You know how both Google and Apple blur faces and license plates and other potentially personal info from their maps imagery? It’s an important step to protect user privacy, which is well appreciated. In the modern AI-driven age, it comes as no surprise that there might be a better and more elegant solution. Enter German AI startup Brighter AI. Its Deep Natural Anonymization 2.0 technology promises to achieve facial and data anonymity through AI modification of images rather than blurring. This results in a way more natural look for the final images and downright makes blurring seem like a “caveman” approach to the problem.
Deep Natural Anonymization 2.0
As per industry sources, Apple plans to acquire Brighter AI and use its technology in its offerings. The company’s Vision Pro VR/AR headset might be the first to benefit from the tech. A rather perplexing decision at first glance, but apparently, Apple is worried that the Vision Pro raises certain privacy concerns, namely that Apple believes it is easier to capture video and images discreetly with a Vision Pro. Much more so than with a smartphone.
While there is arguably nothing discreet about wearing a VR/AR headset today, there might be some truth to this hypothesis. Think about it: once such tech becomes more normalized, casual wear in all sorts of public places might actually be normal and mundane. And even today, when it is definitely not, the sheer novelty of seeing someone with a headset strapped to their face will likely be more than enough to distract you from the subtle visual indicator on the front of the device that comes on when it is capturing footage.
Anyway, such AI anonymization tech sounds pretty cool and useful in many ways beyond Vision Pro as well. If nothing else, Apple could use it to “de-blur” its map images and make them way more pleasant to look at.
Source
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : GSMArena.com – https://www.gsmarena.com/apple_is_reportedly_preparing_to_buy_an_ai_startup_to_anonymize_private_data_in_images-news-61466.php