As the technology industry evolves into the future, two cutting-edge innovations have shown incredible advancements: Artificial Intelligence and Augmented Reality. In the past few months, products such as the Apple Vision Pro and Google Gemini have captured the attention of both casual and avid technology users. While both serve their own purposes, we believe that there is a strong potential for these technologies to complement and support each other.
At Perception, we are always looking for ways to enhance new technologies and include new breakthroughs into our work. In the automotive industry, we have visualized complex AI to elevate and optimize driving. We have also partnered with VR and AR pioneers to conceptualize landscapes and applications for immersive digital and mixed reality experiences. With both of these technologies advancing every day, blending the two together could result in some truly futuristic technology coming to life.
Eye Tracking for Mixed Reality
Artificial Intelligence, with its ability to analyze large amounts of data and make real-time decisions, forms some powerful muscle that technology companies are leveraging more frequently. It enables technologies to perceive the user, understand context and adapt their behavior accordingly. Augmented Reality, on the other hand, overlays digital content onto the real world, seamlessly blending physical and virtual environments to enhance how we utilize and interact with our surroundings.
One of the most interesting ways of blending AI and AR is through immersive experiences. By using AI’s capabilities of language processing and machine learning, AR devices can better understand and respond to user input, creating more personalized experiences.
Visualizing the Capabilities of AI
Recently, our team conceptualized what this could look like. We designed a prototype application that bridges AI and AR in the context of sports viewing. The AR glasses recognize the environment and respond to cues, while AI provides real-time game information and recommendations based on the user’s preferences and past interactions. The AI learns what users like to see during the game based on what their eyes are drawn to. Whether it is player information or referee calls, the AI adapts and caters the AR display to their interests. Not only does this help the user interact with the game, but it also provides a more informative, immersive experience.
Perception's FanVision AR Prototype
This convergence can even go beyond casual entertainment and examine how AI and AR can support each other with human connection and communication. Language barriers often disrupt human interaction, but through the utilization of AI and AR, we envision an application that can interpret and display translations in real-time. For example, the AR technology can scan and recognize that someone is communicating using sign language. The AI interprets the gestures in real-time and provides a translation within the AR display, even suggesting ways to respond and instructions for signing, breaking the language barrier.
The possibilities for AI and AR blended applications are endless. Retailers could use AR to help consumers visualize how furniture or appliances could look in their home, while AI could use that data to provide the perfect fit based on what the user has interacted with, measured and sampled. Carpenters could utilize AR to help envision their projects, while AI predicts measurements and tools needed to complete a job based on environmental conditions.
AR Windshield for Jeep
The convergence of AI and AR is already beginning to take root. OpenAI announced a VisionOS ChatGPT application for the Apple Vision Pro. With this, the user can ask and receive data in real time, appearing on a pop-up window within their environment. However, the interactivity level with the user’s surroundings is limited.
Based on our work in sci-fi films crafting futuristic technology, we can already envision how this partnership with OpenAI could become even more immersive. For example, if the user asks a question about their television, the AI and AR technologies could recognize and highlight the television and provide relevant information based on what the television looks like, its model and how its performing.
By blending AI and AR, a reality where Tony Stark’s glasses or EDITH become a reality isn’t too far away. By harnessing the power of AI to enhance the capabilities of AR, we can unlock a new era of immersive experiences that have the potential to reshape and improve our way of living. As we continue to explore the endless possibilities of this compatibility, one thing is certain: the future of technology is fast approaching.
Experience Perception
Perception is an Emmy nominated design lab pioneering the visionary process of Science Fiction Thinking to architect the future. We divide our time equally between the parallel worlds of science-fiction—working with trailblazing filmmakers, and science-fact—collaborating with the world’s most innovative technology brands.