Every Single Sensor inside the Apple Vision Pro and What It’s Individually Designed To Do

For a $3,499 USD device that’s designed to replace your phone, laptop, watch, tablet, television, and even your mouse, you bet that Apple’s Vision Pro is absolutely crammed with sensors that track you, your movements, eyesight, gestures, voice commands, and your position in space. As per Apple’s own announcement, the Vision Pro has as many as 14 cameras on the inside and outside, 1 LiDAR scanner, and multiple IR and invisible LED illuminators to help it get a sense of where you are and what you’re doing. Aside from this, the headset also has a dedicated R1 Apple Silicone chip that crunches data from all these sensors (and a few others) to help create the best representation of Apple’s gradual shift towards “Spatial Computing”.

What is “Spatial Computing”?

“Vision Pro is a new kind of Computer,” says Tim Cook as he reveals the mixed reality headset for the very first time. “It’s the first Apple product you look through, and not at,” he adds, marking Apple’s shift to Spatial Computing. What’s Spatial Computing, you ask? Well, the desktop was touted as the world’s first Personal Computer, or PC as we so ubiquitously call it today. The laptop shrank the desktop to a portable format, and the phone shrank it further… all the way down to the watch, that put your personal computer on your wrist. Spatial Computing marks Apple’s first shift away from Personal Computing, in the sense that you’re now no longer limited by a display – big or small. “Instead, your surroundings become a canvas,” Tim summarizes, as he hands the stage to VP of Design, Alan Dye.

Spatial Computing marks a new era of computing where the four corners of a traditional display don’t pose any constraints to your working environment. Instead, your real environment becomes your working environment, and just like you’ve got folders, windows, and widgets on a screen, the Vision Pro lets you create folders, windows, and widgets in your 3D space. Dye explains that in Spatial Computing, you don’t have to minimize a window to open a new one. Just simply drag one window to the side and open another one. Apple’s VisionOS turns your room and your visual periphery into an OS, letting you create multiple screens/windows wherever you want, move them around, and resize them. Think Minority Report or Tony Stark’s holographic computer… but with a better, classier interface.

How the M2 and R1 Chips Handle Spatial Computing

At the heart of the Vision Pro headset are two chips that work together to help virtuality and reality combine seamlessly. The Vision Pro is equipped with Apple’s M2 silicon chip to help with computing and handling multitasking, along with a new R1 silicon chip that’s proprietary to the headset, which works with all the sensors inside and outside the headset to track your eyesight, control input, and also help virtual elements exist seamlessly within the real world, doing impressive things like casting shadows on the world around you, changing angles when you move around, or disappearing/fading when someone walks into your frame.

The R1 chip is pretty much Apple’s secret sauce with the Vision Pro. It handles data from every single sensor on the device, simultaneously tracking your environment, your position in it, your hands, and even your eye movements with stunning accuracy. Your eye movements form the basis of how the Vision Pro knows what elements you’re thinking of interacting with, practically turning them into bonafide cursors. As impressive as that is, the R1 also uses your eye data to know what elements of the screen to render, and what not to. Given that you can only focus on a limited area at any given time, the R1 chip knows to render just that part of your visual periphery with crisp clarity, rather than spending resources rendering out the entire scene. It’s a phenomenally clever way to optimize battery use while providing a brilliantly immersive experience. However, that’s not all…

Apple Engineer Reveals the (Scary) Powerful Capabilities of the R1 Chip

A neurotechnology engineer at Apple lifted the veil on exactly how complex and somewhat scary the Vision Pro’s internal tech is. While bound by NDA, Sterling Crispin shared in a tweet how the Vision Pro tracks your eyesight and knows how you’re navigating its interface so flawlessly. Fundamentally, the R1 chip is engineered to be borderline magical at predicting a user’s eye journey and intent. “One of the coolest results involved predicting a user was going to click on something before they actually did […] Your pupil reacts before you click in part because you expect something will happen after you click,” Crispin mentions. “So you can create biofeedback with a user’s brain by monitoring their eye behavior, and redesigning the UI in real time to create more of this anticipatory pupil response.”

“Other tricks to infer cognitive state involved quickly flashing visuals or sounds to a user in ways they may not perceive, and then measuring their reaction to it,” Crispin further explains. “Another patent goes into details about using machine learning and signals from the body and brain to predict how focused, or relaxed you are, or how well you are learning. And then updating virtual environments to enhance those states. So, imagine an adaptive immersive environment that helps you learn, or work, or relax by changing what you’re seeing and hearing in the background.” Here’s a look at Sterling Crispin’s tweet.

A Broad Look at Every Sensor on the Apple Vision Pro

Sensors dominate the Vision Pro’s spatial computing abilities, and here’s a look at all the sensors Apple highlighted in the keynote, along with a few others that sit under the Vision Pro’s hood. This list isn’t complete, since the Vision Pro isn’t available for a tech teardown, but it includes every sensor mentioned by Apple.

Cameras – The Vision Pro has an estimated 14 cameras that help it capture details inside and outside the headset. Up to 10 cameras (2 main, 4 downward, 2 TrueDepth, and 2 sideways) on the outer part of the headset sense your environment in stereoscopic 3D, while 4 IR cameras inside the headset track your eyes as well as perform 3D scans of your iris, helping the device authenticate the user.

LiDAR Sensor – The purpose of the LiDAR sensor is to use light to measure distances, creating a 3D map of the world around you. It’s used in most self-driving automotive systems, and even on the iPhone’s FaceID system, to scan your face and identify it. On the Vision Pro, the LiDAR sensor sits front and center, right above the nose, capturing a perfect view of the world around you, as well as capturing a 3D model of your face that the headset then uses as an avatar during FaceTime.

IR Camera – The presence of an IR camera on any device plays a key role in being able to do the job of a camera when the camera can’t. IR sensors work in absolute darkness too, giving them a significant edge over conventional cameras. That’s why the headset has 4 IR Cameras on the inside, and an undisclosed number of IR cameras/sensors on the outside to help the device see despite lighting conditions. The IR cameras inside the headset do a remarkable job of eye-tracking as well as of building a 3D scan of your iris, to perform Apple’s secure OpticID authentication system.

Illuminators – While these aren’t sensors, they play a key role in allowing the sensors to do their job perfectly. The Vision Pro headset has 2 IR illuminators on the outside that flash invisible infrared dot grids to help accurately scan a person’s face (very similar to FaceID). On the inside, however, the headset has invisible LED illuminators surrounding each eye that help the IR cameras track eye movement, reactions, and perform detailed scans of your iris. These illuminators play a crucial role in low-light settings, giving the IR cameras data to work with.

Accelerator & Gyroscope – Although Apple didn’t mention the presence of these in the headset, it’s but obvious that the Vision Pro has multiple accelerators and gyroscopes to help it track movement and tilt. Like any good headset, the Vision Pro enables tracking with 6 degrees of freedom, being able to detect left, right, forward, backward, upward, and downward movement. The accelerator helps the headset capture these movements, while the gyroscope helps the headset understand when you’re tilting your head. These sensors, along with the cameras and scanners, give the R1 chip the data it needs to know where you’re standing, moving, and looking.

Microphones – The Vision Pro has an undisclosed number of microphones built into the headset that perform two broad activities – voice detection and spatial audio. Voice commands form a core part of how you interact with the headset, which is why the Vision Pro has microphones that let you perform search queries, summon apps/websites, and talk naturally to Siri. However, the microphones also need to perform an acoustic scan of your room, just the way the cameras need to do a visual scan. They do this so that they can match the sound to the room you’re in, delivering the right amount of reverb, tonal frequencies, etc. Moreover, as you turn your head, sounds still stay in the same place, and the microphones help facilitate that, creating a sonic illusion that allows your ears to believe what your eyes see.

Other Key Components

Aside from the sensors, the Vision Pro is filled with a whole slew of tech components, from screens to battery packs. Here’s a look at what else lies underneath the Vision Pro’s hood.

Displays – Given its name, the Vision Pro obviously focuses heavily on your visual sense… and it does so with some of the most incredible displays ever seen. The Vision Pro has two stamp-sized displays (one for each eye) with each boasting more pixels than a 4K television. This gives the Vision Pro’s main displays a staggering 23 million pixels combined, capable of a 12-millisecond refresh rate (making it roughly 83fps). Meanwhile, the outside of the headset has a display too, which showcases your eyes to people around you. While the quality of this display isn’t known, it is a bent OLED screen with a lenticular film in front of it that creates the impression of a 3D display, so people see depth in your eyes, rather than just a flat image.

Audio Drivers – The headset’s band also has audio drivers built into each temple, firing rich, environmentally-responsive audio into your ears as you wear the headset. Apple mentioned that the Vision Pro has dual audio drivers for each ear, which could possibly indicate quality that rivals the AirPods Max.

Fans – To keep the headset cool, the Vision Pro has an undisclosed number of fans that help maintain optimal temperatures inside the headset. The fans are quiet, yet incredibly powerful, cooling down not one but two chips inside the headset. A grill detail on the bottom helps channel out the hot air.

Digital Crown – Borrowing from the Apple Watch, the Vision Pro has a Digital Crown that rotates to summon the home screen, as well as to toggle the immersive environment that drowns out the world around you for a true VR experience.

Shutter Button – The Digital Crown is also accompanied by a shutter button that allows you to capture 3-dimensional photos and videos, that can be viewed within the Vision Pro headset.

Battery – Lastly, the Vision Pro has an independent battery unit that attaches using a proprietary connector to the headset. The reason the headset has a separate battery pack is to help reduce the weight of the headset itself, which already uses metal and glass. Given how heavy batteries are, an independent battery helps distribute the load. Apple hasn’t shared the milliamp-hour capacity of the battery, but they did mention that it gives you 2 hours of usage on a full charge. How the battery charges hasn’t been mentioned either.

The post Every Single Sensor inside the Apple Vision Pro and What It’s Individually Designed To Do first appeared on Yanko Design.

Apple to Announce Their First Ever Augmented Reality Glasses in 3 days… Here’s What to Expect

Apple hasn’t launched a single new product category since they unveiled the AirPods back in 2016. Sure, the AirPods Max debuted in 2020, but it wasn’t a bold leap as much as natural progression. The point I’m really trying to make here is that it’s been a while since the company was ‘recklessly innovative’, and it seems like we might just get a taste of that three days from now at WWDC.

Augmented Reality has always been Tim Cook’s favorite buzzword, and he’s consistently pushed for Apple to have a presence in this space. It’s expected that all this will culminate in what analysts and leakers call “Reality”, Apple’s first XR headset. This cutting-edge device, expected to be unveiled at Apple’s Worldwide Developers Conference, aims to pioneer the relatively uncharted realm of mixed-reality technology. With a price tag of approximately $3,000, the ‘Reality’ headset has been seven years in the making, and has been apparently filled with controversy too, with a large chunk of Apple’s own employees expressing doubt and disdain. However, here’s everything we know about the Reality headset (or could it be a pair of glasses?) that’s set to launch this Monday.

Concept Images by Kylin Wu
(Rendered on KeyShot: Click Here to Download Your Free Trial Now!)

The headset’s design journey has oscillated between being thick and obtrusive, like your average VR headset, to being as slim as a pair of spectacles, or realistically, a pair of chunky ski goggles. At its heart, however, lies the innovative xrOS, designed to provide an interface that echoes the familiar iOS experience. The new operating system (which is pretty much confirmed thanks to a trademark filed by Apple in New Zealand) is set to revolutionize how users interact with their devices, presenting a traditional Home Screen in an entirely new dimension filled with apps and customizable widgets.

One of the most exciting features of ‘Reality’ is its ability to merge digital elements with the real world. The xrOS software could potentially project AR app interface elements onto actual objects, creating a seamless mixed-reality overlay effect. This represents a significant leap forward in AR technology, blurring the boundaries between the physical and digital worlds. According to MacRumors, the ‘Reality’ device will achieve this using “dual high-resolution 4K micro OLED displays with up to 3,000 pixels per inch for a rich, realistic, and immersive viewing experience.” To operate the device, the user’s hands and eyes will be monitored by over a dozen optical cameras. The user can select an on-screen item by simply looking at it and activate it by making a hand gesture, such as a pinch.

The core of xrOS will feature re-imagined versions of Apple’s staple apps. From Safari to Messages, Apple TV+ to Apple Music, users will have the flexibility to work with multiple apps simultaneously, ensuring a dynamic and engaging user experience. Apple is also set to transform existing services into immersive viewing experiences. Imagine watching videos in virtual reality as if on a giant screen, or engaging in guided meditations enhanced by immersive visuals, audio, and voiceovers. Services like Apple Fitness+, Apple TV+, and a 3D version of Apple’s collaborative Freeform tool are set to offer these radical experiences in xrOS. In addition to the reimagined versions of existing apps, Apple is likely to introduce new offerings tailored to the unique capabilities of the ‘Reality’ headset. These would include a Books app for immersive reading, a Health app focusing on psychological wellness, and a Camera app that can capture images from the headset’s cameras, promising a whole new level of interaction and engagement.

Apple is reportedly also working with a select number of game developers to help them update their existing content for mixed reality. Furthermore, Apple reportedly has a robust set of tools that will allow non-developers to create their own AR/VR experiences, even without coding skills. These user-created AR apps could be distributed on the App Store alongside developer-created apps​​.

The Reality headset doesn’t come without its fair share of controversy. It remains one of the most divisive products even within Apple’s own company, with multiple people leaving the project to move to other divisions within Apple, or leaving the company entirely. Multiple engineers have expressed their opinion that Tim Cook should wait before the product is “good enough” for consumers… a feeling that people on Apple’s board have expressed too. Cook gave multiple key executives and personnel a preview of the Reality headset a little over a month ago, one of them being credible Apple reporter Mark Gurman of Bloomberg. However, it seems like Cook’s been adamant about releasing the headset as soon as possible, although as a developer product rather than a consumer-ready gadget. This will probably help set the groundwork needed to make a much more consumer-friendly Reality headset somewhere down the line. Until then, we have our fingers crossed and our calendars set for June 5th, 10 am PST!

The post Apple to Announce Their First Ever Augmented Reality Glasses in 3 days… Here’s What to Expect first appeared on Yanko Design.

Apple Watch Series 5 is back to $299 after WWDC announcements

During yesterday’s WWDC keynote event, Apple revealed its plans for the upcoming watchOS 7. These improvements could make the Apple Watch more attractive to those who have been on the fence, and the reduced prices of the Series 5 watches could seal t...