What is Spatial Computing: Design in Apple’s Vision of the Future

Apple made waves earlier this month when it finally revealed its long-awaited foray into the world of mixed or extended reality. That the company has had its eyes on this market is hardly any secret. In fact, the delayed (at least by market standards) announcement has had some wondering if it was all just wishful thinking. At WWDC 2023, Apple definitely showed the world that it means serious business, perhaps too serious even. The Apple Vision Pro headset itself is already a technological marvel, but in typical Apple fashion, it didn’t dwell too much on the specs that would make many pundits drool. Instead, Apple homed in on how the sleek headset almost literally opens up a whole new world and breaks down the barriers that limited virtual and augmented reality. More than just the expensive hardware, Apple is selling an even more costly new computing experience, one that revolves around the concept of “Spatial Computing.” But what is Spatial Computing, and does it have any significance beyond viewing photos, browsing the Web, and walking around in a virtual environment? As it turns out, it could be a world-changing experience, virtually and really.

Designer: Apple

Making Space: What is Spatial Computing?

Anyone who has been keeping tabs on trends in the modern world will have probably already heard about virtual reality, augmented reality, or even extended reality. Although they sound new to our ears, their origins actually go far, far back, long before Hollywood has even gotten whiff of them. At the same time, however, we’ve been hearing about these technologies so much, especially from certain social media companies, that you can’t help but roll your eyes at “yet another one” coming our way. Given its hype, it’s certainly understandable to be wary of all the promises that Apple has been making, but that would be underselling the concept of what makes Spatial Computing really feel like THE next wave in computing.

It’s impossible to discuss Spatial Computing without touching base with VR and AR, the granddaddies of what is now collectively called “eXtended Reality” or XR. Virtual Reality (VR) is pretty much the best-known of the two, especially because it is easier to implement. Remember that cardboard box with a smartphone inside that you strap to your head? That’s pretty much the most basic example of VR, which practically traps you inside a world full of pixels and intangible objects. Augmented Reality (AR) frees you from that made-up world and instead overlays digital artifacts on real-world objects, much like those Instagram filters everyone seems to love or love to hate. The catch is that these are still intangible virtual objects, and nothing you do in the real world really changes them. Mixed Reality (MR) fixes that and bridges the two so that a physical knob can actually change some virtual configuration or that a virtual switch can toggle a light somewhere in the room.

In that sense, Spatial Computing is the culmination of all these technologies but with a very specific focus, which you can discern from its name. In a nutshell, it turns the whole world into your computer, making any available space into an invisible wall you can hang up your apps’ windows. Yes, there will still be windows (with a small “w”) because of how our software is currently designed, but you can hang up as many as you want in the available space you have. Or you can just have one super gigantic video player taking up your vision. The idea also makes use of our brain’s innate ability to associate things with spaces (which is the theory behind the “Memory Palace”) to have us organize our room-sized computer desktop. In a sense, it makes the computer practically invisible, allowing you to directly interact with applications as if they existed physically in front of you because they practically are.

Apple Reality

Of course, you could say that even Microsoft’s HoloLens already did all that. What makes Spatial Computing and Apple’s implementation different is how the virtual and the real affect each other, much like in mixed reality. There is, for example, the direct way we can control the floating applications using nothing but our own bodies, whether it’s with hand gestures or even just the movement of our eyes. This is the fulfillment of all those Minority Report fantasies, except you don’t even need to wear gloves. Even your facial expressions can have an effect on your FaceTime doppelganger, a very useful trick since you won’t have a FaceTime camera available while wearing the Apple Vision Pro.

Apple’s visionOS Spatial Computing, however, is also indirectly affected by your physical environment, and this is where it gets a little magical and literally spatial. According to Apple’s marketing, your virtual windows will cast shadows on floors or walls, and that they’ll be affected by ambient light as well. Of course, you’ll be the only one who sees those effects, but they make the windows and other virtual objects feel more real to you. The Vision Pro will also dim its display to mimic the effect of dimming your lights when you want to watch a movie in the dark. It can even analyze surrounding objects and their textures to mix the audio so that it sounds like it’s really coming from all directions and bouncing off those objects.

The number of technologies to make this seamless experience possible is quite staggering; that’s why Apple didn’t focus too much on the optics, which is often the key selling point of XR headsets. From the sensors to the processors to the AI that interprets all that data, it’s no longer surprising that it took Apple this long to announce the Vision Pro and its Spatial Computing. It is, however, also its biggest gamble, and it could very well ruin the company if it crashes and burns.

Real Design

Spatial Computing is going to be a game-changer, but it’s not a change that will happen overnight, no matter how much Apple wants it to. This is where computing is heading, whether we like it or not, but it’s going to take a lot of time as well. And while it may have “computing” in its name, its ramifications will impact almost all industries, not just entertainment and, well, computing. When Spatial Computing does take off, it will even change the way we design and create things.

Many designers are already using advanced computing tools like 3D modeling software, 3D printers, and even AI to assist their creative process. Spatial Computing will take it up a notch by letting designers have a more hands-on approach to crafting. Along with “digital twins” and other existing tools, it will allow designers and creators to iterate over designs much faster, letting them measure a hundred times and print only once, saving time, resources, and money in the long run.

Spatial Computing also has the potential to change the very design of products themselves, but not in the outlandish way that the Metaverse has been trying to do. In fact, Spatial Computing flips the narrative and gives more importance to physical reality rather than having an expensive, one-of-a-kind NFT sneaker you can’t wear in real life. Spatial Computing highlights the direct interaction between physical and virtual objects, and this could open up a new world of physical products designed to interact with apps or, at the very least, influence them by their presence and composition. It might be limited to what we would consider “computing,” but in the future, computing will pretty much be the way everyone will interact with the world around them, just like how smartphones are today.

Human Nature

As grand as Apple’s Vision might be, it will be facing plenty of challenges before its Spatial Computing can be considered a success, the least of which is the price of the Vision Pro headset itself. We’ve highlighted those Five Reasons Why the Apple Vision Pro Might Fail, and the biggest reason will be the human factor.

Humans are creatures of habit as well as tactile creatures. It took years, maybe even decades, for people to get used to keyboards and mice, and some people struggle with touch screens even today. While Apple’s Spatial Computing promises the familiar controls of existing applications, the way we will interact with them will be completely gesture-based and, therefore, completely new. Add to the fact that even touch screens give something our fingers can feel, and you can already imagine how alien those air hand gestures might be for the first few years.

Apple surely did its due diligence in ergonomic and health studies and designs, but it’s not hard to see how this won’t be the most common way people will do computing, even if you make the Vision Pro dirt cheap. Granted, today’s computers and mobile devices are hardly ergonomic by design, but there have been plenty of solutions developed by now. Spatial Computing is still uncharted territory, even after VR and AR have long blazed a trail. It will definitely take our bodies getting used to before Spatial Computing almost becomes second nature, and Apple will have to stay strong until then.

Final Thoughts

As expected, Apple wasn’t content to just announce just another AR headset to join an uncertain market. The biggest surprise was its version of Spatial Computing, formally marketed as visionOS. Much of what we’ve seen is largely marketing and promises, but this is Apple we’re talking about. It might as well be reality, even if it takes a while to fully happen.

Unlike the entertainment-centric VR or the almost ridiculous Metaverse, Spatial Computing definitely feels like the next evolution of computing that will be coming sooner rather than later. It’s definitely still at an early stage, even if the seeds were planted nearly two decades ago, but it clearly shows potential to become more widely accepted because of its more common and general applications. It also has the potential to change our lives in less direct ways, like changing the way we learn or even design products. It’s not yet clear how long it will take, but it’s not hard to see how Apple’s Vision of the future could very well be our own.

The post What is Spatial Computing: Design in Apple’s Vision of the Future first appeared on Yanko Design.

Apple Vision Pro for $999? An engineer built the Vision Pro’s eye + hand-tracking interface for the Meta Quest Pro

If every tech reviewer who got to try on the Vision Pro after Apple’s WWDC event can be considered a reliable source, the Vision Pro is absolutely ‘magical’. Almost everyone who got to try it on (even Disney’s CEO Bob Iger) has the same feeling of being simultaneously sucked in and blown away by how incredibly immersive and intuitive the tech is. The resolution is flawless, the eye-tracking is brilliant, and the overall experience has changed the minds of quite a few skeptics. There’s a downside, however… This magical experience costs a whopping $3500 USD.

For YouTuber ThrillSeeker, this downside seemed a little too rich. Ultimately, the Apple Vision Pro’s unique interface could be boiled down to three distinct features – Passthrough (the ability to see the world through your headset), Eye Tracking, and Hand Tracking… and the $999 Meta Quest Pro had all those three features. “I’ve been in VR for half a decade, and have been making videos about AR and VR for most of that time,” said the YouTuber, “I struggle to believe that Apple has somehow created something so radically superior, so transformative, that it warrants the use of the word Magical.” A lot of the Vision Pro’s magic is the result of its highly intuitive UI, which lets you interact with elements simply by looking at them and pinching your fingers. The Meta Quest Pro is capable of doing all these things too, although nobody at Meta really built them out… so ThrillSeeker decided to give things a go.

Designer: ThrillSeeker

ThrillSeeker started first by shooting a tweet to Meta’s CTO, Andrew Bosworth hoping for some leads and support, but understandably never heard from him (I assume everyone at Meta was just taking a while to recover from the Apple Keynote). Deciding to then take matters into his own hands (and eyes), he then went on to build the eye and hand-tracking system, designing a mock app drawer (the Vision OS home page) to test out his UI.

The entire interface was designed and coded within Unity, where ThrillSeeker tapped into the Quest Pro’s eye-tracking abilities and turned them into a controller of sorts. Most VR headsets ship with controllers, and these controllers use invisible lasers to point at objects, which the headset then recognizes as a cursor. ThrillSeeker simply turned the wearer’s eyesight into a laser pointer, allowing app icons to pop forward when you look at them (just like on the Vision Pro). Tapping your fingers would select/grab the icon, allowing you to manipulate it and move it around.

The pop-out 3D app icons

Even though highly preliminary, ThrillSeeker proved one thing – that Apple’s magical UI isn’t entirely inconceivable – it’s just that nobody at Meta (or Sony or HTC) ever thought of it in the first place. His demonstration proves that this eye and hand-controlled interface is absolutely possible with existing tech in a $999 Quest Pro device. ThrillSeeker is planning on making the APK for this demo available in the near future for all Meta Quest Pro users. We’ll add the link here as soon as he does!

The post Apple Vision Pro for $999? An engineer built the Vision Pro’s eye + hand-tracking interface for the Meta Quest Pro first appeared on Yanko Design.

Every Single Sensor inside the Apple Vision Pro and What It’s Individually Designed To Do

For a $3,499 USD device that’s designed to replace your phone, laptop, watch, tablet, television, and even your mouse, you bet that Apple’s Vision Pro is absolutely crammed with sensors that track you, your movements, eyesight, gestures, voice commands, and your position in space. As per Apple’s own announcement, the Vision Pro has as many as 14 cameras on the inside and outside, 1 LiDAR scanner, and multiple IR and invisible LED illuminators to help it get a sense of where you are and what you’re doing. Aside from this, the headset also has a dedicated R1 Apple Silicone chip that crunches data from all these sensors (and a few others) to help create the best representation of Apple’s gradual shift towards “Spatial Computing”.

What is “Spatial Computing”?

“Vision Pro is a new kind of Computer,” says Tim Cook as he reveals the mixed reality headset for the very first time. “It’s the first Apple product you look through, and not at,” he adds, marking Apple’s shift to Spatial Computing. What’s Spatial Computing, you ask? Well, the desktop was touted as the world’s first Personal Computer, or PC as we so ubiquitously call it today. The laptop shrank the desktop to a portable format, and the phone shrank it further… all the way down to the watch, that put your personal computer on your wrist. Spatial Computing marks Apple’s first shift away from Personal Computing, in the sense that you’re now no longer limited by a display – big or small. “Instead, your surroundings become a canvas,” Tim summarizes, as he hands the stage to VP of Design, Alan Dye.

Spatial Computing marks a new era of computing where the four corners of a traditional display don’t pose any constraints to your working environment. Instead, your real environment becomes your working environment, and just like you’ve got folders, windows, and widgets on a screen, the Vision Pro lets you create folders, windows, and widgets in your 3D space. Dye explains that in Spatial Computing, you don’t have to minimize a window to open a new one. Just simply drag one window to the side and open another one. Apple’s VisionOS turns your room and your visual periphery into an OS, letting you create multiple screens/windows wherever you want, move them around, and resize them. Think Minority Report or Tony Stark’s holographic computer… but with a better, classier interface.

How the M2 and R1 Chips Handle Spatial Computing

At the heart of the Vision Pro headset are two chips that work together to help virtuality and reality combine seamlessly. The Vision Pro is equipped with Apple’s M2 silicon chip to help with computing and handling multitasking, along with a new R1 silicon chip that’s proprietary to the headset, which works with all the sensors inside and outside the headset to track your eyesight, control input, and also help virtual elements exist seamlessly within the real world, doing impressive things like casting shadows on the world around you, changing angles when you move around, or disappearing/fading when someone walks into your frame.

The R1 chip is pretty much Apple’s secret sauce with the Vision Pro. It handles data from every single sensor on the device, simultaneously tracking your environment, your position in it, your hands, and even your eye movements with stunning accuracy. Your eye movements form the basis of how the Vision Pro knows what elements you’re thinking of interacting with, practically turning them into bonafide cursors. As impressive as that is, the R1 also uses your eye data to know what elements of the screen to render, and what not to. Given that you can only focus on a limited area at any given time, the R1 chip knows to render just that part of your visual periphery with crisp clarity, rather than spending resources rendering out the entire scene. It’s a phenomenally clever way to optimize battery use while providing a brilliantly immersive experience. However, that’s not all…

Apple Engineer Reveals the (Scary) Powerful Capabilities of the R1 Chip

A neurotechnology engineer at Apple lifted the veil on exactly how complex and somewhat scary the Vision Pro’s internal tech is. While bound by NDA, Sterling Crispin shared in a tweet how the Vision Pro tracks your eyesight and knows how you’re navigating its interface so flawlessly. Fundamentally, the R1 chip is engineered to be borderline magical at predicting a user’s eye journey and intent. “One of the coolest results involved predicting a user was going to click on something before they actually did […] Your pupil reacts before you click in part because you expect something will happen after you click,” Crispin mentions. “So you can create biofeedback with a user’s brain by monitoring their eye behavior, and redesigning the UI in real time to create more of this anticipatory pupil response.”

“Other tricks to infer cognitive state involved quickly flashing visuals or sounds to a user in ways they may not perceive, and then measuring their reaction to it,” Crispin further explains. “Another patent goes into details about using machine learning and signals from the body and brain to predict how focused, or relaxed you are, or how well you are learning. And then updating virtual environments to enhance those states. So, imagine an adaptive immersive environment that helps you learn, or work, or relax by changing what you’re seeing and hearing in the background.” Here’s a look at Sterling Crispin’s tweet.

A Broad Look at Every Sensor on the Apple Vision Pro

Sensors dominate the Vision Pro’s spatial computing abilities, and here’s a look at all the sensors Apple highlighted in the keynote, along with a few others that sit under the Vision Pro’s hood. This list isn’t complete, since the Vision Pro isn’t available for a tech teardown, but it includes every sensor mentioned by Apple.

Cameras – The Vision Pro has an estimated 14 cameras that help it capture details inside and outside the headset. Up to 10 cameras (2 main, 4 downward, 2 TrueDepth, and 2 sideways) on the outer part of the headset sense your environment in stereoscopic 3D, while 4 IR cameras inside the headset track your eyes as well as perform 3D scans of your iris, helping the device authenticate the user.

LiDAR Sensor – The purpose of the LiDAR sensor is to use light to measure distances, creating a 3D map of the world around you. It’s used in most self-driving automotive systems, and even on the iPhone’s FaceID system, to scan your face and identify it. On the Vision Pro, the LiDAR sensor sits front and center, right above the nose, capturing a perfect view of the world around you, as well as capturing a 3D model of your face that the headset then uses as an avatar during FaceTime.

IR Camera – The presence of an IR camera on any device plays a key role in being able to do the job of a camera when the camera can’t. IR sensors work in absolute darkness too, giving them a significant edge over conventional cameras. That’s why the headset has 4 IR Cameras on the inside, and an undisclosed number of IR cameras/sensors on the outside to help the device see despite lighting conditions. The IR cameras inside the headset do a remarkable job of eye-tracking as well as of building a 3D scan of your iris, to perform Apple’s secure OpticID authentication system.

Illuminators – While these aren’t sensors, they play a key role in allowing the sensors to do their job perfectly. The Vision Pro headset has 2 IR illuminators on the outside that flash invisible infrared dot grids to help accurately scan a person’s face (very similar to FaceID). On the inside, however, the headset has invisible LED illuminators surrounding each eye that help the IR cameras track eye movement, reactions, and perform detailed scans of your iris. These illuminators play a crucial role in low-light settings, giving the IR cameras data to work with.

Accelerator & Gyroscope – Although Apple didn’t mention the presence of these in the headset, it’s but obvious that the Vision Pro has multiple accelerators and gyroscopes to help it track movement and tilt. Like any good headset, the Vision Pro enables tracking with 6 degrees of freedom, being able to detect left, right, forward, backward, upward, and downward movement. The accelerator helps the headset capture these movements, while the gyroscope helps the headset understand when you’re tilting your head. These sensors, along with the cameras and scanners, give the R1 chip the data it needs to know where you’re standing, moving, and looking.

Microphones – The Vision Pro has an undisclosed number of microphones built into the headset that perform two broad activities – voice detection and spatial audio. Voice commands form a core part of how you interact with the headset, which is why the Vision Pro has microphones that let you perform search queries, summon apps/websites, and talk naturally to Siri. However, the microphones also need to perform an acoustic scan of your room, just the way the cameras need to do a visual scan. They do this so that they can match the sound to the room you’re in, delivering the right amount of reverb, tonal frequencies, etc. Moreover, as you turn your head, sounds still stay in the same place, and the microphones help facilitate that, creating a sonic illusion that allows your ears to believe what your eyes see.

Other Key Components

Aside from the sensors, the Vision Pro is filled with a whole slew of tech components, from screens to battery packs. Here’s a look at what else lies underneath the Vision Pro’s hood.

Displays – Given its name, the Vision Pro obviously focuses heavily on your visual sense… and it does so with some of the most incredible displays ever seen. The Vision Pro has two stamp-sized displays (one for each eye) with each boasting more pixels than a 4K television. This gives the Vision Pro’s main displays a staggering 23 million pixels combined, capable of a 12-millisecond refresh rate (making it roughly 83fps). Meanwhile, the outside of the headset has a display too, which showcases your eyes to people around you. While the quality of this display isn’t known, it is a bent OLED screen with a lenticular film in front of it that creates the impression of a 3D display, so people see depth in your eyes, rather than just a flat image.

Audio Drivers – The headset’s band also has audio drivers built into each temple, firing rich, environmentally-responsive audio into your ears as you wear the headset. Apple mentioned that the Vision Pro has dual audio drivers for each ear, which could possibly indicate quality that rivals the AirPods Max.

Fans – To keep the headset cool, the Vision Pro has an undisclosed number of fans that help maintain optimal temperatures inside the headset. The fans are quiet, yet incredibly powerful, cooling down not one but two chips inside the headset. A grill detail on the bottom helps channel out the hot air.

Digital Crown – Borrowing from the Apple Watch, the Vision Pro has a Digital Crown that rotates to summon the home screen, as well as to toggle the immersive environment that drowns out the world around you for a true VR experience.

Shutter Button – The Digital Crown is also accompanied by a shutter button that allows you to capture 3-dimensional photos and videos, that can be viewed within the Vision Pro headset.

Battery – Lastly, the Vision Pro has an independent battery unit that attaches using a proprietary connector to the headset. The reason the headset has a separate battery pack is to help reduce the weight of the headset itself, which already uses metal and glass. Given how heavy batteries are, an independent battery helps distribute the load. Apple hasn’t shared the milliamp-hour capacity of the battery, but they did mention that it gives you 2 hours of usage on a full charge. How the battery charges hasn’t been mentioned either.

The post Every Single Sensor inside the Apple Vision Pro and What It’s Individually Designed To Do first appeared on Yanko Design.

Apple Vision Pro gets accessorised in the form of a premium leather Head Band by BandWerk

Apple has set the tech community on steroids with the announcement of the long-awaited virtual reality headset that’s set to change the way we interact with our world. The mixed reality headset was in development for many years now with countless patents and prototype versions marking its inception journey.

The next-generation headset announced at the annual WWDC 2023 conference is by far the most technically advanced VR accessory that the world has seen. Vision Pro backed by Apple’s software integration makes possible a seamless transition from the real world to the virtual world and then to the mixed reality interface.

Designer: BandWerk

Scan through all the tech news lately and Apple’s surprise announcement is making all the headlines. To that accord, premium iPhone case maker BandWerk is not letting go of the opportunity to grab a share of the pie with an announcement of its own. The German accessory provider for the Apple ecosystem has revealed its plans to make available handcrafted leather headbands for the $3,500 Apple headset slated for launch early next year.

The premium headband destined to arrive in five color options – Grey, Creme, Beige, Orange, and Brown – will adapt to the silhouette of the final commercially available headset. For now, BandWerk has only revealed the concept version of the headbands that’ll fill the void of the only single option that deep-pocketed buyers will get with the Vision Pro. According to them, the commercially available luxury headbands will have a precise fit with maximum comfort for longer stints of VR exploration. Durability is another perk that’ll make the $159 price tag totally justified.

The headband will come with a color-matching fabric Light Seal and crafted out of premium Italian leather. This accessory will be made in Germany and then eventually shipped to America and the United Kingdom initially. We can expect more accessories unearthing for the Vision Pro headset as it nears public launch. For now, though, the Apple headset and the third-party headband accessory are only going to be the privilege of the filthy rich or die-hard Apple fans who can afford to buy this exorbitantly priced gadget.

The post Apple Vision Pro gets accessorised in the form of a premium leather Head Band by BandWerk first appeared on Yanko Design.

Apple Vision Pro: 5 Reasons Why The Headset May FAIL Despite Its High Popularity

Let me just get things out of the way by saying that I was one of the million people passionately glued to the YouTube screen watching the Apple WWDC keynote and the Vision Pro announcement. When Tim took the stage for the final time to coyly tease “One More Thing”, I felt the kind of goosebumps I’ve rarely felt before. Throughout the 40-minute-long presentation of the Vision Pro, my mind and heart were captivated by the sheer brilliance of this new device. I overwhelmingly loved everything I saw, but the tech commentator in my brain sent off some instant alarms.

As brilliant as the Vision Pro announcement was, there are still FIVE broad reasons why the Vision Pro could be a big crash-and-burn for Apple. Here’s what they are:

  1. Price
  2. Human Factors
  3. VR Fatigue/Boredom
  4. Inter Device Cannibalization
  5. Lack of Social Media Integration

Price

Months back when the Vision Pro was merely a figment of a tech commentator’s imagination, most rumors were that the device would be priced between $3000 and $10,000 USD, and would be available ONLY to developers to help set the groundwork for a public launch. The reality, however, is that this $3499 device was announced as a consumer product and not a developer one… and that changes things significantly. At a whopping $3499, the Vision Pro is much too expensive for the average user… and this can have negative effects on the product’s overall success. The Vision Pro’s ridiculous price tag sets an incredibly high barrier to entry for consumers, severely limiting how many people actually use the MR headset. This in turn disincentivizes the developers from constantly working on new apps, services, games, experiences, updates, etc. The lack of developer support in turn results in a flawed UX for Apple’s most elite, enthusiastic customer base. In order to combat this, Apple would either have to launch a cheaper, inferior product and become just another Meta, or go down the Microsoft route by just halting new product development, just like how Microsoft never released a new device after the 2019 Hololens.

Human Factors

The second crucial reason that could spell certain death for the Vision Pro is also closely tied to why Microsoft never developed a third Hololens – a large chunk of the human population tends to experience incredible nausea in a VR headset. During experimentations with AR headsets for military applications, Microsoft and DARPA realized that almost a third of all soldiers were becoming violently sick while wearing the Hololens. Between 10-30% of humans don’t respond well to virtual reality – the blurry, pixelated, uncanny valley of VR can sometimes cause the brain to think that it’s experiencing the blurred-vision symptoms of poisoning, which results in nausea and vomiting. The minute you have people adversely reacting to your product, it really doesn’t bode well for its future. Case in point, DARPA halted its plans to have military personnel wear AR glasses on the combat field.

VR Fatigue/Boredom

For the rest of the population who doesn’t have a sickness problem with VR, the novelty tends to wear off pretty quickly. Almost everyone who bought an Oculus reports finding it exciting for only a few months. After this, interest in the tech plateaus, and the VR headset end up sitting on a shelf or in the back of a cupboard, according to a significant majority of VR headset buyers. The erstwhile leader of the metaverse, Meta (Oculus), hasn’t really been able to crack this problem, despite constantly launching new experiences and games (and even announcing the Quest 3 just days ago). That being said, Apple deserves a tonne of credit for focusing so much of their time and effort on highlighting all the areas where the Vision Pro could be incredibly useful… but not everyone will be okay with the idea of perpetually having 4K screens an inch away from your eyeballs. VR Fatigue and Boredom are real; and even though the Vision Pro boasts of an entire day’s worth of battery life, chances are that people won’t want to be strapped into a VR headset for more than an hour a day.

Inter Device Cannibalization

Interestingly enough, the Vision Pro’s success also depends on the failure of every other Apple device… which presents a unique stalemate for the company. Under normal circumstances, Apple’s devices encourage multitasking. You can work on your MacBook while listening to a podcast or music on your AirPods, you could watch Apple TV+ while browsing TikTok on your iPhone. The devices don’t really replace each other, and can coexist rather comfortably… but they can’t with the Vision Pro. The Vision Pro is designed to replace your MacBook, iPad, iPhone, Apple TV, and Apple Watch displays. Its built-in audio drivers also prevent you from wearing AirPods while you’re strapped into the headset. This effectively means all your other Apple devices are pretty much useless when you’re using the Vision Pro, and that poses a threat to Apple’s own hardware. If you’re watching a movie on the Vision Pro, you’re less compelled to buy the Apple TV 4K. If you’ve got a massive built-in virtual display, you don’t really need the Apple Pro Display XDR or the iMac, and if you can control virtual elements with finger gestures, you don’t need the iPad and you certainly don’t need the Apple Pencil. The success of the Apple Vision Pro hinges heavily on you NOT using any of Apple’s other devices… and as a result, the inverse is true too. If you’re much more comfortable with using an Apple TV, because you’re watching movies as a group… or if you’re working on the MacBook or iPad with the intent of collaborating with a co-worker right beside you, you’re probably not going to isolate yourself with a personalized mixed reality headset.

Lack of Social Media Integration

Here’s a statistic that still bends my brain to this day. In the year 2021, the entire world collectively watched 9.6 trillion minutes of Netflix according to data from the Wall Street Journal. That may sound like a lot, but it’s nothing compared to the staggering 22.6 trillion minutes spent watching TikTok. There’s really no debate that social media occupies an overwhelmingly higher amount of time than conventional entertainment… however, Apple didn’t highlight social media even ONCE in their entire 45-minute segment on the Vision Pro.

In the past 3 years, we’ve become so addicted and accustomed to how we browse and interact with social media on the smartphone, it’s nearly impossible to seamlessly shift that to a wearable platform. As a Quest 2 owner, I’ve opened Instagram only twice on the VR headset. Typing comments on a headset is a hassle, uploading stories/reels in a 9:16 portrait format on a virtual or mixed-reality headset just feels odd, and the smartphone has a solid reputation for rapidly creating and sharing content, while the Vision Pro doesn’t. Moreover, social media has thrived wonderfully on a small screen, so the idea of browsing TikTok on a massive virtual display really doesn’t present any major benefits.

Now Apple DOES have until next year to figure out how to overcome these problems, given that the Vision Pro doesn’t hit stores until 2024. This also gives Apple enough time to really gauge consumer and developer feedback, and adapt accordingly. More so than ever, it’ll also be interesting to see what the Vision Pro does for the metaverse, which has kind of been on life support up until now. Someone check in on Zuckerberg too, while we’re at it…

The post Apple Vision Pro: 5 Reasons Why The Headset May FAIL Despite Its High Popularity first appeared on Yanko Design.

Apple Vision Pro Just Brilliantly Destroyed Meta’s Entire Hardware Business… And Possibly Even Its Own

I’d really hate to be Mark Zuckerberg right now. In October 2021 he pivoted to the metaverse, only to pivot to AI in November 2022. Now, Apple’s Vision Pro stole his massive lead with a product so revolutionary, it’s probably going to crush his entire hardware ambitions.

Apple just announced the Vision Pro, an entirely new revolutionary product category, with a Mixed Reality headset that champions what they call “spatial computing” – an upgrade from the personal computing abilities of the laptop and smartphone. The brilliance of this is that it singlehandedly has the potential to redefine and reinvigorate the metaverse. The tragedy is that it also simultaneously kills all of Apple’s other businesses. The Vision Pro’s technical genius deserves an entire article on its own, but for now let’s just focus on exactly how magical this new product is, and what it means for Apple as a hardware company.

One More Thing…

Just as WWDC was coming to a close, Tim Cook, with a twinkle in his eye, uttered the same words that Steve Jobs did when he unveiled Apple’s most revolutionary product – the iPhone. 16 years to that day, Cook’s reutterance of those words promises to disrupt the entire tech industry all over again. The Vision Pro is an MR headset that brings an entirely new category to Apple’s product offering. In short, it has two Apple Silicon chipsets (including an M2 chip), dozens of cameras and sensors, an iris recognition system that scans your eye for biometrics, directional audio units in the strap, two postage-stamp-sized 4K screens on the inside for immersive viewing, and a curved OLED display with a lenticular layer that lets other people see your eyes while you’re wearing the headset. That’s just the short version.

Apple’s Greatest Device Yet

The Vision Pro turns your world into a computing device. You can work, play, watch movies, view 3D content, facetime with friends/family, and access every app on the App Store through it. There’s quite literally nothing you cannot do on the Vision Pro, which makes it such an incredible device. In fact, just announcing it and its features took up nearly an hour of the WWDC live stream, highlighting exactly how important it is to Apple’s future. In Tim Cook’s version of the future, the Vision Pro replaces computing devices entirely. You don’t need laptops, phones, watches, or even VR controllers to interact with the digital world. The Vision Pro handles your laptop or desktop’s abilities, allowing you to make presentations, write emails, edit files, and do practically anything on a massive virtual canvas. Similarly, you don’t need a phone or tablet when all your phone/tablet apps are available on the Vision Pro. When you’re relaxing, the Vision Pro gives you a massive screen to watch movies and TV shows, or even view 3D content or panoramic images immersively.

How the Vision Pro Redefines Computing

The Vision Pro’s interface isn’t really an interface anymore… It’s your entire world (or as Apple calls it, VisionOS). Everything you see is a canvas for a rectangular window. You can simultaneously have your work screen, a Pinterest board, and Ted Lasso existing within your visual periphery. Each element occupies 3D real estate in your vision, and isn’t bound by a screen. You can select, layer, resize, or move elements of your world simply by using your hands, eliminating the need for a controller. You can choose to see the world around you, or immerse yourself in a digital realm with a simple turn of a knob (or a crown), while still being connected to the world around you.

How the Vision Pro Redefines Interaction

A screen on the front of the Vision Pro acts as your digital eyes (or what Apple calls EyeSight), so that when people are talking to you, they see your eyes. If you’re immersed in content, your eyes aren’t made visible on the screen, so they know not to disturb you – it’s a lot like how people know you’re not engaging with them if you’re not making eye contact. However, if they need to grab your attention while you’re in an immersive experience (like a movie), they can merely step close to you, and EyeSight kicks in. They suddenly become visible to you within your headset, and your eyes become visible to them. It’s an impressive handshake of multiple different technologies that resulted in Apple filing as many as 5000 patents for the Vision Pro device.

Meta is Royally Screwed

As impressive as Zuckerberg’s Meta Quest Pro is, it really doesn’t even hold a candle to Apple’s Vision Pro. The Apple Vision Pro is an incredibly meticulously designed product that runs on not one but TWO chipsets – an M2 chip and a new R1 chip that just handles how digital elements react with your physical world. It’s got two 4K screens on the inside with as many as 23 million pixels crammed into an area the size of a postage stamp – that’s the equivalent of 64 pixels in the space occupied by 1 pixel on the iPhone screen. The outside of the device has a screen too (a lenticular 3D one, no less), that projects your eyes so that people can make eye contact with you while you have the headset on. As far as sensors go, the Vision Pro has one LiDAR scanner, two TrueDepth cameras, two main cameras, four downward cameras, two side cameras, and two IR illuminators… just on the outside. The inside has four IR cameras and multiple invisible LED illuminators that track your eyes, letting you use them as a cursor. Your hands become the controls, allowing you to tap, pinch, and manipulate elements that your eyes look at. This entire interaction is just so complicated and nuanced, you don’t need a remote or VR controllers. Oh, did I mention, the Vision Pro uses OpticID, a new authentication system that scans your eyes, making it much more secure than TouchID and FaceID? Even Meta’s highest-end device (which is roughly 1/3rd the price of the Vision Pro) doesn’t have anywhere near as much impressive tech as the Vision Pro… and if I were Zuckerberg, I’d honestly be crying in a corner right now because in Meta’s own metaverse… they’re in second place.

An alliance with Disney

Strangely enough, the one person that shared the stage with Tim Cook was Disney CEO Bob Iger, who promised some great new partnerships between the world’s biggest tech company, and the world’s biggest entertainment company. Disney’s entertainment offerings are now going to be front and center in Apple’s Vision Pro, with a tight partnership between the two giants to make entertainment more immersive. This announcement also falls in line with Disney’s 100-year anniversary, going to show exactly how much Disney has to offer to its fans through the Vision Pro. Strangely enough, this core focus on entertainment excludes one major platform – social media. The Vision Pro doesn’t really do much to enhance how people interact with apps like Instagram, Facebook, or TikTok, which feels like a two-punch melee to Meta and Zuckerberg even more…

Apple may have shot itself in the foot too

Aside from its whopping $3499 price, the Vision Pro does something absolutely unique, in that it replaces every single other Apple device. When you’re strapped into the headset, you’re pretty much never going to look at an iPhone, MacBook, Apple Watch, iMac, or TV. Heck, you’re not even going to wear AirPods… and needless to say, that’s bad for Apple. The Vision Pro is such a strangely isolating experience that it stops you from using Apple’s other hardware devices… and that’s absolutely new. You can use your iPhone simultaneously with a MacBook, AirPods, Apple Watch, etc… but when you’re wearing the Vision Pro, every single other Apple device becomes unnecessary. Spatial computing is great for the Vision Pro, but it’s terrible for all of Apple’s other devices… and this poses an incredibly interesting threat to Apple’s hardware endeavors. Sure, if the Vision Pro takes off, Meta is absolutely, royally, wholeheartedly screwed because there’s no reason someone who wants a Vision Pro would settle for a Quest 2 or 3. However, it’ll be interesting to see if people who buy the Vision Pro ever buy a single other Apple computer like a MacBook, iPad, or Apple TV unit.

The post Apple Vision Pro Just Brilliantly Destroyed Meta’s Entire Hardware Business… And Possibly Even Its Own first appeared on Yanko Design.