Apple Vision Pro first accessory might be a protective cover

If you’re one of those that are excited for the upcoming mixed reality headset from Apple called Vision Pro, you’ll be pleased to know that they have won 70 more design patents that may be used as official accessories. With the headset set to make its debut in early 2024, those who will be early adapters are also looking at what else they can add to enhance the experience of using it. And if you like protecting your gadgets, they may be releasing an official cover for it too.

Designer: Apple

One of the patents that Apple has reportedly won is a cover for their upcoming “spatial computing” device. It uses a soft-touch woven fabric material and is designed to fit in with the device design-wise so that it won’t be too bulky or too out of place. The main function is for it to protect the most vulnerable parts of the gear. It will help you avoid scratches and debris in the front and sides of your headset but it also leaves the back part and the Light Seal clear so it won’t interfere with your viewing and computing experience.

The cover will be able to make sure the EyeSight display will be protected while not hindering the user from actually seeing what needs to be seen. There is also a tab on the left side which will make it easy to remove it in case you don’t need the cover while using the headset. Based on some product renders, the cover does seem to be integrated well into the Vision Pro and maybe even making it cooler.

We can expect a slew of accessories for the upcoming Apple device that should make AR and VR more accessible and usable for the average person. Whether they’re “official” ones from Apple or from popular third-party accessory brands like CASETiFY, expect to be flooded with these add-ons even before you decide whether you’re getting a Vision Pro or not.

The post Apple Vision Pro first accessory might be a protective cover first appeared on Yanko Design.

Best of Tech Design – Yanko Design Awards 2023

It’s that time of year again when we take stock of the past 12 months and reflect on the ups and downs in order to get a brief glimpse of what lies ahead. It has been a pretty active year for the tech industry in many segments, from companies recovering or folding from the events of the past 3 years to the explosion of AI-powered services. Of course, there has also been a torrent of new designs and product types, such as the gaming handheld fever that’s gripping the PC market. In other words, there has been a great storm swirling in the tech and consumer electronics world, and now that the dust has finally settled, we’re taking a look at some of the best product designs that you can grab today or in just a few months.

Best Mixed Reality Headset – Apple Vision Pro

After years of speculation and waiting, Apple finally revealed its hand and showed the world its vision for its mixed reality platform. Of course, that includes the hardware that will become the gateway to that reality, the Vision Pro. In typical Apple fashion, the headset isn’t just a rehash of any old VR/AR gear and is specifically designed not only to showcase the power of Apple’s Spatial Computing but also to provide convenience and comfort in all aspects.

Designer: Apple

Why it’s noteworthy

As expected from any Apple product, the Vision Pro has a minimalist, understated design where every part is made with meticulous attention to detail. In addition to luxurious, comfortable materials, the headset is designed to be lightweight and well-balanced, allowing for long hours of use without straining the head. It was also made to immerse the wearer in a virtual world without taking them away from the real world, especially when communicating with others by making eye contact. Apple’s visionOS platform is carefully tailored to mix the real and the virtual seamlessly. Admittedly, the pricing for the Vision Pro is going to be high, but that’s also on par with any high-quality product that Apple makes.

Best Gaming Handheld – Lenovo Legion Go

Although some manufacturers have been churning out portable gaming PCs for years now, things really heated up when the big names in the industry started jumping in. Valve got the ball rolling with the Steam Deck, followed closely by the ASUS ROG Ally. Lenovo may just be the latest to jump on board, but thanks to both its expertise and the advantage of hindsight, the Lenovo Legion Go is pretty much ahead of the game not just in terms of specs but also in design.

Designer: Lenovo

Why it’s noteworthy

Yes, the Legion Go looks big, but mostly because it has the biggest screen among handheld gaming PCs in the market today. Its removable controllers give the perfect Nintendo Switch vibe, but the FPS mouse mode also delivers a new and unique way to play games. The high-performance hardware leaves few complaints, aside from the usual battery life, and the built-in kickstand removes the need for a separate accessory. There’s definitely still room for improvement, so it’s actually exciting to see what Lenovo has planned next for this device category.

Best Foldable Phone – OPPO Find N3

Foldable phones are becoming mainstream to the point that there might be too many to choose from at this point. While Samsung still has the lion’s share of people’s attention, it has long relinquished being the best in this market. It has easily let its rivals pass it by, and OPPO, who is relatively new to the scene, has just produced what is pretty much the best foldable phone you can buy (depending on where you are).

Designer: OPPO

Why it’s noteworthy

The OPPO Find N3 combines not just the best specs but also the best designs currently available in this category. It folds flat, has stylish color options, and has a more normal, wider shape that makes it more usable as a regular phone when folded. And it doesn’t skimp on the cameras, which is something that most foldable phones still have a hard time getting right. This particular model might not be available in all markets, but the OnePlus Open is a dead ringer and might be available in countries where the OPPO Find N3 isn’t.

Best Laptop/Portable PC – HP Envy Move

When thinking of “portable computers,” most will probably think of laptops since those are the most common designs available. They’re not always the best, though, especially when you consider the ergonomics involved. Desktop PCs, however, are just too large and bulky to be portable, even the thin all-in-one PCs are firmly rooted to desks. HP is challenging that status quo with a unique PC that is as portable as a laptop yet as usable as an all-in-one desktop.

Designer: HP

Why it’s noteworthy

The HP Envy Move is, for all intents and purposes, an AIO or All-in-One PC that was designed to be carried around, especially thanks to a built-in foldable handle and a pocket for storing a wireless keyboard. Its large 23.8-inch screen has plenty of room for work and entertainment, and you won’t have to crane your neck down every time you use it. Yes, it’s probably going to be awkward to carry it with you everywhere, but when you have a few fixed places to set up work or sit down for a movie, this computer will give you the full desktop experience in no time flat.

Best Portable Power Station – Bluetti AC500 + B300S

A stable supply of electricity has almost become a luxury these days. Power grids can go down without prior notice, or you might find yourself spending days and nights outdoors. Gas generators no longer cut it and are dangerous liabilities rather than assets, but there’s, fortunately, a large selection of safer and greener battery-powered generators now available today. Perhaps too many, in fact, since they come in all sizes and capacities, but Bluetti’s latest modular AC500 inverter and B300S expansion batteries offer the perfect flexibility for all your power needs.

Designer: Bluetti

Why it’s noteworthy

The modular design of the Bluetti AC500 means that you can stack as many as four of these 3,072Wh batteries if you need to, enough to power a home for a day or two. Or you can only bring the inverter and one battery module if you just need to go camping over the weekend. Best of all, you can charge the batteries using solar power, so you don’t even have to worry about power outlets in case of emergencies or being stuck outside. It’s a great way to enjoy the conveniences of modern life by having access to safe and sustainable power at any time.

Best Powerbank – Anker 737 GaN Prime 24K

Our dependence on computers and smartphones becomes pretty obvious when we start scrambling for a charger and find no power outlet available. There’s a variety of portable batteries, a.k.a. power banks, available today, but not all of them deliver the power you need to charge multiple devices, let alone a power-hungry laptop. Anker is one of the leading brands in this space, and it has a solution that addresses that need without forcing you to carry a veritable brick in your bag.

Designer: Anker

Why it’s noteworthy

The Anker 737 GaN Prime 24K offers portability and power in a compact package. It has enough output and battery capacity to charge a MacBook, for example, which means smartphones, accessories, and even drones are no sweat at all. GaN (Gallium Nitride) technology makes charging not only faster but also safer, while a helpful LCD display shows all the necessary stats you need to keep tabs on the power bank itself as well as the devices it’s charging. The block is admittedly chunky, but compared to power banks of similar capacity, it’s surprisingly compact and light, making it the perfect all-around partner for any mobile worker.

Best Smart Speaker – Sonos Era 300

The hype around smart speakers may seem to have died down, but that’s mostly because they have become almost too common. Every modern Bluetooth-enabled speaker these days has some smarts now, leaving manufacturers more freedom to explore other designs. Sonos is one of the pioneers in the wireless speaker space, and it continues to push through with new designs even in the face of unrelenting competition. Its new Sonos Era 300 this year makes an impact not only with its design but also with its commitment to the environment.

Designer: Sonos

Why it’s noteworthy

The Sonos Era 300 has a distinctive shape that almost looks like a flattened speaker icon, with a front half that flares out and a back that is more or less uniform in size. That shape isn’t accidental, as it allows the multitude of woofers and tweeters to be positioned for maximum efficiency and performance. What really makes the Sonos Era 300 extra special, however, is the attention that it gives to sustainability, picking materials, lowering power consumption, and encouraging quick repairs in order to reduce its negative impact on the environment.

Best Wireless Earbuds – Sony WF-1000XM5

Apple may have made wireless earbuds popular, but it is far from being the only game in town. When it comes to design and especially sound quality, the AirPods are easily eclipsed by products from more experienced brands. Reclaiming its foothold in this audio space, Sony has launched a new pair of high-end buds that truly immerse you in your favorite music, regardless of your environment.

Designer: Sony

Why it’s noteworthy

The Sony WF-1000XM5 boasts one of if not the best noise canceling experiences that tiny buds can offer. It even uses some AI special sauce to also reduce noise interference when you’re making calls, allowing for clear and crisp audio on both ends of the line. Best of all, its minimalist design doesn’t call attention to itself, but its stylish appearance won’t embarrass you if people do notice.

Best Robot Vacuum Cleaner – SwitchBot S10

We have long gone past the days when robot vacuum cleaners only vacuumed floors and still left everything else to humans. Now they can dump their dirt into bins on their own and even wipe the floor with a mop when needed. Human intervention can’t be removed completely, like in taking out the dust bin or refilling water in the tank, but the SwitchBot S10 further reduces the need for manual interaction by making the robot smart enough to dump its own dirty water and stock up on clean water all by itself.

Designer: SwitchBot

Why it’s noteworthy

The small and narrow auto-emptying station of the SwitchBot S10 is rather deceptive, making you think it’s less capable than its larger competitors. But that’s because the exciting action happens away from that bin and at the separate water station that lets the robot vacuum exchange dirty water for clean one using the same pipes that you already have for drains and faucets. It can even refill a humidifier’s water tank on its own, presuming it’s SwitchBot’s humidifier, of course. These almost completely automated processes reduce the need to get your hands dirty, literally, and reduce the risks of getting contaminated and sick from handling waste materials.

Best Smartwatch – Apple Watch Ultra 2

Just like with smart speakers, the smartwatch market has more or less normalized by now after a long struggle in trying to carve out its niche. Designs may no longer be changing drastically, but innovation hasn’t stopped completely either. Now it’s a race to put the best health sensors you can cram in such a small space, but Apple has long been ahead of the race and the Apple Watch Ultra 2 cements its lead even further.

Designer: Apple

Why it’s noteworthy

The Apple Watch Ultra series delivered what users have been asking for a long time, more space to see the information they need at a glance without having to tap their way through screens. It also adds a new double tap gesture so you won’t even have to use your other hand for basic actions like stopping a timer, ending a call, or better yet, taking a photo from your iPhone. It also introduces Apple’s first carbon-neutral products, offsetting the negative impact the production of its smartwatches has with “carbon credits” from nature-based projects and parts.

The post Best of Tech Design – Yanko Design Awards 2023 first appeared on Yanko Design.

Apple Vision Pro Air Typing experience takes a small step toward usability

It’s truly mind-blowing to see virtual objects floating before our eyes, but the magic and illusion start to break down once we try to manipulate those objects. Input has always been a tricky subject in mixed reality, either because we can’t see our actual hands or we can’t feel what we’re supposed to be touching, which is physically nothing. Until the perfect haptic feedback gloves become a reality, we have to make do with tricks and workarounds to make input less awkward and more convenient. That’s especially true with typing on air, and Apple is apparently using some special techniques to offer a more usable experience on the Vision Pro mixed reality headset.

Designer: Apple (via Brian Tong)

Apple’s first teaser for the Vision Pro headset and visionOS platform didn’t show typing of any sort. It focused, instead, on icons, windows, and menus, virtual 3D objects that are easier to interact with using hand gestures. Of course, sooner or later you will be faced with the need to input text, and the usual method of voice recognition won’t always cut it. visionOS, fortunately, does include a virtual floating keyboard like other VR systems, but the way you use it is quite special and, to some extent, ingenious.

For one, you can interact with the keyboard like you would any part of the Vision Pro’s interface, which is to look at the UI element to focus on it and then use hand gestures. In this case, pinching a letter is the equivalent of selecting it, just like what you’d do for menu items or icons in visionOS. It makes the gesture grammar consistent, but it’s also an awkward way to type.

You can also “peck” at the keys with your fingers, making you feel like you’re typing on air. The difference that the Vision Pro makes, however, is that it tricks your eyes into believing you’re actually pressing down on those keys. Thanks to Apple’s flavor of spatial computing, hovering your real-world finger on a virtual key makes that key glow, and tapping on it results in an animation that looks like the key is actually moving down, just like on a real keyboard. There’s also a haptic sound, similar to the clicking sound effect you’d normally hear on an iOS virtual keyboard, to complete that audiovisual illusion.

Of course, your fingers aren’t actually hitting anything physical, so there’s still a disconnect that will probably confuse your brain. The visual effect, which is really only possible thanks to spatial computing, is still an important step forward in helping our minds believe that there’s a “real” three-dimensional object, in this case, a keyboard, right in front of us. It’s not going to be the most efficient way to input text, but fortunately, you can connect a wireless keyboard to the Vision Pro and you’ll be able to see your actual hands typing away on it.

The post Apple Vision Pro Air Typing experience takes a small step toward usability first appeared on Yanko Design.

Is the Apple Watch Series 9 secretly going to become the new Controller for the Vision Pro headset?

As Apple revealed the latest fleet of the Apple Watch collection, one feature stood out as the most remarkable as well as the most intriguing. The Watch Series 9 and Watch Ultra 2 both boasted of a new gesture input – being able to tap your fingers twice to register a button press. This would work remarkably well if your hands were occupied or dirty, letting you answer/end calls, snooze alarms, play/pause music, and even trigger your iPhone shutter simply by tapping your index finger and thumb together… without touching your Apple Watch at all. Sounds impressive, but also sounds extremely familiar, doesn’t it? Because tapping your fingers is exactly how the Apple Vision Pro registers click inputs too.

Designer: Apple

When Apple debuted the Vision Pro at WWDC in June, their biggest claim was that the Vision Pro was an entirely controller-free AR/VR headset, letting you manipulate virtual objects using just your hands. However, news emerged that Apple was, indeed, figuring out a traditional controller substitute that would be much more reliable than just human hands. It seems like the Apple Watch could be that perfect alternative.

The Watch Series 9 and Watch Ultra Series 2 were unveiled this year, with a few standout upgrades. Both watches now come with 2000 Nits peak brightness, doubling last year’s capabilities. They both also rely on the new S9 SiP (the watch’s dedicated chipset) which now runs Siri locally on the device, without relying on the internet. The watches are also accompanied by new bands, including the FineWoven fabric that now replaces all leather accessories in Apple’s catalog… but more importantly, both the Watch Series 9 and Watch Ultra Series 2 accept the new finger-tapping gesture that does what the home button on both watches would do. The feature’s due to roll out next month as Apple calibrates how it works… but the implications of the feature go beyond just the watch. In fact, the Watch could be the secret controller the Vision Pro truly needs to enhance its Spatial Computing Experience.

Sure, the Vision Pro has multiple cameras that track your environment, also keeping an eye on your hands to see where you’re pointing, tapping, and pinching. The big caveat, however, is any situation where the Vision Pro CAN’T see your hands. If you’ve got your hands under a table, in your pocket, or behind your back, the Vision Pro potentially wouldn’t be able to recognize your fingers clicking away… and that’s a pretty massive drawback for the $3500 device. Potentially though, the Apple Watch helps solve that problem by being able to detect finger taps… although only on one hand.

The way the ‘Double Tap’ feature works on the watch is by relying on the S9 SiP. The chipset uses machine learning to interpret data from the accelerometer, gyroscope, and optical heart sensor to detect when you tap your fingers twice. The feature only works with the hand that’s wearing the Watch (you can’t tap your right-hand fingers while the Watch is on your left hand), but even that’s enough to solve the Vision Pro’s big problem. Moreover, the new Ultra Wide Band chip on the watch can help with spatial tracking, letting your Vision Pro when your hands are in sight and when they aren’t. While Apple hasn’t formally announced compatibility between the Watch and the Vision Pro, we can expect more details when Apple’s spatial-computing headset formally launches next year. The Vision Pro could get its own dedicated keynote event, or even be clubbed along with the new iPad/MacBook announcements that often happen at the beginning of the calendar year.

The post Is the Apple Watch Series 9 secretly going to become the new Controller for the Vision Pro headset? first appeared on Yanko Design.

What if Instagram Went Spatial? Unofficial UI on Apple Vision Pro Shows How

Unofficial Instagram UI for Apple Vision Pro

The internet sure has a short memory. It’s barely been 3 months since Apple debuted the Vision Pro and it pretty much looks like we’ve entirely forgotten about it. However, people experimenting with the developer kit seem to be incredibly impressed with its underlying tech (some even let out audible gasps when they tried the Vision Pro out). So while the hardware device is still a while away from officially hitting the shelves, it’s safe to say that developers are excited to build spatial-ready versions of their apps, platforms, websites, and games. Earlier last month we looked at an unofficial Spotify UI for the Vision Pro, and it seems like we’ve now got a taste of what Instagram would look like through Apple’s headset.

Designer: Ahmed Hafez

Visualized by Cairo-based designer Ahmed Hafez, this Instagram UI comes with neutral frosted glass elements that allow the content to stand out against the background. This approach works rather wonderfully in the spatial world as the contrast allows you to easily see text and elements whether you’re in an illuminated space or even a dimly lit one. Theoretically, it looks like Apple may have ended the “light-mode/dark-mode” UI debate by just making everything frosted.

The interface looks a lot like Instagram’s desktop (and even now its iPad) interface. It’s wider than its mobile counterpart, and comes with menus on the left and content on the right. You can view stories on the upper carousel, or even move higher up to access follow requests, close friends, notifications, and DMs.

The fix for the light vs. dark issue is present in the interface too. While the glassy elements don’t change color, you can alternate between white or black text for better visibility. The interface, however, isn’t traditionally landscape. It’s still quite vertical, which is perfect for spatial computing because you can merely move it to the side and have other tabs/apps open – a promise that Apple made rather clearly with their WWDC keynote.

The Vision Pro is still at least half a year away from formally being available to consumers, although rumors say that Apple’s seeing quite a few roadblocks with its production and plans on cutting the number of production units drastically from its original 400,000 units down to 150,000. That being said, the company isn’t giving up on the idea any time soon, and the Vision Pro is mainly paving the way for a Vision Air device that will be much more affordable. Before that happens, though, it’s important for developers to create a strong app ecosystem to justify the shift from physical computing to spatial computing. This fan-made IG interface is the first step in that direction!

Unofficial Instagram UI for Apple Vision Pro

The post What if Instagram Went Spatial? Unofficial UI on Apple Vision Pro Shows How first appeared on Yanko Design.

Unofficial Spotify Interface for the Apple Vision Pro brings ‘Spatial Computing’ to the music app

While the highly-anticipated headset is still a ways away, a designer has created a conceptual Spotify interface for the Apple Vision Pro and it looks absolutely stunning. With three panels filling your entire periphery, Kyuna Petrova’s Spotify UI lets you browse music, view album art, set your queue, see artist info, and play/pause/seek music all at once. The interface also features the signature blurred glass background that changes hues with your environment, creating an immersive app experience that feels incredibly real and within your palm’s reach!

Designer: Kyuna Petrova

Petrova’s Spotify UI for spatial computing is quite different from Spotify’s own desktop computing interface. The larger display in virtual reality offers much more visual real estate for digital assets. The entire UI can be split into four blurred glass canvases – one taskbar on the left, one control bar at the bottom, a main canvas for browsing music, and another for displaying the album art of the active song along with the artist bio. This is an extension of Spotify’s desktop app/website but with just more meat in the sauce. The result breaks the boundaries of the traditional rectangular display, creating something much more immersive.

While the interface doesn’t have a ‘dark mode’, the blurred glass instinctively shifts hues depending on your environment. In a well-lit space, the UI stays light, but in a darker area, it transforms into a dark UI that’s high on contrast and just glorious to look at. It’s a shame Petrova didn’t show what the app looks like when resized, but I imagine the album art and playback control bar take center stage. I’d also love to see immersive visualizations that fill your environment with psychedelic dynamic art as the music plays too. After all, why would I want to stare at a living room when I could look at responsive motion graphics that dance to my music?!

The post Unofficial Spotify Interface for the Apple Vision Pro brings ‘Spatial Computing’ to the music app first appeared on Yanko Design.

5 Ways Spatial Computing Will Succeed and 5 Ways It Will Flop

As if we haven’t had our fill of buzz-worthy terms like “eXtended Reality” or the “Metaverse,” Apple came out with a new product that pushed yet another old concept into the spotlight. Although the theory behind spatial computing has been around for almost two decades now, it’s one of those technologies that needed a more solid implementation from a well-known brand to actually hit mainstream consciousness. While VR and AR have the likes of Meta pushing the technologies forward, Apple is banking more heavily on mixed reality, particularly spatial computing, as the next wave of computing. We’ve already explained what Spatial Computing is and even took a stab at comparing the Meta Quest Pro with the new kid on the block, the Apple Vision Pro. And while it does seem that Spatial Computing has a lot of potential in finally moving the needle forward in terms of personal computing, there are still some not-so-minor details that need to be ironed out first before Apple can claim complete victory.

Designer: Apple

Spatial Computing is the Future

As a special application of mixed reality, Spatial Computing blurs the boundaries between the physical world and the applications that we use for work and play. But rather than just having virtual windows floating in mid-air the way VR and AR experiences do it, Apple’s special blend of spatial computing lets the real world directly affect the way these programs behave. It definitely sounds futuristic enough, but it’s a future that is more than just fantasy and is actually well-grounded in reality. Here are five reasons Spatial Computing, especially Apple’s visionOS, is set to become the next big thing in computing.

Best of Both Realities

Spatial computing combines the best of VR and AR into a seamless experience that will make you feel as if the world is truly your computer. It doesn’t have the limitations of VR and lets you still see the world around you through your own eyes rather than through a camera. At the same time, it still allows you to experience a more encapsulated view of the virtual world by effectively dimming and darkening everything except your active application. It’s almost like having self-tinting glasses, except it only affects specific areas rather than your whole view.

More importantly, spatial computing doesn’t just hang around your vision the way AR stickers would. Ambient lighting affects the accents on windows, while physical objects can change the way audio sounds to your ears. Virtual objects cast shadows as if they’re physically there, even though you’re the only one that can see them. Given this interaction between physical and virtual realms, it’s possible to have more nuanced controls and devices in the future that will further blur the boundaries and make using these spatial apps feel more natural.

Clear Focus

The term “metaverse” has been thrown around a lot in the past years, in no small part thanks to the former Facebook company’s marketing, but few people can actually give a solid definition of the term, at least one that most people will be able to understand. To some extent, this metaverse is the highest point of virtual reality technologies, a digital world where physical objects can have some influence and effect. Unfortunately, the metaverse is also too wild and too amorphous, and everyone has their own idea or interpretation of what it can or should be.

In contrast, spatial computing has a narrower and more focused scope, one that adds a literal third dimension to computing. Apple’s implementation, in particular, is more interested in taking personal computing to the next level by freeing digital experiences from the confines of flat screens. Unlike the metaverse, which almost feels like the Wild West of eXtended reality (XR) these days, spatial computing is more content on doing one thing: turning the world into your desktop.

Relatable Uses

As a consequence of its clearer focus, spatial computing has more well-defined use cases for these futuristic-sounding features. Apple’s demo may have some remembering scenes from the Minority Report film, but the applications used are more mundane and more familiar. There are no mysterious and expensive NFTs, or fantastic walks around Mars, though the latter is definitely possible. Instead, you’re greeted by familiar software and experiences from macOS and iOS, along with the photos, files, and data that you hold dear every day.

It’s easy enough to take this kind of familiarity for granted, but it’s a factor that sells better over a longer period of time. When the novelty of VR and the metaverse wear off, people are left wondering what place these technologies will have in their lives. Sure, there will always be room for games and virtual experiences that would be impossible in the physical world, but we don’t live in those virtual worlds most of the time. Spatial computing, on the other hand, will almost always have a use for you, whether it’s entertainment or productivity because it brings the all-familiar personal computing to the three-dimensional physical world.

Situational Awareness

One of the biggest problems with virtual reality is that they can’t really be used except in enclosed or safe spaces, often in private or at least with a group of trusted people. Even with newer “passthrough” technologies, the default mode of devices like the Meta Quest Pro is to put you inside a 360-degree virtual world. On the one hand, that allows for digital experiences that would be impossible to integrate into the real world without looking like mere AR stickers. On the other hand, it also means you’re shutting out other people and even the whole world once you put on the headset.

The Apple Vision Pro has a few tricks that ironically make it more social even without the company mentioning a single social network during its presentation. You can see your environment, which means you’ll be able to see not only people but even the keyboard and mouse that you need to type an email or a novel. More importantly, however, other people will also be able to see your “eyes” or at least a digital twin of them. Some might consider it gimmicky, but it shows how much care Apple gives to those subtle nuances that make human communication feel more natural.

Simpler Interactions

The holy grail of VR and AR is to be able to manipulate digital artifacts with nothing but your hands. Unfortunately, current implementations have been stuck in the world of game controllers, using variants of joysticks to move things in the virtual world. They’re just a step away from using keyboards and mice, which creates a jarring disjunct between the virtual objects that almost look real in front of our eyes and the artificial way we interact with them.

Apple’s spatial computing device simply uses hand gestures and eye tracking to do the same, practically taking the place of a touchscreen and a pointer. Although we don’t actually swipe to pan or pinch to zoom real-world objects, some of these gestures have become almost second nature thanks to the popularity of smartphones and tablets. It might get a bit of getting used to, but we are more familiar with the direct movements of our hands compared to memorizing buttons and triggers on a controller. It simplifies the vocabulary considerably, which places less burden on our minds and helps reduce anxiety when using something shiny and new.

Spatial Computing is Too Much into the Future

Apple definitely turned heads during its Vision Pro presentation and has caused many people to check their bank accounts and reconsider their planned expenses for the years ahead. As expected of the iPhone maker, it presented its spatial computing platform as the next best thing since the invention of the wheel. But while it may indeed finally usher in the next age of personal computing, it might still be just the beginning of a very long journey. As they say, the devil is in the details, and these five are those details that could see spatial computing and the Apple Vision Pro take a back seat for at least a few more years.

Missing Haptics

We have five (physical) senses, but most of our technologies are centered around visual experiences primarily, with audio coming only second. The sense of touch is often taken for granted as if we were disembodied eyes and ears that use telekinesis to control these devices. Futuristic designs that rely on “air gestures” almost make that same assumption, disregarding the human need to touch and feel, even if just a physical controller. Even touch screens, which have very low tactile feedback, are something physical that our fingers can touch, providing that necessary connection that our brains need between what we see and what we’re trying to control.

Our human brains could probably evolve to make the need for haptic feedback less important, but that’s not going to happen in time to make the Apple Vision Pro a household item. It took years for us to even get used to the absence of physical keys on our phones, so it might take even longer for us to stop looking for that physical connection with our computing devices.

Limited Tools

The Apple Vision Pro makes use of simpler hand gestures to control apps and windows, but one can also use typical keyboards and mice with no problem at all. Beyond these, however, this kind of spatial computing takes a step back to the different tools that are already available and in wide use on desktop computers and laptops. Tools that take personal computing beyond the typical office work of preparing slides, typing documents, or even editing photos. Tools that empower creators who design both physical products as well as the digital experiences that will fill this spatial computing world.

A stylus, for example, is a common tool for artists and designers, but unless you’re used to non-display drawing tablets, a spatial computing device will only get in the way of your work. While having a 3D model that floats in front of you might be easier to look at compared to a flat monitor, your fingers will be less accurate in manipulating points and edges compared to specialized tools. Rather than deal breakers, there are admittedly things that can be improved over time. But at the launch of the Apple Vision Pro, spatial computing applications might be a bit limited to those more common use cases, which makes it feel like a luxurious experiment.

Physical Strain

Just as our minds are not used to it, our bodies are even more alien to the idea of wearing headsets for long periods of time. Apple has made the Vision Pro as light and as comfortable as it can, but unless it’s the size and weight of slightly large eyeglasses, they’ll never really be that comfortable. Companies have been trying to design such eyewear with little success, and we can’t really expect them to make a sudden leap in just a year’s time.

Other parts of our bodies might also feel the strain over time. Our hands might get sore from all the hand-waving, and our eyes could feel even more tired with the high-resolution display so close to our retinas. These health problems might not be so different from what we have today with monitors and keyboards, but the ease of use of something like the Vision Pro could encourage longer periods of exposure and unhealthy lifestyles.

Accessibility

As great as spatial computing might sound for most of us, it is clearly made for the majority of able-bodied and clear-seeing people. Over the years, personal computing has become more inclusive, with features that enable people with different disabilities to still have an acceptable experience, despite some limitations. Although spatial computing devices like the Vision Pro do make it easier to use other input devices such as accessibility controllers, the very design of headsets makes them less accessible by nature.

Affordability

The biggest drawback of the first commercial spatial computing implementation is that very few people will be able to afford it. The prohibitive price of the Apple Vision Pro marks it as a luxury item, and its high-quality design definitely helps cement that image even further. This is nothing new for Apple, of course, but it does severely limit how spatial computing will grow. Compared to more affordable platforms like the Meta Quest, it might be seen as something that benefits only the elite, despite the technology having even more implications for the masses. That, in turn, is going to make people question whether the Vision Pro would be such a wise investment, or whether they should just wait it out until prices become more approachable.

The post 5 Ways Spatial Computing Will Succeed and 5 Ways It Will Flop first appeared on Yanko Design.

Apple Vision Pro vs. Meta Quest Pro: The Design Perspective

Apple finally took off the veils from its much-anticipated entry into the mixed reality race, and the Internet was unsurprisingly abuzz with comments on both sides. Naturally, comparisons were made between this shiny newcomer and the long-time market leader, which is now Meta, whether you like it or not. Given their already tenuous relationship, the launch of the Apple Vision Pro only served to increase the rivalry between these frenemies. It’s definitely not hard to paint some drama between the two tech giants vying for the same mixed reality or spatial computing market, whichever buzzword you prefer to use. But is there really a direct competition between these two products, or do they have very different visions with almost nothing in common except for having to put a screen over our eyes? We take a deeper look into the Apple Vision Pro and the Meta Quest Pro to see where they differ not only in their design but also in their vision.

Designer: Apple, Meta

What is the Meta Quest Pro

Let’s start with the older of the two, one that dates back to the time when Facebook was also the name of the company. Originally created by Oculus, the Quest line of VR headsets soon bore the Meta name, though not much else has changed in its core focus and the way it works. In a nutshell, the Meta Quest Pro, along with its siblings and predecessors, falls under the category of virtual reality systems, which means it gives you a fully enclosed experience confined within virtual walls. It practically blocks off the rest of the real world while you’re wearing it, but the Quest Pro now has a “passthrough” feature that lets you see the world around you through the headset’s cameras, but the quality is definitely lower than what your eyes could naturally see.

In terms of product design, the Quest Pro doesn’t stray too far from the typical formula of consumer electronics, which is to say that there’s plenty of plastic material all around. To be fair, Meta aimed to make the Quest hardware more accessible to more people to help spread its adoption, so it naturally had to cut a few corners along the way. The choice of materials was also made to lighten the gear that might be sitting on your head for hours, but it also doesn’t remove the less-than-premium feel, nor does it completely alleviate that heft.

To its credit, the design of the Quest Pro does help make the headset feel a little less burdensome by balancing the weight between the front and back parts. While the front has most of the hardware and optics that make the Quest Pro work, the back has the battery that powers the device. Having that battery present still adds to the overall weight of the machine, but Meta opted to prioritize mobility and convenience over lightening the load.

What is the Apple Vision Pro

The Apple Vision Pro, in comparison, takes an almost completely opposite approach from the Meta Quest Pro or all other headsets in general. In typical Apple fashion, the company paid special attention to design details that make the hardware both elegant and comfortable. The Vision Pro makes use of premium materials like laminated glass and woven fabrics, as well as heavier components like aluminum alloy. It’s a device that looks elegant and fashionable; an undeniable part of Apple’s hardware family.

Apple’s answer to the battery problem is both simple and divisive. The Vision Pro simply doesn’t have a battery, at least not on the headset itself. You’d have to connect an external power source via a cable, though that battery can be shoved inside your pocket to get it out of the way. It doesn’t completely hinder mobility and even opens the doors for third-party designs to come up with other ideas on how to solve this puzzle.

The biggest difference between Apple’s and Meta’s headsets, however, is in their use and purpose. The Vision Pro is closer to being an augmented reality headset compared to the Quest Pro, blending both virtual and real worlds in a single, seamless view. The Vision Pro also has the ability to block out or at least dim everything aside from the virtual window you’re using, but that’s only a side feature rather than a core function.

VR/AR vs. Spatial Computing

At its most basic, the Meta Quest Pro is really a virtual reality headset while the Apple Vision Pro is designed for a form of mixed reality now marketed as “spatial computing.” To most people, the two are almost interchangeable, but those sometimes subtle differences set these two worlds apart, especially in how they are used. It’s certainly possible to mix and match some features and use cases, but unless they’re specifically designed to support those, the experience will be subpar.

The Meta Quest Pro, for example, is the first in its line that can be truly considered to have AR functionality thanks to its higher fidelity “passthrough” feature, allowing you to see virtual objects overlaid on top of the real world. That said, its core focus is still on virtual reality, which, by nature, closes off the rest of the world from your sight. Looking at the world through cameras is really only a stopgap measure and can be a little bit disorienting. That’s not even considering how most of the Quest ecosystems experiences happen in virtual reality, including the use of “normal” computer software, particularly ones that require using a keyboard and a mouse.

On the other hand, the Apple Vision Pro was made specifically for mixed reality, specifically spatial computing, where the real and the digital are blended seamlessly. In particular, it puts those applications, including familiar ones from macOS and iOS, in floating windows in front of you. visionOS’s special trick is to actually have the real world affect those virtual objects, from having them cast shadows to tweaking the audio to sound as if they’re bouncing off the furniture in the room. The Vision Pro can emulate the enclosed view of a VR headset by darkening everything except the virtual window you’re using, but it’s unavoidable that you’ll still see some of the real world “bleeding” through, especially in bright ambient light.

The Vision Pro’s and visionOS’s capability to blend the real and the virtual is no small feat. Not only does it enable you to use normal applications with normal computer peripherals, it also makes better use of real-world space. It lets you, for example, assign specific applications and experiences to parts of the house. Apple’s technologies also create more natural-looking interactions with people, even if your actual body parts are invisible or even absent. All these don’t come without costs, though, and it remains to be seen if people will be willing to pay that much for such a young technology.

Controls and Interaction

The Meta Quest Pro hails from a long line of VR and AR headsets, and nowhere is this more obvious than in the way you interact with virtual objects. The headset is paired with two controllers, one for each hand, which are pretty much like joysticks with buttons and motion sensors. Make no mistake, the technology has come a long way and you no longer need to have external beacons stationed elsewhere in the room just to make the system aware of your location or that of your hands. Still, holding two pieces of plastic all the time is a very far cry from how we usually manipulate things in the real world or even from the way we use computers or phones.

Apple may have acquired the holy grail of virtual computing with its more natural input method of using hand gestures without controllers or even gloves. There’s still a limited vocabulary of gestures available, but we’re almost used to that given how we have been using touch screens for the past decade or so. At the same time, however, the Vision Pro doesn’t exclude the use of more precise input instruments, including those controllers, if necessary. The fact that you can actually see the real objects makes it even easier to use any tool, which expands the Vision Pro’s uses considerably.

Philosophy and Vision

Although it’s easy to paint the Apple Vision Pro and Meta Quest Pro as two sides of the same eXtended Reality (XR) coin, the philosophies that drive their design are almost as opposed to each other as the companies themselves are. Meta CEO Mark Zuckerberg was even quoted to have pretty much said that while downplaying the Vision Pro’s innovations. In a nutshell, he doesn’t share Apple’s vision of the future of computing.

It shouldn’t come as a surprise that Zuckerberg’s vision revolves around social experiences, something that might indeed be better served by a fully virtual reality. Not only does it make out-of-this-world experiences like the Metaverse possible, it can also make inaccessible real-world places more accessible to groups of people. Meta’s marketing for the Quest Pro mostly revolves around fun and engaging experiences, content consumption, and a bit of creativity on the side.

The Apple Vision Pro, on the other hand, seems to be about empowering the individual by breaking computing free from the confines of flat and limited screens. There are, of course, features related to connecting with other people, but most of the examples have been limited to FaceTime chats more than huddling around a virtual campfire. It has already been noted repeatedly how Apple’s presentation was bereft of any mention of social media, which some have taken as a knock against Facebook. Of course, social media is now an unavoidable part of life, but it exists only as just another app in visionOS rather than as a core focus.

Ironically, the Vision Pro is perhaps even more social than the Quest Pro, at least as far as more natural connections are concerned. Instead of fun yet comical avatars, people will get to see a life-like semblance of your bust during meetings, complete with eye movements and facial expressions. And when someone needs your attention in the meatspace, the Vision Pro will project your eyes through the glass, making sure that the other person knows and feels that you’re actually paying attention to them.

Pricing

It’s hard to deny how impressive all the technologies inside the Vision Pro are, and it’s easy to understand why Apple took this long to finally let the cat out of the bag. As mentioned, however, these innovations don’t come without a cost, and in this case, it is a very literal one. Right off the bat, Apple’s inaugural spatial computing gear is priced at $3,499, making it cost twice as much as the average MacBook Pro. It might be destined to replace all your Apple devices in the long run, but it’s still a very steep price for an unproven piece of technology.

The Meta Quest Pro is, of course, just a third of that, starting at $1,000. Yes, it uses less expensive materials, but its technologies are also more common and have stood the test of time. The Quest platform has also gone through a few iterations of polish, with developers creating unique applications that play to the hardware’s strengths. That said, although the Quest Pro sounds more dependable, insider insights at Meta have painted a somewhat uncertain future for the company’s Metaverse ambitions. Apple’s announcement might then serve to light a fire under Meta’s seat and push it to pick up the pace and prove that its vision is the right one.

Final Thoughts

As expected of the Cupertino-based company, Apple turned heads when it announced the Vision Pro. It blew expectations not just because of the quality of its design but also because of the ambitious vision that Apple revealed for the next wave of computing. Right now, it may all sound novel and gimmicky, and it will take some time before the technology truly takes root and bears fruit. Spatial computing has the potential to truly revolutionize computing, but only if it also becomes more accessible to the masses.

The Vision Pro isn’t a death knell for the Meta Quest but more of a wake-up call. There will definitely be a need for an alternative to Apple’s technologies, especially for those who refused to live in that walled garden. Meta definitely has a lot of work to do to reach the bar that Apple just raised. Whether those alternatives come from Meta or it might come from other vendors, there’s no doubt that the extended reality market just burst to life with a single “One More Thing” from Apple.

The post Apple Vision Pro vs. Meta Quest Pro: The Design Perspective first appeared on Yanko Design.

Top 5 Apple Vision Pro-inspired concepts that need to be brought to life ASAP

Apple has completely turned the tech industry topsy turvy with its announcement of the Vision Pro headset right at the end of its WWDC keynote. The revolutionary Mixed Reality Headset advocates “spatial computing” – an upgrade from the personal computing abilities of the laptop and smartphone. The MR headset has two Apple Silicon chipsets (including the new M2 chip), dozens of cameras and sensors, an iris recognition system that scans your eye for biometrics, directional audio units in the strap, two postage-stamp-sized 4K screens on the inside for immersive viewing, and a curved OLED display with a lenticular layer that lets other people see your eyes while you’re wearing the headset. And quite obviously such an innovation is bound to send ripples of inspiration across the tech world and designers! Designers and innovators have been tampering with ingenious Apple Vision Pro-inspired conceptual designs, and we’re presenting a few of our favorites to you. Have fun!

1. Apple Vision Joystick Pro Max

Alex Casabò designed the “Apple Vision Joystick Pro Max” – a conceptual design of the Apple Vision Pro Controllers. The controllers have been designed to perfectly complement the headset’s modern aesthetic, and are geared up to provide smooth functionality irrespective of what open-world title you throw at them. The joysticks will maximize the gaming experience for all kinds of modern games keeping in mind the VR environment. The controller features a top surface amped with a touch-sensitive surface for smart controls such as swaying the sword or reloading a potent sniper rifle.

2. Bandwerk’s Headband

German accessory provider Bandwerk designed handcrafted leather headbands for the $3,500 Apple headset slated for launch early next year. The sophisticated and premium headband will be available in five color options – Grey, Creme, Beige, Orange, and Brown, and it will subtly adapt to the silhouette of the final commercially available headset.

3. Ortolani’s AirPods Max Concept

Parker Ortolani designed this Apple Vision Pro-inspired AirPods Max concept. The innovative and redefined design features Vision Pro elements to provide improved comfort and functionality. It will offer new software functionality such as Adaptive Audio, and will be available in a variety of stunning new finishes to perfectly complement your other Apple devices.

4. Dark Knight Apple Vision Pro Case

Max Arnautov decided to create the ultimate Batman x Apple crossover! He designed an attachable case concept for the Apple Vision Pro that draws inspiration from the Dark Knight. Slip this case onto the Apple Vision Pro, wear your headset, and you’ll look like Dark Knight Jr! This will be a hard yes for the lovers of the Batmanverse.

5. Mode Indicator

Moe Slah created this conceptualized feature for the Apple Vision Pro to upgrade and elevate the user experience for not only the wearer but also the onlookers. Called the “Mode Indicator”, this app will quite interestingly beam visual indication of the wearer’s current activity. The user can also choose from in-app text or design custom animations to have a more personalized experience.

The post Top 5 Apple Vision Pro-inspired concepts that need to be brought to life ASAP first appeared on Yanko Design.

Mode Indicator for Apple Vision Pro beams visual cues of user’s current state of activity

Apple created a stir in the tech industry with the announcement of Vision Pro headset, and now the news coming in of visionOS software development kit endless possibilities for the VR headset will unearth. The SDK will lend third-party developers the ability to leverage the headset’s hardware for new exciting features.

In the announcement, Apple addressed the global developers community of developers to create an entirely “new class of spatial computing apps that take full advantage of the infinite canvas in Vision Pro and seamlessly blend digital content with the physical world to enable extraordinary new experiences.”

Designer: Moe Slah

Now that the visionOS SDK will have developers getting creative, the new app experiences will serve categories like productivity, design, gaming, and of course UI. Just in time for this announcement comes this conceptualized feature for Apple Vision Pro to elevate the user experience not only for the wearer but also for curious onlookers. Dubbed the “Mode Indicator” this app will beam visual indications of the wearer’s current activity.

For instance, if he/she is playing a mixed reality game, engrossed in a meeting, binge-watching an action series, or calming down with a meditation session. The user can also choose from in-app text or design custom animations to have a more personalized experience. Simple yet effective, the app will automatically detect the mode of activity and change the headset’s outer display content.

For starters, Moe envisions the app with a preset “Gaming Mode” for interactive and gaming experiences, “Entertainment Mode” for multimedia consumption activities, and “Meditation Mode” for times of calm and focused state of mind. Clearly, the Mode Indicator feature for Vision Pro enhances the understanding and engagement with those around the wearer for a harmonious and collaborative environment. This will be important if we foresee a future with VR headsets seamlessly integrating into daily lives.

The post Mode Indicator for Apple Vision Pro beams visual cues of user’s current state of activity first appeared on Yanko Design.