Sony Teases Futuristic Phones and PlayStation Controllers for the Coming Years

“Ten years from now, we will be living in a more multi-layered world where physical and virtual realities overlap without boundaries,” says the entertainment and hardware company. With the number of pies they’ve got their fingers in (you remember they even announced a car a few years ago?), it makes sense for Sony to operate not one but ten steps ahead to make sure they’re leaders in every single industry they’re in. That even means condensing a home theater to a size small enough to fit around your neck). Today, the company unveiled their Creative Entertainment Vision, a demonstration or visualization of what Sony wants the future to look like. It’s a practice that a lot of companies do, helping consumers, investors, and even competitors understand what direction technology and innovation are going to go in. Sony doubles down on mixed reality and spatial entertainment in this segment (they aren’t, after all, an enterprise or productivity company), showcasing a few unique concepts that feature holographic floating screens, absurdly futuristic gaming controllers, and super-slim spectacles that transform into AR eyewear.

Designer: Sony

Somewhere around the 1-minute and 12-second mark, Sony reveals a few very interesting concepts. One of them is what looks like a futuristic PlayStation controller with its own holographic screen. The controller itself looks familiar yet nothing like any controller we’ve seen. It features a hollow center with two grips on each end. The center is supposedly where the holograph projects out of, while thumb-pads on the left and right come with unusual controls to help maneuver the game. The controller is also space-aware and can be tilted as a means of input.

Another concept was a tablet that looks like, as Apple likes to call their iPad, a slab of magic glass. Except, this does feel like glass and the experience is tantamount to magic. There are no bezels, no cameras, not even any perceived electronics. The glass is transparent when the screen is off, and translucent when you’re watching stuff, so you can still see through it.

The third is an extension of Sony’s tablet vision, but in the form of a smartphone. According to Sony, the future of phones isn’t rectangular slabs of glass, it’s capsule-shaped oval slabs of glass. I don’t know what that says for the future of videos and TikTok, but this new form of screen real estate feels unique for sure. Sony displays a music interface on this mobile device, with album art practically existing from edge to edge. The games, videos, and music in all the concepts above are connected to Sony’s hit PlayStation game Horizon Zero Dawn.

These concepts are also coupled with a set of AR glasses that completely immerse you in a virtual world. As slim as a pair of sunglasses albeit with ski goggle-style eyepieces, this concept piece offers a kind of immersion even the Vision Pro can’t promise. You’re turned into a full-body AI avatar, immersed in a virtual world that exists separate from reality. Made more for entertainment, it lets you play games, interact with people, or even see movies in a way that you never have before. Sony even previews a scene where the city’s streets are filled with ghosts and a giant Pillsbury Dough Boy trods across buildings, lifted right out of the Ghostbusters movie.

While these concepts don’t specifically confirm what Sony plans to release in the near future, it’s important to understand their ‘vision’ of what’s to come. Ideas change based on consumer feedback and technological innovation, but in an ideal world, Sony believes the future will be about crafting new and wonderful realities, and living in stories instead of watching or playing them.

The post Sony Teases Futuristic Phones and PlayStation Controllers for the Coming Years first appeared on Yanko Design.

VR controller concept for artists and designers offer a more intuitive design

The Apple Vision Pro’s take on spatial computing would have us imagine a seamless integration of the physical and digital worlds. That dream of the future is shared by virtual, augmented, and mixed reality technologies, and they almost deliver that promise when it comes to the visual aspect. The illusion, however, breaks when you start interacting with and manipulating those virtual entities, an experience that quickly becomes less natural compared to how we do it with physical objects. The problem lies in the tools we use for this, which are often a game controller or two sticks that function in the same way. This concept for a virtual reality controller tries to reshape that standard design into something that, while still technically the same, offers a more familiar form for artists and designers.

Designer: Jiwoong Yan

When you think about it, it’s almost amazing how digital creatives are able to make do with the input tools available to use in the present. At the very least, a stylus approximates the experience of drawing with a pen on paper, though some people are even able to create mind-blowing art using a keyboard and a mouse. On the one hand, it’s pretty convenient that we don’t have to deal with a dozen different pens, brushes, and other tools when creating digital art, but, at the same time, the disconnect between the tool and the desired outcome is often jarring.

This is especially true in a world that tries to have some fidelity with reality like VR. It’s even worse because it makes us believe we’re seeing virtual objects we can touch, but we can never really touch them and have to be satisfied with pointing and clicking with both hands. Medium is a concept design that offers a compelling compromise for artists and designers. It still has that same two-piece approach that puts a controller in each hand, but ones that are designed to actually mimic the tools that artists would be familiar with.

The right hand, for example, can be held either like a paintbrush or a can of spray paint, and the handle can be rotated to accommodate different ways people hold these tools. The left-hand controller, on the other hand (no pun intended), is like a painter’s palette, though it will probably show more than just colors in the virtual representation that you’ll see through VR glasses. Using these two pieces might feel intuitive for some artists familiar with painting, with the “palette” providing tools and options for the “brush” that you draw with.

Such a design is theoretically already possible with today’s technologies, but it requires a manufacturer to take the risk of actually producing a device that might appeal only to a small segment of VR users. But with these companies trying to push mixed reality and spatial computing harder, it might only be a matter of time before more specialized variants of controllers become available, at least as a stopgap measure until we can directly manipulate those virtual worlds with nothing but our hands.

The post VR controller concept for artists and designers offer a more intuitive design first appeared on Yanko Design.

The Apple Vision Pro is already playing a critical role in the Automotive, Filmmaking, and Healthcare industry

Who knew that Porsche would become the Vision Pro’s most valuable customer?!

Addressing people for the first time since the Vision Pro went on sale in March, Tim Cook decided to give viewers an update of the Vision Pro’s success during this year’s iPad keynote. Although it isn’t clear exactly how many spatial headsets the company sold so far, although the Vision Pro is surely finding its footing in certain industries beyond just the average movie-watching and multi-screen workspace scenarios that Apple sold us on back at WWDC last year when the headset was first announced.

Cook mentioned that the Vision Pro is already becoming a crucial part of Porsche’s showroom experience, with the automotive giant investing heavily in building spatial experience centers around the Vision Pro and their cars. Prospective buyers can wear the Vision Pro to easily and quickly see all the car’s color options in virtual reality instead of looking at images or swatches in a catalog. The Vision Pro’s incredibly high resolution displays help customers experience the car in ways that were never though possible, allowing Porsche to provide a new dimension to their showroom’s UX in ways that other car companies cannot. Additionally, the headset also enables track experiences, and can also be used to train service technicians, harnessing the true power of Spatial Computing. Quite like the Apple Watch eventually settled into becoming a healthcare device, even though the company originally wanted it to be a fashion-tech wearable, the Vision Pro is only now finding its footing months after its announcement and delivery.

What’s remarkable is that Apple’s Vision Pro managed to breach the filmmaking industry and the healthcare industry just months after being delivered – something that Meta hasn’t really spoken at length about when it comes to their devices, and something that Microsoft’s own Hololens has taken years to achieve (at least in the healthcare and military research industries). Cook spoke about Dr. Tommy Korn, using the Vision Pro to improve surgical eye care through simulations and visualizations, while director Jon M. Chu was using the Vision Pro to oversee the entire post-production process for his upcoming film Wicked.

While entertainment and healthcare seemed like sure shot areas where the Vision Pro would create some form of procedural disruption, seeing Porsche invest so heavily in reinventing their showroom and technical training domains by relying on Vision Pros is fascinating. It’s been just over 2 months since the first Vision Pro was delivered to customers, so one can only wait and see what updates Apple provides us with over the next few months. The 2024 WWDC will mark the first anniversary of the headset’s announcement, and maybe we’ll get a few more upgrades to the device’s software as well as some updates on its industrywide acceptance. Hopefully even a price drop, perhaps? Or maybe that’s just wishful thinking!

The post The Apple Vision Pro is already playing a critical role in the Automotive, Filmmaking, and Healthcare industry first appeared on Yanko Design.

Overlay next-gen home display wants to skim down multiple display demands with one AR Screen

Thanks to smart home technology integration, we have become dependent on digital displays for communication, infotainment, and even for preparing culinary bites in the kitchen. This means we need multiple displays: one in the family room for entertainment, another in the bathroom to read news and weather updates, one in the study for working from home, and if you’re not a Michelin chef yourself, a display in the kitchen to run the recipe guide for help!

Investing in all such displays, and still having them suffice for only one purpose at one place doesn’t really add up in 2024. If you’re in the same league of thought, Overlay – designed for Samsung – is the “Next Home Display” that wants to skim down the multiple display demand and fill it with one wholesome unit that has mobile roots.

Designers: Susanna Kim

More than an adjustable, mobile display, the Overlay is a sensor-enabled contraption that can move about the house – on preset commands or on call – to suffice a multitude of infotainment objectives and more. The onboard mapping sensors allow the mobile display to map the space and divide it into understandable zones. Users can pin the desired location on the map and set content (entertainment, information, etc) for each pin with a time. For instance, “the user” can pin the dining “table area to watch Netflix” while dining.

When it is time for dinner, the Overlay will automatically arrive at the table and turn on Netflix, as advised. The user can adjust the height of the display to their liking and watch the movie on Netflix without additional setup requirements (it has an integrated speaker system). In case you are too lazy to adjust the height, Overlay is designed to do that automatically for you. Auto height adjusting display rests on a solid base which is equipped with small radius omni-wheels for smooth maneuvering within the mapped space.

It can be teamed up with the TV or other devices in the house as an enlarged display for them. The unit can overlay more information about what you’re watching on TV or stand by the washing machine and give you a heads-up about when it’s time to take them to the dryers. It comes with an AI assistant to take voice commands and has a transparent screen to display match or player statistics when you’re watching a game of football on TV for instance.

Alongside being an omnipresent mobile display, the Overlay comes with motion sensors and LiDAR camera that provide it the ability to identify objects and perhaps provide plant and pet care tips to the user. Over and above monitoring the condition of plants in its mapped space, it can even regulate watering for personalized, automated care. All this functionality demands great power, for which the Overlay is pretty self-sufficient as well. It is powered by a rechargeable battery that it juices up by heading straight to the charging dock when the power is running low (no human intervention required here either). Capable of bringing MR experience to the home, the Overlay is designed in four distinct colors that should complement any home effortlessly.

The post Overlay next-gen home display wants to skim down multiple display demands with one AR Screen first appeared on Yanko Design.

First Time Using the Apple Vision Pro: It Blew My Mind

From the moment I set my eyes on the Apple Vision Pro, the intuitive nature of its interface struck me. Controlling the device through natural gestures—like tapping fingers together for selection or pinching to zoom—felt like an extension of my own movements, creating a seamless user experience that was truly impressive. “Everything’s all eye track,” I marveled, amazed by the device’s responsiveness, which made every interaction feel intuitive and natural.

Designer: Apple

During the initial setup, the Digital Crown—borrowed from the Apple Watch—brought up the home view with a simple press. The blend of futuristic technology with familiar elements made the icons react as I looked at them, creating a magical experience. This immediate and responsive engagement reinforced the intuitive nature of the user interface. As I explored this advanced technology, I remember thinking, “I haven’t read any reviews on the Vision Pro, and that’s a good thing.” Approaching the device without any preconceptions allowed me to truly immerse myself in the experience.

As I navigated through a demo photo library, the ambient lighting dimmed, focusing my attention on images that transported me to places like Iceland and the Oregon coast, displayed panoramically. “That was so amazing,” I exclaimed, overwhelmed by the vividness and the immersive experience the photos provided.

Viewing spatial photos and videos added incredible depth to everyday moments. Watching a family birthday party captured with the Apple Vision Pro felt as if I stood among the celebrating children, bringing these moments to life. “So no one else can see this except you and me, huh?” I remarked to Avnish, my guide through this journey, who was able to see what I saw through an iPad. This added layer of interaction enhanced my appreciation for the technology as I watched a spatial video shot with the iPhone 15 Pro, captivated by the depth and realism.

The design of the Apple Vision Pro was notably sleek and modern, with a lightweight, comfortable frame suitable for extended wear. The minimal physical buttons enhanced its streamlined appearance, highlighting its advanced gesture and eye-tracking capabilities. A dedicated button for capturing spatial photos and videos added real-world interactions into vivid digital clarity, showcasing Apple’s meticulous attention to hardware design.

Manipulating my environment with a turn of the Digital Crown was particularly impressive. I could adjust my immersion levels from partial to full, exploring digital renditions of places like Mount Hood National Forest as if I were truly there. This smooth transition back to reality, while remaining connected with those around me, showcased the device’s seamless integration into personal and social settings.

Spatial multitasking introduced a new way to interact with applications, allowing me to manipulate windows in a spatial context as if handling physical objects. This dynamic, intuitive approach transformed traditional interfaces into a vibrant, three-dimensional workspace.

The entertainment capabilities of the Apple Vision Pro were striking. Watching 3D movies like “Super Mario Bros. Movie” and “Avatar: The Way of Water” transformed any space into a personal cinema. The high-resolution display and spatial audio created a viewing experience that far surpassed traditional setups. “That’s impressive,” I remarked, blown away by the depth and immersion of the features.

The Apple Immersive Video demo was a highlight, transporting me to the center of the action—flying over landscapes, diving with sharks, and standing on a soccer field. This segment was so engaging that I was left nearly speechless, managing only to say, “That was so amazing.”

An interactive session where a butterfly landed on my hand and a close encounter with a dinosaur showcased the Apple Vision Pro’s unique capabilities, blurring the lines between digital and physical realities. These experiences felt real and tangible, enhancing my appreciation for the device’s ability to create such vivid and interactive moments.

Finally, with the Apple Vision Pro, I got the chance to rehearse—well, more like pretend—to present Apple’s infamous “one more thing” on the stage of the Steve Jobs Theater. It felt so real that I almost waved to the nonexistent crowd! I’ve been to press events there before, but never on stage. The closest I’ve gotten was the last third of the theater.

Photo credit: YouTuber MKBHD demonstrates Keynote on Vision ProMKBHD on YouTube

After the demo—and yes, I highly encourage anyone and everyone remotely interested in spatial computing to visit your local Apple store—I had the option to purchase a brand new Vision Pro constructed right there. The Solo Knit Band, Dual Loop Band, and importantly, the Light Seal are available in size 21W, which fits me perfectly. Apple has streamlined the sizing process in the Apple Store app, which now includes a 3D scan of the face for a customized fit, guiding you through capturing the necessary facial dimensions.

The Apple Vision Pro demo was a breathtaking introduction to futuristic technology that felt straight out of science fiction. Its intuitive interface and gesture controls impressed me immediately, making every interaction feel natural and fluid. While the immersive experience of exploring vibrant, distant locales and engaging with life-like spatial videos was captivating, the demo ended too soon, leaving me eager for a more extended, immersive exploration with the Vision Pro. I’m looking forward to delving deeper into its potential in a longer session.

The post First Time Using the Apple Vision Pro: It Blew My Mind first appeared on Yanko Design.

Meta Quest 3S images leak online, hinting at an even more affordable VR headset

Upscaled using AI

The Meta Quest 3 was supposed to be the cheaper alternative to the Meta Quest Pro… but now leaked photos from an internal presentation show a new device called the Meta Quest 3S, a ‘lite’ version of the already wildly popular VR headset. Sparked by user u/LuffySanKira on Reddit, screenshots supposedly from a Meta user research session offer a glimpse of the potential Quest 3s. The images showcase the rumored headset alongside the standard Quest 3, revealing some key specifications.

Designer: Meta

The Quest 3s is expected to be a more affordable version of its pricier counterpart. According to the leaks, it will feature a display resolution of 1920 x 1832 with 20 pixels per degree (PPD). This falls short of the Quest 3’s rumored 2208 x 2064 resolution and 25.5 PPD. Storage capacity is also speculated to be lower at 256GB compared to the Quest 3’s 512GB.

The leaked images provide a visual comparison as well. The Quest 3s appears slightly smaller overall, with the most noticeable difference being the front sensors. The Quest 3 has three oval cutouts, while the Quest 3s sports a configuration of six stacked cutouts, three on either side. These leaks are yet to be confirmed by Meta. However, they offer an exciting possibility for VR fans seeking a more accessible entry point into the world of virtual reality.

The post Meta Quest 3S images leak online, hinting at an even more affordable VR headset first appeared on Yanko Design.

Some Apple Vision Pros are cracking down the center. To understand why, look at the shape of the KitKat bar.

Feel free to call it the “Spatial KitKat Hypothesis”…

The Apple Vision Pro wasn’t designed to bend, but when you’ve got two straps pulling on the massive headset from either side with a human head in the middle acting as a wedge of sorts, the headset’s bound to feel some stress at its weakest point. Theoretically, that weakest point lies at the nose bridge, or the narrowest part of the Vision Pro’s design. If you imagine the Vision Pro to be a massive KitKat, or a Toblerone, or any bar of chocolate for that matter, it’s usually the narrowest part that’s designed to snap, resulting in a perfectly broken piece of chocolate. The problem here, however, is that this particular ‘chocolate’ is a cutting-edge spatial computer that costs upwards of $3500.

It seems like Apple products don’t really have great luck when it comes to structural soundness. If you remember exactly 10 years ago, #BendGate was plaguing the 2014 iPhone 6, a scandal that arose after people found their iPhones bending in their pockets when they sat down. Sure, Apple worked hard to fix the iPhone 6’s flimsy design (in part because people were just walking into Apple Stores and folding iPhones in half), but #bendgate still lives on in infamy, especially through its latest avatar, or what people are calling #CrackGate. Multiple users are reporting that the Vision Pro’s glass is cracking almost perfectly down the center, for no apparent reason. The crack runs almost perfectly symmetrically, going from the nose upwards, causing an extremely visible fault line right down the center. While it doesn’t seem to affect the Vision Pro’s actual functionality, it’s just like getting a scratch right down the side of your Lamborghini, emotionally gut-wrenching.

The reason, however, isn’t really clear (in part because Apple hasn’t officially addressed the issue or offered repairs), but multiple users have their own theories. Some sleuths noticed that the crack almost always emerges right near the LiDAR sensor, causing speculations that the invisible light from the sensor may be weakening the glass. Others claim it’s a heat-related issue, caused by the fact that most people don’t turn their Vision Pros off after using it for the day, causing it to heat up and the aluminum frame to expand, cracking the glass. The latter theory makes much more sense than the former, but there’s yet another issue that could just contribute to the glass’ structural weakness, and its most simple explanation lies in the shape of a KitKat bar.

Unless you’re an absolute psychopath who chomps right into the KitKat bar, chances are you follow the protocol of breaking it down its linear ridge, creating individual KitKat fingers that you can easily eat. The bar’s practically designed for this interaction, allowing you to snap off individual ‘batons’ that you can either share or eat on your own. The physics behind this design is as simple as it gets. The individual fingers are connected by a small valley of chocolate, which can easily be snapped with little pressure. The reason the KitKat always breaks at this ridge is because it’s easily the most vulnerable part of the chocolate bar. Similarly, the Vision Pro has the same problem. The ‘nose bridge’ on the front is where the Vision Pro’s glass panel is at its narrowest. Apply enough stress to the area and chances are, just like a KitKat bar, it’ll break there first.

Heat could be a contributing factor to this structural weakness, but let’s not forget, the Vision Pro comes with a headband that’s secured to its sides. Wear the Vision Pro on your face, and the headbands tug on the headset from the left and right, while your face being the solid mass it is, applies forward pressure. The rest of the Vision Pro is made from Aluminum, a material famous for being able to bend easily (no points for guessing what the iPhone 6 was made of), but glass – especially curved and hardened glass like the one on the front of the Vision Pro – isn’t really susceptible to bending. The result? A crack at its weakest point, caused by people wearing the headset too tight, coupled with the obvious heat issues because people don’t turn their Vision Pro off every night.

There are two solutions to this problem – the first comes from MKBHD and a bunch of other tech experts, who recommend turning the Vision Pro off after use and disconnecting the battery pack every night so the headset doesn’t keep running and heat up. The second solution is much more obvious, and is an indication of Apple’s hubris. In the pursuit of creating ‘the greatest spatial device ever seen’, Apple’s premium choice of materials is biting them in their backside. The Vision Pro’s aluminum structure is notoriously heavy, causing neck fatigue for people wearing it for long hours… but more importantly, the use of glass on the front seemed highly unnecessary. A well-polished plastic facade on the front would have worked just as fine, even if it didn’t line up with Apple’s ‘luxury’ image. It would have been stronger, easier to produce, and would probably have helped Apple cut costs and boost profits – to the benefit of the consumer. Instead, Apple’s being predictably silent while multiple users are fuming at the prospect of having a difficult-to-ignore crack on their rather expensive $3500 headset.

The post Some Apple Vision Pros are cracking down the center. To understand why, look at the shape of the KitKat bar. first appeared on Yanko Design.

Honda UNI-ONE wheelchair finds innovative use in VR worlds as extended reality mobility experience

Honda introduced the UNI-ONE personal mobility chair for people with lower limb immobilization at the end of 2022. The Segway-like version for people who want an advanced electric wheelchair contraption with flexible movement capabilities will officially debut at South by Southwest (SXSW) in Austin, Texas, next month, with a VR application twist.

The Japanese automaker will leverage the self-balancing personal mobility device (mostly intended for the disabled) for a seamless virtual reality world, which they are calling the “Honda Extended Reality (XR)” experience. The idea of fusing the real-world riding on the UNI-ONE with the virtual world environment sounds like a winning proposition, and Honda doesn’t want to let go of the opportunity.

Designer: Honda

The SXSW attendees will get the opportunity to get first-hand exposure to this unique VR experience from 10-13 March at Honda’s booth #729 at the SXSW Creative Industries Expo at the Austin Convention Center. This amalgam of two different technologies is directed towards solving the hardware limitation of a comprehensive metaverse reality that is otherwise only limited to the visual input and confined to a limited space. According to Hirokazu Hara, vice president of New Business Development, at American Honda Motor this will expand the “joy and freedom of personal mobility into entertainment applications.”

Hirokazu further added that the never-before thought of combination will elevate the multimodal immersive experience three-fold. The self-balancing tech dubbed Honda Omni Traction Drive System (HOT Drive System) and the advanced sensors on the 154 pounds UNI-ONE (permitting movement and tilt in any direction) will leverage a new VR and AR entertainment. This will shoot the extended reality technology and application development possibilities to another level, inducing the interest of early adopters more than ever before.

For instance, racing through a track on a virtual planet with lesser gravity than on Earth will be possible on a hands-free device capable of going at a top speed of 3.7 mph. The rig will combine the visual input from a VR headset and the freedom of movement to make the user feel as if racing on a real track in an alien landscape. The fact that Honda is vesting so much interest in this possibility with the UNI-ONE speaks a lot about how the future is going to pan out in the Metaverse world. According to Honda the extended reality (XR) technology will be perfect for malls, theme parks, or any other indoor or outdoor entertainment hubs with a lot of open space to move around.

The post Honda UNI-ONE wheelchair finds innovative use in VR worlds as extended reality mobility experience first appeared on Yanko Design.

World’s First AR Glasses that correct Partial Retinal Blindness: Hands-on with Eyecane AR at MWC 2024

I’ve always said that great technology doesn’t cater only to the needs of the dominant 95%, it also factors in the needs of the often neglected 5%. To that end, AR technology is great, but it hasn’t been applied in a way that benefits the 5th percentile – and Cellico wants to change that. The medical-tech company unveiled the Eyecane AR glasses at MWC, the world’s first augmented reality device designed to correct age-related macular degeneration (AMD).

Designed to look and feel like your standard sunglasses, the Eyecane AR helps people with retinal disease see clearly. A 4K camera at the center of the glasses records the world, feeding media into a tiny projected display within the Eyecane AR’s lenses. AMD causes blind spots within people’s vision, but the Eyecane AR’s cameras help fill in those blind spots with digitally captured imagery in real time, helping people see fully and clearly again.

Designer: Cellico

A disease that affects as many as 1 in 200 people by the time they reach 60, going up as high as 1 in 5 people by the time they hit their 90s. The affliction, caused by the degeneration of the macula (the central part of the retina) results in blurry or sometimes even no vision in the center of your eye. Think of a large black dot in otherwise relatively clear vision. Given that a lot of the important things we see find themselves in this central zone, people with AMD can have a tough time looking at objects, identifying people, and navigating scenarios. Cellico’s solution is incredibly simple – have a camera capture whatever is in that gap, and display it in the corner of your eye, where you can still see things relatively clearly. Creating somewhat of a picture-in-picture effect, Eyecane AR allows people with AMD to regain vision in their macular region simply by having a camera capture it and display it in another part of their field of view.

A snap-on sunshade helps people see clearly in bright settings too

By harnessing the power of a compact 4K 20MP camera seamlessly integrated into smart glasses and complemented by an intuitive mobile app, Eyecane AR captures and processes real-time images with precision, even applying optical image stabilization. These images are then projected onto an augmented reality display in Full-HD, effectively shifting central vision to the peripheral field of view. This groundbreaking approach not only restores clarity but also rekindles independence for those navigating the challenges of AMD.

simulates how people with AMD perceive the world, and how the Eyecane AR can help fill in the gap with a PIP on the left side.

Moving the PIP to the center of the screen shows what images would look like for people with regular vision.

The beauty of the Eyecane AR lies in the fact that it can be used right out of the box without a hospital visit. The Eyecane app has a built-in scotometry program that analyzes your vision for you, pinpointing the blind spot or the problematic area in your vision. The app then helps the AR glasses’ camera calibrate and focus on that region, capturing the image and displaying it in a corner of your peripheral vision. The entire process takes mere minutes, and helps quickly restore macular vision simply by relying on the inherent properties of augmented reality displays!

Key Features of Eyecane AR:

  • 4K camera with Optical Image Stabilization (OIS) embedded in AR glasses
  • Mobile app featuring a customized image-processing engine
  • Full HD Reflective Freeform crystal Lens offering a Field of View (FoV) of 40°
  • Voice control functionality for seamless user interaction
  • Electric Auto Sunshade coated with an LC film, ensuring optimal visual comfort in diverse lighting conditions
  • Lightweight construction, crafted from Titanium and Ultem materials, prioritizing comfort and wearability

The post World’s First AR Glasses that correct Partial Retinal Blindness: Hands-on with Eyecane AR at MWC 2024 first appeared on Yanko Design.

If the Apple Vision Pro and the Google Glass had a baby, these AR glasses would be it…

Showcasing their tech at the Mobile World Congress, Everysight is riding the new AR (or should we call it Spatial) wave with the Maverick, their small, sleek, and stylish glasses that challenge the bulky mixed reality glasses and headsets we see today with something so close to regular glasses, you wouldn’t be able to tell the difference. Designed with a projected display that lets you see data and metric overlaid on the existing world, the Maverick uses an entire slew of sensors to track position, orientation, and head-tilt to ensure that digital elements remain in your line of sight (LOS) and correctly oriented. In fact, the Maverick even bagged multiple awards last year, including the iF Design Award and the Red Dot Award.

Designer: Everysight

Unlike most AR headsets that can make you look a little dystopian when walking down the streets or sitting in a subway (we’re looking at you, Vision Pro wearers), the Maverick stands at the intersection of great tech and fashion. Weighing in at under 47 grams, these glasses boast a sporty, ergonomic frame that promises comfort for all-day wear, a crucial consideration for devices intended to be part of our daily lives. This comfort does not come at the expense of durability or style, making them a versatile accessory suitable for any occasion.

AR glasses are only as good as their displays (something that most Vision Pro users will swiftly point out – which is why the Maverick impresses with its crisp, high-contrast visuals despite its tiny package. Utilizing a Sony Color microOLED display, it delivers stunning visuals characterized by vibrant colors and sharp details. The high-brightness display guarantees an optimal viewing experience in both indoor and outdoor settings, a testament to the glasses’ adaptability and user-centric design.

Ease of use is at the forefront of the Maverick design, with an intuitive interface that allows users to navigate and control features through simple gestures. This user-friendly approach is further enhanced by advanced sensors like a 3D accelerometer, gyro, and magnetometer, which provide accurate line-of-sight tracking for an immersive augmented reality experience.

Battery life is a perennial concern for wearable technologies, and here, Maverick impresses with over 8 hours of continuous operation. This endurance is complemented by efficient power management, ensuring that the glasses support a day’s worth of activities without needing a recharge. Such longevity is essential for users who demand reliability from their smart devices.

User interface and interaction are streamlined for ease of use. Maverick features an intuitive interface that allows for effortless navigation and control through simple gestures. This ease of use is further enhanced by the inclusion of a 3D accelerometer, gyro, and magnetometer, providing accurate line-of-sight tracking that enriches the augmented reality experience by aligning virtual objects with the real world seamlessly. Moreover, the Maverick glasses are designed with inclusivity in mind. They offer an RX solution with personalized lenses tailored to individual prescriptions, ensuring that users with varying visual needs can enjoy the benefits of smart eyewear without compromise.

Connectivity is robust, with Bluetooth 5.2 ensuring seamless pairing with a wide range of devices, including iOS and Android smartphones, as well as Apple Watch and Android Wear. This connectivity underpins the Maverick’s versatility, making it a central hub for notifications and digital interactions on the go.

In the box, users will find everything needed to start their journey with Maverick: tinted removable visors, a charging cable, a carrying case, a pouch, a cleaning cloth, and interchangeable nose pieces. Everysight is selling a developer edition of the Maverick for $399, although it’s unclear when the public rollout will begin, and what the price will be for regular consumers.

Everysight’s Maverick glasses represent a significant advancement in smart eyewear, proving that it’s possible to stay stylish while benefiting from the latest in wearable technology. They set a new benchmark for combining practicality with elegance, ensuring users can stay connected in a visually compelling, productive, and convenient manner. If these glasses could brew coffee, we might never find a reason to take them off.

The post If the Apple Vision Pro and the Google Glass had a baby, these AR glasses would be it… first appeared on Yanko Design.