Xreal Air 2 Ultra hands-on at CES 2024: Next-gen AR glasses in need of killer apps

Even though Apple didn't have an official presence at CES 2024 in Las Vegas, the trade show still had a whole area dominated by mixed reality tech. One of the most popular booths there was none other than Xreal (formerly Nreal), which decided to ride on the Apple Vision Pro hype train and unveil its latest AR glasses, the Xreal Air 2 Ultra, in Las Vegas. The Chinese firm claims that its latest headset makes "an affordable alternative to" the likes of the $3,499 Vision Pro, though it's currently priced at $699 — a tad more than the $499 Meta Quest 3 — as Xreal attempts to lure developers into its ecosystem.

Unlike the rest of the Xreal Air 2 series, the Air 2 Ultra finally brings back 6DoF (six-degree-of-freedom) tracking — a first since the Nreal Light. In other words, you can physically walk around a virtual space, rather than being stuck in one spot. The 6DoF tracking is mainly handled by the two front-facing 3D environment sensors which, according to Xreal founder and CEO Chi Xu, are an advancement over prior models, and are less physically obtrusive compared to the ones on the Ray-Ban Meta smart glasses. As with the Light, the Air 2 Ultra also supports hand tracking for interacting with virtual objects directly.

There's also a slight upgrade in the display department, featuring a wider 52-degree field of view — up from 46 degrees on the Air 2 and Air 2 Pro. It should otherwise be the same Sony micro OLED panels with a crisp 1080p resolution for each eye, along with a refresh rate of up to 120Hz and a brightness of up to 500 nits. Likewise, the Air 2 Ultra inherited the electrochromic dimming feature from the Pro, which offers two shade levels to minimize distraction from the outside world.

Given the extra hardware for 6DoF and hand tracking, it's no surprise that the Air 2 Ultra weighs slightly more than the Air 2 Pro — 80 grams versus 75 grams, but it still looks like a pair of regular sunglasses, and I definitely wouldn't mind wearing them in public. Xreal did its best to minimize the weight gain using a titanium frame, and obviously, these glasses still require an external power source — a smartphone or a PC — via USB-C. I had a total hands-on time of around 20 minutes, and at no point did I feel any discomfort, though there's no telling if that would be the case if I kept on going for the rest of the day.

A screenshot of the Xreal Air 2 Ultra's mixed reality from a third-person perspective.
Photo by Joel Chokkattu / Engadget

Given the lack of third-party apps at the moment (I do miss the Angry Birds demo on the Nreal Light), Xreal could only offer a concept demo to show off the Air 2 Ultra's 6DoF experience. This mainly involved a massive virtual desktop showing multimedia players, or a social media window showing the latest messages from my made-up friends, or a personal 3D cinema with a library of three movies. To the left, there was a vertical slider for changing colors on a smart light bar in the real world. I was also given three AR hexagonal discs: one for toggling between work profile and casual profile, one for switching between the casual modes (contacts, social and movie) and one for displaying a virtual pet.

The overall room tracking worked smoothly, even as I walked up to the large 3D avatars on my right, but things got a little trickier when it came to hand interaction. The hand tracking alone seemed fine (at least according to the skeletons rendered over my hands), but I struggled to pinch the light bar's color slider — it ended up at the wrong hue on several occasions. The interaction with the AR cardboard discs was also laggy at times, though I did enjoy being able to bring my virtual pet on one of the discs up close — I couldn't pick a favorite between the dung beetle and the fiddler crab.

Xreal Air 2 Ultra
Photo by Joel Chokkattu / Engadget

In response to the technical issues I ran into, Xreal's Xu pointed out that his team had been experiencing the same since the show floor opened. Our demo unit also crashed once, but it was fine after a reboot on the smartphone — an Oppo Find X5, which got worryingly warm right before it gave up. This goes to show that the Air 2 Ultra performance is only as good as the computational device it's attached to.

Speaking of which, Xreal says these glasses are also compatible with the Samsung Galaxy S22 and S23, along with Apple's iPhone 15 and any Windows or Mac machine that can run the company's Nebula environment. Xreal is also developing a dedicated companion device to go with the Air 2 Ultra, though there's no word on specs nor time frame.

Xreal Air 2 Ultra
Photo by Richard Lai / Engadget

Considering the show floor hiccups, it's only fair to revisit the Xreal Air 2 Ultra in a less chaotic environment later — especially when more developers are on board after it starts shipping in March. Still, we wouldn't go as far as agreeing with Xreal's implication that its AR glasses can totally replace Apple's Vision Pro, as only the latter — and any VR headset, for that matter — is able to offer a completely immersive experience. It'll ultimately boil down to the range of apps on each platform, but if you're looking for something that you wouldn't mind wearing for prolonged periods, then the Air 2 Ultra would most likely be the better choice.

This article originally appeared on Engadget at https://www.engadget.com/xreal-air-2-ultra-hands-on-next-gen-ar-glasses-in-need-of-killer-apps-203943588.html?src=rss

MouthPad turns your tongue into a mouse for your phone

You can one day use your tongue as a mouse for your laptop, tablet or phone, thanks to a new product that made its first public appearance at CES 2024 in Las Vegas. The MouthPad (an obvious spin on the word "mousepad") is what its makers call a tongue-operated touchpad that "sits at the roof of your mouth" and can be connected to your devices just like a standard Bluetooth mouse. I got to see the MouthPad here in Las Vegas, where it's making its first public appearance since its announcement last year, though, to be clear, I did not put it in my mouth to try out for myself. Instead, I watched as the company's co-founder Tomás Vega used the device to navigate an iPhone and open the camera as we took a selfie together. 

The MouthPad is basically like a retainer with a touchpad, battery and Bluetooth radio built in. It's made of a resin that the company says is the same "dental-grade material that is commonly used in dental aligners, bite guards and other oral appliances." The device's battery was made by a company called Varta, which MouthPad's makers also said has "a long track record of producing safe, medical implant-grade batteries." All this is to say that while it can feel strange to put a battery-powered electrical device in your mouth, at least it might be reassuring to know that this uses technology that has existed in the oral health industry for a long time.

I watched Vega place the 7.5-gram mouthpiece right on his palette, where it sat surrounded by his upper teeth. He closed his mouth and the iPhone he held up showed a cursor moving around as he opened apps and menus. I asked him to open up the camera and he obliged, and we took a selfie. This was evidently not a pre-recorded demo paired with good acting. 

The MouthPad, a tongue-operate controller, held up in mid-air. It's a clear dental tray with an orange touchpad in the middle and some circuitry throughout.
Photo by Cherlynn Low / Engadget

Now, because I didn't try it myself, I can't tell you if it's comfortable or easy to use. But the specs sheet states that the MouthPad is about 0.7mm (0.027 inches) thick, apart from where there are capsules, while the touchpad itself on the roof of the mouth is 5mm (0.19 inches) thick. From what I saw, it didn't look much bulkier than my own retainers, and when Vegas smiled after putting the MouthPad on, I could only really see one small black piece on top of one of his teeth. 

You'll have to take out the MouthPad when you're eating, but you can speak while it's in your mouth. You might have a slight lisp the way you would with regular retainers, but I could understand Vega perfectly. The company said that the device currently lasts about five hours on a charge, though the team is working on improving that to eight hours by March. Recharging the device takes about an hour and a half, though Vega and his team said that, of the 30ish people that currently have a MouthPad, most users tend to charge theirs when they're eating and rarely seem to run out of juice.

The company explained that the MouthPad uses Apple's Assistive Touch feature to navigate iOS, but it can be recognized by other devices as a Bluetooth mouse. It's already on sale for those who sign up for early access, but general availability is coming later this year. Each MouthPad is individually 3D-printed, based on dental impressions sent in by customers as part of the ordering process. Early access users will also receive assistance from the company during setup and calibration, as well as throughout their use of the device.

Close up on the smile of a person wearing the MouthPad 2024. Some lines indicate the person has a clear tray over their teeth, while a small piece of equipment is on top of a tooth on the right side of their mouth.
Photo by Cherlynn Low / Engadget

Tongue-operated controllers are not new, but MouthPad is one of the more elegant and sophisticated options to date. It also works with a wide variety of devices and seems far enough along in the production process to be ready for sale. Whether the human tongue is a suitable organ for computer interactions, however, is something we can only determine after longterm use in the real world. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/the-mouthpad-turns-your-tongue-into-a-mouse-for-your-phone-184541021.html?src=rss

The Morning After: I drove in Sony Honda’s EV simulator

Since Sony Honda Mobility revealed its EV collaboration at last year’s CES, not much has changed externally. The Afeela EV does now have a LIDAR notch above the windshield, and there’s been some design refinements and tweaks, but for CES 2024, the company was trying to express exactly how all of Sony’s entertainment and sensor expertise would combine with Honda’s automotive know-how, and why we should care about its high-tech EV.

The Afeela will create its own noise cancellation bubble, apparently “tailoring the cabin for entertainment” using Sony’s Spatial Audio technology. According to SHM’s renderings, there appear to be roughly 30 speakers. These were put to use in one of the most surreal experiences I’ve had at CES: playing Horizon Forbidden West inside a car.

No, there isn’t a PS5 baked into this concept EV, but a demonstration involving PlayStation’s long-running Remote Play feature. Sure, the Bluetooth connection to the controller was temperamental (CES is just hundreds of Bluetooth and Wi-Fi signals clashing), but conceptually, it works. I also got to “test-drive” the Afeela through a simulator in a realistic computer-generated city, courtesy of Epic Games’ Unreal Engine 5, with even the digital wing mirrors reflecting what they would see in real life. The simulator’s dash display then offered an AR overlay, showing vehicles, objects and pedestrians, flagging nearby hazards in red.

I’m sure many wonder if SHM’s EV will ever exist as a consumer vehicle, but at CES, it’s found the perfect audience for its gadget-packed car.

— Mat Smith

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!​​

The biggest stories you might have missed

GyroGlove is a hand-stabilizing glove for people with tremors

The best budget gaming laptops

Walmart makes a rare CES appearance to promote AI-powered shopping

Hertz is selling 20,000 EVs and replacing them with gas-powered vehicles

‘Teach’ your dog to ‘play’ this ‘piano’

CES is all about pet tech.

TMA
Engadget

At CES, a startup showed off TheButter, a four-key instrument with light-up pads your dog can “play.” Your pet has to follow along the sequence of lights, each one triggering another few notes of whatever song you’ve equipped it with. Once done, you should reward their effort with a treat or some other form of encouragement — no, it’s not automated. TheButter is now available to buy in the US for $99, and you’ll also get the companion app to set your dog’s training routine.

Continue reading.

What to expect from Samsung Unpacked 2024, including the Galaxy S24

CES may soon be over, but…

TMA
Engadget

Samsung is running its first event of the year a little earlier than usual. It will start on January 17 at 1PM ET. We’re expecting the company to unveil its Galaxy S24 smartphone family and possibly a few more gadgets. Fortunately, thanks to leaks, we have a good idea of what to expect from the company’s latest smartphones.

Continue reading.

Next-gen MEMS ultrasonic solid-state earbud drivers will deliver the bass

We’ve heard the difference at CES 2024.

While MEMS drivers may be the next big thing in true wireless earbuds, the first models with the solid-state components still require a hybrid setup. These products pair a MEMS speaker with a dynamic driver. The current-gen driver from xMEMS, a California-based company that develops audio components, is called Cowell, and it’s already available in earbuds from the likes of Creative. Its new driver will arrive in products in 2025. We’ve tested them out, and you can really hear the difference.

Continue reading.

25 gadgets from CES 2024 that you can buy right now

Available, now.

In a rare twist for anyone that's followed product announcements at tech shows, a lot of the tech at CES 2024 is actually available to buy already. While it may not be a Sony Honda EV or a transparent TV, some of the latest monitors, headphones and more are already up for grabs.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-i-drove-in-sony-hondas-ev-simulator-181231086.html?src=rss

Google removes ‘underutilized’ Assistant features to focus on ‘quality and reliability’

Google has announced that it will eliminate at least 17 features from its Assistant product, following news that it had laid off "hundreds" of employees from the division. The company is cutting "underutilized features" to "focus on quality and reliability, it wrote in a blog post, even though a good number of people may still rely on those functions.

"Beginning on January 26, when you ask for one of these features, you may get a notification that it won't be available after a certain date," wrote Google Assistant VP Duke Dukellis. 

The company didn't specify how removing certain commands will improve Assistant, nor did it describe any specific quality and reliability problems. It did say, though, that improvements in the past were aided by user feedback, so it may have been receiving complaints about Assistant's core usability of late.

The 17 functions being removed include: accessing or managing your cookbook; using your voice to send an email, video or audio message; rescheduling events in Google Calendar with your voice; and using App Launcher in Google Assistant driving mode on Google Maps to read and send messages, make calls, and control media. It also describes what Assistant can still do related to those functions, or alternate ways of doing them. A list is here, though Google said they're just "some" of the affected features.

The company is also changing the way Assistant works on your phone. The microphone icon in the Google search bar will no longer pull up Assistant, but merely start a Google voice search, "which is its most popular use case," Dukellis wrote. The "Hey Google" hot word and power button long-press will continue to activate Assistant as before. 

After laying off 12,000 people last year, Google said it planned to focus on AI in the future, so it's interesting that one of its early AI products is being pruned. Earlier today, Google confirmed that it had laid off hundreds of people from at least three divisions, including Assistant, hardware devices and core engineering. 

At its October Pixel 8 event, the company announced plans to launch Assistant with Bard, a version that generates personalized answers based on events, dates and conversations stored on your phone. However, Google didn't say if that version has anything to do with cutbacks in current Assistant functionality. 

This article originally appeared on Engadget at https://www.engadget.com/google-removes-underutilized-assistant-features-to-focus-on-quality-and-reliability-141141513.html?src=rss

What to expect from Samsung Unpacked 2024, including the Galaxy S24 smartphone launch

CES 2024 is in the books and that means the tech world can kick back and re— oh, wait, there's the small matter of a Samsung Unpacked on the horizon. Samsung is running its first event of the year a little earlier than usual. It will start on January 17 at 1PM ET. However, barring a major shock, Samsung will unveil its Galaxy S24 smartphones.

Samsung Galaxy S24 lineup

As is always the case, the rumor mill has been churning for weeks when it comes to Samsung’s Galaxy S24 smartphones. Thanks to leaker Evan Blass, who claimed to have obtained a spec sheet for all three of the devices, we have a decent idea of what Samsung has in store for the Galaxy S24, S24+ and S24 Ultra. As in years past, Samsung has a "reserve" page up now on its site as well for those who want to be first in line to buy the latest smartphones.

It’s likely to be another year of iterative changes on the hardware front. There will very likely be spec bumps to most of the components and the S24 devices will probably offer faster and more efficient performance than their predecessors. However, you shouldn't anticipate having a wildly different looking phone if you tend to upgrade to the latest handset every year or two... except in the case of the Galaxy S24 Ultra, which is slated to have a flat display and a titanium frame.

The most important hardware upgrade is arguably in the engine room. Samsung is expected to employ the Qualcomm Snapdragon 8 Gen 3 chipset. That’s significant given Qualcomm’s efforts to support on-device AI operations with its CPUs and Samsung’s recent work in the generative AI space.

To that end, the biggest change to the Galaxy lineup this year is likely to come in the form of AI features. Samsung recently unveiled its own generative AI models, which can handle tasks such as translations, summarizing documents, drafting emails, helping out with coding and, yes, whipping up images based on text prompts. 

It's widely believed that Samsung's Gauss generative AI tech will make its public debut in the Galaxy S24 smartphones, and it's likely to be labeled as Galaxy AI. The company has been hinting at some of the AI updates, such as with this tease of a feature called Zoom with Galaxy AI.

Everything else: Generative AI, fitness trackers and laptops

The new smartphones will undoubtedly be the star of the show and Samsung will probably spend quite a bit of time going over the generative AI functions. That might not leave much bandwidth for other announcements. There is a chance that we might see the Galaxy Fit 3 fitness tracker, according to some rumormongers. The event may mark the release of One UI 6.1 for Galaxy devices too.

Beyond that, there have been suggestions that Samsung will show off several Galaxy Book Pro laptops, while there's also a possibility the company will unveil new tablets, smartwatches and earbuds. We'll find out soon enough just what the company has planned for its first mass market devices of 2024.

This article originally appeared on Engadget at https://www.engadget.com/what-to-expect-from-samsung-unpacked-2024-including-the-galaxy-s24-smartphone-launch-140010394.html?src=rss

Leak suggests Sony may soon offer a DualSense V2 controller with 12 hours of battery

Sony might have an updated PlayStation 5 controller available soon. GamesRadar+ spotted a brand new V2 DualSense Wireless Controller on Best Buy's Canadian online shop, and anyone who is sick of finding their wireless controller dead when gaming is in for a treat. The device is listed as having 12 hours of battery life — up from a maximum of five hours in its current iteration. 

Apart from the major boost in battery life, the listed Sony V2 DualSense Wireless Controller is pretty much a mirror of its predecessor. It has a headset jack, built-in microphone, and haptic triggers across its rear and face. It's listed for 90 CAD (about 67 USD), almost identical to the previously released V2 DualSense Wireless Controller's $69 retail price. Though the 12-hour model is visible on the website, it's not actually available for purchase, so it's unclear when (or even if) Sony will release it. 

Interestingly, a patent filed by Sony in November 2023 described a new controller outfitted with a touchscreen instead of a touchpad. It also potentially employs predictive AI assistance to light up certain buttons, analog sticks and shoulder triggers as hints for gameplay. Just like the controller currently sitting on Best Buy's website, this one's fate is up in the air. 

This article originally appeared on Engadget at https://www.engadget.com/leak-suggests-sony-may-soon-offer-a-dualsense-v2-controller-with-12-hours-of-battery-115527945.html?src=rss

SpaceX and T-Mobile send the first text messages from orbiting Starlink satellites

SpaceX sent and received its first text messages sent via T-Mobile using its D2D (direct-to-device) Starlink satellites launched just over a week ago, the company announced. First revealed in August 2022, the project aims to provide satellite internet connectivity to regular cell phones so that T-Mobile customers can stay online even when they're in a terrestrial dead zone. 

T-Mobile said that it aims to publicly launch text services with T-Mobile in 2024, with voice, data and IoT (internet of things) plans coming in 2025. Globally, SpaceX has partnered with Rogers in Canada, Australia's Optus, KDDI in Japan and others. 

The scheme requires larger, special versions of the Starlink satellites with D2D capability. SpaceX launched the first six of those on January 2, completing early tests with no issues. "On Monday, January 8, less than 6 days after launch, we sent and received our first text messages to and from unmodified cell phones on the ground to our new satellites in space using TMobile network spectrum... [indicating that] the system works," SpaceX wrote in a blog post. 

Starlink Direct-to-digital (D2D) satellite
SpaceX

When the plan was announced, T-Mobile CEO Mike Sievert said the technology is like putting a cellular tower in the sky. He added that it could one day eliminate dead zones, allowing people to easily get in touch with loved ones even if they're in the middle of the ocean. 

SpaceX said that the system, which uses LTE/4G (not 5G protocols) is a bit more complicated than cell towers in the sky, though. Since the satellites move at tens of thousands of miles per hour relative to the Earth, data must be handed off seamlessly between them. Doppler shift, timing delays and the relatively low transmission power of smartphones must also be accounted for. 

The two companies aren't the first to test such a system. Working with communications specialist AST SpaceMobile, AT&T successfully conducted the first two-way satellite audio call on its network in April, calling a number in Japan with a stock Samsung Galaxy S22 smartphone. AT&T also complained to the FCC that SpaceX and T-Mobile's plan was "woefully insufficient" regarding the risk of harmful interference to ground-based networks. 

This article originally appeared on Engadget at https://www.engadget.com/spacex-and-t-mobile-send-the-first-text-messages-from-orbiting-starlink-satellites-103526219.html?src=rss

Everything you missed at CES 2024 Day 2 on the show floor in Las Vegas: AI, trending gadgets and more

The show floor at CES 2024 opened on Tuesday, and people have been racking up their steps, canvassing Las Vegas’ vast convention centers and hotel ballrooms to see all the latest and weirdest tech products. The Engadget team has been getting our cardio in, braving both vehicular and human traffic to get face and hand time (and other body parts?) with the most intriguing demos here, while companies haven’t stopped holding press conferences and announcing new items. If you don’t have time to parse through every individual headline or are here in Vegas and want to know where to go, here’s a recap of the biggest news out of CES 2024’s second day.

One of the biggest booths at the show is, as usual, Google, and the company also had a fair amount of news to share. In keeping with the same theme it’s been doing the last few years of “Better Together,” Google shared updates to its inter-device software like Fast Pair and announced it’s working with Samsung to integrate and rename its Nearby Share feature to Quick Share, which is the current name of Samsung’s version of the same thing. This should hopefully simplify things for Android users, and give them a more cohesive alternative to Apple’s AirDrop. Details were pretty scarce on whether there are changes coming to Samsung users, but those who have Nearby Share should see a new icon pretty soon. 

Google also added support for people to Chromecast TikTok videos to compatible TVs and screens and is bringing its apps to some Ford, Nissan and Lincoln vehicles later this year. Android Auto will also be able to share your electric vehicle’s battery levels to Google Maps so it can factor in recharge stations, charge times and stops into your routes. This is, again, similar to a feature in Apple's next-gen CarPlay.

Speaking of EVs, Honda also debuted new EV concepts called the Saloon and the Space Hub. The Saloon is a sedan with an aerodynamic design and rides low to the ground, while the Space Hub is a minivan that is a little boxier and its seats has its passengers facing each other. Honda said it will develop a model based on the Saloon concept car for North American markets in 2026, with no word yet on the Space Hub.

In other transportation news, Hyundai brought an updated version of its S-A2 Air Taxi to the show. The S-A2 is an electric vertical take off and landing vehicle that has a cruising speed of 120mph when it reaches an altitude of 1,500 feet. It’s designed to fly short trips between 25 to 40 miles and the company envisions it as an everyday transportation solution for urban areas.

We also got more smart home news from companies other than Google, including Amazon, which said it will adopt the Matter standard for Casting, but it won’t support Chromecast or Apple’s AirPlay. How nice. We also saw new face-scanning and palm-reading door locks, smart outdoor lights by Nanoleaf and a new Weber Searwood smart grill that’s cheaper and more versatile.

There has been a smattering of mobile news, including the Clicks iPhone keyboard case and a surprising, adorable device called the Rabbit R1. It’s pitched as an AI-powered assistant in what’s basically a cute squarish walkie-talkie co-designed by Teenage Engineering. It has a tiny 2.88-inch touchscreen, an analog scroll wheel, two mics, a speaker and a 360-degree camera you can spin to face toward you or through the back of the handset. You’re supposed to talk to the Rabbit AI by pushing down a button (like a walkie talkie) and ask it to do anything like book an Uber or look for a recipe tailored to your specific list of ingredients.

There’s been a lot more at the show, but I wanted to take some time to shout out a bunch of intriguing accessibility products. We saw the OrCam Hear system that’s designed to help people with hearing loss isolate the voices of specific speakers in crowded environments. There’s also the GyroGlove, which is a hand-stabilizing glove for people with hand tremors, as well as the Mouthpad, which lets you control your phone, tablet or laptop by using your tongue.

We also saw an update to the Audio Radar system that provides visual cues for gamers who are hard of hearing to see where sounds are coming from and what type of sounds they might be. It’s very heartening to see all this development in assistive technology at CES, especially when the industry often spends so much time and money on less-worthy endeavors.

We’re nearing the end of the show and as we get ready to do our final sweeps of the show floor, the Engadget team is also looking back and contemplating the best things we saw at CES 2024. We’ll be putting together our Best of CES awards list soon, so make sure you come back to see what we decided were the winners of the show.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/ces-2024-day-2-recap-a-wild-rabbit-gadget-appears-while-google-offers-its-own-take-on-apple-software-tricks-022245111.html?src=rss

Next-gen MEMS ultrasonic solid-state earbud drivers will deliver the bass

While MEMS drivers may be the next big thing in true wireless earbuds, the first models with the solid-state components still require a hybrid setup. These products pair a MEMS speaker with a dynamic driver to ensure proper bass performance. The current-gen driver from xMEMS, a California-based company that develops the audio components, is called Cowell and it's already available in earbuds from the likes of Creative and Noble Audio

The next-gen MEMS driver is called Cypress, and while it won't arrive in new products until 2025, I got a chance to hear the difference been it and Cowell at CES 2024 here in Las Vegas — and it's quite striking. With Cowell, there's bass, but it's subdued and the emphasis is on the highs and the mids. It sounds great, on both complete products and reference designs, offering punchy highs, full mids and great clarity. With Cypress alone though, there's a blanket of warm, bassy low-end that really fills out the soundstage. It will be a massive improvement for what MEMS drivers are capable of doing for wireless earbuds. 

"We moved to a sound from ultrasound principle where we have ultrasonic modulation and demodulation to deliver 30 to 40 times greater low-frequency pressure for anti-noise generation for ANC earbuds, while still delivering all of the benefits of our solid state speakers," xMEMS vice president of marketing Mike Housholder explained. "Wide dynamic range, with excellent low-frequency performance for deep bass and noise cancellation."

xMEMS Cypress reference design with external electronics
Photo by Billy Steele/Engadget

Indeed, that 30 to 40 times louder bass response was clearly evident when on a Cypress reference design. The prototype was built to showcase the MEMS drivers on their own, without that secondary dynamic driver today's true wireless models require for bass. The results are the pristine clarity you'd expect in a send of high-end wireless headphones or even some audiophile-grade cans. The additional bass isn't loud and boomy, but instead it's warm and full, inviting you to stay and listen a while. And that I did: I had a hard time putting the Cypress prototype down even when I felt I'd overstayed my welcome. 

On the whole, MEMS drivers offer a host of benefits over coil speakers that should all lead to better audio quality in your earbuds. They're more efficient in terms of mechanical response, with faster speeds there contributing to increased detail and clarity — something I certainly noticed on the Noble Audio FoKus Triumph wireless earbuds. This model pairs Cowell with a 6.5mm dynamic driver, but the boost in fidelity in the mids and highs is apparent. And getting a set of earbuds with MEMS drivers doesn't mean you'll pay more. The two models Creative has already debuted are $130 and $150. The same will be true for upcoming products with the ultrasonic Cypress drivers, according to Housholder. 

"We see ourselves going to market first in flagship products," he said. "As with our current products, we really see the sweet spot for our products anywhere 150 and up, [which] is easily achievable day one. And then over time and over volume, getting down to that $100 price point."

xMEMS
Various MEMS drivers for IEMs and wireless earbuds
Photo by Billy Steele/Engadget

And that's really the big takeaway from me. For years, companies have offered true wireless earbuds with some of the features of more premium flagship models, but usually lacking the sonic performance of pricier options. With MEMS drivers, the audio quality is greatly improved in affordable models that are half the price of the top-of-the-line Sony or Sennheiser noise-canceling earbuds. And with Cypress, xMEMS can offer audio companies the ability to improve overall sound quality without having to raise prices. 

xMEMS has also developed what it calls a DynamicVent to relieve occlusion in sleep earbuds. The component can automatically open or close depending on if the buds detect ambient noise like a snoring spouse. When open, the DynamicVent offers a semi-open fit like AirPods, but when it's closed the ear will be completely sealed off. The open vent should also keep sounds of your own breathing or the earbuds rubbing against a pillow from disturbing your sleep. xMEMS is showing off the DynamicVent at CES in a set of reference sleep buds equipped with its Cowell MEMS drivers. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/next-gen-mems-ultrasonic-solid-state-earbud-drivers-will-deliver-the-bass-214131547.html?src=rss

I played Horizon Forbidden West inside Sony and Honda’s Afeela concept EV at CES 2024

A year since Sony Honda Mobility (SHM) announced its debut EV concept, the Afeela, the company is back at CES 2024 in Las Vegas to offer more details, more collaborations and a driving simulator.

The name of the concept vehicle hasn’t changed since last we saw it. What is new, however, is the car's ability to be driven around with a PlayStation controller. I didn’t get to do that — it was a stunt operated by one of the company's employees — but there was a DualSense controller involved in my demo.

So let’s begin where SHM left off. At last year’s CES 2023, Sony revealed the Afeela Concept EV, which packed in 45 cameras and an expansive “media bar” that spread across the vehicle's dash, showing a mix of car information, navigation, music players and more. The steering wheel was redesigned as a yoke so that the driver can better view that sumptuous dash. The company also further teased some mixed-reality tricks in collaboration with Epic Games.

The Afeela EV itself looks mostly the same as the prototype from last year, although it now has a substantial LiDAR bar above the windscreen that looks like a giant smartphone notch. The company says that the car will be available for pre-order in 2025 before going on sale in the US the following year.

At CES 2024, I got to step inside an Afeela, while an SHM representative gave me a tour of everything that’s so far been crammed into this concept vehicle.

Combining both Sony and Honda’s expertise, the Afeela will create its own noise cancellation bubble, apparently “tailoring the cabin for entertainment” using Sony’s Spatial Audio technology. According to SHM’s renders, there appear to be roughly 30 speakers, although that’s more than likely to change as the concept further evolves. A spokesperson added that over 42 sensors grace the Afeela’s initial spec sheet.

Sony Honda's Afeela EV hands-on at CES 2024
Photo by Mat Smith/Engadget

In one of the most surreal experiences I’ve had at CES, I also got to play Horizon Forbidden West on the Afeela's expansive dashboard display. No, there isn’t a PS5 baked into this concept EV — why not, though? — but a demonstration involving PlayStation’s long-running Remote Play feature. Sure, the Bluetooth connection to the controller was temperamental (CES is just hundreds of Bluetooth and Wi-Fi signals clashing), but conceptually, you get that it’s possible. The two screens for passengers sitting in the rear would also be able to display games, movies and more; however they were just dummy screens in this demo car.

SHM also announced during Sony’s CES show that it’s already teamed up with Microsoft to use its Azure OpenAI technology to create a “Mobility Personal Agent” — a conversational in-car virtual assistant for drivers and passengers alike.

It’s also working with Polyphony Digital, the company behind Gran Turismo, on a nebulous goal of developing vehicles that “fuse the virtual and the real, mainly in the area of human senses.” For now, that collaboration has resulted in an Afeela you can drive in Gran Turismo.

But it’s the new dash, combined with AR graphics overlays and that LiDAR notch, which intrigues me most. The EV will draw information and imagery from its sensors and create 3D models of the outside world. This can be used for frivolous things, like Godzilla-styled monsters on your dashcam feed and augmented reality games. Or more simply, rich, detailed overlays for navigation to nearby businesses and destinations.

While we weren’t driving the Afeela EV around Las Vegas, SHM tried to offer the next best thing: a car simulator made in collaboration with Epic Games (and what appears to be that Unreal Engine 5 Matrix demo). Steering around the virtual world, in an Aveela cockpit (in a moody black colorway, different from the light gray showroom car I sat in earlier), the digital wing mirrors also displayed the same highly realistic 3D world. The dash display then offered an AR overlay, showing vehicles, objects and pedestrians, flagging nearby hazards in red.

Sony Honda's Afeela EV hands-on at CES 2024

SHM is still putting a lot of its focus on developing its autonomous driving technology and advanced driver assistance systems, the latter are non-autonomous helper features, similar to Tesla’s Autopilot. With Qualcomm’s Snapdragon Ride SoC powering the concept vehicle’s advanced driving features, the Afeela could reach limited Level 3 autonomous driving capabilities. At that point (and we’re not there yet) a vehicle can manage most aspects of driving without any human intervention. A spokesperson added it would be capable of Level 2 Plus autonomous driving in urban settings. SHM also teased traffic monitoring through the sensors, detection for objects like traffic cones, and apparently, using what it calls a Vision Transformer that will work to detect environmental characteristics “in a broader perspective” — this could translate into predicting future traffic jams before you meet them or alternative driving routes.

The car will also utilize all those sensors more frivolously, to detect drivers' approach and open the door for them. The same sensors, including LiDAR and cameras, will guide the Afeela as it parks itself.

Many of us still wonder if SHM’s EV will ever exist as a consumer vehicle. The commitment to getting its car on roads by 2026 is still there, and while CES may be the perfect audience for the hype being served up, will car buyers think the same?

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/i-played-horizon-zero-dawn-inside-sony-and-hondas-afeela-concept-ev-at-ces-2024-205902922.html?src=rss