Galaxy S24 and S24 Plus hands-on: Samsung’s AI phones are here, but with mixed results

I’ve never thought of Samsung as a software company, let alone as a name to pay attention to in the AI race. But with the launch of the Galaxy S24 series today, the company is eager to have us associate it with the year’s hottest tech trend. The new flagship phones look largely the same as last year’s models, but on the inside, change is afoot. At a hands-on session during CES 2024 in Las Vegas last week, I was more focused on checking out the new software on the Galaxy S24 and S24 Plus.

Thanks to a new Snapdragon 8 Gen 3 processor (in the US) customized “for Galaxy,” the S24 series are capable of a handful of new AI-powered tasks that seem very familiar. In fact, if you’ve used Microsoft’s CoPilot, Google’s Bard AI or ChatGPT, a lot of these tools won’t feel new. What is new is the fact that they’re showing up on the S24s, and are mostly processed on-device by Samsung’s recently announced Gauss generative AI model, which it has been quietly building out.

Samsung’s Galaxy AI features on the S24

There are five main areas where generative AI Is making a big difference in the Galaxy S24 lineup — search, translations, note creation, message composition and photo editing and processing. Aside from the notes and composition features, most of these updates seem like versions of existing Google products. In fact, the new Circle to Search feature is a Google service that is debuting on the S24 series, in addition to the Pixel 8 and Pixel 8 Pro.

Circle to Search

With Circle to Search, you basically press the middle of the screen’s bottom edge, the Google logo and a search bar pop up, and you can draw a ring around anything on the display. Well, almost anything. DRMed content or things protected from screenshots, like your banking app, are off limits. Once you’ve made your selection, a panel slides up showing your selection, along with results from Google’s Search Generative Experience (SGE).

You can scroll down to see image matches, followed by shopping, text, website and other types of listings that SGE thought were relevant. I circled the Samsung clock widget, a picture of beef wellington and a lemon, and each time I was given pretty accurate results. I was also impressed by how quickly Google correctly identified a grill that I circled on an Engadget article featuring a Weber Searwood, especially since the picture I drew around was at an off angle.

This is basically image search via Google or Lens, except it saves you from having to open another app (and take screenshots). You’ll be able to circle items in YouTube videos, your friend’s Instagram Stories (or, let’s be honest, ads). Though I was intrigued by the feature and its accuracy, I’m not sure how often I’d use it in the real world. The long-press gesture to launch Circle to Search works whether you use a gesture-based navigation or if you have the three-button layout. The latter might be slightly confusing, since you pretty much hold your finger down on the home button, but not exactly.

Circle to Search is launching on January 31st, and though it’s reserved for the Galaxy S24s and Pixel 8s for now, it’s not clear whether older devices might get the feature.

Chat Assist to tweak the tone of your messages

The rest of Samsung’s AI features are actually powered by the company’s own language models, not Google’s. This part is worth making clear, because when you use the S24 to translate a message from, say, Portuguese to Mandarin, you’ll be using Samsung’s database, not Google’s. I really just want you to direct your anger at the right target when something inevitably goes wrong.

I will say, I was a little worried when I first heard about Samsung’s new Chat Assist feature. It uses generative AI to help reword a message you’ve composed to change up the tone. Say you’re in a hurry, firing off a reply to a friend whom you know can get anxious and misinterpret texts. The S24 can take your sentences, like “On my way back now what do you need” and make it less curt. The options I saw were “casual,” “emojify,” “polite,” “professional” and “social,” which is a hashtag-filled caption presumably for your social media posts.

I typed “Hey there. Where can I get some delicious barbecue? Also, how are you?” Then I tapped the AI icon above the keyboard and selected the “Writing Style” option. After about one or two seconds, the system returned variations of what I wrote.

At the top of the results was my original, followed by the Professional version, which I honestly found hilarious. It said “Hello, I would like to inquire about the availability of delectable barbecue options in the vicinity. Additionally, I hope this message finds you well. Thank you for your attention to this matter.”

It reminded me of an episode of Friends where Joey uses a thesaurus to sound smarter. Samsung’s AI seems to have simply replaced every word with a slightly bigger word, while also adding some formal greetings. I don’t think “inquire about the availability of delectable barbecue options in the vicinity” is anything a human would write.

That said, the casual option was a fairly competent rewording of what I’d written, as was the polite version. I cannot imagine a scenario where I’d pick the “emojify” option, except for the sake of novelty. And while the social option pained me to read, at least the hashtags of #Foodie and #BBQLover seemed appropriate.

Samsung Translate

You can also use Samsung’s AI to translate messages into one of 13 languages in real-time, which is fairly similar to a feature Google launched on the Pixel 6 in 2021. The S24’s interface looks reminiscent of the Pixel’s, too, with both offering two text input fields. Like Google, Samsung also has a field at the top for you to select your target language, though the system is capable of automatically recognizing the language being used. I never got this to work correctly in a foreign language that I understand, and have no real way of confirming how accurate the S24 was in Portuguese.

Samsung’s translation engine is also used for a new feature called Live Translate, which basically acts as an interpreter for you during phone calls made via the native dialer app. I tried this by calling one of a few actors Samsung had on standby, masquerading as managers of foreign-language hotels or restaurants. After I dialed the number and turned on the Live Translate option, Samsung’s AI read out a brief disclaimer explaining to the “manager at a Spanish restaurant” that I was using a computerized system for translation. Then, when I said “Hello,” I heard a disembodied voice say “Hola” a few seconds later.

The lag was pretty bad and it threw off the cadence of my demo, as the person on the other end of the call clearly understood English and would answer in Spanish before my translated request was even sent over. So instead of:

Me: Can I make a reservation please?

S24: … ¿Puedo hacer una reserva por favor?

Restaurant: Si, cuantas personas y a que hora?

S24 (to me): … Yes, for how many people and at what time?

My demo actually went:

Me: Can I make a reservation please?

pause

Restaurant: Si, cuantas personas y a que hora?

S24: ¿Puedo hacer una reserva por favor?

pause

S24 (to me): Yes, for how many people and at what time?

It was slightly confusing. Do I think this is representative of all Live Translate calls in the real world? No, but Samsung will need to work on cutting down lag if it wants to be helpful and not confusing.

Galaxy AI reorganizing your notes

I was most taken by what Samsung’s AI can do in its Notes app, which historically has had some pretty impressive handwriting recognition and indexing. With the AI’s assistance, you can quickly reformat your large blocks of text into easy-to-read headers, paragraphs and bullets. You can also swipe sideways to see different themes, with various colors and font styles.

Notes can also generate summaries for you, though most of the summaries on the demo units didn’t appear very astute or coherent. After it auto-formatted a note titled “An Exploration of the Celestial Bodies in Our Solar System,” the first section was aptly titled “Introduction,” but the first bullet point under that was, confusingly, “The Solar System.” The second bullet point was two sentences, starting with “The Solar System is filled with an array of celestial bodies.”

Samsung also borrowed another feature from the Pixel ecosystem, using its speech-to-text software to transcribe, summarize and translate recordings. The transcription of my short monologue was accurate enough, but the speaker labels weren’t. Summaries of the transcriptions were similar to those in Notes, in that they’re not quite what I’d personally highlight.

The Galaxy S24 held in mid-air, with the viewfinder of its camera app showing on the screen.
Photo by Sam Rutherford / Engadget

That’s already a lot to cover, and I haven’t even gotten to the photo editing updates yet. My colleague Sam Rutherford goes into a lot more detail on those in his hands-on with the Galaxy S24 Ultra, which has the more-sophisticated camera system. In short though, Samsung offers edit suggestions, generative background filling and an instant slow-mo tool that fills in frames when you choose to slow down a video.

Samsung Galaxy S24 and S24 Plus hardware updates

That brings me to the hardware. On the regular Galaxy S24 and S24 Plus, you’ll be getting a 50-megapixel main sensor, 12MP wide camera and 10MP telephoto lens with 3x optical zoom. Up front is a 12MP selfie camera. So, basically, the same setup as last year. The S24 has a 6.2-inch Full HD+ screen, while the S24 Plus sports a 6.7-inch Quad HD+ panel and both offer adaptive refresh rates that can go between 1 and 120Hz. In the US, all three S24 models use a Snapdragon 8 Gen 3 for Galaxy processor, with the base S24 starting out with 8GB of RAM and 128GB of storage. Both the S24 and S24 Plus have slightly larger batteries than their predecessors, with their respective 4,000mAh and 4,900mAh cells coming in at 100mAh and 200mAh bigger than before.

Though the S24s look very similar to last year’s S23s, my first thought on seeing them was how much they looked like iPhones. That’s neither a compliment nor an indictment. And to be clear, I’m only talking about the S24 and S24 Plus, not the Ultra, which still has the distinctive look of a Note.

Four Galaxy S24 handsets in white, cream, black and purple, laid down on a table with their rear cameras facing up.
Photo by Sam Rutherford / Engadget

It feels like Samsung spent so much time upgrading the software and focusing on joining the AI race this year that it completely overlooked the S24’s design. Plus, unlike the latest iPhones, the S24s are also missing support for the newer Qi 2 wireless charging standard, which includes magnetic support, a la Apple’s MagSafe.

Wrap-up

I know it’s just marketing-speak and empty catchphrases, but I’m very much over Samsung’s use of what it thinks is trendy to appeal to people. Don’t forget, this is the company that had an “Awesome Unpacked” event in 2021 filled to the brim with cringeworthy moments and an embarrassingly large number of utterances of the words “squad” and “iconic”.

That doesn’t mean what Samsung’s done with the Galaxy S24 series is completely meaningless. Some of these features could genuinely be useful, like summarizing transcriptions or translating messages in foreign languages. But after watching the company follow trend after trend (like introducing Bixby after the rise of digital assistants, or bringing scene optimizers to its camera app after Chinese phone makers did), launching generative AI features feels hauntingly familiar. My annoyance at Samsung’s penchant for #trendy #hashtags aside, the bigger issue here is that if the company is simply jumping on a fad instead of actually thoughtfully developing meaningful features, then consumers run the risk of losing support for tools in the future. Just look at what happened to Bixby.

This article originally appeared on Engadget at https://www.engadget.com/galaxy-s24-and-s24-plus-hands-on-samsungs-ai-phones-are-here-but-with-mixed-results-180008236.html?src=rss

Apple Vision Pro hands-on, redux: Immersive Video, Disney+ app, floating keyboard and a little screaming

With pre-orders for the Apple Vision Pro headset opening this week, the company is getting ready to launch one of its most significant products ever. It announced this morning an “entertainment format pioneered by Apple” called Apple Immersive Video, as well as new viewing environments in the Disney+ app featuring scenes from the studio’s beloved franchises like the Avengers and Star Wars.

We already got hands-on once back at WWDC when the headset was first announced, but two of our editors, Dana Wollman and Cherlynn Low, had a chance to go back and revisit the device (and in Dana’s case, experience it for the first time). Since we’ve already walked you through some of the basic UI elements in our earlier piece, we decided to focus on some of the more recently added features, including Apple Immersive Video, the new Disney+ environments, a built-in “Encounter Dinosaurs” experience, as well as the floating keyboard, which didn’t work for us when we first tried the device in June of last year. Here, too, we wanted to really get at what it actually feels like to use the device, from the frustrating to the joyful to the unintentionally eerie. (Yes, there was a tear, and also some screaming.)

Fit, comfort and strap options

Cherlynn: The best heads-up display in the world will be useless if it can’t be worn for a long time, so comfort is a crucial factor in the Apple Vision Pro’s appeal. This is also a very personal factor with a lot of variability between individual users. I have what has been described as a larger-than-usual head, and a generous amount of hair that is usually flat-ironed. This means that any headgear I put on tends to slip, especially if the band is elastic.

Like the version that our colleague Devindra Hardawar saw at WWDC last year, the Vision Pro unit I tried on today came with a strap that you stretch and ends up at the back of your head. It was wide, ridged and soft, and I at first thought it would be very comfortable. But 15 minutes into my experience, I started to feel weighed down by the device, and five more minutes later, I was in pain. To be fair, I should have flagged my discomfort to Apple earlier, and alternative straps were available for me to swap out. But I wanted to avoid wasting time. When I finally told the company’s staff about my issues, they changed the strap to one that had two loops, with one that went over the top of my head.

A woman with dark hair wearing the Apple Vision Pro headset, sat back on a gray couch.
Apple

Dana: The fitting took just long enough — required just enough tweaking — that I worried for a minute that I was doing it wrong, or that I somehow had the world’s one unfittable head. First, I struggled to get the lettering to look sharp. It was like sitting at an optometrist's office, trying out a lens that was just slightly too blurry for me. Tightening the straps helped me get the text as crisp as it needed to be, but that left my nose feeling pinched. The solution was swapping out the seal cushion for the lighter of the two options. (There are two straps included in the box, as well as two cushions.) With those two tweaks — the Dual Loop Band and the light seal cushion — I finally felt at ease.

Cherlynn: Yep, that Dual Loop band felt much better for weight distribution, and it didn’t keep slipping down my hair. It’s worth pointing out that Apple did first perform a scan to determine my strap size, and they chose the Medium for me. I also had to keep turning a dial on the back right to make everything feel more snug, so I had some control over how tightly the device sat. Basically, you’ll have quite a lot of options to adapt the Vision Pro to your head.

Apple Immersive Video and spatial videos

Dana: Sitting up close in the center of Apple Immersive and spatial videos reminded me of Jimmy Stewart’s character in It’s A Wonderful Life: I was both an insider and outsider at the same time. In one demo, we saw Alicia Keys performing the most special of performances: just for us, in a living room. In a different series of videos — these meant to demonstrate spatial video — we saw the same family at mealtime, and a mother and daughter outside, playing with bubbles.

As I watched these clips, particularly the family home videos that reminded me of my own toddler, I felt immersed, yes, but also excluded; no one in the videos sees you or interacts with you, obviously. You are a ghost. I imagined myself years from now, peering in from the future on bygone videos of my daughter, and felt verklempt. I did not expect to get teary-eyed during a routine Apple briefing.

Cherlynn: The Immersive Video part of my demo was near the end, by which point I had already been overwhelmed by the entire experience and did not quite know what more to expect. The trailer kicked off with Alicia Keys singing in my face, which I enjoyed. But I was more surprised by the kids playing soccer with some rhinos on the field, and when the animals charged towards me, I physically recoiled. I loved seeing the texture of their skin and the dirt on the surface, and was also impressed when I saw the reflection of an Apple logo on the surface of a lake at the end. I didn’t have the same emotional experience that Dana did, but I can see how it would evoke some strong feelings.

A banner with the words
Apple

Disney+ app

Dana: Apple was very careful to note that the version of the Disney+ app we were using was in beta; a work in progress. But what we saw was still impressive. Think of it like playing a video game: Before you select your race course, say, you get to choose your player. In this case, your “player” is your background. Do you want to sit on a rooftop from a Marvel movie? The desert of Tatooine? Make yourself comfortable in whatever setting tickles your fancy, and then you can decide if actually you want to be watching Loki or Goosebumps in your Star Wars wasteland. It’s not enough to call it immersive. In some of these “outdoor” environments in particular, it’s like attending a Disney-themed drive-in. Credit to Disney: They both understand – and respect – their obsessive fans. They know their audience.

Cherlynn: As a big Marvel fangirl, I really geeked out when the Avengers Tower environment came on. I looked around and saw all kinds of easter eggs, including a takeout container from Shawarma Grill on the table next to me. It feels a little silly to gush about the realism of the images, but I saw no pixels. Instead, I looked at a little handwritten note that Tony Stark had clearly left behind and felt like I was almost able to pick it up. When we switched over to the Tatooine environment, I was placed in the cockpit of Luke Skywalker’s landspeeder, and when I reached out to grab the steering controls, I was able to see my own hands in front of me. I felt slightly disappointed to not actually be able to interact with those elements, but it was definitely a satisfying experience for a fan.

Typing experience

Cherlynn: Devindra mentioned that the floating keyboard wasn’t available at his demo last year, and was curious to hear what that was like. I was actually surprised that it worked, and fairly well in my experience. When I selected the URL bar by looking at it and tapping my thumb and forefinger, the virtual keyboard appeared. I could either use my eyes to look at the keys I wanted, then tap my fingers together to push them. Or, and this is where I was most impressed, I could lean forward and press the buttons with my hands.

It’s not as easy as typing on an actual keyboard would be, but I was quite tickled by the fact that it worked. Kudos to Apple’s eye- and hand-tracking systems, because they were able to detect what I was looking at or aiming for most of the time. My main issue with the keyboard was that it felt a little too far away and I needed to stretch if I wanted to press the buttons myself. But using my eye gaze and tapping wasn’t too difficult for a short phrase, and if I wanted to input something longer I could use voice typing (or pair a Bluetooth keyboard if necessary).

A screenshot of the Vision Pro home screen, with about a dozen apps floating above a lake.
Apple

Dana: This was one of the more frustrating aspects of the demo for me. Although there were several typing options – hunting and pecking with your fingers, using eye control to select keys, or just using Siri – none of them felt adequate for anything resembling extended use. It took several tries for me to even spell Engadget correctly in the Safari demo. This was surprising to me, as so many other aspects of the broader Apple experience – the pinch gesture, the original touch keyboard on the original iPhone – “just work,” as Apple loves to say about itself. The floating keyboard here clearly needs improvement. In the meantime, it’s harder to imagine using the Vision Pro for actual work. The Vision Pro feels much further along as a personal home theater.

Meditation

Cherlynn: As someone who’s covered the meditation offerings by companies like Apple and Fitbit a fair amount, I wasn’t sure what to expect of the Vision Pro. Luckily, this experience took place in the earlier part of the demo, so I wasn’t feeling any head strain yet and was able to relax. I leaned back on the couch and watched as a cloud, similar to the Meditation icon in the Apple Watch, burst into dozens of little “leaves” and floated around me in darkness. As the 1-minute session started, soft, comforting music played in the background as a voice guided me through what to do. The leaves pulsed and I felt enveloped by relaxing visuals and calming sounds and altogether it felt quite soothing. It’s funny how oddly appropriate a headset is for something like meditating, where you can literally block out distractions in the world and simply focus on your breathing. This was a fitting use of the Vision Pro that I certainly did not anticipate.

Dana: I wanted more of this. A dark environment, with floating 3D objects and a prompt to think about what I am grateful for today. The demo only lasted one minute, but I could have gone longer.

Encounter Dinosaurs

Cherlynn: Fun fact about me: Dinosaurs don’t scare me, but butterflies do. Yep. Once you’ve stopped laughing, you can imagine the trauma I had to undergo at this demo. I’d heard from my industry friends and Devindra all about how they watched a butterfly land on their fingers in their demos at WWDC, before dinosaurs came bursting out of a screen to roar at them. Everyone described this as a realistic and impressive technological demo, since the Vision Pro was able to accurately pinpoint for everyone where their fingers were and have the butterflies land exactly on their fingertips.

I did not think I’d have to watch a butterfly land on my body today, and just generally do not want that in life. But for this demo, I kept my eyes open to see just how well Apple would do, and, because I had a minor calibration issue at the start of this demo, I had to do this twice. The first time this happened, I… screamed a bit. I could see the butterfly’s wings and legs. That’s really what creeped me out the most — seeing the insect’s legs make “contact” with my finger. There was no tactile feedback, but I could almost feel the whispery sensation of the butterfly’s hairy ass legs on my finger. Ugh.

Then the awful butterfly flew away and a cute baby dinosaur came out, followed by two ferocious dinosaurs that I then stood up to “pet”. It was much more fun after, and actually quite an impressive showcase of the Vision Pro’s ability to blend the real world with immersive experiences, as I was able to easily see and walk around a table in front of me to approach the dinosaur.

Dana: Unlike Cher, I did not scream, though I did make a fool of myself. I held out my hand, to beckon one of the dinosaurs, and it did in fact walk right up to me and make a loud sound in my face. I “pet” it before it retreated. Another dinosaur appeared. I once again held out my hand, but that second dino ignored me. As the demo ended, I waved and heard myself say “bye bye.” (Did I mention I live with a toddler?) I then remembered there were other adults in the room, observing me use the headset, and felt sheepish. Which describes much of the Vision Pro experience, to be honest. You could maybe even say the same of any virtual reality headset worth their salt. It is immersive to the point that you will probably, at some point, throw decorum to the wind.

The Disney+ app floating above a living room in a screenshot of the visionOS interface on the Apple Vision Pro.
Apple

Final (ish) thoughts

Cherlynn: I had been looking forward to trying on the Vision Pro for myself and was mostly not disappointed. The eye- and hand-tracking systems are impressively accurate, and I quickly learned how to navigate the interface, so much so that I was speeding ahead of the instructions given to me. I’m not convinced that I’ll want to spend hours upon hours wearing a headset, even if the experience was mind-blowing. The device’s $3,500 price is also way out of my budget.

But of all the VR, AR and MR headsets I’ve tried on in my career, the Apple Vision Pro is far and away the best, and easily the most thought-out. Apple also took the time to show us what you would look like to other people when using the device, with a feature called EyeSight that would put a visual feed of your eyes on the outside of the visor. Depending on what you’re doing in visionOS, the display would show some animations indicating whether you’re fully immersed in an environment or if you can see the people around you.

Dana: The Vision Pro was mostly easier to use than I expected, and while it has potential as an all-purpose device that you could use for web browsing, email, even some industrial apps, its killer application, for now, is clearly watching movies (home videos or otherwise). I can’t pretend that Apple is the first to create a headset offering an immersive experience; that would be an insult to every virtual reality headset we’ve tested previously (sorry, Apple, I’m going to use the term VR). But if you ask me what it felt like to use the headset, particularly photo and video apps, my answer is that I felt joy. It is fun to use. And it is up to you if this much fun should cost $3,500.

Update, January 17 2024, 3:04PM ET: This article was edited to clarify the TV shows you can view in the Disney+ app's immersive environment. You can only watch Disney+ shows in the environments, like the Avengers Tower or the landspeeder on Tatooine. A previous misspelling of the word Tatooine was also edited, as well as clarification around the head strap option that was available at the WWDC demo.

This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-hands-on-redux-immersive-video-disney-app-floating-keyboard-and-a-little-screaming-180006222.html?src=rss

Audio Radar helps gamers with hearing loss ‘see’ sound effects instead

Audio cues can sometimes be crucial for success in games. Developers frequently design the sound environment for their experiences to be not only rich and immersive, but to also contain hints about approaching enemies or danger. Players who are hard of hearing can miss out on this, and it's not fair for them to be disadvantaged due to a disability. A product called Audio Radar launched at CES 2024 and it can help turn sound signals into visual cues, so that gamers with hearing loss can "see the sound," according to the company AirDrop Gaming LLC. 

The setup is fairly simple. A box plugs into a gaming console to interpret audio output and converts that data into lights. A series of RGB light bars surround the screen, and display different colors depending on the type of sound coming from the respective direction they represent. Put simply, it means that if you're walking around a Minecraft world, like I did at the company's booth on the show floor, you'll see lights of different colors appear on the different bars.

Red lights mean sounds from enemies are in the area adjacent to the corresponding light, while green is for neutral sounds. An onscreen legend also explains what the sounds mean, though that might just be for the modded Minecraft scenario on display at CES. 

A close-up of the bottom right corner of a monitor, with an onscreen legend showing the words
Photo by Cherlynn Low / Engadget

I walked around the scene briefly, and could see green lights hovering above a pen of farm animals, while purple lights fluttered in tandem with a dragon flying overhead. I did find it a little confusing, but that is probably due more to the fact that I know very little about Minecraft, and as someone with hearing I might not appreciate the added information as much as someone without.

With an SDK that the company launched at the show, developers will be able to customize the lights and visual feedback to elements in their game so that they have control over what their hard-of-hearing gamers see. In the meantime, Audio Radar is using its own software to detect stereo or surround sound signals to convert to feedback in lights and colors. 

Though the product may seem in its early stages, various major gaming companies have appeared to indicate interest in Audio Radar. AirDrop Gaming's CEO Tim Murphy told me that Logitech is "providing support as we further develop our product and design our go-to-market strategy." Also, Microsoft CEO Satya Nadella was spotted at the booth on opening day.

Audio Radar is beginning to ship on a wider level this year, and the company continues to develop products for gamers who are deaf and hard of hearing, among other things. The system works with Xbox, PlayStation and PC.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/audio-radar-helps-gamers-with-hearing-loss-see-sound-effects-instead-195001226.html?src=rss

Our favorite accessibility innovations at CES 2024

So much of what we see at CES tends to be focused on technological innovation for the sake of innovation, or obvious attempts to tap into whatever trend is gripping the internet's attention that year. In the last few shows, though, there has been a heartening increase in attention to assistive products that are designed to help improve the lives of people with disabilities and other different needs. At CES 2024, I was glad to see more development in the accessibility category, with many offerings appearing to be more thoughtfully designed in addition to being clever. It's so easy to get distracted by the shiny, eye-catching, glamorous and weird tech at CES, but I wanted to take the time to give due attention to some of my favorite accessibility products here in Las Vegas.

GyroGlove

Before I even packed my bags, numerous coworkers had sent me the link to GyroGlove's website after it had been recognized as an honoree for several CES Innovation awards. The device is a hand-stabilizing glove that uses gyroscopic force to help those with hand tremors minimize the shakes. Because the demo unit at the show floor was too large for me, and, more importantly, I don't have hand tremors, I couldn't accurately assess the glove's effectiveness. 

But I spoke with a person with Parkinson's Disease at the booth, who had been wearing one for a few days. She said the GyroGlove helped her perform tasks like buttoning up a shirt more easily, and that she intended to buy one for herself. At $5,899, the device is quite expensive, which is the sad state of assistive products these days. But GyroGlove's makers said they're in talks with some insurance providers in the US, which could lead to it being covered for those in America who could benefit from it. That's one of the biggest reasons that led us to name GyroGlove one of our winners for CES 2024

A down-up look at the MouthPad inside a person's mouth.
Photo by Cherlynn Low / Engadget

MouthPad

I did not think I'd be looking deep into a person's mouth and up their nose at CES 2024, but here we are. Sometimes you have to do strange things to check out unconventional gadgets. The MouthPad is as unusual as it gets. It's a tongue-operated controller for phones, tablets and laptops, and basically anything that will accept a Bluetooth mouse input. The components include a touchpad mounted onto the palette of what's essentially a retainer, as well as a battery and Bluetooth radio. 

As odd as the concept sounds, it actually could be a boon for people who aren't able to use their limbs, since your tongue, as a muscle, can offer more precise movement and control than, say, your eyes. If you're feeling apprehensive about sticking a device inside your mouth, it might be helpful to know that the battery is from the same company that's made them for medical-grade implants, while the rest of the dental tray is made from a resin that's commonly used in aligners and bite guards. The product is currently available as an early access package that includes setup and calibration assistance, with a new version (with longer battery life) slated for launch later this year.

OrCam Hear

Assistive tech company OrCam won our Best of CES award for accessibility in 2022, so I was eager to check out what it had in store this year. I wasn't disappointed. The company had a few updated products to show off, but the most intriguing was a new offering for people with hearing loss. The OrCam Hear system is a three-part package consisting of a pair of earbuds, a dongle for your phone and an app. Together, the different parts work to filter out background noise while identifying and isolating specific speakers in a multi-party conversation.

At a demo during a noisy event at CES 2024, I watched and listened as the voices of selected people around me became clear or muffled as company reps dragged their icons in or out of my field of hearing. I was especially impressed when the system was able to identify my editor next to me and let me choose to focus on or filter out his voice. 

Audio Radar

If you're a gamer, you'll know how important audio cues can sometimes be for a successful run. Developers frequently design the sound environment for their games to be not only rich and immersive, but to also contain hints about approaching enemies or danger. Players who are hard of hearing can miss out on this, and it's not fair for them to be disadvantaged due to a disability. 

A product called Audio Radar can help turn sound signals into visual cues, so that gamers with hearing loss can "see the sound," according to the company. The setup is fairly simple. A box plugs into a gaming console to interpret the audio output and convert it into lights. A series of RGB light bars surround the screen, and display different colors depending on the type of sound coming from the respective direction they represent.

CES 2024 saw not just Audio Radar's official launch, but was also where the company introduced its SDK for game developers to create custom visual cues for players who are hard of hearing. The company's founder and CEO Tim Murphy told Engadget that it's partnering with Logitech, with the gaming accessory maker "providing support as we further develop our product and design our go-to-market strategy." 

A person wearing the TranscribeGlass on the right side of a pair of black-framed glasses.
Photo by Cherlynn Low / Engadget

Transcribe Glass

Google Glass was resurrected at CES 2024. Sort of. A new product called Transcribe Glass is a small heads up display you can attach to any frames, and the result looks a lot like the long-dead Google device. It connects to your phone and uses that device's onboard processing to transcribe what it hears, then projects the text onto the tiny transparent display hovering above the eye. You'll be able to resize the font, adjust the scrolling speed and choose your language model of choice, since TranscribeGlass uses third-party APIs for translation. Yes, it converts foreign languages into one you understand, too. 

The company is targeting year's end for launch, and hoping to offer the device at $199 to start. When I tried it on at the show floor, I was surprised by how light and adjustable the hardware was. I had to squint slightly to see the captions, and was encountering some Bluetooth lag, but otherwise the transcriptions took place fairly quickly and appeared to be accurate. The TranscribeGlass should last about eight hours on a charge, which seems reasonable given all that it's doing. 

Samsung's subtitle accessibility features

Though we didn't catch a demo of this in person, Samsung did briefly mention a "sign language feature in Samsung Neo QLED" that "can be easily controlled with gestures for the hearing impaired, and an Audio Subtitle feature [that] turns text subtitles into spoken words in real-time for those with low vision." We weren't able to find this at the show, but the concept is certainly meaningful. Plus, the fact that Samsung TVs have mainstream appeal means these features could be more widely available that most of the niche products we've covered in this roundup.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/our-favorite-accessibility-products-at-ces-2024-170009710.html?src=rss

MouthPad turns your tongue into a mouse for your phone

You can one day use your tongue as a mouse for your laptop, tablet or phone, thanks to a new product that made its first public appearance at CES 2024 in Las Vegas. The MouthPad (an obvious spin on the word "mousepad") is what its makers call a tongue-operated touchpad that "sits at the roof of your mouth" and can be connected to your devices just like a standard Bluetooth mouse. I got to see the MouthPad here in Las Vegas, where it's making its first public appearance since its announcement last year, though, to be clear, I did not put it in my mouth to try out for myself. Instead, I watched as the company's co-founder Tomás Vega used the device to navigate an iPhone and open the camera as we took a selfie together. 

The MouthPad is basically like a retainer with a touchpad, battery and Bluetooth radio built in. It's made of a resin that the company says is the same "dental-grade material that is commonly used in dental aligners, bite guards and other oral appliances." The device's battery was made by a company called Varta, which MouthPad's makers also said has "a long track record of producing safe, medical implant-grade batteries." All this is to say that while it can feel strange to put a battery-powered electrical device in your mouth, at least it might be reassuring to know that this uses technology that has existed in the oral health industry for a long time.

I watched Vega place the 7.5-gram mouthpiece right on his palette, where it sat surrounded by his upper teeth. He closed his mouth and the iPhone he held up showed a cursor moving around as he opened apps and menus. I asked him to open up the camera and he obliged, and we took a selfie. This was evidently not a pre-recorded demo paired with good acting. 

The MouthPad, a tongue-operate controller, held up in mid-air. It's a clear dental tray with an orange touchpad in the middle and some circuitry throughout.
Photo by Cherlynn Low / Engadget

Now, because I didn't try it myself, I can't tell you if it's comfortable or easy to use. But the specs sheet states that the MouthPad is about 0.7mm (0.027 inches) thick, apart from where there are capsules, while the touchpad itself on the roof of the mouth is 5mm (0.19 inches) thick. From what I saw, it didn't look much bulkier than my own retainers, and when Vegas smiled after putting the MouthPad on, I could only really see one small black piece on top of one of his teeth. 

You'll have to take out the MouthPad when you're eating, but you can speak while it's in your mouth. You might have a slight lisp the way you would with regular retainers, but I could understand Vega perfectly. The company said that the device currently lasts about five hours on a charge, though the team is working on improving that to eight hours by March. Recharging the device takes about an hour and a half, though Vega and his team said that, of the 30ish people that currently have a MouthPad, most users tend to charge theirs when they're eating and rarely seem to run out of juice.

The company explained that the MouthPad uses Apple's Assistive Touch feature to navigate iOS, but it can be recognized by other devices as a Bluetooth mouse. It's already on sale for those who sign up for early access, but general availability is coming later this year. Each MouthPad is individually 3D-printed, based on dental impressions sent in by customers as part of the ordering process. Early access users will also receive assistance from the company during setup and calibration, as well as throughout their use of the device.

Close up on the smile of a person wearing the MouthPad 2024. Some lines indicate the person has a clear tray over their teeth, while a small piece of equipment is on top of a tooth on the right side of their mouth.
Photo by Cherlynn Low / Engadget

Tongue-operated controllers are not new, but MouthPad is one of the more elegant and sophisticated options to date. It also works with a wide variety of devices and seems far enough along in the production process to be ready for sale. Whether the human tongue is a suitable organ for computer interactions, however, is something we can only determine after longterm use in the real world. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/the-mouthpad-turns-your-tongue-into-a-mouse-for-your-phone-184541021.html?src=rss

Everything you missed at CES 2024 Day 2 on the show floor in Las Vegas: AI, trending gadgets and more

The show floor at CES 2024 opened on Tuesday, and people have been racking up their steps, canvassing Las Vegas’ vast convention centers and hotel ballrooms to see all the latest and weirdest tech products. The Engadget team has been getting our cardio in, braving both vehicular and human traffic to get face and hand time (and other body parts?) with the most intriguing demos here, while companies haven’t stopped holding press conferences and announcing new items. If you don’t have time to parse through every individual headline or are here in Vegas and want to know where to go, here’s a recap of the biggest news out of CES 2024’s second day.

One of the biggest booths at the show is, as usual, Google, and the company also had a fair amount of news to share. In keeping with the same theme it’s been doing the last few years of “Better Together,” Google shared updates to its inter-device software like Fast Pair and announced it’s working with Samsung to integrate and rename its Nearby Share feature to Quick Share, which is the current name of Samsung’s version of the same thing. This should hopefully simplify things for Android users, and give them a more cohesive alternative to Apple’s AirDrop. Details were pretty scarce on whether there are changes coming to Samsung users, but those who have Nearby Share should see a new icon pretty soon. 

Google also added support for people to Chromecast TikTok videos to compatible TVs and screens and is bringing its apps to some Ford, Nissan and Lincoln vehicles later this year. Android Auto will also be able to share your electric vehicle’s battery levels to Google Maps so it can factor in recharge stations, charge times and stops into your routes. This is, again, similar to a feature in Apple's next-gen CarPlay.

Speaking of EVs, Honda also debuted new EV concepts called the Saloon and the Space Hub. The Saloon is a sedan with an aerodynamic design and rides low to the ground, while the Space Hub is a minivan that is a little boxier and its seats has its passengers facing each other. Honda said it will develop a model based on the Saloon concept car for North American markets in 2026, with no word yet on the Space Hub.

In other transportation news, Hyundai brought an updated version of its S-A2 Air Taxi to the show. The S-A2 is an electric vertical take off and landing vehicle that has a cruising speed of 120mph when it reaches an altitude of 1,500 feet. It’s designed to fly short trips between 25 to 40 miles and the company envisions it as an everyday transportation solution for urban areas.

We also got more smart home news from companies other than Google, including Amazon, which said it will adopt the Matter standard for Casting, but it won’t support Chromecast or Apple’s AirPlay. How nice. We also saw new face-scanning and palm-reading door locks, smart outdoor lights by Nanoleaf and a new Weber Searwood smart grill that’s cheaper and more versatile.

There has been a smattering of mobile news, including the Clicks iPhone keyboard case and a surprising, adorable device called the Rabbit R1. It’s pitched as an AI-powered assistant in what’s basically a cute squarish walkie-talkie co-designed by Teenage Engineering. It has a tiny 2.88-inch touchscreen, an analog scroll wheel, two mics, a speaker and a 360-degree camera you can spin to face toward you or through the back of the handset. You’re supposed to talk to the Rabbit AI by pushing down a button (like a walkie talkie) and ask it to do anything like book an Uber or look for a recipe tailored to your specific list of ingredients.

There’s been a lot more at the show, but I wanted to take some time to shout out a bunch of intriguing accessibility products. We saw the OrCam Hear system that’s designed to help people with hearing loss isolate the voices of specific speakers in crowded environments. There’s also the GyroGlove, which is a hand-stabilizing glove for people with hand tremors, as well as the Mouthpad, which lets you control your phone, tablet or laptop by using your tongue.

We also saw an update to the Audio Radar system that provides visual cues for gamers who are hard of hearing to see where sounds are coming from and what type of sounds they might be. It’s very heartening to see all this development in assistive technology at CES, especially when the industry often spends so much time and money on less-worthy endeavors.

We’re nearing the end of the show and as we get ready to do our final sweeps of the show floor, the Engadget team is also looking back and contemplating the best things we saw at CES 2024. We’ll be putting together our Best of CES awards list soon, so make sure you come back to see what we decided were the winners of the show.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/ces-2024-day-2-recap-a-wild-rabbit-gadget-appears-while-google-offers-its-own-take-on-apple-software-tricks-022245111.html?src=rss

GyroGlove is a hand-stabilizing glove for people with tremors

A busy, stimulating convention like CES can exacerbate hand tremors for those living with Parkinson's Disease. For Roberta Wilson-Garrett, however, a new wearable device has been helping keep the tremors at bay. Wilson-Garrett has been using the GyroGlove, which launched here at CES 2024 in Las Vegas. It's a hand-stabilizing glove designed to "counteract hand tremors by utilizing advanced gyroscopic technology," giving wearers more control over their mobility.

In the few days she has been wearing the GyroGlove, Wilson-Garrett says she's been able to perform certain tasks more easily. Things like buttoning up a shirt, moving a cup of coffee around or writing down a note had become easier with the device. One morning, she had forgotten that she didn't have the glove on and grabbed her coffee, only for her hand to shake and and the drink to spill over. 

It's in little daily activities like that where assistive technology can help give people with disabilities some sense of control and independence again. The current iteration of GyroGlove comprises of three parts: The fabric glove, the gyroscope in the stabilization module and a battery pack on the forearm. Though the company's reps said they designed the glove to be easy to put on by people with hand tremors, they wanted to help me get the device on. I held my palm out, and a representative slipped the GyroGlove on. 

The unit at the booth was too large for me, so my experience wasn't as effective or accurate. Though I tried to move my hand in a way that might be similar to tremors, I didn't quite feel any counteracting force or stabilizing effect. 

If anything, I just felt like there was a fairly heavy weight on the back of my palm and a constant low whir of the gyroscope spinning inside the module. According to the company's founder Faii Ong, the gyroscope is spinning at a speed that's over four times faster than a jet turbine. The device is powered by rechargeable lithium polymer batteries that last about four hours of continuous use, which Wilson-Garrett said was in line with her experience. She also said that she's heard of some people who manage to get two days out of a charge, if they use the device more intermittently depending on the frequency of their tremors.

The components were designed to be bulky and easy for people with hand tremors to grip and maneuver. Large buttons on the battery pack allow for power control and navigation of the screen on the power unit, which also displays the battery status in large icons and font. 

A person in a pink blazer wearing the GyroGlove hand stabilizer, holding their forearm and fist up to their chest.
Photo by Liviu Oprescu / Engadget

All of these parts are attached to a comfortable harness, which felt stretchy, soft and spongy. The company said the fabric was "benchmarked against top yoga and athleisure brands" and "manufactured by the very same leading manufacturers." Altogether, the GyroGlove weighs about 580 grams (or about 1.27 pounds), with the stabilization and power modules each coming in at 200 grams. 

During my time with the device, I mostly held my hand up awkwardly in mid-air while gesturing at our video producer, and that prolonged strain might explain why the GyroGlove felt more heavy to me. Wilson-Garrett, however, said she found the glove comfortable to wear all day, and I noticed she was using her hand more naturally than I was. It's likely she had grown more accustomed to the GyroGlove's weight and presence, and had adapted to it. 

Ultimately, I'm not a person who lives with significant hand tremors and had tried on the wrong size of the device, so I cannot really criticize its effectiveness. Wilson-Garrett, who has been living with Parkinson's disease for at least six years, said she's happy with it and intends to purchase one. 

The GyroGlove is available for sale worldwide for $5,899 (though it's on sale for $1,000 cheaper for a limited time). Like many assistive devices, that's a high price that not everyone can pay. Ong said the GyroGlove is registered with the FDA and the Australian Therapeutic Goods Administration (TGA) as a medical device, and that the company is in talks with insurance providers in the US to consider covering the glove for those who need it. It's worth noting that GyroGlove is not meant to replace medication or other types of treatment, too. 

The company's reps said it has hopes for future iterations to be smaller and offer more sophisticated stabilization. For now, the fact that GyroGlove is an actual device you can buy (if you have the money for it) is a good sign of its potential ability to help the many people living with hand tremors.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/gyroglove-is-a-hand-stabilizing-glove-for-people-with-tremors-223816688.html?src=rss

Everything you missed at CES 2024 Day 1: Samsung and Sony dominated, as did chips and laptops

The first truly busy day of CES 2024 has come and gone and it feels like we’ve been run over by a giant metaphorical eighteen-wheeler full of press conferences. From home robots to electric vehicles to AI, laptops and processors, there was news from pretty much all areas of tech. There were pleasant surprises like Samsung’s cute new Ballie robot ball and Sony’s spatial content creation headset, and intriguing concepts like Razer’s vibrating cushion for gamers. We also got exactly what we expected in the form of new processors from the likes of AMD, Intel and NVIDIA, as well as the subsequent flood of laptops carrying the just-announced chips for 2024.

And for everyone else, this CES also saw the launch of things like headphones, electric vehicles, gaming handhelds, grills, gaming phones, e-ink tablets, strange hybrid devices, noise-suppressing masks, standing desks and more. It’s a free for all and we’re nowhere near done. Here’s just a small selection of the biggest news out of CES 2024’s press day, right before the show officially opens.

Samsung at CES

Samsung and Sony’s press conferences had some of the best surprises this year. Samsung showed us a new version of its Ballie robot, which is cute as heck. It’s basically a yellow bowling ball with a projector built in and can send you text messages and video clips of what’s at home while you’re out. You can ask it to close your curtains, turn on your lights or stream your favorite yoga video to your ceiling while you lie on your back for a meditative session. Samsung told The Washington Post that Ballie will be available for sale some time this year, but did not say how much it would cost. I guess that’s another surprise we can look forward to in the coming months.

Sony steals the show 

Then there's Sony, which brought us a few unexpected demos, starting by driving its Afeela concept electric car onstage using a PlayStation controller. Then, it showed off its mixed reality headset for “spatial content creation,” which sounds somewhat similar to Apple’s Vision Pro and Microsoft’s HoloLens. Sony’s does appear to target content creators, though, and looks like a pared down PSVR2 headset. It’ll be powered by a Snapdragon XR2+ Gen 2 chipset, sport dual 4K OLED microdisplays and have user and space tracking. The new Sony headset still has no name, no price, but it will be available later this year.

Chips galore at CES 2024

Also dominating our news feeds on Day 1 was the barrage of chip news coming from Intel, AMD and NVIDIA. AMD, for example, launched a new Radeon RX 7600 XT GPU, which is a slight upgrade from last year’s entry-level model. The company also brought processors with neural processing units for AI acceleration to its desktop offerings by announcing the Ryzen 8000G series

Meanwhile, NVIDIA unveiled the RTX 4080 Super, RTX 4070 Ti Super and RTX 4070 Super, which will cost $999, $799 and $599 respectively. It also announced updates for its GeForce Now cloud gaming service, adding G-Sync support and day passes for streaming. Intel kept things fairly tame and tidy, simply giving us its complete 14th-generation CPU family, including HX-series chips like a 24-core i9 model. It also launched the Core U Processor Series 1, which is designed to balance performance and power efficiency in thin and light laptops.

The usual CES gadgets: PCs, laptops, TVs, and more

Speaking of laptops, most PC makers followed up the chip news flood by announcing all their new models containing the latest silicon. We saw notebooks from Alienware, Lenovo, MSI, Acer, Asus, and Razer, among others. MSI also had a new gaming handheld to show us, which is the first of its category to use Intel’s just-announced Core Ultra chip.

Asus also put that chip in a non-laptop product, debuting a new homegrown NUC. Meanwhile, Lenovo continued to challenge our notions of what a laptop with its ThinkBook Plus Gen 5, which is a weird gadget mermaid of sorts. Its top half is a 14-inch Android tablet, while its bottom half is a Windows keyboard and all of it is just funky.

Speaking of odd Android tablets, TCL was here with a new version of its NXTPAPER e-ink-ish tablet. This year’s model can switch between a matte e-paper-like display and a full-color LCD at the push of a button. The company also showed off a miniLED TV, which, at 115-inches large, is the biggest MiniLED TV with Quantom Dot technology to date.

We also got to check out Razer’s Project Esther, which is a proof of concept vibrating cushion showcasing the company’s new Sensa HD haptics platform for more immersive gaming experiences. That might be one of my favorite demos so far because… well... It vibrates. It’s a vibrating cushion for most office or gaming chairs.

Car stuff

There was plenty of car and transportation news, too, like Kia’s new fleet of modular vans and Volkswagen adding ChatGPT powers to its in-car voice assistant. The CES 2024 showfloor was also littered with piles of headphones, earbuds (and earwax) thanks to announcements from JBL, Sennheiser and less-known names like Mojawa, which put an AI-powered running coach in its bone-conducting headphones.

At the Pepcom showcase, we also saw some intriguing and fun products, like the Skyted Silent Mask that lets you talk in private in public, as well as the LifeSpan standing desk bike that lets you cycle really hard to generate enough power to charge your phone.

Intrigued? Check out our articles and videos with more details on everything I’ve mentioned and more. Or if you prefer, we’ll be back tomorrow to recap all the biggest news again to make your life easier. We’ve got plenty of press conferences coming up, and the show floor has officially opened, which means there’s still lots of stuff to check out in the days to come. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/ces-2024-day-1-recap-samsung-and-sony-dominated-as-did-chips-and-laptops-140024127.html?src=rss

OrCam Hear hands-on: A surprisingly effective voice isolation platform for people with hearing loss

Imagine being at a crowded convention or noisy bar and trying to have a conversation with someone across from you. It's tough enough for people with hearing to focus on what the person is saying, not to mention those with hearing loss. Assistive technology company OrCam has rolled into CES 2024 with a host of new products including a set of devices and an iPhone app designed to help those with hearing loss deal with auditory overload. The platform is called OrCam Hear and after a quick hands-on at the show in Las Vegas, I'm pleasantly surprised.

OrCam Hear consists of a pair of earbuds and a dongle that plugs into any phone, and you'll use the app to control who you want to listen to. The system listens to voices for a few seconds (via the dongle) and uses AI to create speaker profiles for each person that then allows you to "selectively isolate specific voices even in noisy environments." This targets the issue sometimes known as the "cocktail party problem" that's a challenge for hearing aids.

During a demo, my editor Terrence O'Brien and I spoke to two people whose voice profiles were already set up in the app. We stood around a table with Terrence on my right and the two company spokespeople across us about five feet away. I put the earbuds in (after they were sanitized), and the noise around me immediately sounded a little less loud and a lot more muffled. 

A close up of the OrCam Hear's dongle plugged into an iPhone held in mid-air.
Photo by Terrence O'Brien / Engadget

I looked at everyone around me and though I could see their lips moving, I couldn't hear anyone speaking. After OrCam's reps used the app to drag a floating circle into the ring surrounding me, I started to hear the person diagonally across me talk. And though the executive next to him was also moving his mouth, I could still only hear the voice of the person selected. Only after we moved the other speaker's icon into the ring did I start to hear them.

What impressed me more, though, was how the system handled relatively new participants like Terrence. He didn't have a profile set up in the app, and I initially couldn't hear him at all. A few seconds into the demo, though, a new circle appeared with a gray icon indicating a new "Anonymous" person had been recognized. When we dragged that into the ring, I was suddenly able to hear Terrence. This was all the more impressive because Terrence was wearing a fairly thick mask, which would have made him hard to understand any way. Yet, I was able to clearly make out what he was saying.

The OrCam Hear isn't perfect, of course. I was still able to hear the speakers as they talked, and the audio playing through the earbuds was slightly delayed, so there was a small echo. But people who have hearing loss, whom this product is designed for, aren't likely to experience that. There was also some audio distortion when the selected speakers were talking, but not so much that it impeded my comprehension.

OrCam said that the Hear platform is "currently in a technology preview phase and is expected to be shipped later in the year." Hopefully, that gives the company time to iron out quirks and make the app available on both iOS and Android, so that the assistive tech can be truly inclusive and accessible to more people.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/orcam-hear-hands-on-a-surprisingly-effective-voice-isolation-platform-for-those-with-hearing-loss-230243953.html?src=rss

You can now Chromecast TikTok videos to your TV

In the last few years, Google has used CES to show off new ways for Android, Chrome and all manner of non-Apple products to play nice with each other. At CES 2024 in Las Vegas, the company is also bringing updates to Chromecast, Fast Pair and Nearby Share, alongside some new features for cars. If you've always wanted to cast TikTok to your TV, because you're one of the handful of people from the TV generation that uses the app, you can now do that on Chromecast-enabled screens. Soon, though Google doesn't specify when, you'll also be able to cast livestreams from TikTok. 

In line with the theme of greater inter-device performance, this year Google is rolling out the ability to move what's playing on Spotify and YouTube Music from compatible Pixel phones to docked Pixel Tablets when within range. That's reminiscent of an Apple feature between Homepods and iPhones.

More devices with Chromecast built-in will also be launching this year, including the 2024 LG TV series. Later this year, LG Hospitality and Healthcare will also be getting Chromecast support so you can cast to TVs in, say, your hotel room without having to log into your own Google accounts. The company is also expanding Fast Pair support to Chromecasts with Google TV, which will make it easier to connect headphones to TV dongles. Google added that Fast Pair is coming to more Google TV devices later this year.

One of the best features on Apple devices is AirDrop, which allows different iPhones to easily transfer photos, files and contact information in person. Google's (attempt at an) answer to that since 2020 has been Nearby Share, though proximity-based versions of a file sharing tool on Android have existed for much longer. Samsung also introduced its Quick Share system in 2020, offering a very similar experience to Google's. 

In a move that should make things less confusing and more unified on the Android ecosystem, Google announced it's "collaborating with Samsung" to bring "the best of our sharing solutions together into a singular cross-Android solution under the Quick Share name." In a press release, Google said it's "integrated the experiences and created the best default, built-in option for peer-to-peer content sharing across all types of devices in the Android and Chromebook ecosystems." 

Google also said it's working with LG and other "leading PC manufacturers" to make Quick Share a pre-installed app on most Windows PCs. When Quick Share rolls out to current Nearby Share-enabled devices next month, you should see a new icon. Tapping it will show a list of devices available around you, and allow you to select who to send your media to. Like you can with Apple devices, you can go into your Android's settings to choose if you can be seen by everyone, contacts only or just your own devices.

It's worth noting that with last year's release of iOS 17, Apple upgraded AirDrop to make it possible to share files simply by putting the heads of two phones together. 

In addition to everything covered in this post, Google is also updating Android Auto and making more devices act as hubs for the Matter smart home standard to enable better uptake and integration. We have separate articles on each of those topics, so check them out for all the finer details.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/you-can-now-chromecast-tiktok-videos-to-your-tv-180006853.html?src=rss