The MSI Claw is the first gaming handheld built on Intel’s Core Ultra chips

MSI has introduced a handheld gaming device called Claw at CES 2024 in Las Vegas. And unlike its biggest rivals the Steam Deck and the ASUS ROG Ally, it's powered by Intel's processors instead of AMD's. The Claw runs on Intel's new Core Ultra chips and comes integrated with Intel's XeSS technology, which uses advanced AI upscaling algorithms to boost FPS for a smooth gaming experience. MSI says that will allow users to enjoy even resource-intensive AAA games on a handheld device. 

The device also uses MSI's thermal technology design called the Cooler Boost Hyperflow that redirects airflow to cool internal components so that it doesn't overheat even after extended gaming sessions. Its battery lasts for two hours under full workload conditions, same as ROG Ally's. In fact, the Claw looks pretty similar to its ASUS counterpart, even its curvatures at the bottom for better grip. 

It has a 7-inch full HD display, as well, with a 120Hz refresh rate. MSI's system enables users to fully customize personal macros for certain games, and it gives them access to Android games in addition to Windows titles on the handheld. We'll be taking the Claw for a spin at CES, so keep an eye out for a hands-on where we'll be talking about its performance. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/the-msi-claw-is-the-first-gaming-handheld-built-on-intels-core-ultra-chips-033813849.html?src=rss

Vivoo’s new at-home UTI test kit and app can tell you if you have a urinary tract infection

Following last year's smart toilet which debuted at CES 2023, Vivoo is at it again for CES 2024 with another urine analysis product. The company has unveiled an at-home digital urinary tract infection (UTI) testing kit that provides what it calls "gold standard accuracy results" via a two-minute test. 

To use it, just pee on the provided UTI test strip and scan it to obtain results via Vivoo's app in "seconds," the company says. If the result is positive, customers can then connect with a doctor to obtain a prescription if required. The company says the product "saves customers time, prevents confusion in readings, and digitalizes the data so customers can share results with healthcare providers via the app, if instant treatment is desired." From the looks of it, the results are obtained via the strip, then deciphered by the app.

Vivoo notes that UTIs are the most common type of outpatient infection, with six in ten women experiencing them in their lifetimes. Normally, you'd send your urine off to a lab for analysis, or use an existing at-home test kit. The company says that the new product spares users the bureaucracy of lab testing while also keeping the relevant data for users who might need that, unlike regular testing kits. 

In fact, many women experience recurrent UTIs, which have become resistant to at least one or even multiple types of antibiotics. By keeping a record of past infections, Vivoo's app could help patients and medical professionals track the problem and treat it appropriately. 

Last year, the company unveiled a smart toilet device that clips onto existing toilets and provides data like your body's water, magnesium, PH, protein and sodium levels. Later on, it released strips for vaginal PH levels. The new home UTI test will come to market in Q2 2024, but pricing isn't yet available. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/vivoos-new-at-home-uti-test-kit-and-app-can-tell-you-if-you-have-a-urinary-tract-infection-030021462.html?src=rss

Belkin auto-tracking Stand Pro swivels iPhone 360 degrees with your movement during video calls, recordings

Apple iPhone already does some great things. Belkin believes there is a way it can add some prowess to the phone’s video and FaceTime capabilities. To that accord, the accessories manufacturer is bringing Belkin Stand Pro to CES 2024. This motorized dock for the iPhone 12 and upward is made to swivel 360 degrees and track users’ movement while the camera is on.

There have been numerous iterations of iPhone dock we have seen in our time. Right from the days docking stations played incredible music and now those that wirelessly charge the smartphone and render it with incredible utility: case in point, the rotating functionality the Belkin Stand Pro brings to the iPhone.

Designer: Belkin

The Belkin Stand Pro is offered with a cylindrical base that can rotate 360 degrees. From the base extends a MagSafe-equipped motorized arm which holds and charges the iPhone. The arm can extend 90 degrees, up and down, for more convenience.

The Stand Pro is essentially designed for tracking your movement, for instance, you are cooking in the kitchen and the iPhone playing the video recipe tutorial rotates wherever you go; picking spices, or walking to the fridge. When you don’t want the iPhone to track your movement, you can turn the tracking off from the onboard button. The LED indicator built-in indicates when the tracking is on or off.

The iPhone pairs to the Belkin dock with NFC and allows you to open apps including camera, FaceTime, Instagram, WhatsApp and more. Once the app’s working, the Stand Pro automatically rotates the docked iPhone to keep you in frame all the time using iPhone’s recognition technology without having to add additional third-party app. This makes the Belkin Stand Pro the first iPhone accessory to use Apple’s DockKit framework.

Priced at $179.99, the Belkin dock functions as a wireless charger with up to 15W fast wireless charging when plugged into a 30W USB-C charger. For filming away from the direct power port, the Stand Pro also features a battery that can do the task for up to five hours on a single charge.

The post Belkin auto-tracking Stand Pro swivels iPhone 360 degrees with your movement during video calls, recordings first appeared on Yanko Design.

I’m ashamed how much I love Mercedes-AMG and will.i.am’s attempt to turn cars into DJs

If you’ve ever wanted to turn your car into a DJ, with the sound controlled by how you drive, then you need to buy a Benz, stat. Mercedes-AMG and will.i.am have turned up at CES 2024 in Las Vegas with what they’re calling MBUX SOUND DRIVE (all caps, as if to be bellowed). Sadly, it’s hard to talk about what it is and what it does without robbing it of its mystery, so apologies in advance: It’s essentially a system that pulls data from the car’s suite of sensors, which then helps control a specially-deconstructed music file. But, as joyless as that description sounds, once you’ve experienced it, you’ll wonder why it hasn’t been done before. Not to mention that, at the risk of gushing, it really does deepen the emotional connection between driving and the music you’re listening to.

The announcement came as part of Mercedes’ CES push, which this year is focused on the power of its audio setup. Alongside the announcement of MBUX SOUND DRIVE, it’s boasting of a new partnership with Amazon Music and Audible. That’ll see Dolby Atmos versions of its exclusive audio dramas, podcasts and books come to compatible vehicles. (The highlight of the event was when legendary British audio producer Dirk Maggs took to the stage, the figure responsible for the latter radio versions of The Hitchhikers Guide to the Galaxy.)

MBUX SOUND DRIVE works by pairing musical elements in a song with ten inputs taken from the car. Start the car and all you get is the track’s bed, so to speak, looping in the background waiting for you to get moving. Push on the accelerator at low speeds and it’ll add some bass reverb to the song, while turning the steering wheel gets you extra effects or the chorus loop kicking in. It’s only when you open the car up on a clear highway and the main music and lyrics will start blasting, rewarding you for moving along. And then, when you’re coasting toward a stop light, the lead vocal and melody will peel away, returning you to the far less intrusive backing track.

If nothing else, it’s a spectacular piece of hardware and software development, given the fact even the fanciest in-car platform wasn’t designed to do this. It’s worth pointing out the extent of the achievement that’s enabled something like this to happen on an existing system. And there are plans to extend it further so, for instance, if the windshield wipers detect rain, the music will change to reflect the mood.

The demo I experienced had 16 tracks pre-loaded, including The Black Eyed Peas’ I Got A Feeling and Le French’s Night Drive. These songs have all been broken down and rebuilt to take advantage of MBUX SOUND DRIVE’s separated format. When you’re just cruising around a Las Vegas parking lot, it’s all pretty restrained, even if you do put some heavy reverb on while you’re parked. In fact, the whole experience at slow speeds could almost be described as teasing, offering you hints of the song you know and love, but never giving you the whole thing.

It’s only when you (or in this case, your qualified driver) puts their foot down and you suddenly start screaming down the road that the whole song kicks in. Even a song like I Got A Feeling, hardly the most bombastic, suddenly feels epic in this format. The closest thing I can compare it to is those moments in Grand Theft Auto when you’re opening it up on the highway and a great track kicks in. Of course, the best example of that would be cruising down the road while David Bowie’s Somebody Up There Likes Me plays. But, despite will.i.am’s promises that when the system arrives halfway through 2024 all genres will be well-represented, I’m not so sure. After all, it’s clear that tracks primarily based on discrete loops are going to be the easiest to translate and the most well-suited to the environment.

In terms of the future, will.i.am shared his hopes that tracks could be hard-coded to reflect a geography. He used the example of a car going through a tunnel, which would prompt a gas car driver to put their foot down to fill the space with engine noise. But in our electric future, where there is no engine noise, users will instead have to content themselves with the jolt from their favourite song. He added that he also dreams of building in easter eggs for songs, which would only start playing when the car reaches a specific location. On one hand, I’m curious how many musicians would take the time to remix their existing songs for the size of the addressable market. Which, in this case, is only Mercedes-Benz vehicles equipped with a second-generation MBUX system. Then again, money talks.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/im-ashamed-how-much-i-love-mercedes-amg-and-williams-attempt-to-turn-cars-into-djs-023948867.html?src=rss

Razer is bringing the world’s first HD haptics gaming chair cushion to CES 2024

CES 2024 is here, and we're seeing all manner of new gaming gear and accessories. One notable mention is Razer's Project Esther, the world's first HD haptics gaming chair cushion. Yes, that's right, you can get one step closer to feeling like you're in the game — especially if you combine it with a VR headset

The Project Esther concept includes 16 haptic actuators and has ultra-low latency. You can control the haptics' directionality, multiple-device integration and multi-actuator experiences. It also had automatic audio-to-HD haptics conversion, allowing for a plug-and-play solution. Razer claims the cushion is compatible with most gaming and office chairs, so you won't have to get a whole new setup to use it. 

Razer's Project Esther gaming chair cushion is the result of years of haptics development. In 2018, the company unveiled Razer HyperSense haptics in its new Nari Ultimate headphones. They used advanced digital signal processing and wideband voice-coil actuators to produce the haptics for PC gaming — expanding it into Xbox headphones the following year. 

CES 2019 brought Razer's Hypersense high-resolution vibration system, which it programmed into keyboard rests, mice, and a chair. In 2022, Razer took another big step, buying Interhaptics, a company specializing in, as the name suggests, haptics. Project Esther functions using Interhaptics technology. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/razer-is-bringing-the-worlds-first-hd-haptics-gaming-chair-cushion-to-ces-2024-020027892.html?src=rss

Razer updates its Iskur gaming chair with a ‘6D’ lumbar system for CES 2024

Razer’s new accessories at CES 2024 are every bit as lavish as you’d expect. At this year’s convention, the company has a follow-up to its first gaming chair, an 11-port USB-C dock, a gaming cushion with HD haptics and a monitor-mounted light bar with Chroma RGB illumination.

Razer Iskur V2 gaming chair

The Razer Iskur V2 Gaming Chair is the successor to the 2020 original. The new model’s highlight is its “6D Adjustable and Adaptive Lumbar Support System.” Described as the only one of its kind, the lumbar support has a spring-loaded mechanism to adjust to the body’s weight and posture, alongside manual controls for the lumbar area’s protrusion and height.

Product marketing photo showing a person's palm pressing down on the cushy-looking seat with a gray pattern. Closeup of the chair's seat.
Razer

Razer says it gathered feedback from ergonomics experts and esports communities in designing the Iskur V2. The chair can recline up to 152 degrees and lets you control its tilt. It includes high-density foam cushions and is made of EPU-grade synthetic leather. The company describes it as offering “extensive customization in height, position, and angle,” and it has a memory foam head cushion.

The Iskur V2 costs $650 and is available to order today from Razer’s website.

Razer USB C Dock

Product marketing photo of the Razer USB C Dock. The hub sits on a desk with a gaming laptop behind it. It has several open ports and an SD Card halfway out. It sits on a dark blue desk with dramatic shadows.
Razer

Razer also has a new 11-port dock compatible with Windows PCs, Macs, iPads and Chromebooks. On the port front, the Razer USB C Dock has four USB-A, two USB-C, a gigabit ethernet, HDMI, a 3.5mm audio combo jack and slots for UHS-I SD and microSD.

The dock’s HDMI port can output up to 4K at 60Hz, and its audio jack supports 7.1 surround sound. The accessory is made from an aluminum alloy, and its USB ports support 85 W laptop charging.

The Razer USB C Dock costs $120 and is available today from Razer.

Project Esther cushion

Following its 2022 acquisition of Interhaptics, Razer is showcasing Project Esther, “the world’s first HD haptics gaming cushion,” which sounds like it could have been made from a CES-themed Mad Lib. The cushion hasn’t been announced as a commercial product (at least not yet), but it’s designed to flaunt Razer Sensa haptics.

Similar to the size and shape of standalone chair massagers, the Project Esther chair mat stretches along where you sit up across the back. Devs can control Sensa’s “directionality, multi-actuator experiences, and multiple-device integration between different platforms and peripherals.” The tech is plug-and-play, automatically converting audio to HD haptics.

The chair offers “wideband, high-definition haptics,” thanks to 16 built-in actuators. It has adjustable straps, and Razer says it’s compatible with most gaming and office chairs. (If it ever makes it to market, it could perhaps help the haptic-obsessed save money on gaming chairs.) It supports low-latency connections to ensure its rumbles stay synced without delay with your gaming or media content.

Aether Monitor Light Bar

Product marketing image of Razer's Aether Monitor Light Bar. The elongated bar, viewed from an angle, emits RGB lighting above, behind and below.
Razer

Razer loves RGB lighting, and the company has a new bar to prove it. The Aether Monitor Light Bar is a mountable accessory with front- and rear-facing LEDs.

The light bar has a Color Rendering Index (CRI, a rating of color accuracy) score of 95 and can light a 60cm x 30cm (about 2 x 1 feet) area. Its Chroma RGB lighting supports over 16.8 million colors and “a myriad of lighting effects.”

The bar also includes capacitive touch controls. Its brightness, color temperature and Chroma effects are all user-customizable. It supports the Matter smart home standard, and users can tweak its settings through the Razer Gamer Room app.

The Aether Monitor Light Bar will be available in March for $130.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/razer-updates-its-iskur-gaming-chair-with-a-6d-lumbar-system-for-ces-2024-020026353.html?src=rss

Sony drove its Afeela EV onto the CES stage using a PlayStation controller

Sony's partnership with Honda around a new concept EV called the Afeela has been a highlight of CES for several years now. And while we're not any closer to finding out if and when this car will become a reality, Sony had a fun way to show off the latest iteration of the vehicle: they drove it onto the Sony CES 2024 stage with a PlayStation DualSense controller. Sure, it was just a fun gimmick rather than any evidence of a PlayStation-controlled vehicle coming down the road, but CES is all about the spectacle. 

Sony / Honda Afeela concept EV
Sony

We'll keep an eye out for more details on the Afeela, but Sony just invited Microsoft on stage to talk about how the in-vehicle experience is going to get smarter thanks to — you guessed it — AI. We're getting close to CES bingo here, folks. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/sony-drove-its-afeela-ev-onto-the-ces-stage-using-a-playstation-controller-014403857.html?src=rss

Sony’s mixed reality headset for ‘spatial content creation’ arrives later this year

Sony's CES 2024 presentation didn't have much news for the first 25 minutes, but then the company revealed a new XR head-mounted display and controllers with... no name so far, aimed at "spatial content creation." With a matte gray finish, the headset looks like a stripped-down PSVR2, and there appear to be two cameras facing out from the front. There is also a controller-wand and a smaller peripheral similar in size to a ring. The new hardware is apparently aimed at creators and artists who manipulate and craft products in virtual spaces. It will be available later in 2024, though pricing will be announced at a later date.

Although Sony didn't go into great detail onstage, a press release dropped after the show wrapped with some key specs. The headset is powered by Qualcomm's Snapdragon XR2+ Gen 2, which was announced just as CES began. This means it's a self-contained device that doesn't require a computer to run. That chip is driving dual 4K OLED microdisplays, and provides "user and space tracking" for mixed reality experiences. Sony CEO Kenichiro Yoshida said it would offer a "crisp viewing experience" and "intuitive interaction for 3D design", teasing a device aimed at professionals, similar to its professional-level cameras and devices.

The device has "video see-through" functionality and a total of six cameras and sensors. A pair of controllers were shown off, one described as a "ring controller" for manipulating objects and another as a "pointing controller" for... pointing. Sony envisions creators being able to craft 3D models in realtime using the controllers and traditional input devices like keyboards in tandem:

"By holding the pointing controller in the dominant hand and attaching the ring controller to the fingers of the other hand, creators can model 3D objects using both controllers and a keyboard, while wearing the head-mounted display."

Sony is also talking up the headset's balance, saying it has fine-tuned "the balance of the device's center of gravity." The display portion of the headset also flips up and out of the wearer's field of vision, allowing them to dip in and out of their work without needing to remove the headset entirely.

Sony at CES 2024
Sony

From the work shown on stage, it seems positioned less as a Vision Pro rival and more as a creative take on Microsoft's HoloLens. Hopefully, we'll hear and see more at Sony's booth later this week. The company showed mock-ups of a user tinkering with a bipedal robot while wearing the new headset, manipulating the robot's arm, while two monitors nearby showed things in extra detail. Sony says you'll be able to, through third-party creative apps, review and collaborate on work remotely. 

Update, January 8, 10:30PM ET: Added details from Sony's post-show PR.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/sony-spatial-content-creation-headset-at-ces-2024-013936595.html?src=rss

World’s First 3D Printed Edible Eel: Sushi Ready

Hot on the heels of 3D-printed salmon comes the world’s first 3D-printed eel, made by Steakholder Foods using its line of 3D meat printers. Its current iteration of eel is plant-based, but it plans to ethically harvested eel cells and cultivate them once “economies of scale allow for price-competitive cell development.” These are fascinating times for the sushi industry!

SteakHolder’s printing process involves printing alternating layers of varied textures to as closely resemble the meat it’s printing as possible. So, it’s not just a solid block of the same texture and flavor. Its printing technology also allows the company to produce meat alternatives using significantly fewer ingredients than others currently on the market.

Above: A filet of grouper being printed.

SteakHolder Foods CEO Arik Kaufman says, “The launch of our printed eel marks a pivotal moment in the seafood industry…This technology is designed to enable partners to generate products on a potential industrial scale of hundreds of tons monthly, not only at lower costs compared to wild eel, but also with the flexibility to create a variety of printed products using the same production line.”

Would you eat 3D-printed eel? I would. As a matter of fact, I want some right now. Ideally, laid atop some rice with wasabi and soy sauce on the side. Great, now I want sushi. But I just had Mexican! I suppose I still have a little room…

[via TechEBlog]

How to watch Intel’s CES 2024 keynote

Intel is one of the biggest names that's in Las Vegas for CES 2024. The company has several talks and panels lined up, including a keynote from CEO Pat Gelsinger. You'll be able to watch that particular event live at 8pm ET on January 9 at Intel's website, along with the CES 2024 site and app.

What to expect

Intel hasn't divulged too much about what Gelsinger will dig into. However, it probably shouldn't come as a surprise that AI is one of the topics at hand. According to Intel, Gelsinger will talk about "the critical roles that silicon and software play in making AI more accessible, providing powerful compute and enabling modern economies." 

Meanwhile, the company has just revealed its full slate of Intel Core 14th-gen processors for desktops and laptops, including HX-series mobile CPUs. Intel says there are more than 60 14th-gen HX-powered systems coming to market from its partners this year. Meanwhile, thin-and-light laptops featuring the new Intel Core U Processor Series 1 lineup will start hitting retailers by the end of March.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/how-to-watch-intels-ces-2024-keynote-010001600.html?src=rss