Dell unveils a massive 52-inch 6K ultrawide curved monitor at CES 2026

PC and gaming monitors are among the CES 2026 announcements, and Dell may just have one of the most impressive. The company launched an ultrawide, curved 52-inch 6K monitor at the show this year, a productivity behemoth designed for stock traders, engineers and other professionals. Dell claims the UltraSharp 52 Thunderbolt Hub Monitor is the first 52-incher to market that’s also ultrawide, curved and supports 6K resolution (but with that many qualifiers almost anything can be a world first).

Given Dell’s experience in the monitor realm, this could be a dream display for professionals who handle vast data sets such as trading platforms, AutoCAD, 3D rendering software, spreadsheets and more. It sports a 120Hz refresh rate on an IPS Black panel and emits up to 60 percent less blue light when compared to competing monitors. It delivers an impressive 129 ppi (for comparison a 4K 32-inch monitor delivers 138 ppi) and an ambient light sensor helps avoid eye strain during long work sessions.

Users can connect up to four PCs to the monitor simultaneously, which can use picture-by-picture to treat each partitioned screen as an individual monitor. It also features built-in KVM (keyboard, video, mouse) features that let users control all connected PCs via a single mouse and keyboard. The monitor can also charge your laptop with up to 140W of power via a Thunderbolt 4 connection.

Dell also rolled out a new 32-inch 4K QD-OLED monitor with support for both True Black 500 HDR and Dolby Vision. It claims true-to-life color accuracy out of the box and excellent gamut coverage at 99 percent of DCI-P3. This monitor would be a strong fit for film and photo editing.

The Dell UltraSharp 52 Thunderbolt Hub Monitor is priced at $2,900 with stand or $2,800 without. It will be widely available starting January 6. The Dell UltraSharp 32 4K QD-OLED Monitor comes in at $2,600 and will be available beginning February 24.

This article originally appeared on Engadget at https://www.engadget.com/computing/dell-unveils-a-massive-52-inch-6k-ultrawide-curved-monitor-at-ces-2026-140024842.html?src=rss

NVIDIA’s G-Sync Pulsar tech can minimize motion blur for gamers

NVIDIA has unveiled the G-Sync Pulsar, which it calls the “latest evolution of [its] pioneering VRR (variable refresh rate) technology,” at CES 2026. The new tech promises a gaming experience that’s free of stutter with buttery smooth motion, which is made possible by pulsing the display’s backlight. G-Sync Pulsar displays have multiple horizontal backlight sections that are pulsed independently from top to bottom, unlike traditional displays whose backlight is always on. When the backlight is always active, the image fades from one frame to the next. The displays with the new tech give pixels in a frame enough time to stabilize before they’re backlit so that they’re shown in their right locations, effectively reducing monitor-based motion blur.

The company says G-Sync Pulsar can effectively quadruple your refresh rate. If you’re playing at 250 fps, that means it can deliver a perceived effective motion clarity of over 1,000 Hz. That enables easier tracking and shooting in-game, making displays with the technology especially suited for esports. You can see the difference in motion in Counter-Strike 2 between a 360Hz monitor without G-Sync Pulsar and one with the technology switched on in the video below.

The first four displays designed specifically to support G-Sync Pulsar and Ambient Adaptive Technology, which allows users to automatically adjust color temperature and brightness based on ambient lighting, will be available starting on January 7. Acer, AOC, ASUS and MSI will each be releasing a 27-inch 2,560 × 1,440 IPS display, which comes with a 360Hz refresh rate and 500 nits of peak brightness in HDR.

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/nvidias-g-sync-pulsar-tech-can-minimize-motion-blur-for-gamers-140000058.html?src=rss

Razer’s Project Motoko is a concept gaming headset that doubles as an AI wearable

We see plenty of far-out ideas on the CES show floor, and this year Razer brought in a concept piece called Project Motoko. The device is Razer's take on blurring the line between a gaming headset and an AI-powered wearable for daily life. Or it's a way for Ghost in the Shell fans to feel affronted by Razer taking The Major's name in vain, take your pick. 

Project Motoko is powered by Qualcomm's Snapdragon platforms. The headset has a pair of first-person view cameras positioned at eye level that can support real-time object and text recognition. It also has a wide field of attention that can capture things happening beyond the normal human eye's peripheral vision, and its microphone array is designed to capture both near and distant audio. In addition to taking in environmental details, Project Motoko can also operate as a wearable AI assistant, and the model is compatible with several different chatbots. The company's press release noted that it can integrate with Grok, OpenAI and Gemini.

"Project Motoko is more than a concept, it’s a vision for the future of AI and wearable computing," Nick Bourne, Razer's global head of mobile console division, said in the press release about the device. "By partnering with Qualcomm Technologies, we’re building a platform that enhances gameplay while transforming how technology integrates into everyday life. This is the next frontier for immersive experiences."

Sometimes concept designs we see at CES remain just that: thought experiments with no plan for commercial release. However, according to a rep from Razer: "Our goal is to bring this headset to market in the near future, with a dev-kit available first followed by a full retail release." It should be interesting to see if Razer does commit to Project Motoko in the longer term. Razer is also well-established as a brand for serious gamers, so seeing the company explore making an AI wearable that could appeal to a broader audience, if still a geeky one, is an intriguing move.

Update, January 6, 2026, 1:08PM ET: Added statement from Razer rep about future commercial prospects.

This article originally appeared on Engadget at https://www.engadget.com/gaming/razers-project-motoko-is-a-concept-gaming-headset-that-doubles-as-an-ai-wearable-140000534.html?src=rss

Ring relaunches its suite of smart home sensors

Ring turned up to CES with a whole host of announcements, including a revamped range of home sensors. Ring Sensors (for that is their name) is a new lineup of tools, built on Amazon’s Sidewalk low-power networking protocol. That includes updated versions of its door, window and break glass sensors, as well as a new OBD-II car alarm, motion detectors and panic buttons. You’ll be able to pre-order the new car alarm today, while the rest of the new sensors will be available at some point in March. And, in tandem with that news, Amazon is announcing that Sidewalk is expanding outside of the US, starting in Canada and Mexico.

At the same time, the company is launching a number of enhancements to its app platform, including the Ring Appstore. This will let users purchase and integrate with third-party apps which have been built to cater to “specific use cases, from small business operations to everyday needs around the home.” The company added that, in the coming weeks, users will be able to browse a growing number of apps designed to help “you get more value from your Ring cameras.”

The company is also throwing more AI into its system to better coordinate its alerts, including Unusual Event Alerts. These will learn from the patterns around your home and, when it spots something out of the ordinary, send you a ping. Active Warnings, meanwhile, will use computer vision to identify potential threats and offer “specific warnings based on details like location and actions.”

Finally, Ring has teamed up with Watch Duty, a non-profit alert platform designed to share useful information about local wildfires. The pair have added a Fire Watch feature to the Neighbors app to enable communities keep each other in the loop about local fire and smoke events. Ring users are encouraged to share details from their own Ring cameras to “support first responders on the ground.”

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/ring-relaunches-its-suite-of-smart-home-sensors-140000667.html?src=rss

Segway’s Navimow brand unveiled a new line of robotic lawn mowers at CES 2026

Segway, the maker of Steve Wozniak's favorite mode of self-balancing transport, has released a new series of robotic lawn mowers under its Navimow brand, designed for all manner of Roomba-esque mowing action. The lineup includes four residential series and was unveiled at CES 2026.

Navimow's lineup includes the flagship X4 Series for large yards up to 1.5 acres in size. Its AWD system can handle slopes up to 40 degrees and it sports dual 180-watt cutting motors. The largest mode, the X450 will retail for $3,000. A smaller X430 rated for yards up to 1 acre will go for $2,500.

The Navimow i2 series comes in AWD and LiDAR variants and is designed for "everyday" lawn maintenance. The AWD variant's three-wheel-drive system can handle 24-degree slopes and is designed to handle muddy or slippery terrain. The larger i2 AWD model, called the i210 AWD is rated for yards up to a quarter-acre in size, and will retail for $1,300. A smaller model dubbed the i206 AWD can handle yards as large as 0.15 acres and will go for $1,000.

The i2 LiDAR variant can scan 200,000 points per second to create a detailed spatial map of your yard, allowing it to navigate complex paths and, crucially, work at night. That model is rated for yards up to 0.37 acres in size. Pricing has not been announced for the i215 LiDAR model.

Finally, the H2 series features three vision technologies integrated into one model, with LiDAR, Network RTK and cameras combining into what Navimow is calling LiDAR+. The H2 is built for slopes up to 24 degrees and can handle yards up to half an acre in size. The H2 is being released in two models. The H210 for yards up to 0.25 acres, and the H220 for half-acre yards. Pricing for the H2 series has not been released yet.

The i2 AWD Series and the X4 Series will be available for pre-order beginning January 16.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/segways-navimow-brand-unveiled-a-new-line-of-robotic-lawn-mowers-at-ces-2026-130007014.html?src=rss

Ugreen launches a smart home security platform at CES

Ugreen makes plenty of things, but you’re probably familiar with the name in the context of its NAS systems (should that be NASes? Who knows). Naturally, the company has turned up to CES 2026 with the former, but it’s also branching out into home security. It’s announcing SynCare, an AI infused all-in-one surveillance platform which, it rather boldly claims, will become an “attentive, integrated guardian” of your home.

Leading the pack is the SynCare Video Doorbell with head-to-toe 4K video, intelligent detection and 24/7 recording — especially if you’ve got it hooked up to your Ugreen NAS. That works in tandem with SynCare cameras offering 4K video on a pan-tilt base and, of course, AI to recognise “people, pets and key events.” Ugreen is also offering a tablet, the SynCare Smart Display, a “home hub” to let you manage your cameras from a single place in your home. 

The company is quick to highlight the major benefit of an at-home system like this, which is no need to pay for a monthly subscription. And, of course, that the footage from your home stays inside your home at all times, making it a better option for those folks who value their privacy. Sadly, Ugreen isn’t ready to share pricing or availability information for the series, saying it’ll be available in the back end of 2026.

This article originally appeared on Engadget at https://www.engadget.com/wearables/ugreen-launches-a-smart-home-security-platform-at-ces-130000389.html?src=rss

Meta has delayed the international rollout of its display glasses

Meta is pausing release of its Ray-Ban Display smart glasses to the UK, France, Italy and Canada due to "unprecedented demand and limited inventory," the company said on Monday at CES 2026. There's no new date for the expansion that was originally set for early 2026. "We'll continue to focus on fulfilling orders in the US while we re-evaluate our approach to international availability," Meta wrote on its blog.

Since Meta's display glasses first went on sale, acquiring them has been a challenge. They're not available online and can only be found in a limited number of retail outlets including select Ray-Ban, Sunglass Hut, LensCrafters and Best Buy locations in the United States. To buy them, you need to book an appointment for a demo at one those stores via Meta's website. Ahead of launch, Meta said it saw "strong" demand for demos with locations booked ahead for several weeks. 

There was optimism that availability would increase as the company expected buying options to "expand" the longer they were on sale. However, with the delay of the planned international launch, it appears that the company still has a mismatch between supply and demand. 

Meta's $799 Ray-Ban Display glasses are its first to incorporate a heads-up display and are also equipped with a camera, stereo speakers, six microphones, WiFi 6 and a finger tracking Neural Band controller. In her review, Engadget's senior reporter Karissa Bell noted that the Ray-Ban display "enables wearers to do much more than what's currently possible with [other] Ray-Ban or Oakley models" — provided you don't mind the look of the chunky, chunky frames. 

This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-has-delayed-the-international-rollout-of-its-display-glasses-120056833.html?src=rss

Meta has delayed the international rollout of its display glasses

Meta is pausing release of its Ray-Ban Display smart glasses to the UK, France, Italy and Canada due to "unprecedented demand and limited inventory," the company said on Monday at CES 2026. There's no new date for the expansion that was originally set for early 2026. "We'll continue to focus on fulfilling orders in the US while we re-evaluate our approach to international availability," Meta wrote on its blog.

Since Meta's display glasses first went on sale, acquiring them has been a challenge. They're not available online and can only be found in a limited number of retail outlets including select Ray-Ban, Sunglass Hut, LensCrafters and Best Buy locations in the United States. To buy them, you need to book an appointment for a demo at one those stores via Meta's website. Ahead of launch, Meta said it saw "strong" demand for demos with locations booked ahead for several weeks. 

There was optimism that availability would increase as the company expected buying options to "expand" the longer they were on sale. However, with the delay of the planned international launch, it appears that the company still has a mismatch between supply and demand. 

Meta's $799 Ray-Ban Display glasses are its first to incorporate a heads-up display and are also equipped with a camera, stereo speakers, six microphones, WiFi 6 and a finger tracking Neural Band controller. In her review, Engadget's senior reporter Karissa Bell noted that the Ray-Ban display "enables wearers to do much more than what's currently possible with [other] Ray-Ban or Oakley models" — provided you don't mind the look of the chunky, chunky frames. 

This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-has-delayed-the-international-rollout-of-its-display-glasses-120056833.html?src=rss

Meta’s EMG wristband is moving beyond its AR glasses

Meta has been experimenting with EMG technology for years. In 2025, the company commercialized it for the first time in its Meta Ray-Ban Display glasses, which users control via a dedicated neural band that is able to interpret subtle muscle movements in the wrist.

Now, at CES 2026, the company is offering its first look at how its neural band could be used to control devices outside of its smart glasses lineup. Meta has teamed up with Garmin, as well as a handful of research partners, to explore some intriguing use cases for its wrist-based controller.

The social media company has previously worked with Garmin on fitness integrations for its glasses. But at CES, the companies were showing off a very early demo of how Meta's neural band inside of a car to control the built-in infotainment system. 

The experience is part of Garmin's "Unified Cabin" concept, which explores a bunch of AI-centric in-car experiences. The demo I tried was fairly limited: while wearing a neural band, I was able to navigate two apps on a touchscreen display in Garmin's cockpit setup. In one, I used pinch and swipe gestures to manipulate an onscreen model of a car, much like how I would use the band to zoom in and out of an image while wearing the display glasses. The second demo, somewhat bizarrely, was a game of 2048. I used the same swipe gestures to move the tiles around. 

Neither of those are the kinds of experiences you immediately think of when you imagine "in-car entertainment," but Garmin, which works with a number of major car brands on infotainment systems, seems to be thinking about some more practical use cases too. The company told me that it will explore using the neural band to control vehicle functions like rolling down windows or unlocking doors. 

Elsewhere, Meta also announced a research collaboration with the University of Utah that will explore how its EMG tech can be used to help people who have ALS, muscular dystrophy and other conditions that affect the use of their hands.

Researchers will work with Meta to test gestures that could enable people to control smart speakers, blinds, thermostats, locks and other household devices using the neural band.  "Meta Neural Band is sensitive enough to detect subtle muscle activity in the wrist — even for people who can’t move their hands," the company explains in a blog post. Researchers will also look at using the band for mobility use cases, like the University of Utah's TetraSki program, which currently uses a joystick or mouth-based controller to help participants ski.

This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-emg-wristband-is-moving-beyond-its-ar-glasses-120000503.html?src=rss

Meta’s EMG wristband is moving beyond its AR glasses

Meta has been experimenting with EMG technology for years. In 2025, the company commercialized it for the first time in its Meta Ray-Ban Display glasses, which users control via a dedicated neural band that is able to interpret subtle muscle movements in the wrist.

Now, at CES 2026, the company is offering its first look at how its neural band could be used to control devices outside of its smart glasses lineup. Meta has teamed up with Garmin, as well as a handful of research partners, to explore some intriguing use cases for its wrist-based controller.

The social media company has previously worked with Garmin on fitness integrations for its glasses. But at CES, the companies were showing off a very early demo of how Meta's neural band inside of a car to control the built-in infotainment system. 

The experience is part of Garmin's "Unified Cabin" concept, which explores a bunch of AI-centric in-car experiences. The demo I tried was fairly limited: while wearing a neural band, I was able to navigate two apps on a touchscreen display in Garmin's cockpit setup. In one, I used pinch and swipe gestures to manipulate an onscreen model of a car, much like how I would use the band to zoom in and out of an image while wearing the display glasses. The second demo, somewhat bizarrely, was a game of 2048. I used the same swipe gestures to move the tiles around. 

Neither of those are the kinds of experiences you immediately think of when you imagine "in-car entertainment," but Garmin, which works with a number of major car brands on infotainment systems, seems to be thinking about some more practical use cases too. The company told me that it will explore using the neural band to control vehicle functions like rolling down windows or unlocking doors. 

Elsewhere, Meta also announced a research collaboration with the University of Utah that will explore how its EMG tech can be used to help people who have ALS, muscular dystrophy and other conditions that affect the use of their hands.

Researchers will work with Meta to test gestures that could enable people to control smart speakers, blinds, thermostats, locks and other household devices using the neural band.  "Meta Neural Band is sensitive enough to detect subtle muscle activity in the wrist — even for people who can’t move their hands," the company explains in a blog post. Researchers will also look at using the band for mobility use cases, like the University of Utah's TetraSki program, which currently uses a joystick or mouth-based controller to help participants ski.

Update, Tuesday, January 6, 2026, 3:40PM PT: Added a video from garmin’s demo.

This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-emg-wristband-is-moving-beyond-its-ar-glasses-120000503.html?src=rss