Fujifilm’s latest Instax camera looks like a vintage Super 8

Fujifilm just revealed the Instax mini Evo Cinema camera, which looks suspiciously like a vintage Super 8. More specifically, it was designed to mimic the Single-8 from 1965, which was a rival unit to the Super 8. Fujifilm's latest device captures video, just like its retro inspiration.

However, this is an Instax and the line has primarily been dedicated to snapping and printing out still images on the fly. The Evo Cinema can still do that, albeit in a slightly different way. Users shoot a video and the camera can convert a shot from the footage into an Instax print. That's pretty cool. The bad news? It requires some kind of QR code tomfoolery.

The camera also comes equipped with something called the Eras Dial, which has nothing to do with Taylor Swift and everything to do with adjusting various effects and filters to create footage "inspired by different eras." There are ten "eras" to choose from, including a 1960s vibe. The filter levels here are adjustable. We'll have to take a look at some footage to see how everything translates.

The Eras Dial.
Fujifilm

Fujifilm is dropping the Instax Evo Cinema on January 30, but only in Japan for now. We don't have a price yet.

This is just the latest nifty camera gizmo the company has thrust upon the world. It recently released an Instax model that has a secondary camera for selfies.

This article originally appeared on Engadget at https://www.engadget.com/cameras/fujifilms-latest-instax-camera-looks-like-a-vintage-super-8-194537863.html?src=rss

The Shine 2.0 is a compact wind turbine for your next camping trip

As power gets more dicey, personal energy generation only gets more appealing. Shine’s compact turbine isn’t going to power your house any time soon (though Rachel Carr, the company’s co-founder told me they have plans in that direction) but it can suck up the energy required to refill a smartphone in as little as 17 minutes. Of course, what it can generate depends on wind speed. That same charge could take as long as 11 hours if there’s only a slight breeze.

That power curve, and its ability to operate at night, sets the turbine apart from solar panels. Of course, on a completely still day, the Shine as inert as a becalmed sailing ship but if the wind picks up even as little as a breeze, it gets to work making power. The turbine even automatically pivots on the included stand to face into the wind.

Shine turbine 2.0
Shine turbine 2.0
Shine

The Shine 2.0 looks like a thin space football and has a screw-off cap that reveals a hollow compartment for the stand and tie downs. The cap then doubles as a key to unlock the blades. It all weighs just three pounds, which is impressively light considering it also houses a 12,000mAh battery that can output up to 75 watts. This is the second version of the turbine and updates include a USB-C port instead of USB-A, as well as app connectivity.

The company claims you can set the entire thing up in around two minutes. I watched the Carr take the turbine from fully closed to unfurled and ready for the stand in about that long. Unfortunately, there was no wind rushing through the CES show floor so I couldn’t see it spin, but Carr was kind enough to spin it for me.

Spinning the Shine Turbine 2.0
Spinning the Shine Turbine 2.0
Amy Skorheim for Engadget

Possibly the most exciting part is Shine’s plan for more expansive power generation. Shine 3.0, which the company is working on now, will be a 100 to 300 watt system and grid-tied turbines are on the wish list.

Pre orders are now open for the Shine 2.0 through Indiegogo for $399 and units should begin shipping this spring.

Update, January 7 2026, 4:00PM ET: This story has been updated to correct the wattage output and include the co-founder’s name.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/the-shine-20-is-a-compact-wind-turbine-for-your-next-camping-trip-191000940.html?src=rss

ASUS and XREAL teamed up at CES to make gaming smartglasses with two important upgrades

The latest generation of smartglasses can create huge virtual screens without the need to lug around giant monitors are a real boon to frequent travelers. However, their specs aren’t often tailored to the needs of gamers, so at CES 2026, ASUS and XREAL partnered to make a pair with two very important features you don’t normally get from rivals. 

The new ROG XREAL R1 AR glasses are based on the existing XREAL One Pro, so naturally they share a lot of the same components and specs including dual micro-OLED displays with a per-eye resolution of 1,920 x 1,080, three degrees of freedom (natively), 700-nit peak brightness, 57-degree FOV and built-in speakers tuned by Bose. However, the big difference on the R1s is that instead of maxing out with a 120Hz refresh rate, ASUS and XREAL’s collab goes all the way up to 240Hz. That’s a pretty nice bump, especially for people with older hardware or anyone who might not have access to a high refresh rate display or just doesn’t want to lower their standards while traveling. 

The ROG XREAL R1 AR smartglasses deliver 1,920 x 1,080 resolution to each eye with a 240Hz refresh rate and 57-degree FOV.
The ROG XREAL R1 AR smartglasses deliver 1,920 x 1,080 resolution to each eye with a 240Hz refresh rate and 57-degree FOV.
Sam Rutherford for Engadget

The other big addition is the R1’s included ROG Control Dock, which from what I’ve seen is slightly better suited for home use. It’s designed to be a simple hub with two HDMI 2.0 jacks, one DisplayPort 1.4 connector and a couple of USB-C slots (one is for power), so you can quickly switch between multiple systems like your desktop and console with a single touch. That said, depending on the situation you might not even need the dock at all because the R1s can also be connected to compatible PCs or gaming handhelds like the ROG Ally X and ROG Xbox Ally X (see the synergy there?) directly via USB-C. 

When I got to try them out at CES, the R1s delivered a very easy to use and relatively streamlined kit. At 91 grams, they are barely heavier than the original XREAL One Pro (87g) so they don’t feel too weighty or cumbersome. I also really like the inclusion of electrochromic lenses, which allow you to change the tint of the glasses with the touch of a button. This lets you adjust how much or little light you want to come in through the front to best suit your environment. And thanks to support for three DOF, you have the ability to pin your virtual screen in one location or let it follow you around. 

Of course, ASUS and XREAL couldn't resist putting RGB lighting on the ROG XREAL R1 AR smartglasses.
Of course, ASUS and XREAL couldn't resist putting RGB lighting on the ROG XREAL R1 AR smartglasses.
Sam Rutherford for Engadget

Now it is important to remember that in order to get 240Hz on the smartglasses, you need hardware capable of pushing the kind of performance. So depending on the title, when the R1s are connected to something like a gaming handheld, you might not be able to get there. Luckily, I had the chance to use the specs when connected to a PC as well, which let me really appreciate the smoothness you get from faster refresh rates. General image quality was also quite good thanks to the glasses’ 1080p resolution, so I had no trouble reading text or discerning small UI elements.

The ROG Control dock makes it easy to connect multiple devices to the ROG XREAL R1 AR smartglasses, but it may be a bit too bulky to pull out in tight situations like on a plane.
The ROG Control dock makes it easy to connect multiple devices to the ROG XREAL R1 AR smartglasses, but it may be a bit too bulky to pull out in tight situations like on a plane.
Sam Rutherford for Engadget

My one small gripe is that I kind of wish its 57-degree FOV was a tiny bit bigger, but that’s more of a limitation of current optical technology as there aren't a ton of similarly sized specs that can go much higher (at least not yet). That said, even with its current FOV, you can still create up to a 171-inch virtual screen at four meters away, which is massively bigger than any portable screen you might entertain carrying around.

Unfortunately, ASUS and XREAL haven’t announced official pricing or a release date for the R1s yet, but hopefully they won’t cost too much more than the XREAL One Pro, which are currently going for $649.


This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/asus-and-xreal-teamed-up-at-ces-to-make-gaming-smartglasses-with-two-important-upgrades-190500897.html?src=rss

Brunswick’s latest boats at CES 2026 feature edge AI, self-docking capabilities and solar power

If you've never docked a boat before, consider yourself lucky. There are plenty of popular TikTok channels devoted to shaming those who bring their craft back home clumsily or berth them with something less than finesse. Tricky crosswinds, unpredictable surf and even the jeers of passersby can make it a stressful experience at the best of times.

Brunswick, which owns more than 50 water-borne brands like Sea Ray, Bayliner and Mercury Marine, has a solution. It's demonstrating some self-docking tech called AutoCaptain at CES 2026 that makes this process a cinch, plus a fleet of other innovations that, in some cases, leave some of the smart cars on the show floor looking a bit remedial.

One of those technologies is edge AI. While in-car AI is an increasingly common feature, those agents are exclusively running remotely, relying on cellular connections to offload all the processing power required to drive a large language model.

Sadly, that won't always work on a boat.

One of Brunswick's tech-equipped boats
One of Brunswick's tech-equipped boats
Brunswick

"One of the things about AI for boats is you don't have connectivity, so there is some edge compute required," David M. Foulkes told me. He's chairman and CEO of Brunswick.

Many of the company's boats do have active cellular connectivity, but head far enough offshore, and you're on your own unless you're packing Starlink or the like.

To solve that, Brunswick is running advanced SoCs from NVIDIA and other providers that enable running a limited agent offline — on the edge, as it were. When offline, Misty, as the on-boat AI assistant is called, won't be able to make dinner reservations or craft a 3,000-word treatise on the history of America's relationship with Greenland. It can, however, help with navigation or boat settings.

"It'll help answer the kind of questions that you might need to take out a manual to understand and maybe act as an assistant to make your boating a bit smoother," Foulkes said.

When the company's smart boats are connected, they offer some degree of remote control. No, you can't drive it around the docks and freak out your pier-mates, but you can check on the boat remotely to make sure nobody's trying to stow away. You can even precondition it to get the cuddy cabin nice and cool before you come aboard.

Navan C30
Navan C30

Power for that, and a variety of other onboard systems, can come from an integrated power system called Fathom, which has a lot in common with modern smart home tech. Solar panels on the roof (nicely disguised beneath a black mesh) collect power to recharge an onboard battery, with various sizes upwards of 30 kWh depending on the boat's size. That battery can also be recharged by the onboard motors, like the three 425-horsepower V10s the Sea Ray SLX 360 drydocked at the Brunswick booth at CES 2026.

The juice in that battery can then be used to power a variety of onboard systems, even charging a pair of electric hydrofoils, which another of the company's boats, called the Navan C30, had strapped on the roof.

You'll also find cameras on the roof of these boats. That's how the AutoCaptain feature works, numerous fisheye lenses scanning the water in every direction. Approach a pier and the AI assistant asks if you'd like some help docking. Just tap the button on the touchscreen, then kick back and let Misty do the driving.

Between automated docking, the in-cabin AI assistant and the smart power distribution system, Brunswick's boats offered some impressive tech. But then they'd have to, given the cost. The Sea Ray SLX 360 Outboard has a starting price of $586,000. The smaller Navan C30 is a rather more attainable, but still extreme, at $227,500. That’s still probably cheaper than hiring a real captain, though.


This article originally appeared on Engadget at https://www.engadget.com/transportation/brunswicks-latest-boats-at-ces-2026-feature-edge-ai-self-docking-capabilities-and-solar-power-185500213.html?src=rss

Niko is a robotic lift for people with limited mobility that doesn’t require a caregiver’s help

A startup called ReviMo has developed a robotic system that provides a way for people with limited mobility to lift and transfer themselves — like from a bed to a wheelchair, or to the toilet — without the assistance of a caregiver. ReviMo's Niko has two sets of arms: one that forms a "scooping seat" that slides underneath the person to lift them up, and the other encircling their torso and providing a backrest. It can be operated both by remote and using the controls on its dashboard. Niko in its current iteration can carry up to 250 pounds, but the team says it's working on a version that can support up to 400 pounds.

In addition to aiding in transfers, Niko can lift the rider to a standing level and offers retractable handlebars for support. It also has the potential to be a big help to caregivers, who in many cases assist with multiple transfers every day. Even in a situation where a person isn't able to operate it by themselves, Niko facilitates a transfer that requires much less physical exertion than today's common methods, like sling-based mechanical lift systems. At CES, founder Aleksandr Malaschenko gave a demonstration of its lifting capabilities, using it to scoop him up from a chair and bring him out into the aisle. 

Niko is designed to work with most wheelchairs and be compact enough to navigate small bathrooms. It can position a person right above a toilet, and there are disposable seat covers. The goal is to help people with limited ability achieve more independence.  

It is the kind of device that, if it delivers on its promises, could be a game-changer for people with limited mobility and paralysis, and their loved ones. My dad was diagnosed with ALS when I was a kid, and I learned how to operate a Hoyer lift by the time I was in middle school. This strikes me as something we would have really appreciated having around. Malaschenko has said the inspiration for the robotic system came from being a caregiver for his grandfather following a stroke. 

Niko will cost about $15,000, though the team said it's working to get it covered by insurance. The company is also offering lower prices for customers who sign up for one of its premium waitlists, and there are options to rent it for those who would only need a lift and transfer system temporarily. 

This article originally appeared on Engadget at https://www.engadget.com/transportation/niko-is-a-robotic-lift-for-people-with-limited-mobility-that-doesnt-require-a-caregivers-help-184500703.html?src=rss

Ubisoft is shutting down a studio 16 days after it unionized

Ubisoft is closing a Canadian studio just over two weeks after it unionized. In a dizzying claim, the company told GamesIndustry.biz that the closure of Ubisoft Halifax was part of "company-wide actions to streamline operations" and unrelated to the unionization.

On December 22, Ubisoft Halifax announced that 61 of its workers had joined the Game & Media Workers Guild of Canada. At the time, the studio's lead programmer, Jon Huffman, told CTV News that 73.8 percent of employees voted in favor of unionizing. Ominously in hindsight, he had described the decision as a "huge relief." The studio was working on mobile titles within the Rainbow Six and Assassin's Creed franchises.

Ubisoft's official statement framed the shutdown as part of a broader pattern of financial belt-tightening. "Over the past 24 months, Ubisoft has undertaken company-wide actions to streamline operations, improve efficiency, and reduce costs," the company said. "As part of this, Ubisoft has made the difficult decision to close its Halifax studio. 71 positions will be affected. We are committed to supporting all impacted team members during this transition with resources, including comprehensive severance packages and additional career assistance."

In October, Ubisoft announced that Massive Entertainment, developer of The Division series, Star Wars Outlaws and Avatar: Frontiers of Pandora, was offering buyouts to some employees. The company framed that move as a "voluntary career transition program." Over the past few years, Ubisoft has closed offices and laid off workers in San Francisco, London and Leamington. In 2024, the company's headcount dropped by eight percent.

This article originally appeared on Engadget at https://www.engadget.com/gaming/ubisoft-is-shutting-down-a-studio-16-days-after-it-unionized-183000983.html?src=rss

Narwal Flow 2 at CES 2026: Sees Everything, Cleans Smarter

Robot vacuums quietly went from novelty to background appliance, yet many still behave like polite bumper cars. They avoid walls, follow schedules, and send maps, but they do not really understand what they are seeing. A cable, a sock, and a pet toy often get the same treatment, which is why people still hover nearby during automatic cleaning runs, ready to intervene when the robot inevitably gets confused by something obvious.

Narwal Flow 2 is the latest step in the brand’s attempt to build a robot that actually sees and decides. It builds on earlier DirtSense and dual-camera work, but now leans on a NarMind Pro autonomous system and a foundation-model brain to recognize unlimited objects, assign risk levels, and adjust both path and cleaning strategy. This is less about more suction and more about better judgment, the kind that changes behavior based on whether it is looking at a table leg, a pet bowl, or a crawling mat.

Designer: Narwal

The 2026 flagship also adopts a brand-new design outlook, with a rational arc-form dock featuring a frosted glass panel on the front and easy-lift water tanks shaped for straight-up lifting. The integrated status light bar communicates through the frosted glass instead of scattered LEDs, giving the dock a more premium, sleek presence. It is designed to look less like an appliance you hide in a corner and more like a considered object that can live in visible spaces without visual friction.

A Robot That Sees and Decides

The Narwal Flow 2 uses dual RGB cameras and a VLA OmniVision model running on a 10 TOPS AI platform to capture 1.5 million data points per second. It categorizes objects as no-risk, low-risk, mid-risk, or high-risk, then adjusts distance and behavior accordingly. Walls invite close cleaning within 8 mm, pet bowls get 20 mm of space, and high-risk items like pet waste trigger a protective bypass at 70 mm.

Adaptive smart cleaning means Flow 2 uses different strategies for dry debris, wet spills, and heavy messes. Dual-direction mopping keeps the side brush from dragging dirty water into clean zones, with a reverse pass to protect the brush and a forward pass to lift stains. Cloud-based recognition feeds back into the model, so the robot becomes more tuned to a specific home over time, learning which corners collect dust and which zones need extra attention.

Living with Pets, Babies, and Busy Schedules

In Pet Care Mode, Flow 2 automatically identifies pet-active zones and adapts for deeper cleaning there, while treating pet bowls, beds, and toys as objects to avoid bumping or soaking. The same visual system that keeps it away from waste can be used to scan for a missing pet on command, turning the robot into a quiet scout when you are not home and want to make sure your dog is not locked in a bedroom.

Baby Care Mode shifts behavior around cribs and crawling mats. Flow 2 can drop into ultra-quiet mode near a sleeping baby, recognize toys left on the floor and nudge you to pick them up, and avoid rolling over dedicated play areas to keep them as clean as possible. The goal is not to replace parenting, but to make the robot feel like it understands which zones are more sensitive than others, adjusting volume and intensity without manual scheduling.

The updated dock and mapping round out the picture. TrueColor 3D mapping turns the home into a more intuitive map where you can tap rooms or furniture for targeted cleaning, while AI Floor Tag remembers floor types and zones. The all-in-one base station now uses a reusable dust bag and washable debris filter, along with hot-water self-cleaning and hot-air drying, so the system stays hygienic without filling a trash bag with single-use consumables every few weeks or emitting odors between runs.

Mopping That Stays Clean While It Cleans

The FlowWash mopping system treats the mop like a moving track rather than a pair of pads. Sixteen angled nozzles continuously infuse the track with fresh water, while a reverse-rolling mop applies 12 N of downward pressure and 140 °F heat. A tight scraper presses against the fabric to strip away dirt in real time, so the surface touching the floor is constantly refreshed instead of slowly turning into a gray sponge you would not want to touch.

Wastewater extraction and storage, with a built-in stirrer in the dirty tank, prevents residue and odors from settling. That matters in homes where mopping is not just about dust, but about food spills, pet accidents, and whatever kids drag in from outside. The system is designed so that by the time Flow 2 returns to its dock, both the floor and the mop have been treated, not just one at the expense of the other.

On a mixed floor with tile in the kitchen and wood in the living room, Flow 2 can push harder and use hotter water on stubborn kitchen stains, then ease off as it moves into more delicate areas. EdgeReach capabilities let the track mop get within 0.19 in of walls and baseboards, reducing the need for manual follow-up with a traditional mop that you have to wring out by hand.

Beyond the Floor

The Flow 2 is not the only thing Narwal is launching at CES 2026. The V50 Series cordless vacuum brings the same auto-empty, smart dirt detection philosophy to a stick form, with a compact dock that handles a 3.2qt dust bin, active dust scraping, and push-in charging. At 3.1lb with dual detachable batteries and 210 AW of suction, it combines CarpetFocus Mode and full-cycle de-tangling with a dirt-detection headlight and multi-cyclone H13 filtration, turning a handheld into something that feels almost as hands-free as a robot.

The U50 Series mattress vacuum targets a different corner of the home, using 137°F iron-heating, UVC sterilization, 60,000 taps per minute, and 16,000 Pa of suction to pull mites and allergens out of mattresses and upholstery. It weighs just 3.7lb and uses sealed, disposable dust bags with a transparent window, so you can treat beds and sofas without dealing with messy dust cups or touching what comes out. Together, V50 and U50 show Narwal extending its maintenance-free, AI-aware design language into spaces the robot cannot reach, keeping the entire home cleaner without multiplying the number of chores you actually have to do.

Narwal Flow 2: See Further, Think Deeper, Clean Smarter

Flow 2 is a sign that robot vacuums are finally moving from smart enough not to fall down the stairs to smart enough to adapt to how you live. It still has big suction numbers and a long spec sheet, but the interesting part is how it sees pets, babies, and messes differently, and how it keeps its own mop clean while it works. For a category that has been chasing power for years, that kind of judgment feels like the more meaningful upgrade, especially when the alternative is manually zoning a map and hoping the robot does not knock over a water bowl or wake up a napping toddler on its next routine pass.

The post Narwal Flow 2 at CES 2026: Sees Everything, Cleans Smarter first appeared on Yanko Design.

LG’s CLOiD robot can fold laundry and serve food… very slowly

When LG announced that it would demo a laundry-folding, chore-doing robot at CES 2026, I was immediately intrigued. For years, I've wandered the Las Vegas Convention Center halls and wondered when someone might create a robot that can tackle the mundane but useful tasks I despise like folding laundry. With CLOiD (pronounced like "Floyd"), LG has proven that this is theoretically possible, but probably not likely to happen any time soon. 

I went to the company's CES booth to watch its demonstration of CLOiD's abilities, which also include serving food, fetching objects and fitness coaching. During a very carefully choreographed 15-minute presentation, I watched CLOiD grab a carton of milk out of the fridge, put a croissant in an oven, sort and fold some laundry and grab a set of keys off a couch and hand them to the human presenter.

Throughout the demonstration, LG showed off how its own appliances can play along with the robot. When it rolled over to the fridge, the door automatically opened, as did the oven. When the LG-branded robot vacuum needed to move around a hamper, CLOiD helpfully cleared the path. But the robot also moved very slowly, which you can see in the highlight video below. 

The appliance maker is selling the setup as a part of its vision for a "zero labor home" where its appliances and, I guess, robotics technology can come together to take care of all your chores and household upkeep. Maybe I'm jaded from a decade of watching CES vaporware, but I left the slick demo thinking the concept is unlikely to amount to much anytime soon.

On one hand, it is exciting to see robots competently performing tasks that would actually be useful to most people. But this technology is still far from accessible. Even LG isn't making any firm commitments about CLOiD's future as anything more than a CES demo. The company has instead said that CLOiD is a signal of its interest in creating "home robots with practical functions" and "robotized appliances," like fridges with doors that can open automatically. 

That may be a more reasonable target for the company (and yet another way for LG to sell us more appliance upgrades). But it's still pretty far from anything approaching the fantasy of a "zero labor home."

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/lgs-cloid-robot-can-fold-laundry-and-serve-food-very-slowly-181902306.html?src=rss

These robotic sneakers gave me a surprising boost at CES

I'll admit that I've always kind of taken walking for granted. Other than a knee injury more than a decade ago, my ability to walk long distances has largely been limited only by my own choices. That's not the case for everyone, though. And robotics company Dephy has created a pair of robotic sneakers, called the Sidekick, that are meant to help people who want to walk more than their bodies might otherwise be capable of.

The system consists of two parts: an ankle-worn exoskeleton and a special pair of sneakers that attach to it. The exoskeleton hooks onto the back of the shoe and is secured with a strap around your calf. The battery powered device is equipped with sensors that can detect and adapt to the wearer's gait in order to deliver an extra "boost" with each step. 

The whole setup is pricey, at $4,500, but Dephy is betting that people who have "personal range anxiety" might be willing to pay for the extra confidence the Sidekick can provide. "This is a device that's kind of like [having] an extra calf muscle," Dephy CEO Luke Mooney told me. 

The Sidekick.
The Sidekick.
Karissa Bell for Engadget

I was able to take the Sidekick for a spin around the CES showfloor and it was a truly surprising sensation. The best way I can describe walking with the Sidekick powered on is that with every step forward there's a noticeable upward push from under your heel. It wasn't enough to throw me off balance, but it did feel a bit strange.

The Sidekick has adjustable power levels based on how much help you might need. At the highest level, it definitely felt unnecessarily pushy. The lower levels were still noticeable but felt less disruptive. I just felt… bouncy. Later, when Mooney turned off the power entirely, I noticed that my feet felt weirdly heavy in a way they hadn't just a few minutes before. 

Mooney was quick to tell me that I'm not Dephy's target demographic. "A lot of times people who are fit, or like athletes, actually struggle to adopt to the technology because their body's so in tune with how they move," he said. "Whereas folks who are not as physically active and fit, their body's ready to accept help."

The company's technology will be used in products more focused on athletic performance, however. Dephy has partnered with Nike on its upcoming robotic sneaker currently known as Project Amplify. Mooney declined to share details on the collaboration, but the shoemaker has claimed that some early testers have been able to improve their mile times by two minutes. 

I tried the Sidekick early in the day. Several hours later, though, when I was walking between the Las Vegas Conventions Center halls for the third or fourth time, I started thinking about those robotic sneakers again. I was getting close to 10,000 steps and hadn't sat down for hours. My feet were sore. I remembered that strange, bouncy boost and thought it sounded kind of nice.

This article originally appeared on Engadget at https://www.engadget.com/wearables/these-robotic-sneakers-gave-me-a-surprising-boost-at-ces-174500005.html?src=rss

Spotify now lets you share what you’re listening to in real time via chat

Spotify is rolling out more social features to keep people on the platform. It's adding a new tool to its messaging platform that lets users see what their friends and family members are listening to in real time.

Once activated, a user's listening activity will be displayed at the top of the chat. The other person in the chat can tap the bar to play a particular track, save it or react with an emoji. People can also, of course, comment directly to either praise or rag on the song selection.

There's another little addition to Spotify's messaging system. Users will now be able to invite chat participants to start a Jam, which is the app's collaborative listening feature. Premium users will find a "Jam" button in the top right corner, which sends an invite. This lets two people add tracks to a shared queue and listen together. Free users can join one of these sessions, but cannot initiate.

It's worth noting that the messaging platform is currently just a one-on-one affair. There's no option for a group chat, so users won't be able to spy on multiple people simultaneously. These tools are rolling out gradually for iOS and Android right now, but won't be broadly available for a few weeks.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/music/spotify-now-lets-you-share-what-youre-listening-to-in-real-time-via-chat-173749120.html?src=rss