Hackers made robot vacuums randomly yell racial slurs

Robot vacuums across the country were hacked in the space of several days, according to reporting by ABC News. This allowed the attackers to not only control the robovacs, but use their speakers to hurl racial slurs and abusive comments at anyone nearby.

All of the affected robots were of the same make and model, the Chinese-made Ecovacs Deebot X2s. This particular robovac has developed a reputation for being easy to hack, thanks to a critical security flaw. ABC News, for instance, was able to get full control over one of the robots, including the camera.

One victim of this week’s hacks was a Minnesota lawyer named Daniel Swenson. He told ABC that he was watching TV when the robot started making weird noises, like “a broken-up radio signal or something.” Through the app, Swenson could tell that a stranger was accessing the live camera feed and the remote control feature.

He reset the password and rebooted the vacuum, but that’s when the weirdness really started. It immediately started moving again of its own accord and the speakers began emitting a human voice. This voice was yelling racist obscenities right in front of Swenson’s son.

"I got the impression it was a kid, maybe a teenager," said Swenson. "Maybe they were just jumping from device to device messing with families." Ultimately, he said it could have been worse, such as if the vacuum silently spied on his family for days on end.

Swenson’s device was hacked on May 24. That same day another Deebot X2s in Los Angeles began chasing around a dog. This vacuum’s speakers also shouted abusive comments. Five days later, a similar incident happened in El Paso. It remains unclear how many of the company’s devices have been hacked in total.

At the root of this issue is a security flaw that allows bad faith actors to bypass the required four-digit security PIN in order to gain control of the vacuum. This issue originally came to light in December 2023. The Bluetooth connector also has a flaw that allows for complete access from up to 300 feet away. However, the attacks occurred throughout the country, so the Bluetooth vulnerability is an unlikely culprit.

According to Gizmodo, the company has developed a patch to eliminate the aforementioned security flaw that’ll roll out sometime in November. We reached out to Ecovacs to get a confirmation on this.

This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/hackers-are-making-robot-vacuums-randomly-yell-racial-slurs-184017187.html?src=rss

Intel’s upcoming Arrow Lake H laptop chips will offer beefier GPUs for AI workloads

Alongside its new family of Arrow Lake desktop hardware, Intel today also gave us a few tidbits around its upcoming Arrow Lake H mobile chips for high performance laptops. First off, they're not expected to arrive until the first quarter of 2025 — but the slight wait might be worth it, as Intel says they will offer powerful new Xe GPUs with XMX. Thanks to that upgrade, the GPU alone will offer four times better AI workload processing than its previous chips, alongside double the ray tracing performance and twice as much cache (8MB L2).

Notably, though, these new chips will still lag behind the company's less powerful Lunar Lake processors when it comes to NPU and overall AI TOPS (tera operations per second) figures. Arrow Lake H's NPU will hit 13 TOPS, the new GPU will reach 77 and the CPU will offer 9 TOPS. Taken altogether, it'll offer up to 99 TOPS of performance. Lunar Lake, meanwhile, sports a 48 TOPS NPU and up to 120 TOPS of system-wide AI performance.

Intel Arrow Lake H
Intel

The difference makes sense when you consider what these chips are meant for. Lunar Lake is mostly geared towards ultraportables and slim workstations, while Arrow Lake H chips are targeted at demanding notebooks with desktop-like performance. While they can technically be called AI PCs, Arrow Lake H's low NPU performance doesn't meet the bar for Microsoft's Copilot+ badge (those require at least 40 TOPS NPUs). You'll be able to run basic AI features, like Windows Studio Effects in video chats, but not more complicated tasks like Recall.

Intel didn't have many other details to share about Arrow Lake H, but we'll likely hear more at CES 2025.

Intel Arrow Lake H
Intel
This article originally appeared on Engadget at https://www.engadget.com/computing/laptops/intels-upcoming-arrow-lake-h-laptop-chips-will-offer-beefier-gpus-for-ai-workloads-150021214.html?src=rss

Amazon will start offering regular and grocery items in a single same-day order

Amazon said on Wednesday that it’s rolling out new online ordering methods for Prime members, including the ability to bundle standard orders and groceries in one same-day shipment. The company is also adding more combined Amazon / Whole Foods fulfillment centers and trialing a store where robots pack your Amazon orders while you shop for groceries.

The company’s bundling of same-day Amazon.com orders with groceries kicks off in the Phoenix area. Customers there can shop “tens of thousands of grocery items” (including fresh ones) alongside regular Amazon orders for things like AirPods or Lego sets. The items will be bundled in one order and arrive together in a user-selected, same-day or overnight delivery window.

The company plans to expand the combined same-day model to more areas after it tests and learns from the Phoenix trial.

Along similar lines, Amazon is expanding its product range in some Amazon Fresh fulfillment centers. The company is modifying 26 of them globally to add “the best of Whole Foods Market and household goods on Amazon.com.” Like the Phoenix-area trial, it aims to more efficiently combine orders from separate branches of the sprawling online store.

Finally, the company is trialing a Whole Foods Market of the future in Plymouth Meeting, Pennsylvania (about 19 miles from Philadelphia). The store will add an automated Amazon.com micro-fulfillment center to serve up household items (Amazon used the examples of Tide Pods and Pepsi) while you cruise the aisles, buying organic spinach and pita bread.

You’ll order on your phone, and robots will prepare them in the back-of-house while you shop. The app will include a countdown of estimated time remaining before it’s finished. The idea is for the micro-fulfillment center order to be complete by the time you check out: Take your phone to the counter, grab your Tide Pods and get hopping.

The service will launch at the Plymouth Meeting store in 2025. The video below shows a visualization of the futuristic Philly-area location in action.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/amazon-will-start-offering-regular-and-grocery-items-in-a-single-same-day-order-184227542.html?src=rss

Lyft promises upfront hourly rates and traffic delay pay for its drivers

Contract employees of ride sharing apps and services aren’t known for having the steadiest or even sanest of pay rates. Lyft just announced some new earnings improvements for its drivers that will help cover their expenses when rides take longer than they should and show them how much they’ll earn before they pick up a fare.

Lyft announced the new driver earnings improvements on its official blog. The new earnings improvements aim to “tackle drivers’ biggest frustrations, and make it more rewarding” for Lyft drivers.

One of the biggest improvements affects how drivers are paid if they are stuck in traffic or go out of their way to help a rider. Lyft is implementing a new “5-minute-delay pay” structure that will increase a driver’s pay if any ride takes five minutes longer than expected. “Out-of-your-way pay” covers drivers who have to drive out of the normal coverage area only to turn around and drive all the way back without any fares.

Lyft is also implementing a new earnings dashboard on the mobile app for drivers. The new interface will also show drivers’ daily, weekly and yearly earnings and the estimated hourly rate for each ride “so drivers don’t have to make the mental calculation,” according to the post.

Drivers who drive electric vehicles for Lyft are also seeing some new benefits. EV drivers can choose only to receive rides that fall within their vehicle’s battery range and find nearby charging stations on the Lyft drivers app.

Of course, these new policies and changes won’t solve Lyft drivers’ problems overnight. It’ll take time to see if they make a dent (the good kind of dent).

This article originally appeared on Engadget at https://www.engadget.com/transportation/lyft-promises-upfront-hourly-rates-and-traffic-delay-pay-for-its-drivers-220142458.html?src=rss

The OnePlus 12 smartphone drops to a record low of $650 for Prime Day

In our review of the OnePlus 12, we said the smartphone's affordability was one of the best things about it. Now, thank's to Amazon's upcoming Prime Day sale, that's an even more compelling point. A 19 percent discount drops the price to $650 for the model with 12GB of RAM and 256GB of storage. That's a a new all-time low (it's gone for $700 a few times previously). The model with 16GB/512GB is also $150 off and down to $750. 

Other than being a great value for a flagship phone, the OnePlus also has solid cameras that take sharp and clear images. Just note that the camera module is strikingly bulky and you'll see a watermark from Hasselblad (the camera company OnePlus partners with) unless you opt to remove it. 

The battery life is another win. We measured a lifespan of over 26 hours in our video rundown test. And after just 10 minutes of wall charging, the phone got from 10 to 55 percent. It refilled to 37 percent in the same amount of time sitting on a OnePlus wireless charger, which is available separately for $50. 

The design is pleasingly "retro" (if you can apply that term to an aesthetic that harkens back a mere five years) with the rounded corners and tapered edges of something like the Galaxy S10+. The OnePlus 12 also takes notes from the past when it comes to AI — in that it mostly forgoes the buzzy new tech in favor of basic (but solid) smartphone functionality.

The screen and processor are fully modern, however, with a super bright and crisp 120Hz screen and a Snapdragon 8 Gen 3 chip that we found to offer fluid performance, despite clocking in a tad low on some benchmarks. 

Elsewhere, Amazon is also selling the OnePlus Open for $1,300, which is a $400 discount on the phone we named the more affordable pick for a flagship foldable. That's a price the Open has hit previously, as recently as last month. 

Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice, and stay tuned to Engadget.com for all of the best tech deals coming out of October Prime Day 2024.

This article originally appeared on Engadget at https://www.engadget.com/the-oneplus-12-smartphone-drops-to-a-record-low-of-650-for-prime-day-220729473.html?src=rss

Google’s theft protection features have started showing up for some Android users

Three new theft protection features that Google announced earlier this year have reportedly started rolling out on Android. The tools — Theft Detection Lock, Offline Device Lock and Remote Lock — are aimed at giving users a way to quickly lock down their devices if they’ve been swiped, so thieves can’t access any sensitive information. Android reporter Mishaal Rahman shared on social media that the first two tools had popped up on a Xiaomi 14T Pro, and said some Pixel users have started seeing Remote Lock.

Theft Detection Lock is triggered by the literal act of snatching. The company said in May that the feature “uses Google AI to sense if someone snatches your phone from your hand and tries to run, bike or drive away.” In such a scenario, it’ll lock the phone’s screen. 

Offline Device Lock, on the other hand, can automatically lock the screen after a thief has disconnected the phone from the internet. You can already remotely lock your phone with Google’s Find My Device, but the third feature, Remote Lock, lets you do so without having to scramble to figure out your Google account password. All you’d need for this is “your phone number and a quick security challenge using any device.”

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/googles-theft-protection-features-have-started-showing-up-for-some-android-users-210634941.html?src=rss

OpenAI rolls out Canvas, its newest ChatGPT interface

OpenAI has debuted a new workspace interface for ChatGPT called Canvas. The AI giant unveiled its new ChatGPT workspace on its official blog and immediately made it available for ChatGPT Plus and Team users. Enterprise and Edu users will be able to access Canvas sometime next week.

Canvas is a virtual interface space for writing and coding projects that allows users to consult with ChatGPT on certain portions of a project. A separate window opens besides the main chat space and users can put writing or code on this new “canvas” and highlight sections to have the model focus on and edit “like a copy editor or code reviewer,” according to the blog.

Canvas can either be opened manually by typing “use canvas” in your prompt. Canvas can also automatically open when it “detects a scenario in which it could be helpful,” according to the blog post. There are also several shortcuts that can be used for writing and coding projects. For writing projects, users can ask ChatGPT for suggested edits or length adjustments, or ask it to change the reading level of a block of text, from graduate school level down to kindergarten. It can also add "relevant emojis for emphasis and color."

Coders can have ChatGPT review their code and add inline suggestions for improvements. It can also mark up your work with logs and comments to aid in debugging and make understanding your code easier. It's also capable of fixing bugs and port coding to a different language such as JavaScript, TypeScript, Python, Java, C++ or PHP in Canvas mode.

OpenAI’s Canvas feature brings ChatGPT in line with other AI assistants that have separate workspaces to focus on certain areas of a project like Anthropic's Artifacts and the coding focused AI model Cursor.

Update, October 4, 12:55PM ET: This story was edited after publishing to include more context on the code and text functionality of the Canvas feature.

This article originally appeared on Engadget at https://www.engadget.com/ai/openai-rolls-out-canvas-its-newest-chatgpt-interface-230335185.html?src=rss

Snap’s fifth-generation Spectacles bring your hands into into augmented reality

Snap’s latest augmented reality glasses have a completely new — but still very oversized — design, larger field of view and all-new software that supports full hand tracking abilities. But the company is only making the fifth-generation Spectacles available to approved developers willing to commit to a year-long $99/month subscription to start.

It’s an unusual strategy, but Snap says it’s taking that approach because developers are, for now, best positioned to understand the capabilities and limitations of augmented reality hardware. They are also the ones most willing to commit to a pricey $1,000+ subscription to get their hands on the tech.

Developers, explains Snap’s director of AR platform Sophia Dominguez, are the biggest AR enthusiasts. They’re also the ones who will build the kinds of experiences that will eventually make the rest of Snapchat’s users excited for them too. “This isn't a prototype,” Dominguez tells Engadget. “We have all the components. We're ready to scale when the market is there, but we want to do so in a thoughtful way and bring developers along with our journey.”

Snap gave me an early preview of the glasses ahead of its Partner Summit event, and the Spectacles don’t feel like a prototype the way its first AR-enabled Spectacles did in 2021. The hardware and software are considerably more powerful. The AR displays are sharper and more immersive, and they already support over two dozen AR experiences, including a few from big names like Lego and Niantic (Star Wars developer Industrial Light and Motion also has a lens in the works, according to Snap.)

To state the obvious, the glasses are massive. Almost comically large. They are significantly wider than my face, and the arms stuck out past the end of my head. A small adapter helped them fit around my ears more snugly, but they still felt like they might slip off my face if I jerked my head suddenly or leaned down.

Still, the new frames look slightly more like actual glasses than the fourth-generation Spectacles, which had a narrow, angular design with dark lenses. The new frames are made of thick black plastic and have clear lenses that are able to darken when you move outside, sort of like transition lenses.

The fifth-generation Spectacles are the first to have clear lenses.
The fifth-generation Spectacles are the first to have clear lenses.
Karissa Bell for Engadget

The lenses house Snap’s waveguide tech that, along with “Liquid Crystal on Silicon micro-projectors,” enable their AR abilities. Each pair is also equipped with cameras, microphones and speakers.

Inside each arm is a Qualcomm Snapdragon processor. Snap says the dual processor setup has made the glasses more efficient and prevents the overheating issues that plagued their predecessor. The change seems to be an effective one. In my nearly hour-long demo, neither pair of Spectacles I tried got hot, though they were slightly warm to the touch after extended use. (The fifth-generation Spectacles have a battery life of about 45 minutes, up from 30 min with the fourth-gen model.)

Snap's newest AR Spectacles are extremely thick.
Snap's newest AR Spectacles are extremely thick.
Karissa Bell for Engadget

Snap has also vastly improved Spectacles’ AR capabilities. The projected AR content was crisp and bright. When I walked outside into the sun, the lenses dimmed, but the content was very nearly as vivid as when I had been indoors. At a resolution of 37 pixels per degree, I wasn’t able to discern individual pixels or fuzzy borders like I have on some other AR hardware.

But the most noticeable improvement from Snap’s last AR glasses is the bigger field of view. Snap says it has almost tripled the field of view from its previous generation of Spectacles, increasing the window of visible content to 46 degrees. Snap claims this is equivalent to having a 100-inch display in the room with you, and my demo felt significantly more immersive than what I saw in 2021.

The fourth-generation Spectacles (above) were narrow and not nearly as oversized as the fifth-gen Spectacles (below).
The fourth-generation Spectacles (above) were narrow and not nearly as oversized as the fifth-gen Spectacles (below).
Karissa Bell for Engadget

It isn’t, however, fully immersive. I still found myself at times gazing around the room, looking for the AR effects I knew were around me. At other points, I had to physically move around my space in order to see the full AR effects. For example, when I tried out a human anatomy demo, which shows a life-sized model of the human body and its various systems, I wasn’t able to see the entire figure at once. I had to move my head up and down in order to view the upper and lower halves of the body.

The other big improvement to the latest Spectacles is the addition of full hand tracking abilities. Snap completely redesigned the underlying software powering Spectacles, now called Snap OS, so the entire user interface is controlled with hand gestures and voice commands.

You can pull up the main menu on the palm of one hand, sort of like Humane’s AI Pin and you simply tap on the corresponding icon to do things like close an app or head back to the lens explorer carousel. There are also pinch and tap gestures to launch and interact with lenses. While Snap still calls these experiences lenses, they look and feel more like full-fledged apps than the AR lens effects you’d find in the Snapchat app.

Lego has a game that allows you to pick up bricks with your hands and build objects. I also tried a mini golf game where you putt a golf ball over an AR course. Niantic created an AR version of its tamagotchi-like character Peridot, which you can place among your surroundings.

MyAI interface for the AR Spectacles.
The interface for Snapchat's AI assistant, MyAI, on Spectacles.
Snap

You can also interact with Snapchat’s generative AI assistant, MyAI, or “paint” the space around you with AR effects. Some experiences are collaborative, so if two people with Spectacles are in a room together, they can view and interact with the same AR content together. If you only have one pair of Spectacles, others around you can get a glimpse of what you’re seeing via the Spectacles mobile app. It allows you to stream your view to your phone, a bit like how you might cast VR content from a headset to a TV.

The new gesture-based interface felt surprisingly intuitive. I occasionally struggled with lenses that required more precise movements, like picking up and placing individual Lego bricks, but the software never felt buggy or unresponsive.

There are even more intriguing use cases in the works. Snap is again partnering with OpenAI so that developers can create multimodal experiences for Spectacles. “Very soon, developers will be able to bring their [OpenAI] models into the Spectacles experience, so that we can really lean into the more utilitarian, camera-based experiences,” Dominguez says. “These AI models can help give developers, and ultimately, their end customers more context about what's in front of them, what they're hearing, what they're seeing.”

CEO Evan Spiegel has spent years touting the promise of AR glasses, a vision that for so long has felt just out of reach. But if the company’s 2021 Spectacles showed AR glasses were finally possible, the fifth-generation Spectacles feel like Snap may finally be getting close to making AR hardware that’s not merely an experiment.

For now, there are still some significant limitations. The glasses are still large and somewhat unwieldy, for one. While the fifth-gen Spectacles passably resemble regular glasses, it’s hard to imagine walking around with them on in public.

Then again, that might not matter much to the people Snap most wants to reach. As virtual and mixed reality become more mainstream, people have been more willing to wear the necessary headgear in public. People wear their Apple Vision Pro headsets on airplanes, in coffee shops and other public spaces. As Snap points out, its Spectacles, at least, don’t cover your entire face or obscure your eyes. And Dominguz says the company expects its hardware to get smaller over time.

Snap's fifth-generation Spectacles.
Snap's fifth-generation Spectacles are its most advanced, and ambitious, yet.
Karissa Bell for Engadget

But the company will also likely need to find a way to reduce Spectacles’ price. Each pair reportedly costs thousands of dollars to produce, which helps explain Snap’s current insistence on a subscription model, but it’s hard to imagine even hardcore AR enthusiasts shelling out more than a thousand dollars for glasses that have less than one hour of battery life.

Snap seems well aware of this too. The company has always been upfront with the fact that it’s playing the long game when it comes to AR, and that thinking hasn’t changed. Dominguez repeatedly said that the company is intentionally starting with developers because they are the ones “most ready” for a device like the fifth-gen Spectacles and that Snap intends to be prepared whenever the consumer market catches up.

The company also isn’t alone in finally realizing AR hardware. By all accounts, Meta is poised to show off the first version of its long-promised augmented reality glasses next week at its developer event. Its glasses, known as Orion, are also unlikely to go on sale anytime soon. But the attention Meta brings to the space could nonetheless benefit Snap as it tries to sell its vision for an AR-enabled world.

This article originally appeared on Engadget at https://www.engadget.com/social-media/snaps-fifth-generation-spectacles-bring-your-hands-into-into-augmented-reality-180026541.html?src=rss

Uber’s rider ID program is available everywhere in the US as of tomorrow

Uber just announced the expansion of safety features directed toward drivers, including a national rollout of enhanced rider verification, which begins tomorrow. If a rider undergoes these additional verification steps they’ll get a “Verified” badge on their account, which will let drivers know everything is on the up and up.

The company says it designed these new verification measures “in response to driver feedback that they want to know more about who is getting in their car.” The company began testing this feature earlier this year and it must have been a success, as it's getting a national rollout. Lyft has its own version of this tool, though it's still being tested in select markets

Uber verifies riders by cross-checking account information against third-party databases, though it’ll also accept official government IDs. The program will also allow users to bring in their verification status from the CLEAR program.

While rider ID is the most notable safety tool announced, Uber’s also bringing its Record My Ride feature to the whole country after another successful beta test. This lets drivers record the entirety of the ride via their smartphone cameras, without the need to invest in a dashcam. The footage is fully encrypted, with Uber saying nobody can access it unless a driver sends it in for review. The company hopes this will allow it to “more quickly and fairly resolve any incidents that may arise.”

The Uber block feature.
Uber

Drivers can now cancel any trip without a financial penalty and they can “unmatch” from any riders they don't feel comfortable picking up. Finally, there’s a new PIN verification feature in which drivers can request riders to enter a number to confirm they are, in fact, the correct guest.

Uber tends to focus its resources on riders over drivers, so this is a nice change of pace. It is kind of a bummer, however, that drivers require this kind of enhanced verification system just to root out some bad apples and keep doing their jobs. In other words, don’t be a jerk during your next Uber ride.

Correction, September 17 2024, 10:45AM ET: This story and its headline originally stated that Uber's rider verification program was rolling out nationwide as of today. The rollout starts tomorrow, September 18. We apologize for the error.

This article originally appeared on Engadget at https://www.engadget.com/transportation/ubers-rider-id-program-is-now-available-everywhere-in-the-us-143037313.html?src=rss

The Morning After: The AirPods Pro’s new hearing aid features are a big deal

Folded between all the new hardware announcements, Apple surprised us last week with news of FDA-approved hearing aid features for the AirPods Pro. No new hardware needed — it’s all in software updates. In the last decade, we’ve seen several companies tackle hearing-aid technology, aided by the boom in wireless tech. Now, arguably the most influential company in consumer tech is trying it. John Falcone outlines why this is a big deal. Or, at least, a very good deal.

— Mat Smith

The biggest stories you might have missed

The iPhone 16 event is over, and now we’ve got plenty of thoughts to share after playing with all of Apple’s new hardware. In this episode, Devindra and Cherlynn chat about the entire iPhone 16 and Pro lineup, and Billy Steele joins to chat about his experience with the AirPods 4 and Apple Watch Series 10. It turns out the Apple Watch stole the show.

Listen here

TMA
Annapurna

The entire Annapurna Interactive team has left the company after its executives walked out, according to a Bloomberg report. Apparently, the video game publisher had been negotiating with Annapurna Pictures to spin off Annapurna Interactive into its own entity. Those talks broke down, so “all 25 members of the Annapurna Interactive team collectively resigned,” the team said in a joint statement.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/general/the-morning-after-the-airpods-pros-new-hearing-aid-features-are-a-big-deal-111529376.html?src=rss