Engadget Podcast: Does Humane’s AI Pin live up to the hype?

Humane’s hyped up AI Pin is finally here and, unfortunately, it stinks. This week, Cherlynn and Devindra are joined by Michael Fisher (AKA MrMobile) and Wired Reviews Editor Julian Chokkattu to chat about the AI Pin and the many ways it fails. It’s often inaccurate, it takes crummy photos, and it gets way too hot. Not so great for something you’re supposed to wear all day! Is there any hope for AI-dependent gadgets? Also, Washington Post columnist Christopher Velazco joins to discuss Apple’s approval of used iPhone components for repairs.


Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!

Topics

  • Too much heat, too few features: Humane’s AI pin doesn’t live up to the hype – 1:09

  • Other News: Apple will allow devices to be repaired with secondhand parts soon – 44:08

  • Google’s Next 24 event announces AI video generation tool, ARM-based CPU for data centers, and Google Photos tools for all subscribers – 53:10

  • Working on – 1:00:59

  • Pop culture picks – 1:05:40

Subscribe!

Credits 

Hosts: Cherlynn Low and Devindra Hardawar
Guests: Michael Fisher (MrMobile) and Julian Chokkattu
Producer: Ben Ellman
Music: Dale North and Terrence O'Brien

This article originally appeared on Engadget at https://www.engadget.com/engadget-podcast-humane-ai-pin-review-110052283.html?src=rss

Apple Vision Pro two months later: A telepresence dream

Two months after I started using the Apple Vision Pro, it hasn't transformed the way I live. It hasn't replaced my TV, and it doesn't make me want to give up my powerful desktop or slim laptops. It's just another tool in my gadget arsenal — one I can don to catch up on X-Men '97 in bed, or to help me dive deep into research while I'm away from my office. The Vision Pro becomes normal so quickly, it's almost easy to forget how groundbreaking it actually is. Its screens are still absolutely stunning, and the combination of eye tracking and Apple's gesture controls makes for the most intuitive AR/VR interface I've seen yet.

While the Vision Pro still isn't something most people should consider buying, Apple has thrown out a few bones to early adopters. There are more games popping up on the App Store and Arcade every week, and there are also a handful of 3D films being offered to Apple TV+ subscribers. The addition of Spatial Personas also goes a long way towards making the Vision Pro more of a telepresence machine (more on that below). But we're still waiting for the company to make good on the promise of 180-degree Immersive Video, as well as to let users create higher quality Spatial Videos on iPhones.

Using the Apple Vision Pro with a MacBook Air
Photo by Devindra Hardawar/Engadget

How I use the Apple Vision Pro

Once the pressure of reviewing every aspect of the Vision Pro was over, I started incorporating it into my life like a typical user. (Full disclosure: I returned the unit I originally bought, but Apple sent along a sample for further testing.) Mostly, that means not forcing myself to use the headset for large chunks of the day. Instead, my Vision Pro time is more purpose-driven: I slip it on in the morning and project my MacBook's screen to clear out emails and catch up on Slack conversations, all while a YouTube video is virtually projected on my wall.

In the middle of a work session, or sometimes right before diving into a busy workday, I run through a five- or ten-minute meditation session with the Mindfulness app. I can easily meditate without any headgear, but I've found the app's calm narration and the immersive environment it creates (since it completely blocks out the real world) to be incredibly helpful. It's like having your own yoga teacher on staff, ready to help calm your brain whenever you have a free moment.

I've also learned to appreciate the Vision Pro as a way to expand where I can get work done. As someone who's been primarily working from home since 2009, I learned early on that changing locations was an easy way to keep myself engaged. I try not to write in the same place where I've been checking email in the morning, for example. I normally hop between a PC desktop and large monitor (currently it's Alienware's 32-inch 4K OLED) in my office, and a MacBook Air or Pro for writing around the house. Sometimes I'll go to a nearby park or cafe when I need to zone into a writing assignment for several hours.

Using the Apple Vision Pro
Photo by Devindra Hardawar/Engadget

With the Vision Pro, I can actually handle some serious multitasking from my deck or kitchen without being tied to a desktop computer. I've found that useful for covering events to avoid getting stuck inside my basement office (I can have a video streaming on a virtual window, as well as Slack and web browsers open via a projected MacBook). I've juggled conference calls while being sick in bed with the Vision Pro, because it felt more comfortable than staring down at a tiny laptop display.

I still haven’t traveled much with the headset, but I can foresee it being useful the next time I take a weekend trip with my family. Tested's Norman Chan told me he's used the Vision Pro during long flights, where it makes the hours just disappear. I'm still working myself up to that — I'd much rather use a small laptop and headphones on planes, but I can imagine the beauty of watching big-screen movies on the Vision Pro while everyone else is staring at tablets or cramped seat-back displays.

The Vision Pro remains a fantastic way to watch movies or TV shows at home, as well. When I'm too tired to head downstairs after putting my kids to sleep, I sometimes just veg in bed while projecting YouTube videos or anime on the ceiling. That's where I experienced a trippy temporal shift while watching X-Men '97: As soon as its remastered theme song spun up, I was immediately transported back to watching the original show on a 13-inch TV in my childhood bedroom. If I could somehow jump back into the past, Bishop-style, it would be impossible to convince my 10-year-old self that I'd eventually be watching a sequel series in a futuristic headset, projected in a 200-inch window. How far we've come.

Apple Vision Pro Spatial Persona collaboration with Norm Chan from Tested.
Photo by Devindra Hardawar/Engadget

Spatial Personas are a telepresence dream

When Apple first announced the Vision Pro last year, I couldn't help but be creeped out by its Persona avatars. They looked cold and inhuman, the exact sort of thing you'd imagine from soulless digital clones. The visionOS 1.1 update made them a bit less disturbing, but I didn't truly like the avatars until Apple unveiled Spatial Personas last week. Instead of being confined to a window, Spatial Personas hover in your virtual space, allowing you to collaborate with friends as if they were right beside you.

The concept isn't entirely new: I tested Microsoft Mesh a few years ago with a HoloLens 2 headset, which also brought digital avatars right into my home office. But they looked more like basic Miis from the Nintendo Wii than anything realistic. Meta's Horizon Workrooms did something similar in completely virtual spaces, but that's not nearly as impressive as collaborating digitally atop a view of the real world.

Apple's Spatial Personas are far more compelling than Microsoft’s and Meta's efforts because they're seamless to set up — you just have to flip on Spatial mode during a FaceTime chat — and they feel effortlessly organic. During a Spatial Persona call with Norm from Tested, we were conversing as if he was sitting right in front of me in my home theater. We were able to draw and write together in the Freeform app easily — when I stood up and reached out to the drawing board, it was almost as if we were standing beside each other at a real white board.

Apple Vision Pro Spatial Persona collaboration
Photo by Devindra Hardawar/Engadget

SharePlay with Spatial Personas

We were also able to customize our viewing experiences while watching a bit of Star Trek Beyond together using SharePlay in the Vision Pro. Norm chose to watch it in 2D, I watched in 3D, and our progress was synchronized. The experience felt more engrossing than a typical SharePlay experience, since I could just lean over and chat with him instead of typing out a message or saying something over a FaceTime call. I also couldn't help but imagine how easy it would be to record movie commentaries for podcasts using Spatial Personas. (We'd have to use separate microphones and computers, in addition to Vision Pros, but it would make for a more comfortable recording session than following movies on a monitor or TV.)

Our attempts to play games together failed, unfortunately, because we were running slightly different versions of Game Room. We also didn’t have enough time during our session to sync our apps up. I eventually was able to try out Chess and Battleship with other Vision Pro-equipped friends and, once again, it felt like they were actually playing right beside me. (Norm and CNET's Scott Stein also looked like they were having a ball with virtual chess.)

The main stumbling block for Spatial Personas, of course, is that they require a $3,500 headset. Apple is laying the groundwork for truly great telepresence experiences, but it won't matter for most people until they can actually afford a Vision Pro or a cheaper Apple headset down the line.

With Horizon Workrooms, Meta allowed non-VR users to join virtual meetings using Messenger on phones and computers, so that they weren’t left out. Standard FaceTime users can also join Vision Pro chats alongside spatial personas, but they'll be stuck in a window. And unlike Meta's offering, regular users won't be able to see any virtual environments (though you could still collaborate on specific apps like FreeForm). Meta's big advantage over Apple was with capacity: Horizon Workrooms supports up to 16 people in VR, as well as 34 more calling in from other devices. Spatial Persona chats, on the other hand, are limited to five participants.

Apple Immersive Video
Apple

No momentum for Immersive Video

Apple's 180-degree Immersive Video format was one of the most impressive aspects of the Vision Pro when I previewed it last year, and the handful of experiences at launch were pretty compelling. But the Immersive Video well has been dry since launch — the only new experience was a five-minute short showing off the 2023 MLS Playoffs, which was mostly disappointing.

While that short had such great resolution and depth that it felt like I was actually on the pitch, the MLS experience is disorienting because it cuts far too often, and with no sense of rhythm. Once you get settled into a scene, perhaps watching someone gear up for a well-placed goal, the camera view changes and you have no idea where you are. It's almost like a five-minute lesson in what not to do with Immersive Video. Hopefully, the MLS has a longer experience in the works.

I'm not expecting a tsunami of Immersive Video content, since the Vision Pro is still an obscenely expensive device meant for developers and professionals, but it would be nice to see more of a push from Apple. The company is teasing another six-minute episode of Prehistoric Planet for later this month, but again that isn't really much. Where are the creators pushing Immersive Video to new heights? While the content is likely hard to work with since it's shot in 3D and 8K, the format could be a perfect way for Apple to extol the virtues of its new chips.

In lieu of more Immersive Videos, I’ve been spending more time re-watching Spatial Videos captured with my iPhone 15 Pro. They still look more realistic than 2D clips, but I’ve grown to dislike the 1080p/30fps limitation. It’s just hard to accept that resolution when I know my phone can also produce crisp 4K and 60fps footage. The $3 app Spatialify helps somewhat by unlocking 1080p/60fps and 4k/30fps spatial video capture, but its footage is also more shaky and buggy than the iPhone’s built-in camera. At this point, I’ll consider using Spatialify if my phone is on a tripod or gimbal, but otherwise I’ll stick with the native camera app.

Using the Apple Vision Pro with a MacBook Air outside
Photo by Devindra Hardawar/Engadget

What’s next for the Apple Vision Pro

We’ll likely have to wait until Apple’s WWDC 24 event in June before we hear about any more major upgrades for Vision Pro or visionOS. That would be appropriate, since last year’s WWDC was the headset’s big debut (and a hellish day for us trying to cover all the news). Now that the hardware is in the wild, Apple has to convince developers that it’s worth building Vision Pro apps alongside their usual iOS, iPadOS and macOS wares. It’s not just some mythical spatial computing platform anymore, after all.

This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-two-months-later-a-telepresence-dream-181550906.html?src=rss

Apple Vision Pro two months later: A telepresence dream

Two months after I started using the Apple Vision Pro, it hasn't transformed the way I live. It hasn't replaced my TV, and it doesn't make me want to give up my powerful desktop or slim laptops. It's just another tool in my gadget arsenal — one I can don to catch up on X-Men '97 in bed, or to help me dive deep into research while I'm away from my office. The Vision Pro becomes normal so quickly, it's almost easy to forget how groundbreaking it actually is. Its screens are still absolutely stunning, and the combination of eye tracking and Apple's gesture controls makes for the most intuitive AR/VR interface I've seen yet.

While the Vision Pro still isn't something most people should consider buying, Apple has thrown out a few bones to early adopters. There are more games popping up on the App Store and Arcade every week, and there are also a handful of 3D films being offered to Apple TV+ subscribers. The addition of Spatial Personas also goes a long way towards making the Vision Pro more of a telepresence machine (more on that below). But we're still waiting for the company to make good on the promise of 180-degree Immersive Video, as well as to let users create higher quality Spatial Videos on iPhones.

Using the Apple Vision Pro with a MacBook Air
Photo by Devindra Hardawar/Engadget

How I use the Apple Vision Pro

Once the pressure of reviewing every aspect of the Vision Pro was over, I started incorporating it into my life like a typical user. (Full disclosure: I returned the unit I originally bought, but Apple sent along a sample for further testing.) Mostly, that means not forcing myself to use the headset for large chunks of the day. Instead, my Vision Pro time is more purpose-driven: I slip it on in the morning and project my MacBook's screen to clear out emails and catch up on Slack conversations, all while a YouTube video is virtually projected on my wall.

In the middle of a work session, or sometimes right before diving into a busy workday, I run through a five- or ten-minute meditation session with the Mindfulness app. I can easily meditate without any headgear, but I've found the app's calm narration and the immersive environment it creates (since it completely blocks out the real world) to be incredibly helpful. It's like having your own yoga teacher on staff, ready to help calm your brain whenever you have a free moment.

I've also learned to appreciate the Vision Pro as a way to expand where I can get work done. As someone who's been primarily working from home since 2009, I learned early on that changing locations was an easy way to keep myself engaged. I try not to write in the same place where I've been checking email in the morning, for example. I normally hop between a PC desktop and large monitor (currently it's Alienware's 32-inch 4K OLED) in my office, and a MacBook Air or Pro for writing around the house. Sometimes I'll go to a nearby park or cafe when I need to zone into a writing assignment for several hours.

Using the Apple Vision Pro
Photo by Devindra Hardawar/Engadget

With the Vision Pro, I can actually handle some serious multitasking from my deck or kitchen without being tied to a desktop computer. I've found that useful for covering events to avoid getting stuck inside my basement office (I can have a video streaming on a virtual window, as well as Slack and web browsers open via a projected MacBook). I've juggled conference calls while being sick in bed with the Vision Pro, because it felt more comfortable than staring down at a tiny laptop display.

I still haven’t traveled much with the headset, but I can foresee it being useful the next time I take a weekend trip with my family. Tested's Norman Chan told me he's used the Vision Pro during long flights, where it makes the hours just disappear. I'm still working myself up to that — I'd much rather use a small laptop and headphones on planes, but I can imagine the beauty of watching big-screen movies on the Vision Pro while everyone else is staring at tablets or cramped seat-back displays.

The Vision Pro remains a fantastic way to watch movies or TV shows at home, as well. When I'm too tired to head downstairs after putting my kids to sleep, I sometimes just veg in bed while projecting YouTube videos or anime on the ceiling. That's where I experienced a trippy temporal shift while watching X-Men '97: As soon as its remastered theme song spun up, I was immediately transported back to watching the original show on a 13-inch TV in my childhood bedroom. If I could somehow jump back into the past, Bishop-style, it would be impossible to convince my 10-year-old self that I'd eventually be watching a sequel series in a futuristic headset, projected in a 200-inch window. How far we've come.

Apple Vision Pro Spatial Persona collaboration with Norm Chan from Tested.
Photo by Devindra Hardawar/Engadget

Spatial Personas are a telepresence dream

When Apple first announced the Vision Pro last year, I couldn't help but be creeped out by its Persona avatars. They looked cold and inhuman, the exact sort of thing you'd imagine from soulless digital clones. The visionOS 1.1 update made them a bit less disturbing, but I didn't truly like the avatars until Apple unveiled Spatial Personas last week. Instead of being confined to a window, Spatial Personas hover in your virtual space, allowing you to collaborate with friends as if they were right beside you.

The concept isn't entirely new: I tested Microsoft Mesh a few years ago with a HoloLens 2 headset, which also brought digital avatars right into my home office. But they looked more like basic Miis from the Nintendo Wii than anything realistic. Meta's Horizon Workrooms did something similar in completely virtual spaces, but that's not nearly as impressive as collaborating digitally atop a view of the real world.

Apple's Spatial Personas are far more compelling than Microsoft’s and Meta's efforts because they're seamless to set up — you just have to flip on Spatial mode during a FaceTime chat — and they feel effortlessly organic. During a Spatial Persona call with Norm from Tested, we were conversing as if he was sitting right in front of me in my home theater. We were able to draw and write together in the Freeform app easily — when I stood up and reached out to the drawing board, it was almost as if we were standing beside each other at a real white board.

Apple Vision Pro Spatial Persona collaboration
Photo by Devindra Hardawar/Engadget

SharePlay with Spatial Personas

We were also able to customize our viewing experiences while watching a bit of Star Trek Beyond together using SharePlay in the Vision Pro. Norm chose to watch it in 2D, I watched in 3D, and our progress was synchronized. The experience felt more engrossing than a typical SharePlay experience, since I could just lean over and chat with him instead of typing out a message or saying something over a FaceTime call. I also couldn't help but imagine how easy it would be to record movie commentaries for podcasts using Spatial Personas. (We'd have to use separate microphones and computers, in addition to Vision Pros, but it would make for a more comfortable recording session than following movies on a monitor or TV.)

Our attempts to play games together failed, unfortunately, because we were running slightly different versions of Game Room. We also didn’t have enough time during our session to sync our apps up. I eventually was able to try out Chess and Battleship with other Vision Pro-equipped friends and, once again, it felt like they were actually playing right beside me. (Norm and CNET's Scott Stein also looked like they were having a ball with virtual chess.)

The main stumbling block for Spatial Personas, of course, is that they require a $3,500 headset. Apple is laying the groundwork for truly great telepresence experiences, but it won't matter for most people until they can actually afford a Vision Pro or a cheaper Apple headset down the line.

With Horizon Workrooms, Meta allowed non-VR users to join virtual meetings using Messenger on phones and computers, so that they weren’t left out. Standard FaceTime users can also join Vision Pro chats alongside spatial personas, but they'll be stuck in a window. And unlike Meta's offering, regular users won't be able to see any virtual environments (though you could still collaborate on specific apps like FreeForm). Meta's big advantage over Apple was with capacity: Horizon Workrooms supports up to 16 people in VR, as well as 34 more calling in from other devices. Spatial Persona chats, on the other hand, are limited to five participants.

Apple Immersive Video
Apple

No momentum for Immersive Video

Apple's 180-degree Immersive Video format was one of the most impressive aspects of the Vision Pro when I previewed it last year, and the handful of experiences at launch were pretty compelling. But the Immersive Video well has been dry since launch — the only new experience was a five-minute short showing off the 2023 MLS Playoffs, which was mostly disappointing.

While that short had such great resolution and depth that it felt like I was actually on the pitch, the MLS experience is disorienting because it cuts far too often, and with no sense of rhythm. Once you get settled into a scene, perhaps watching someone gear up for a well-placed goal, the camera view changes and you have no idea where you are. It's almost like a five-minute lesson in what not to do with Immersive Video. Hopefully, the MLS has a longer experience in the works.

I'm not expecting a tsunami of Immersive Video content, since the Vision Pro is still an obscenely expensive device meant for developers and professionals, but it would be nice to see more of a push from Apple. The company is teasing another six-minute episode of Prehistoric Planet for later this month, but again that isn't really much. Where are the creators pushing Immersive Video to new heights? While the content is likely hard to work with since it's shot in 3D and 8K, the format could be a perfect way for Apple to extol the virtues of its new chips.

In lieu of more Immersive Videos, I’ve been spending more time re-watching Spatial Videos captured with my iPhone 15 Pro. They still look more realistic than 2D clips, but I’ve grown to dislike the 1080p/30fps limitation. It’s just hard to accept that resolution when I know my phone can also produce crisp 4K and 60fps footage. The $3 app Spatialify helps somewhat by unlocking 1080p/60fps and 4k/30fps spatial video capture, but its footage is also more shaky and buggy than the iPhone’s built-in camera. At this point, I’ll consider using Spatialify if my phone is on a tripod or gimbal, but otherwise I’ll stick with the native camera app.

Using the Apple Vision Pro with a MacBook Air outside
Photo by Devindra Hardawar/Engadget

What’s next for the Apple Vision Pro

We’ll likely have to wait until Apple’s WWDC 24 event in June before we hear about any more major upgrades for Vision Pro or visionOS. That would be appropriate, since last year’s WWDC was the headset’s big debut (and a hellish day for us trying to cover all the news). Now that the hardware is in the wild, Apple has to convince developers that it’s worth building Vision Pro apps alongside their usual iOS, iPadOS and macOS wares. It’s not just some mythical spatial computing platform anymore, after all.

This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-two-months-later-a-telepresence-dream-181550906.html?src=rss

Engadget Podcast: Why pay $10 a month to get away from Google search?

Google has gone from being the go-to search engine to something people are paying to avoid entirely. This week, Cherlynn and Devindra chat with 404 Media co-founder Jason Koebler about his experience moving away from Google and towards Kagi, a $10 a month search engine without ads or data tracking. Funny enough, Kagi is still relying on Google’s index, so it’s a lot like using that site before the onslaught of ads, sponsored posts and AI results. Also, we discuss the company’s lies around Chrome’s incognito mode, as well as the news that it would be deleting user data collected in that mode. (Be sure to check out the 404 Media podcast too!)


Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!

Topics

  • Why Jason Koebler moved from Google to Kagi's paid search engine – 0:45

  • Google says it will destroy data collected from users using Incognito mode – 15:01

  • Gurman report: Apple is working on personal home robots – 24:55

  • Amazon just walked out on its self check-out tech – 30:43

  • FCC set to vote to restore Net Neutrality – 43:00

  • Apple adds Spatial Personas to make the Vision Pro experience less lonely – 45:09

  • Proposed California state law would give tech workers the “right to disconnect” – 47:17

  • Tekken director responds to fighting game fans’ request for a Waffle House stage – 49:57

  • Around Engadget – 51:22

  • Working on – 54:31

  • Pop culture picks – 59:13

Subscribe!

Credits 

Hosts: Cherlynn Low and Devindra Hardawar
Guest: Jason Koebler
Producer: Ben Ellman
Music: Dale North and Terrence O'Brien

This article originally appeared on Engadget at https://www.engadget.com/engadget-podcast-google-search-kagi-incognito-123049753.html?src=rss

Engadget Podcast: Why pay $10 a month to get away from Google search?

Google has gone from being the go-to search engine to something people are paying to avoid entirely. This week, Cherlynn and Devindra chat with 404 Media co-founder Jason Koebler about his experience moving away from Google and towards Kagi, a $10 a month search engine without ads or data tracking. Funny enough, Kagi is still relying on Google’s index, so it’s a lot like using that site before the onslaught of ads, sponsored posts and AI results. Also, we discuss the company’s lies around Chrome’s incognito mode, as well as the news that it would be deleting user data collected in that mode. (Be sure to check out the 404 Media podcast too!)


Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!

Topics

  • Why Jason Koebler moved from Google to Kagi's paid search engine – 0:45

  • Google says it will destroy data collected from users using Incognito mode – 15:01

  • Gurman report: Apple is working on personal home robots – 24:55

  • Amazon just walked out on its self check-out tech – 30:43

  • FCC set to vote to restore Net Neutrality – 43:00

  • Apple adds Spatial Personas to make the Vision Pro experience less lonely – 45:09

  • Proposed California state law would give tech workers the “right to disconnect” – 47:17

  • Tekken director responds to fighting game fans’ request for a Waffle House stage – 49:57

  • Around Engadget – 51:22

  • Working on – 54:31

  • Pop culture picks – 59:13

Subscribe!

Credits 

Hosts: Cherlynn Low and Devindra Hardawar
Guest: Jason Koebler
Producer: Ben Ellman
Music: Dale North and Terrence O'Brien

This article originally appeared on Engadget at https://www.engadget.com/engadget-podcast-google-search-kagi-incognito-123049753.html?src=rss

Microsoft may have finally made quantum computing useful

The dream of quantum computing has always been exciting: What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classical physics? But despite seeing IBM, Google and others announce iterative quantum computing hardware, they're still not being used for any practical purposes. That might change with today's announcement from Microsoft and Quantinuum, who say they've developed the most error-free quantum computing system yet.

While classical computers and electronics rely on binary bits as their basic unit of information (they can be either on or off), quantum computers work with qubits, which can exist in a superposition of two states at the same time. The trouble with qubits is that they're prone to error, which is the main reason today's quantum computers (known as Noisy Intermediate Scale Quantum [NISQ] computers) are just used for research and experimentation.

Microsoft's solution was to group physical qubits into virtual qubits, which allows it to apply error diagnostics and correction without destroying them, and run it all over Quantinuum's hardware. The result was an error rate that was 800 times better than relying on physical qubits alone. Microsoft claims it was able to run more than 14,000 experiments without any errors.

According to Jason Zander, EVP of Microsoft's Strategic Missions and Technologies division, this achievement could finally bring us to "Level 2 Resilient" quantum computing, which would be reliable enough for practical applications.

"The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems," Zander wrote in a blog post today. "In short, we need to transition to reliable logical qubits — created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation."

Microsoft's announcement is a "strong result," according to Aram Harrow, a professor of physics at MIT focusing on quantum information and computing. "The Quantinuum system has impressive error rates and control, so it was plausible that they could do an experiment like this, but it's encouraging to see that it worked," he said in an e-mail to Engadget. "Hopefully they'll be able to keep maintaining or even improving the error rate as they scale up."

Microsoft Quantum Computing
Microsoft

Researchers will be able to get a taste of Microsoft's reliable quantum computing via Azure Quantum Elements in the next few months, where it will be available as a private preview. The goal is to push even further to Level 3 quantum supercomputing, which will theoretically be able to tackle incredibly complex issues like climate change and exotic drug research. It's unclear how long it'll take to actually reach that point, but for now, at least we're moving one step closer towards practical quantum computing.

"Getting to a large-scale fault-tolerant quantum computer is still going to be a long road," Professor Harrow wrote. "This is an important step for this hardware platform. Along with the progress on neutral atoms, it means that the cold atom platforms are doing very well relative to their superconducting qubit competitors."

This article originally appeared on Engadget at https://www.engadget.com/microsoft-may-have-finally-made-quantum-computing-useful-164501302.html?src=rss

Microsoft may have finally made quantum computing useful

The dream of quantum computing has always been exciting: What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classical physics? But despite seeing IBM, Google and others announce iterative quantum computing hardware, they're still not being used for any practical purposes. That might change with today's announcement from Microsoft and Quantinuum, who say they've developed the most error-free quantum computing system yet.

While classical computers and electronics rely on binary bits as their basic unit of information (they can be either on or off), quantum computers work with qubits, which can exist in a superposition of two states at the same time. The trouble with qubits is that they're prone to error, which is the main reason today's quantum computers (known as Noisy Intermediate Scale Quantum [NISQ] computers) are just used for research and experimentation.

Microsoft's solution was to group physical qubits into virtual qubits, which allows it to apply error diagnostics and correction without destroying them, and run it all over Quantinuum's hardware. The result was an error rate that was 800 times better than relying on physical qubits alone. Microsoft claims it was able to run more than 14,000 experiments without any errors.

According to Jason Zander, EVP of Microsoft's Strategic Missions and Technologies division, this achievement could finally bring us to "Level 2 Resilient" quantum computing, which would be reliable enough for practical applications.

"The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems," Zander wrote in a blog post today. "In short, we need to transition to reliable logical qubits — created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation."

Microsoft's announcement is a "strong result," according to Aram Harrow, a professor of physics at MIT focusing on quantum information and computing. "The Quantinuum system has impressive error rates and control, so it was plausible that they could do an experiment like this, but it's encouraging to see that it worked," he said in an e-mail to Engadget. "Hopefully they'll be able to keep maintaining or even improving the error rate as they scale up."

Microsoft Quantum Computing
Microsoft

Researchers will be able to get a taste of Microsoft's reliable quantum computing via Azure Quantum Elements in the next few months, where it will be available as a private preview. The goal is to push even further to Level 3 quantum supercomputing, which will theoretically be able to tackle incredibly complex issues like climate change and exotic drug research. It's unclear how long it'll take to actually reach that point, but for now, at least we're moving one step closer towards practical quantum computing.

"Getting to a large-scale fault-tolerant quantum computer is still going to be a long road," Professor Harrow wrote. "This is an important step for this hardware platform. Along with the progress on neutral atoms, it means that the cold atom platforms are doing very well relative to their superconducting qubit competitors."

This article originally appeared on Engadget at https://www.engadget.com/microsoft-may-have-finally-made-quantum-computing-useful-164501302.html?src=rss

Apple brings Spatial Persona avatars to Vision Pro to help you feel less alone

Apple is making the Vision Pro a bit more social with the introduction of Spatial Personas, which breaks those avatars out of their restricted windows and plops them right next to you in virtual space. The goal is to make collaborating and hanging out feel more natural in the Vision Pro — you can work on presentations together, watch movies over SharePlay, or play games as if your friends are right beside you. The feature works with up to five participants, and it'll be available today for everyone with a Vision Pro running visionOS 1.1 or later.

Meta has tackled virtual collaboration similarly with Horizon Workrooms, but Apple's implementation reminds me more of Microsoft Mesh, which let me interact with virtual companions in AR using the HoloLens 2. Like the Vision Pro itself, Spatial Personas seem a bit more refined than Microsoft's 2021-era technology. You can enable or disable them at will from a FaceTime call, and Apple says everyone will be able to adjust content how they like, without affecting how their virtual companions will see it.

While I found Apple's Personas to be a bit creepy and robotic during my Vision Pro review, the company has steadily improved them to better capture different facial expressions and hairstyles. When they're stuck in a FaceTime window, Personas are a sub-par replacement for actually seeing your friends faces. But they may be more useful if they can make it seem like your remote friends are actually sitting beside you. 

This article originally appeared on Engadget at https://www.engadget.com/apple-brings-spatial-persona-avatars-to-vision-pro-to-help-you-feel-less-alone-144824140.html?src=rss

Apple brings Spatial Persona avatars to Vision Pro to help you feel less alone

Apple is making the Vision Pro a bit more social with the introduction of Spatial Personas, which breaks those avatars out of their restricted windows and plops them right next to you in virtual space. The goal is to make collaborating and hanging out feel more natural in the Vision Pro — you can work on presentations together, watch movies over SharePlay, or play games as if your friends are right beside you. The feature works with up to five participants, and it'll be available today for everyone with a Vision Pro running visionOS 1.1 or later.

Meta has tackled virtual collaboration similarly with Horizon Workrooms, but Apple's implementation reminds me more of Microsoft Mesh, which let me interact with virtual companions in AR using the HoloLens 2. Like the Vision Pro itself, Spatial Personas seem a bit more refined than Microsoft's 2021-era technology. You can enable or disable them at will from a FaceTime call, and Apple says everyone will be able to adjust content how they like, without affecting how their virtual companions will see it.

While I found Apple's Personas to be a bit creepy and robotic during my Vision Pro review, the company has steadily improved them to better capture different facial expressions and hairstyles. When they're stuck in a FaceTime window, Personas are a sub-par replacement for actually seeing your friends faces. But they may be more useful if they can make it seem like your remote friends are actually sitting beside you. 

This article originally appeared on Engadget at https://www.engadget.com/apple-brings-spatial-persona-avatars-to-vision-pro-to-help-you-feel-less-alone-144824140.html?src=rss

HP Spectre x360 14 review (2024): Keeping the 2-in-1 laptop dream alive

The idea behind convertible, or 2-in-1 PCs, has remained the same over the last decade: Why buy a tablet when your laptop can fold a full 360 degrees, allowing you to use it as a large slate, or a screen propped up without a keyboard in the way? Most PC makers have moved on from the concept entirely, but HP remains one of the holdouts. While Windows never became the tablet-friendly platform Microsoft envisioned, there's still plenty of value in having a machine that can transform to suit your needs.

That was my takeaway two years ago when I tested HP's 16-inch Spectre x360, and now the company has returned with a smaller model, the Spectre x360 14. It features Intel's latest CPUs with AI-accelerating NPUs (neural processing units), faster Intel Arc graphics and a beautiful 2.8K OLED display. And best of all, it's still usable as a tablet, unlike its larger sibling.

Even if you never plan to twist its screen around, though, the HP Spectre x360 14 is still an attractive premium laptop. For some, it may also serve as a more traditional alternative to Dell's new XPS 14, which has an invisible trackpad and a capacitive function row. While that computer looks great, getting used to its less conventional features takes some time. The Spectre x360 14, on the other hand, is both attractive and familiar to anyone who's ever used a laptop. (Its rotating screen takes just 10 seconds to figure out for the first time, while Dell's invisible trackpad still tripped me up hours after I started testing it.)

HP Spectre x360 14 front view
Photo by Devindra Hardawar/Engadget

Design and hardware

That familiarity could also be seen as a shortcoming of HP's. The Spectre x360 14 has everything you expect to see in a premium laptop today: A sleek metal case, a gorgeous screen with ultra-thin bezels and a luxuriously large trackpad with haptic feedback. But really, it doesn't look that much different from the 13-inch Spectre x360 I reviewed in 2019. It would be nice to see HP take a few major design leaps, but on the other hand, I can't blame the company for sticking with a winning design.

With the Spectre x360 14, HP focused on minor updates. It has a wide 14-inch screen with a 16:10 aspect ratio, compared to the previous model's 13.5-inch display that was a squarish 3:2. Its trackpad offers configurable haptic feedback and is 19 percent larger than before, so much so that it completely dominates the palm area. HP stuck with its wonderfully responsive keyboard, but its key caps are 12 percent larger, making them easier to hit. And to simplify functionality a bit, HP unified the power button and fingerprint sensor (the laptop also supports Windows Hello facial biometrics).

The port situation hasn't changed. There are two USB-C connections along the right rear (including one on its unique chopped corner), as well as a drop-down USB Type-A port on the left and a headphone jack on the corner. As usual, it would have been nice to see some sort of card reader built in, especially for a machine aimed at creative professionals.

HP Spectre x360 14 headphone jack and USB-A port
Photo by Devindra Hardawar/Engadget

The Spectre x360 14 may look very similar to its siblings, but HP says it's been tweaked significantly under the hood. It now supports 28-watt Intel Core Ultra CPUs, instead of the previous 14-watt options, and offers 10 percent more airflow than before. The company also managed to engineer those improvements without increasing the machine's 17 millimeter height. At 3.2 pounds, it's a bit more portable than the 3.5-pound MacBook Pro 14-inch.

The Spectre's 9-megapixel webcam is also a major upgrade from the previous 5MP option. The new sensor offers hardware-enabled low light adjustment thanks to quad-binning, the process of taking data from four pixels and combining them into one. That allows cameras with smaller pixels to let in more light, resulting in a brighter overall picture. During Google Meet and Zoom calls, the webcam delivered a sharp picture with bright and bold colors. It looked almost like a mirrorless camera once I enabled Windows Studio Effects background blur, though the picture occasionally looked overexposed in direct sunlight.

HP Spectre x360 14 USB-C ports
Photo by Devindra Hardawar/Engadget

Video chats also sounded great through the laptop's quad-speaker array, which consists of two upward firing tweeters along the keyboard and two woofers along the front. There's not much low-end (especially compared to Apple's MacBook Pro speakers), but voices and music sound surprisingly clear. The speakers can also get pretty loud without distortion, which is impressive for such a thin system.

While the laptop has an NPU-equipped processor, which powers features in Paint, ClipChamp and Windows Studio Effects, the Spectre x360 14 isn't technically an "AI PC" under Intel and Microsoft's definition. The reason? It doesn't have a dedicated button for Windows Copilot. Personally, though, I haven't found that key to be very useful on the XPS 14 and 16. Triggering Copilot from the taskbar or Windows sidebar isn't very difficult, and it's certainly not onerous enough to warrant giving up a spot on the keyboard.

HP Spectre x360 14 screen stand formation
Photo by Devindra Hardawar/Engadget

In use

The HP Spectre x360 14 I reviewed performed similarly to other machines we've tested with Intel's Core Ultra 7 155H chip. It’s fast and relatively efficient, especially compared to systems from two years ago. My review unit, which came with 32GB of RAM and a 2TB SSD, was 30 percent faster in the PCMark 10 benchmark compared to the Spectre x360 16 from 2022 (6,493 points, up from 4,785 points). This year’s Spectre also scored 78 percent higher in the Cinebench R23 multi-core benchmark, a testament to the improvements Intel has made since its 11th-gen CPUs.

Geekbench 6 CPU

PCMark 10

Cinebench R23

3DMark Wildlife Extreme

HP Spectre x360 14 (Intel Core Ultra 7, 2023)

2,273/11,735

6,493

1,651/8,481

5,952

ASUS ZenBook 14 OLED (Intel Core Ultra 7, 2023)

2,240/10,298

6,170

1,599/7,569

4,827

Apple MacBook Pro 14-inch (Apple M3, 2023)

3142/11,902

N/A

1,932/10,159

8,139

HP Spectre x360 16 (Intel i7-11390H, 2022)

N/A

4,785

1,515/3,722

N/A

The most noticeable upgrade for the Spectre x360 isn't AI smarts; it's Intel's Arc graphics, which are dramatically faster than Intel's older integrated graphics. In 3DMark's TimeSpy Extreme benchmark, it almost kept up with NVIDIA's RTX 3050 in the x360 16 (1,435 points compared to 1,730). That's impressive for a machine that's far slimmer and lighter. Sure, it's no gaming rig, but I was still able to play Halo Infinite in 1080p at around 30 fps. I'm sure it would handle smaller indie titles just fine.

Thanks to the wealth of RAM and Intel's Core Ultra chip, my review model tackled everything I threw at it without any noticeable slowdown. During a typical workday, I juggle dozens of browser tabs, photo editing apps, YouTube streams, video chats, Slack and Evernote. The Spectre x360's OLED display also made everything look fantastic, even if I was just staring at words on a news site. It supports a variable refresh rate up to 120Hz, so scrolling through documents and sites was very smooth.

HP Spectre x360 14 keyboard
Photo by Devindra Hardawar/Engadget

When I first tested a Spectre x360 five years ago, I immediately fell in love with its keyboard. Typing felt incredibly satisfying, thanks to a healthy amount of key travel and feedback. It was one of those rare designs that almost felt like it was begging me to use it, like a finely tuned piano that's simply urging you to play. Thankfully, HP didn't mess with any of that keyboard magic: The large new key caps are even more comfortable to use, and the actual typing experience is as great as ever.

I have a few complaints about the Spectre x360's new trackpad though. It's smooth and accurate for swiping, and its haptic feedback is indiscernible from a trackpad that physically depresses. But HP's palm rejection software feels sloppy — occasionally, while typing up a storm, my hand would hit the trackpad and push the cursor to select another window. It happened often enough that it became a creativity flow killer. I'm hoping this is something HP can sort out with a software update eventually.

As a convertible notebook, the Spectre x360 14 is far more useful than the 16-inch model. A gentle push on the screen is all it takes to flip it around the keyboard — it becomes a tablet when it’s fully turned around, or you can stop that process halfway and flip the Spectre around for its “tent” mode. The 14-inch x360 is better at being a slate, simply because it's lighter and easier to hold with one hand (though you'll probably want to prop it on your lap for longer sessions).

Rotating the screen was also less cumbersome, since the display was far less wide. I used the tent formation to watch YouTube videos in bed, while on the couch I occasionally folded the keyboard behind the Spectre, so I could use it like a large touchscreen with a stand. I appreciate the versatility of 2-in-1 convertibles more than the flexible OLED screens we're seeing on new machines. It's cheaper to implement, and for my purposes, convertibles are simply more pragmatic.

The Spectre x360's major flaw is battery life: It lasted five hours and ten minutes in the PCMark 10 Modern Office test, whereas the ZenBook 14 OLED pushed through for 12 hours and 43 minutes. There's a cost for keeping its frame so thin, after all. During real-world testing, it would typically need to charge around six hours into my workday. 

HP Spectre x360 14
Photo by Devindra Hardawar/Engadget

Pricing and the competition

The Spectre x360 14 is a decent deal for a high-end convertible, starting at $1,450 with an intel Core Ultra 5 125H, 16GB of RAM and a 512GB SSD. At the time of writing, that configuration has been discounted by $300, which is an even better value. (Credit to HP for not offering a meager 8GB RAM option, which would only lead to headaches for most users.) For $1,900, you can bump up to a Core Ultra 7 155H chip, 32GB of RAM and a 2TB SSD.

Your options are somewhat limited if you're looking for other upper-tier convertible laptops. Dell's XPS 13 2-in-1 is still running older 12th-gen Intel chips, and you'll have to look to the middle-range Inspiron and Latitude lines for more modern options. We’re also still waiting to see Lenovo’s Yoga lineup get upgraded to newer Intel chips. And we haven’t tested Samsung’s Galaxy Book4 360, but it doesn’t have the style of HP’s design.

Microsoft's Surface Laptop Studio 2 is also technically a convertible (its screen pulls forward, instead of flipping around), but it starts at $1,900. For that price, you're better off going for the x360 14's beefier hardware, instead of the Surface's unique screen.

HP Spectre x360 14 case rear
Photo by Devindra Hardawar/Engadget

Wrap-up

It's unclear how much life is left in the convertible PC format, but I wouldn't be surprised if HP ends up being one of the last companies still giving it a shot. The Spectre x360 14 is one of the best laptops you can buy today — the fact that it can also be flipped around in multiple orientations is just icing on the cake.

This article originally appeared on Engadget at https://www.engadget.com/hp-spectre-x360-14-review-2024-keeping-the-2-in-1-laptop-dream-alive-140045823.html?src=rss