Apple is slowly turning the iPhone into a fairly capable satellite communications device. It first rolled out Emergency SOS via Satellite on the iPhone 14 lineup. While you were previously limited to relying on the spacecraft for communications only when dialing emergency services, in iOS 18 this capability will apply to all situations where you're away from cellular or Wi-Fi coverage. I got to check out the updated experience at Apple Park, here's how it will work with iMessage or SMS messages.
When you’ve been disconnected from cellular or Wi-Fi for some time, an alert will appear to say you’ll need to hop on satellite communications to send messages. Tapping this notification brings up the new connection assistant, which contains all your satellite-powered tools, like Find My, roadside assistance and emergency SOS.
You can start a chat from this page or just go to the Messages app, where a prompt will appear in the Dynamic Island to instruct you on how to find an overhead satellite. You can also go to the Satellite option in Settings or in the Control Center to get set up when cellular or Wi-Fi signals aren’t available.
The connection experience here isn’t much different from before — you’ll be told what direction to point your iPhone, as well as suggestions to avoid obstructions. Once you’re linked, an indicator in the dynamic island shows a green check mark and it’ll continue to change colors and shapes if your connection begins to weaken.
To prevent networks getting congested with promotional SMS messages, Apple has made it so the person that’s off grid is the one who can initiate SMS chats over satellite. That is, except for people who are listed as your emergency contacts or your iCloud family members. Their messages will still come through when you’re on a satellite connection. Regardless of whether you’re using iMessage or SMS, you’ll only be able to send texts, emojis or tapback reactions. Compressing these into packages small enough to convey over satellite is already challenging, not to mention images and video.
In my demo on an iPhone that was tweaked so it couldn’t connect to Wi-Fi or cellular, I watched an Apple rep connect to an available satellite via the Dynamic Island’s interface, then send a text to another person. On the sender’s phone, the words “iMessage. Satellite” appeared above the blue bubble, and the same showed up on the recipient’s phone over the gray bubble. Read receipts aren’t supported over satellite, so I only saw the “delivered” and “sent” status labels under the bubbles.
It’s nice to see both SMS and iMessage supported over satellite, even if only the latter is end-to-end encrypted. Apple’s decision to include SMS is thoughtful, and though I’d like RCS to be covered as well, that platform’s messages are too large or complicated to compress effectively. They are, after all, going to satellite infrastructure over 800 miles away from Earth, and targeting spacecraft that are moving at 15,000 miles per hour.
The company still hasn’t shared details on the pricing for satellite connectivity and related features after its free trial is over, but for now, those with an iPhone 14 and newer will be able to use these features for free. Apple’s provision here is a little confusing at the moment, but basically you get two years of free satellite service from the time you purchase your new iPhone, and in November 2023 the company added another year to its free trial. In theory that’s a window of a total of three years, though it might depend on when you bought your iPhone, since Apple didn’t initially offer a two-year period.
iOS 18 is expected to come to iPhones this fall, and we'll undoubtedly find out more about Messages via Satellite before then. We'll update this article with more information as and when we get it.
This article originally appeared on Engadget at https://www.engadget.com/how-messages-via-satellite-will-work-on-ios-18-and-how-much-it-will-cost-130020976.html?src=rss
There was so much Apple had to cram into its WWDC 2024 keynote that some features were left out of the spotlight. Here at the company's campus, I've had the chance to speak with various executives, as well as get deeper dives into iOS 18, iPadOS 18, Apple Intelligence, watchOS 11 and more. In these sessions, I've been able to learn more about how specific things work, like what steps exactly do you take to customize your iPhone's home screen and control center. I also got to see some other updates that weren't even briefly mentioned during the keynote, like new support for hiking routes in Apple Maps and what training load insights look like on watchOS 11. Of all the unmentioned features I've come to discover, here are my favorites.
Maps: Create and share custom routes
I've always been a Google Maps girl, in part because that app had superior information compared to Apple Maps in its early years. These days, I stick to Google Maps because it has all my saved places and history. When I found out that iOS 18 would bring updates to Apple Maps, particularly to do with hiking and routes, I was intrigued.
Basically, in iOS 18, when you go into search in Maps, you'll see a new option under "Find Nearby" called hikes. It'll show you recommended hikes, and you can filter by the type of hike (loop, for example) and specify a length. You'll find options in the vicinity and tapping into one will show you a topographical view with the elevation details, how challenging it should be as well as estimated duration. You can tap to save each route and store it for offline reference later and add notes too. There's a new Library view and you'll find it in your profile in Maps.
You'll also be able to create new routes in Maps by tapping anywhere to start defining your route. You can keep tapping to add waypoints, which will cause the trail to continue to connect them, then hit a "Close loop" button to finish your trail. These routes can be shared, though it's not yet clear if you can share it to, say, your friend or driver to have them take your preferred path to your destination.
Apple
The hikes that Apple will serve up in Maps are created by its own team, which is working with US National Parks, so they'll only be available for the 63 national parks in the country to begin with. In other words, it's not porting information from AllTrails, for example. In a press release, Apple said thousands of hikes will be available to browse at launch.
As a city dweller who only sometimes hikes, my excitement is less about hiking and more about the potential of sharing my custom routes to show people how they should walk to my building or favorite restaurant from the train station. It's a compelling feature, and arguably a reason I'd choose Apple Maps versus Google's.
Calendar integration with Reminders
Frankly, the Maps update might be my favorite out of everything that wasn't shown off during the WWDC 2024 keynote by a huge margin. But some of the new tools coming to Calendar tickle my fancy too. Specifically, the new integration with reminders makes it easier to not just schedule your tasks right into your daybook, but also check them off from the Calendar app. You can soon move reminders around by long pressing and dragging them, so that note to call your mom can be placed on a slot at 5pm on Wednesday, instead of sitting in your Reminders app. In addition, Calendar is getting new views that better detail your level of activity each day in a month, similar to how the Fitness app shows your daily rings progress quickly in the monthly view.
Tapback insights showing who exactly responded with what emoji
This isn't one that wasn't mentioned at all during the keynote, but there are details about how Tapback works that weren't described at yesterday's show. If you're like me, you might not even have remembered that Tapback refers to those reactions you can send in Messages by double tapping on a blue or gray bubble. With iOS 18, you'll get more options than the limited selection of heart, thumbs up, thumbs down, "Haha," exclamation points and question mark. They're also going to show up in full color with the update, instead of the existing (boring) gray.
What I found out later on, though, is that when you double tap a message that already has reactions attached, a new balloon appears at the top of your screen showing who has responded with which emoji. This should make it easier to lurk in a group chat, but also could double as an unofficial polling tool by asking your friends to react with specific emojis to indicate different answers. That should make Messages a bit more like Slack, and I wish Whatsapp and Telegram would take note.
Others: Math Notes in iPhone, updates to Journal and Safari
There are quite a lot of features coming to iOS 18 that didn't get much love on the WWDC stage, like the Journal app's new widget for the home screen, which shows prompts for reflection and lets you create new entries. Journal also has a new insights view that displays your writing streaks and other historical data, plus a new tool that lets you add your state of mind to each entry from within the app.
Meanwhile, Safari is getting a new "Highlights" button in the search (or URL) bar, and tapping it will show a machine-learning-generated summary of the webpage you're on. Tapping into this brings up a panel with more information like navigation directions to a restaurant mentioned on the page, for example, or a phone number to call up a business. You can also quickly launch the reader view from this pane.
I wasn't super enthusiastic about either of those, largely because I don't use the Journal app much and I don't need Safari summarizing a webpage for me. But there are some other buried updates that I really wanted to shout out. For example, Math Notes for iPad and with Apple Pencil certainly got a lot of time, but it wasn't till I looked at Apple's iOS 18 press release that I found out the iPhone's Notes app is also getting a version of it. According to the screenshot Apple included, it looks like you can tally up and split expenses between a group of friends by writing a list of expenses and how much each item cost, then add the names of each expense to a formula with plus and equal signs, then get that divided by the number of people in your group. Not quite Splitwise, but I could see this becoming more powerful over time.
I was also intrigued by some of the Smart Script features on iPadOS 18, especially when I realized that you can just move your handwritten words around by dragging your handwritten words further away from each other, and the rest of your scrawled text moves in tandem. This is hard to describe, and I'll have to wait till I can try it for myself to show you an animated example. But it was impressive, even if it's not extremely useful.
Finally, the Passwords app and other privacy updates got a shout out during the keynote, but I learned more about how things like accessory setup and contacts sharing with apps work. Apple is releasing a new accessory setup kit so that device makers can adopt a pairing interface similar to how you'd connect your AirPods or Apple Watch to your iPhone. If developers don't use this approach, the new Bluetooth connection interface will be clearer about what other devices are on your network and what you're actually granting access to when you let an app see other devices on your network. Though it wasn't entirely skipped during the keynote, the Passwords app is something that makes me happy, since I'm absolutely sick of having to dig through settings to get codes for apps which I use my iPhone's authenticator to unlock.
There are plenty of features that were talked about that I'm excited by and learned more about the workings of, including the new dynamic clock style in the Photos face in watchOS 11, pinned collections in the redesigned Photos app and iPadOS mirroring for easier remote tech support. Oh, and that Airplay feature that'll let you send money to friends by holding your phones together? Yes! Being able to pause and adjust your Activity rings in watchOS and that Training Load insight? Hallelujah!
And though I can see the appeal of locked and hidden apps, I'm not sure I'd find much use for that and it would probably exacerbate my already prone-to-suspicion nature.
I'm also a little wary of things like Genmoji and Image Playground, which are both Apple Intelligence features that won't hit all iOS 18 devices. There will be metadata information indicating when images were generated by Apple's AI, and guardrails in place to prevent the creation of abusive and exploitative content.
Clearly, there are plenty of updates coming to Apple's phones, tablets, laptops and wearables later this year, and I can't wait to try them out. The public beta should be ready around the end of summer this year, which is when most people (who are willing to risk an unstable platform) can check them out.
This article originally appeared on Engadget at https://www.engadget.com/my-favorite-ios-18-ipados-18-and-watchos-11-features-that-flew-under-the-radar-at-wwdc-2024-113044069.html?src=rss
There was so much Apple had to cram into its WWDC 2024 keynote that some features were left out of the spotlight. Here at the company's campus, I've had the chance to speak with various executives, as well as get deeper dives into iOS 18, iPadOS 18, Apple Intelligence, watchOS 11 and more. In these sessions, I've been able to learn more about how specific things work, like what steps exactly do you take to customize your iPhone's home screen and control center. I also got to see some other updates that weren't even briefly mentioned during the keynote, like new support for hiking routes in Apple Maps and what training load insights look like on watchOS 11. Of all the unmentioned features I've come to discover, here are my favorites.
Maps: Create and share custom routes
I've always been a Google Maps girl, in part because that app had superior information compared to Apple Maps in its early years. These days, I stick to Google Maps because it has all my saved places and history. When I found out that iOS 18 would bring updates to Apple Maps, particularly to do with hiking and routes, I was intrigued.
Basically, in iOS 18, when you go into search in Maps, you'll see a new option under "Find Nearby" called hikes. It'll show you recommended hikes, and you can filter by the type of hike (loop, for example) and specify a length. You'll find options in the vicinity and tapping into one will show you a topographical view with the elevation details, how challenging it should be as well as estimated duration. You can tap to save each route and store it for offline reference later and add notes too. There's a new Library view and you'll find it in your profile in Maps.
You'll also be able to create new routes in Maps by tapping anywhere to start defining your route. You can keep tapping to add waypoints, which will cause the trail to continue to connect them, then hit a "Close loop" button to finish your trail. These routes can be shared, though it's not yet clear if you can share it to, say, your friend or driver to have them take your preferred path to your destination.
Apple
The hikes that Apple will serve up in Maps are created by its own team, which is working with US National Parks, so they'll only be available for the 63 national parks in the country to begin with. In other words, it's not porting information from AllTrails, for example. In a press release, Apple said thousands of hikes will be available to browse at launch.
As a city dweller who only sometimes hikes, my excitement is less about hiking and more about the potential of sharing my custom routes to show people how they should walk to my building or favorite restaurant from the train station. It's a compelling feature, and arguably a reason I'd choose Apple Maps versus Google's.
Calendar integration with Reminders
Frankly, the Maps update might be my favorite out of everything that wasn't shown off during the WWDC 2024 keynote by a huge margin. But some of the new tools coming to Calendar tickle my fancy too. Specifically, the new integration with reminders makes it easier to not just schedule your tasks right into your daybook, but also check them off from the Calendar app. You can soon move reminders around by long pressing and dragging them, so that note to call your mom can be placed on a slot at 5pm on Wednesday, instead of sitting in your Reminders app. In addition, Calendar is getting new views that better detail your level of activity each day in a month, similar to how the Fitness app shows your daily rings progress quickly in the monthly view.
Tapback insights showing who exactly responded with what emoji
This isn't one that wasn't mentioned at all during the keynote, but there are details about how Tapback works that weren't described at yesterday's show. If you're like me, you might not even have remembered that Tapback refers to those reactions you can send in Messages by double tapping on a blue or gray bubble. With iOS 18, you'll get more options than the limited selection of heart, thumbs up, thumbs down, "Haha," exclamation points and question mark. They're also going to show up in full color with the update, instead of the existing (boring) gray.
What I found out later on, though, is that when you double tap a message that already has reactions attached, a new balloon appears at the top of your screen showing who has responded with which emoji. This should make it easier to lurk in a group chat, but also could double as an unofficial polling tool by asking your friends to react with specific emojis to indicate different answers. That should make Messages a bit more like Slack, and I wish Whatsapp and Telegram would take note.
Others: Math Notes in iPhone, updates to Journal and Safari
There are quite a lot of features coming to iOS 18 that didn't get much love on the WWDC stage, like the Journal app's new widget for the home screen, which shows prompts for reflection and lets you create new entries. Journal also has a new insights view that displays your writing streaks and other historical data, plus a new tool that lets you add your state of mind to each entry from within the app.
Meanwhile, Safari is getting a new "Highlights" button in the search (or URL) bar, and tapping it will show a machine-learning-generated summary of the webpage you're on. Tapping into this brings up a panel with more information like navigation directions to a restaurant mentioned on the page, for example, or a phone number to call up a business. You can also quickly launch the reader view from this pane.
I wasn't super enthusiastic about either of those, largely because I don't use the Journal app much and I don't need Safari summarizing a webpage for me. But there are some other buried updates that I really wanted to shout out. For example, Math Notes for iPad and with Apple Pencil certainly got a lot of time, but it wasn't till I looked at Apple's iOS 18 press release that I found out the iPhone's Notes app is also getting a version of it. According to the screenshot Apple included, it looks like you can tally up and split expenses between a group of friends by writing a list of expenses and how much each item cost, then add the names of each expense to a formula with plus and equal signs, then get that divided by the number of people in your group. Not quite Splitwise, but I could see this becoming more powerful over time.
I was also intrigued by some of the Smart Script features on iPadOS 18, especially when I realized that you can just move your handwritten words around by dragging your handwritten words further away from each other, and the rest of your scrawled text moves in tandem. This is hard to describe, and I'll have to wait till I can try it for myself to show you an animated example. But it was impressive, even if it's not extremely useful.
Finally, the Passwords app and other privacy updates got a shout out during the keynote, but I learned more about how things like accessory setup and contacts sharing with apps work. Apple is releasing a new accessory setup kit so that device makers can adopt a pairing interface similar to how you'd connect your AirPods or Apple Watch to your iPhone. If developers don't use this approach, the new Bluetooth connection interface will be clearer about what other devices are on your network and what you're actually granting access to when you let an app see other devices on your network. Though it wasn't entirely skipped during the keynote, the Passwords app is something that makes me happy, since I'm absolutely sick of having to dig through settings to get codes for apps which I use my iPhone's authenticator to unlock.
There are plenty of features that were talked about that I'm excited by and learned more about the workings of, including the new dynamic clock style in the Photos face in watchOS 11, pinned collections in the redesigned Photos app and iPadOS mirroring for easier remote tech support. Oh, and that Airplay feature that'll let you send money to friends by holding your phones together? Yes! Being able to pause and adjust your Activity rings in watchOS and that Training Load insight? Hallelujah!
And though I can see the appeal of locked and hidden apps, I'm not sure I'd find much use for that and it would probably exacerbate my already prone-to-suspicion nature.
I'm also a little wary of things like Genmoji and Image Playground, which are both Apple Intelligence features that won't hit all iOS 18 devices. There will be metadata information indicating when images were generated by Apple's AI, and guardrails in place to prevent the creation of abusive and exploitative content.
Clearly, there are plenty of updates coming to Apple's phones, tablets, laptops and wearables later this year, and I can't wait to try them out. The public beta should be ready around the end of summer this year, which is when most people (who are willing to risk an unstable platform) can check them out.
This article originally appeared on Engadget at https://www.engadget.com/my-favorite-ios-18-ipados-18-and-watchos-11-features-that-flew-under-the-radar-at-wwdc-2024-113044069.html?src=rss
Apple Intelligence is coming, but not to every iPhone out there. In fact, you'll need to have a device with an A17 Pro processor or M-series chip to use many of the features unveiled during the Apple Intelligence portion of WWDC 2024. That means only iPhone 15 Pro owners (and those with an M-series iPad or MacBook) will get the iOS 18-related Apple Intelligence (AI?) updates like Genmoji, Image Playground, the redesigned Siri and Writing Tools. Then there are things like Math Notes and Smart Script on iPadOS 18 and the new features in Messages coming via iOS 18 that will be arriving for anyone that can upgrade to the latest platforms. It's confusing, and the best way to anticipate what you're getting is to know what processor is in your iPhone, iPad or Mac.
Why won't the iPhone 14 Pro get Apple Intelligence?
It's not evident exactly why older devices using an A16 chip (like the iPhone 14 Pro) won't work with Apple Intelligence, given its neural engine seems more than capable compared to the M1. A closer look at the specs sheets of those two processors show that the main differences appear to be in memory and GPU prowess. Specifically, the A16 Bionic can only support a maximum of 6GB of RAM onboard while the M1 starts at 8GB and goes up to 16GB. In fact, all the supported devices have at least 8GB of RAM and that could hint at why your iPhone 14 Pro will not be able to handle making Genmojis, perhaps.
Though it might not seem quite fair that owners of a relatively recent iPhone won't get to use Apple Intelligence features, you'll still be getting a healthy amount of updates via iOS 18. Here's a quick breakdown of what is coming via iOS 18, and what's only coming if your iPhone supports Apple Intelligence.
What iOS 18 features will be coming to iPhones?
Basically everything described during the iOS portion of yesterday's WWDC 2024 keynote is coming to all iPhones (that can update to iOS 18). That includes the customizable home screen, Control Center, dedicated Passwords app, redesigned Photos app, new Tapback emoji reactions, text effects, scheduled sending and more. Messages via Satellite is only coming to iPhone 14 or newer, and you'll be able to send text messages, emojis and Tapbacks, but not images or videos.
You'll also be tied to the same satellite service plan that you got at the time of your purchase of an iPhone 14. If you bought your iPhone 14 in January 2024, you received a free two-year subscription to be able to use Emergency SOS via Satellite and other satellite communication features that now include texting. That means that to continue texting people via satellite after January 2026, you'll need to start paying for a plan.
There are a whole host of updates coming with iOS 18 that Apple didn't quite cover in its keynote either, and I'll be putting up a separate guide about that in a bit. But suffice to say that apps like Maps, Safari, Calendar and Journal are getting new functions that, together with the other changes mentioned so far, add up to a meaty OS upgrade.
What Apple Intelligence features are older devices missing out on?
In short, all of them. If you have an iPhone 15 Pro or an iPad (or Mac) with an M-series chip, you'll get a redesigned Siri, Genmoji and Image Playground, as well as writing tools baked into the system. That means tools like proofreading, summarizing or helping you adjust your tone in apps like Mail, Notes and Keynote are limited to the AI-supported devices. If you don't have one of those, you'll get none of this.
The redesigned Siri, which is only coming through Apple Intelligence, will be able to understand what's on your screen to contextually answer your queries. If you've been texting with your friend about which baseball player is the best, you can ask Siri (by long pressing the power button or just saying Hey Siri) "How many homeruns has he done?" The assistant will know who "he" is in this context, and understand you're referring to the athlete, not the friend you're chatting with.
Apple Intelligence is also what brings the ability to type to Siri — and you can invoke this keyboard to talk to the assistant by double tapping the bottom of the screen.
This also means that new glowing edge animation that appears when Siri is triggered is limited to the Apple Intelligence-supported devices. You'll still be looking at that little orb at the bottom of your screen when you talk to the assistant on an iPhone 14 Pro or older.
This article originally appeared on Engadget at https://www.engadget.com/apple-intelligence-what-devices-and-features-will-actually-be-supported-185850732.html?src=rss
Google is crossing genres with its latest wearable for kids, combining a gaming system and an activity tracker in the Fitbit Ace LTE. The company is pitching this as a “first-of-its-kind connected smartwatch that transforms exercise into play and safely helps kids lead more active, independent lives.” Basically, think of it as a Nintendo Switch pared down into an activity tracker for children aged 7 and up, with a few safety and connectivity features built in.
The main idea here is to get kids up and moving, in exchange for progress on the Ace LTE’s onboard games. But there are also basic tools that let parents (and trusted contacts) stay in touch with the wearer. Through the new Fitbit Ace app (that adults can install on iOS or Android), guardians can set play time, monitor activity progress and send calls or messages. On the watch itself, kids can also use the onscreen keyboard or microphone to type or dictate texts or choose an emoji.
The Fitbit Ace LTE’s hardware: Basically a Pixel Watch 2
Since the Fitbit Ace LTE uses a simplified version of the hardware on the Pixel Watch 2, it’s pretty responsive. One major difference, though, is that the kid-friendly tracker uses Gorilla Glass 3 on its cover, in addition to the 5 ATMs of water-resistance that both models share. Google does include a protective case with each Ace LTE, and it doesn’t add much weight.
There are also other obvious differences because the Pixel Watch 2 has a circular face while the Fitbit Ace LTE has a “squircle” (square with rounded corners) OLED with two large buttons on the right side. The latter’s band is also a lot narrower, and it comes “with technology built in,” according to Google’s vice president of product management Anil Sabharwal. That's just a fancy way to say that the Ace LTE recognizes when you swap in a new strap and each accessory comes with unique content.
Cherlynn Low for Engadget
The company is calling these straps “Cartridges” — another reminder of how the Fitbit Ace LTE is a gaming console wannabe. When you snap a new one on, you’ll see an animation of all the bonus material you just got. They include new backgrounds and items for your Tamagotchi-esque pet called “eejie.” Separate bands also add unique cartoony strips, called Noodles, that make their way around the edges of the watch's display every day which chart the wearer’s progress towards daily goals, similar to Apple's activity rings.
I’m dancing around the main part of the Fitbit Ace LTE’s proposition, because I wanted to get the hardware out of the way. The most interesting concept here is the idea of a wearable gaming system. The Ace LTE’s home screen looks fairly typical. It shows you the time and the Noodle activity ring around it, as well as some small font at the very bottom showing the number of points collected.
To the left of this page is what Sabharwal called a “playlist” — a collection of daily quests. Like on other iOS or Android games, this is a bunch of targets to hit within a dictated time frame to ensure you’re engaged, and achieving these goals leads to rewards.
Eejie: Like Tamagotchi but less cute
Most of these rewards are things you can use to jazz up your digital pet’s home over on the right of the home screen. Google calls these things “eejies” — that name doesn’t actually mean anything. Some engineers in a room looked at the letters “I” “J” and “I” and sounded them out and thought sure, why not. (No, those letters don't actually stand for anything, either.)
Cherlynn Low for Engadget
According to Google, “Eejies are customizable creatures that feed off daily activity — the more kids reach their movement goals, the more healthy and happy their eejie gets.” When daily activities are completed and each child earns arcade tickets (or when a new watch strap is attached), they can exchange them for new outfit or furniture items for their eejies.
Even though they’re supposed to be “customizable creatures,” the eejies are anthropomorphic and look like… well, kids. Depending on how you style them, they sort of look like sullen teenagers, even. Don’t expect a cute Pikachu or Digimon to play with, these eejie are two-legged beings with heads, arms and necks. I’d prefer something cuter, but perhaps the target demographic likes feeding and playing with a strange avatar of themselves.
When multiple Ace LTE wearers meet up, their eejie can visit each other and leave emoji messages. Of course, how fun that is depends on how many of your (kid’s) friends have Ace LTEs.
Gaming on the Fitbit Ace LTE
Even without that social component though, the Ace LTE can be quite a lot of fun. It is the home of Fitbit Arcade, a new library of games built specifically for this wearable. So far, I’ve only seen about six games in the collection, including a room escape game, a fishing simulator and a Mario Kart-like racer.
The first game I tried at Google’s briefing was Smoky Lake, the fishing game. After a quick intro, I tapped on a shadow of a fish in the water, and flung my arm out. I waited till the Ace LTE buzzed, then pulled my wrist in. I was told that I had caught a puffer fish, and swiped through to see more information about past catches. I earned five arcade tickets with this catch.
I gleefully tried again and caught what I was told was the “biggest pineapple gillfish” acquired that day. Other hauls the Ace LTE I was wearing had acquired included a “ramen squid” and a “blob fish,” and tapping an icon on the upper left brought up my library of things that had been caught.
Cherlynn Low for Engadget
I then played a round of Pollo 13, a racing game where I played as a chicken in a bathtub competing in an intergalactic space match against my arch nemesis. There, I tilted my wrist in all directions to steer, keeping my vehicle on track or swerving to collect items that sped me up. Just as I expected based on my prior Mario Kart experience (and also my general lack of skill at driving in real life), I sucked at this game and came in last. Sabharwal gently informed me that this was the poorest result they had seen all day.
I didn’t get to check out other titles installed, like Galaxy Rangers, Jelly Jam or Sproutlings but I was most intrigued by a room escape game, which is my favorite genre.
Google doesn’t want to encourage obsession or addiction to the Ace LTE’s games, though. “We don’t want kids to overexercise. We don’t want kids to feel like they have a streak and if they miss a day, ‘Oh my God, the world is over!’” Sabharwal said.
To that end, progress in each game is built around encouraging the wearer to meet movement goals to advance to new stages. Every two to three minutes, you’ll be prompted to get up and move. In Smokey Lake, for instance, you’ll be told that you’ve run out of bait and have to walk a few hundred steps to go to the bait shop. This can be achieved by walking a number of steps or doing any activity that meets similar requirements. Google is calling this “interval-based gaming,” playing on the idea of “interval-based training.” After about five to 10 sessions, the company thinks each wearer will hit the 60 to 90 minutes of daily required activity recommended by the World Health Organization.
Cherlynn Low for Engadget
The idea of activity as currency for games isn’t exactly novel, but Google’s being quite careful in its approach. Not only is it trying to avoid addiction, which for the target age group is a real concern, but the company also says it built the Ace LTE “responsibly from the ground up” by working with “experts in child psychology, public health, privacy and digital wellbeing.” It added that the device was “built with privacy in mind, front and center,” and that only parents will ever be shown a child’s location or activity data in their apps. Location data is deleted after 24 hours, while activity data is deleted after a maximum of 35 days. Google also said “there are no third-party apps or ads on the device.”
While activity is the main goal at launch, there is potential for the Ace LTE to track sleep and other aspects of health to count towards goals. Parts of the Ace LTE interface appeared similar to other Fitbit trackers, with movement reminders and a Today-esque dashboard. But from my brief hands-on, it was hard to fully explore and compare.
Though I like the idea of the Ace LTE and was definitely entertained by some of the games, I still have some reservations. I was concerned that the device I tried on felt warm, although Sabharwal explained it was likely because the demo units had been charging on and off all day. I also didn’t care for the thick bezels around the screen, though that didn’t really adversely impact my experience. What did seem more of a problem was the occasional lag I encountered waiting for games to load or to go to the home screen. I’m not sure if that was a product of early software or if the final retail units will have similar delays, and will likely need to run a full review to find out.
The Fitbit Ace LTE is available for pre-order today for $230 on the Google Store or Amazon and it arrives on June 5. You’ll need to pay an extra $10 a month for the Ace Pass plan, which includes LTE service (on Google’s Fi) and access to Fitbit Arcade and regular content updates. If you spring for an annual subscription, you’ll get a collectable Ace Band (six are available at launch) and from now till August 31, the yearly fee is discounted at 50 percent off, making it about $5 a month.
Update, May 29, 3:15PM ET: This story has been edited to clarify that the Fitbit Ace LTE's hardware is a simplified version of the Pixel Watch 2. It is not capable of sleep or stress tracking.
This article originally appeared on Engadget at https://www.engadget.com/fitbit-ace-lte-hands-on-wearable-gaming-to-make-exercise-fun-but-not-too-fun-140059054.html?src=rss
Google is crossing genres with its latest wearable for kids, combining a gaming system and an activity tracker in the Fitbit Ace LTE. The company is pitching this as a “first-of-its-kind connected smartwatch that transforms exercise into play and safely helps kids lead more active, independent lives.” Basically, think of it as a Nintendo Switch pared down into an activity tracker for children aged 7 and up, with a few safety and connectivity features built in.
The main idea here is to get kids up and moving, in exchange for progress on the Ace LTE’s onboard games. But there are also basic tools that let parents (and trusted contacts) stay in touch with the wearer. Through the new Fitbit Ace app (that adults can install on iOS or Android), guardians can set play time, monitor activity progress and send calls or messages. On the watch itself, kids can also use the onscreen keyboard or microphone to type or dictate texts or choose an emoji.
The Fitbit Ace LTE’s hardware: Basically a Pixel Watch 2
Since the Fitbit Ace LTE uses a simplified version of the hardware on the Pixel Watch 2, it’s pretty responsive. One major difference, though, is that the kid-friendly tracker uses Gorilla Glass 3 on its cover, in addition to the 5 ATMs of water-resistance that both models share. Google does include a protective case with each Ace LTE, and it doesn’t add much weight.
There are also other obvious differences because the Pixel Watch 2 has a circular face while the Fitbit Ace LTE has a “squircle” (square with rounded corners) OLED with two large buttons on the right side. The latter’s band is also a lot narrower, and it comes “with technology built in,” according to Google’s vice president of product management Anil Sabharwal. That's just a fancy way to say that the Ace LTE recognizes when you swap in a new strap and each accessory comes with unique content.
Cherlynn Low for Engadget
The company is calling these straps “Cartridges” — another reminder of how the Fitbit Ace LTE is a gaming console wannabe. When you snap a new one on, you’ll see an animation of all the bonus material you just got. They include new backgrounds and items for your Tamagotchi-esque pet called “eejie.” Separate bands also add unique cartoony strips, called Noodles, that make their way around the edges of the watch's display every day which chart the wearer’s progress towards daily goals, similar to Apple's activity rings.
I’m dancing around the main part of the Fitbit Ace LTE’s proposition, because I wanted to get the hardware out of the way. The most interesting concept here is the idea of a wearable gaming system. The Ace LTE’s home screen looks fairly typical. It shows you the time and the Noodle activity ring around it, as well as some small font at the very bottom showing the number of points collected.
To the left of this page is what Sabharwal called a “playlist” — a collection of daily quests. Like on other iOS or Android games, this is a bunch of targets to hit within a dictated time frame to ensure you’re engaged, and achieving these goals leads to rewards.
Eejie: Like Tamagotchi but less cute
Most of these rewards are things you can use to jazz up your digital pet’s home over on the right of the home screen. Google calls these things “eejies” — that name doesn’t actually mean anything. Some engineers in a room looked at the letters “I” “J” and “I” and sounded them out and thought sure, why not. (No, those letters don't actually stand for anything, either.)
Cherlynn Low for Engadget
According to Google, “Eejies are customizable creatures that feed off daily activity — the more kids reach their movement goals, the more healthy and happy their eejie gets.” When daily activities are completed and each child earns arcade tickets (or when a new watch strap is attached), they can exchange them for new outfit or furniture items for their eejies.
Even though they’re supposed to be “customizable creatures,” the eejies are anthropomorphic and look like… well, kids. Depending on how you style them, they sort of look like sullen teenagers, even. Don’t expect a cute Pikachu or Digimon to play with, these eejie are two-legged beings with heads, arms and necks. I’d prefer something cuter, but perhaps the target demographic likes feeding and playing with a strange avatar of themselves.
When multiple Ace LTE wearers meet up, their eejie can visit each other and leave emoji messages. Of course, how fun that is depends on how many of your (kid’s) friends have Ace LTEs.
Gaming on the Fitbit Ace LTE
Even without that social component though, the Ace LTE can be quite a lot of fun. It is the home of Fitbit Arcade, a new library of games built specifically for this wearable. So far, I’ve only seen about six games in the collection, including a room escape game, a fishing simulator and a Mario Kart-like racer.
The first game I tried at Google’s briefing was Smoky Lake, the fishing game. After a quick intro, I tapped on a shadow of a fish in the water, and flung my arm out. I waited till the Ace LTE buzzed, then pulled my wrist in. I was told that I had caught a puffer fish, and swiped through to see more information about past catches. I earned five arcade tickets with this catch.
I gleefully tried again and caught what I was told was the “biggest pineapple gillfish” acquired that day. Other hauls the Ace LTE I was wearing had acquired included a “ramen squid” and a “blob fish,” and tapping an icon on the upper left brought up my library of things that had been caught.
Cherlynn Low for Engadget
I then played a round of Pollo 13, a racing game where I played as a chicken in a bathtub competing in an intergalactic space match against my arch nemesis. There, I tilted my wrist in all directions to steer, keeping my vehicle on track or swerving to collect items that sped me up. Just as I expected based on my prior Mario Kart experience (and also my general lack of skill at driving in real life), I sucked at this game and came in last. Sabharwal gently informed me that this was the poorest result they had seen all day.
I didn’t get to check out other titles installed, like Galaxy Rangers, Jelly Jam or Sproutlings but I was most intrigued by a room escape game, which is my favorite genre.
Google doesn’t want to encourage obsession or addiction to the Ace LTE’s games, though. “We don’t want kids to overexercise. We don’t want kids to feel like they have a streak and if they miss a day, ‘Oh my God, the world is over!’” Sabharwal said.
To that end, progress in each game is built around encouraging the wearer to meet movement goals to advance to new stages. Every two to three minutes, you’ll be prompted to get up and move. In Smokey Lake, for instance, you’ll be told that you’ve run out of bait and have to walk a few hundred steps to go to the bait shop. This can be achieved by walking a number of steps or doing any activity that meets similar requirements. Google is calling this “interval-based gaming,” playing on the idea of “interval-based training.” After about five to 10 sessions, the company thinks each wearer will hit the 60 to 90 minutes of daily required activity recommended by the World Health Organization.
Cherlynn Low for Engadget
The idea of activity as currency for games isn’t exactly novel, but Google’s being quite careful in its approach. Not only is it trying to avoid addiction, which for the target age group is a real concern, but the company also says it built the Ace LTE “responsibly from the ground up” by working with “experts in child psychology, public health, privacy and digital wellbeing.” It added that the device was “built with privacy in mind, front and center,” and that only parents will ever be shown a child’s location or activity data in their apps. Location data is deleted after 24 hours, while activity data is deleted after a maximum of 35 days. Google also said “there are no third-party apps or ads on the device.”
While activity is the main goal at launch, there is potential for the Ace LTE to track sleep and other aspects of health to count towards goals. Parts of the Ace LTE interface appeared similar to other Fitbit trackers, with movement reminders and a Today-esque dashboard. But from my brief hands-on, it was hard to fully explore and compare.
Though I like the idea of the Ace LTE and was definitely entertained by some of the games, I still have some reservations. I was concerned that the device I tried on felt warm, although Sabharwal explained it was likely because the demo units had been charging on and off all day. I also didn’t care for the thick bezels around the screen, though that didn’t really adversely impact my experience. What did seem more of a problem was the occasional lag I encountered waiting for games to load or to go to the home screen. I’m not sure if that was a product of early software or if the final retail units will have similar delays, and will likely need to run a full review to find out.
The Fitbit Ace LTE is available for pre-order today for $230 on the Google Store or Amazon and it arrives on June 5. You’ll need to pay an extra $10 a month for the Ace Pass plan, which includes LTE service (on Google’s Fi) and access to Fitbit Arcade and regular content updates. If you spring for an annual subscription, you’ll get a collectable Ace Band (six are available at launch) and from now till August 31, the yearly fee is discounted at 50 percent off, making it about $5 a month.
Update, May 29, 3:15PM ET: This story has been edited to clarify that the Fitbit Ace LTE's hardware is a simplified version of the Pixel Watch 2. It is not capable of sleep or stress tracking.
This article originally appeared on Engadget at https://www.engadget.com/fitbit-ace-lte-hands-on-wearable-gaming-to-make-exercise-fun-but-not-too-fun-140059054.html?src=rss
All over the PC industry today, we’re learning of new systems and products launching in conjunction with Microsoft’s Copilot+ push. But HP isn’t just showing off new Snapdragon-powered laptops as part of the program. The company up and decided to nuke its entire product portfolio altogether and unify most of its sub-series.
While HP was never the worst offender in the world of awful product names — I’m looking at you, Sony, LG and Lenovo — being able to quickly identify the make and model of a device is crucial when you’re deciding what to buy. HP’s vice president of consumer PC products Pierre-Antoine Robineau admits as much, saying “to be fair, we don’t make things easy with our portfolio.” He referred to the company’s brands like Spectre, Pavilion and Envy, saying that if you ask ChatGPT what they are, the answers you’d get might refer to a ghost or a gazebo.
To simplify things, HP is getting rid of all those names on its consumer product portfolio and unifying everything under the Omni label. It’ll use Omnibook to refer to laptops, Omnidesk for desktops and Omnistudio for all-in-ones. For each category, it’ll add a label saying “3,” “5,” “7,” “X” or “Ultra” to indicate how premium or high-end the model is. That means the Omnibook Ultra is the highest-tier laptop, while the Omnidesk 3 might be the most basic or entry-level desktop system. That sort of numbering echoes Sony’s recent streamlined nomenclature of its home theater and personal audio offerings.
If Omnibook sounds familiar, that’s because HP actually had a product with that name, and it was available from 1993 to about 2002. The Omni moniker makes sense now in the 2020s, HP says, because these are devices that can do just about anything and act as multiple things at once. (As long as they don’t claim to be omniscient, omnipresent or omnipotent, I’ll let this slide.)
The company is also cleaning things up on the commercial side of its business, where the word “Elitebook” has traditionally been the most recognized label. It’s keeping that name, adopting the same Elitebook, Elitedesk and Elitestudio distinctions across categories and using the same “Ultra” and “X” labels to denote each model’s tier. However, instead of “3,” “5” or “7” here, HP is using even numbers (2, 4, 6 or 8), in part because it has used even series numbers like “1040” and “1060” in the Elitebook line before. Keeping similar numbers around can help IT managers with the shift in names, HP said.
The first new laptops under this new naming system are the Omnibook X and the Elitebook Ultra. They share very similar specs, with the Elitebook offering software that make them easier for IT managers to deploy to employees. Both of these come with 14-inch 2.2K touchscreens that were, at least in my brief time with them during a recent hands-on, bright and colorful.
I didn’t get to explore much of the new Windows 11, since the units available either ran existing software or were locked. I presume, though, that these would have other Copilot+ PC goodies that Microsoft announced earlier today.
What I can tell you is that I prefer the aesthetic of HP’s older Spectre models. The company’s machines turned heads and caught eyes thanks to their shiny edges and uniquely cut-off corners. I’m a sucker for razor sharp edges and gold or silver finishes, so that line of laptops really called to me.
In contrast, the HP Omnibook X seems plain. It comes in white or silver (the Elitebook is available in blue) and has a uniform thickness along its edges. It’s still thin and light, at 14mm (or about 0.55 inches) and 1.33 kilograms (or 2.93 pounds). But it’s certainly lost a little flavor, and I crave some spice in a device.
That’s not to say the Omnibook is hideous. It’s fine! I actually like the color accents on the keyboard deck. The power button is a different shade of blue depending on the version you get, while the row of function keys is a light shade of gray or blue. Typing on the demo units felt comfortable, too, though I miss the clicky feedback on older Elitebooks and would like a tad more travel on the keyboard.
You might also need to invest in a dongle for a card reader or if you have lots of accessories, but the two USB-C sockets and one USB-A might be enough in a pinch. Thankfully, there’s a headphone jack, too. Like every other Copilot+ PC announced today, the Omnibook and Elitebook are both powered by Qualcomm’s Snapdragon X Elite processor and promise 26 hours of battery life when playing local video. HP says its “next-gen AI PCs” have dedicated NPUs that are “capable of 45 trillion operations per second (TOPS),” which is slightly more than the 40 TOPS Microsoft is claiming for its Copilot+ PCs.
The company is also distinguishing its own AI PCs by adorning them with a logo that’s the letters “A” and “I” twisted into a sort of DNA helix. You’ll find it on the keyboard deck and the spine of the machine. It’s not big enough to be annoying, though you’ll certainly see it.
If you're already a fan of the HP Omnibook X or Elitebook Ultra, you can pre-order them today. The Omnibook X will start at $1,200 and come with 1 TB of storage, while the Elitebook Ultra starts at $1,700. Both systems will begin shipping on June 18.
This article originally appeared on Engadget at https://www.engadget.com/hp-omnibook-x-hands-on-vintage-branding-in-the-new-era-of-ai-180038627.html?src=rss
Ahead of Global Accessibility Awareness Day this week, Apple is issuing its typical annual set of announcements around its assistive features. Many of these are useful for people with disabilities, but also have broader applications as well. For instance, Personal Voice, which was released last year, helps preserve someone's speaking voice. It can be helpful to those who are at risk of losing their voice or have other reasons for wanting to retain their own vocal signature for loved ones in their absence. Today, Apple is bringing eye-tracking support to recent models of iPhones and iPads, as well as customizable vocal shortcuts, music haptics, vehicle motion cues and more.
Built-in eye-tracking for iPhones and iPads
The most intriguing feature of the set is the ability to use the front-facing camera on iPhones or iPads (at least those with the A12 chip or later) to navigate the software without additional hardware or accessories. With this enabled, people can look at their screen to move through elements like apps and menus, then linger on an item to select it.
That pause to select is something Apple calls Dwell Control, which has already been available elsewhere in the company's ecosystem like in Mac's accessibility settings. The setup and calibration process should only take a few seconds, and on-device AI is at work to understand your gaze. It'll also work with third-party apps from launch, since it's a layer in the OS like Assistive Touch. Since Apple already supported eye-tracking in iOS and iPadOS with eye-detection devices connected, the news today is the ability to do so without extra hardware.
Vocal shortcuts for easier hands-free control
Apple is also working on improving the accessibility of its voice-based controls on iPhones and iPads. It again uses on-device AI to create personalized models for each person setting up a new vocal shortcut. You can set up a command for a single word or phrase, or even an utterance (like "Oy!" perhaps). Siri will understand these and perform your designated shortcut or task. You can have these launch apps or run a series of actions that you define in the Shortcuts app, and once set up, you won't have to first ask Siri to be ready.
Another improvement coming to vocal interactions is "Listen for Atypical Speech," which has iPhones and iPads use on-device machine learning to recognize speech patterns and customize their voice recognition around your unique way of vocalizing. This sounds similar to Google's Project Relate, which is also designed to help technology better understand those with speech impairments or atypical speech.
To build these tools, Apple worked with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign. The institute is also collaborating with other tech giants like Google and Amazon to further development in this space across their products.
Music haptics in Apple Music and other apps
For those who are deaf or hard of hearing, Apple is bringing haptics to music players on iPhone, starting with millions of songs on its own Music app. When enabled, music haptics will play taps, textures and specialized vibrations in tandem with the audio to bring a new layer of sensation. It'll be available as an API so developers can bring greater accessibility to their apps, too.
Help in cars — motion sickness and CarPlay
Drivers with disabilities need better systems in their cars, and Apple is addressing some of the issues with its updates to CarPlay. Voice control and color filters are coming to the interface for vehicles, making it easier to control apps by talking and for those with visual impairments to see menus or alerts. To that end, CarPlay is also getting bold and large text support, as well as sound recognition for noises like sirens or honks. When the system identifies such a sound, it will display an alert at the bottom of the screen to let you know what it heard. This works similarly to Apple's existing sound recognition feature in other devices like the iPhone.
Apple
For those who get motion sickness while using their iPhones or iPads in moving vehicles, a new feature called Vehicle Motion Cues might alleviate some of that discomfort. Since motion sickness is based on a sensory conflict from looking at stationary content while being in a moving vehicle, the new feature is meant to better align the conflicting senses through onscreen dots. When enabled, these dots will line the four edges of your screen and sway in response to the motion it detects. If the car moves forward or accelerates, the dots will sway backwards as if in reaction to the increase in speed in that direction.
Other Apple Accessibility updates
There are plenty more features coming to the company's suite of products, including Live Captions in VisionOS, a new Reader mode in Magnifier, support for multi-line braille and a virtual trackpad for those who use Assistive Touch. It's not yet clear when all of these announced updates will roll out, though Apple has historically made these features available in upcoming versions of iOS. With its developer conference WWDC just a few weeks away, it's likely many of today's tools get officially released with the next iOS.
This article originally appeared on Engadget at https://www.engadget.com/apple-brings-eye-tracking-to-recent-iphones-and-ipads-140012990.html?src=rss
Ahead of Global Accessibility Awareness Day this week, Apple is issuing its typical annual set of announcements around its assistive features. Many of these are useful for people with disabilities, but also have broader applications as well. For instance, Personal Voice, which was released last year, helps preserve someone's speaking voice. It can be helpful to those who are at risk of losing their voice or have other reasons for wanting to retain their own vocal signature for loved ones in their absence. Today, Apple is bringing eye-tracking support to recent models of iPhones and iPads, as well as customizable vocal shortcuts, music haptics, vehicle motion cues and more.
Built-in eye-tracking for iPhones and iPads
The most intriguing feature of the set is the ability to use the front-facing camera on iPhones or iPads (at least those with the A12 chip or later) to navigate the software without additional hardware or accessories. With this enabled, people can look at their screen to move through elements like apps and menus, then linger on an item to select it.
That pause to select is something Apple calls Dwell Control, which has already been available elsewhere in the company's ecosystem like in Mac's accessibility settings. The setup and calibration process should only take a few seconds, and on-device AI is at work to understand your gaze. It'll also work with third-party apps from launch, since it's a layer in the OS like Assistive Touch. Since Apple already supported eye-tracking in iOS and iPadOS with eye-detection devices connected, the news today is the ability to do so without extra hardware.
Vocal shortcuts for easier hands-free control
Apple is also working on improving the accessibility of its voice-based controls on iPhones and iPads. It again uses on-device AI to create personalized models for each person setting up a new vocal shortcut. You can set up a command for a single word or phrase, or even an utterance (like "Oy!" perhaps). Siri will understand these and perform your designated shortcut or task. You can have these launch apps or run a series of actions that you define in the Shortcuts app, and once set up, you won't have to first ask Siri to be ready.
Another improvement coming to vocal interactions is "Listen for Atypical Speech," which has iPhones and iPads use on-device machine learning to recognize speech patterns and customize their voice recognition around your unique way of vocalizing. This sounds similar to Google's Project Relate, which is also designed to help technology better understand those with speech impairments or atypical speech.
To build these tools, Apple worked with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign. The institute is also collaborating with other tech giants like Google and Amazon to further development in this space across their products.
Music haptics in Apple Music and other apps
For those who are deaf or hard of hearing, Apple is bringing haptics to music players on iPhone, starting with millions of songs on its own Music app. When enabled, music haptics will play taps, textures and specialized vibrations in tandem with the audio to bring a new layer of sensation. It'll be available as an API so developers can bring greater accessibility to their apps, too.
Help in cars — motion sickness and CarPlay
Drivers with disabilities need better systems in their cars, and Apple is addressing some of the issues with its updates to CarPlay. Voice control and color filters are coming to the interface for vehicles, making it easier to control apps by talking and for those with visual impairments to see menus or alerts. To that end, CarPlay is also getting bold and large text support, as well as sound recognition for noises like sirens or honks. When the system identifies such a sound, it will display an alert at the bottom of the screen to let you know what it heard. This works similarly to Apple's existing sound recognition feature in other devices like the iPhone.
Apple
For those who get motion sickness while using their iPhones or iPads in moving vehicles, a new feature called Vehicle Motion Cues might alleviate some of that discomfort. Since motion sickness is based on a sensory conflict from looking at stationary content while being in a moving vehicle, the new feature is meant to better align the conflicting senses through onscreen dots. When enabled, these dots will line the four edges of your screen and sway in response to the motion it detects. If the car moves forward or accelerates, the dots will sway backwards as if in reaction to the increase in speed in that direction.
Other Apple Accessibility updates
There are plenty more features coming to the company's suite of products, including Live Captions in VisionOS, a new Reader mode in Magnifier, support for multi-line braille and a virtual trackpad for those who use Assistive Touch. It's not yet clear when all of these announced updates will roll out, though Apple has historically made these features available in upcoming versions of iOS. With its developer conference WWDC just a few weeks away, it's likely many of today's tools get officially released with the next iOS.
This article originally appeared on Engadget at https://www.engadget.com/apple-brings-eye-tracking-to-recent-iphones-and-ipads-140012990.html?src=rss
In a video showcasing the prowess of Google's new Project Astra experience at I/O 2024, an unnamed person demonstrating asked Gemini "do you remember where you saw my glasses?" The AI impressively responded "Yes, I do. Your glasses were on a desk near a red apple," despite said object not actually being in view when the question was asked. But these glasses weren't your bog-standard assistive vision aid; these had a camera onboard and some sort of visual interface!
The tester picked up their glasses and put them on, and proceeded to ask the AI more questions about things they were looking at. Clearly, there is a camera on the device that's helping it take in the surroundings, and we were shown some sort of interface where a waveform moved to indicate it was listening. Onscreen captions appeared to reflect the answer that was being read aloud to the wearer, as well. So if we're keeping track, that's at least a microphone and speaker onboard too, along with some kind of processor and battery to power the whole thing.
We only caught a brief glimpse of the wearable, but from the sneaky seconds it was in view, a few things were evident. The glasses had a simple black frame and didn't look at all like Google Glass. They didn't appear very bulky, either.
In all likelihood, Google is not ready to actually launch a pair of glasses at I/O. It breezed right past the wearable's appearance and barely mentioned them, only to say that Project Astra and the company's vision of "universal agents" could come to devices like our phones or glasses. We don't know much else at the moment, but if you've been mourning Google Glass or the company's other failed wearable products, this might instill some hope yet.
Catch up on all the news from Google I/O 2024 right here!
This article originally appeared on Engadget at https://www.engadget.com/google-just-snuck-a-pair-of-ar-glasses-into-a-project-astra-demo-at-io-172824539.html?src=rss