My favorite iOS 18, iPadOS 18 and watchOS 11 features that flew under the radar at WWDC 2024

There was so much Apple had to cram into its WWDC 2024 keynote that some features were left out of the spotlight. Here at the company's campus, I've had the chance to speak with various executives, as well as get deeper dives into iOS 18, iPadOS 18, Apple Intelligence, watchOS 11 and more. In these sessions, I've been able to learn more about how specific things work, like what steps exactly do you take to customize your iPhone's home screen and control center. I also got to see some other updates that weren't even briefly mentioned during the keynote, like new support for hiking routes in Apple Maps and what training load insights look like on watchOS 11. Of all the unmentioned features I've come to discover, here are my favorites.

I've always been a Google Maps girl, in part because that app had superior information compared to Apple Maps in its early years. These days, I stick to Google Maps because it has all my saved places and history. When I found out that iOS 18 would bring updates to Apple Maps, particularly to do with hiking and routes, I was intrigued. 

Basically, in iOS 18, when you go into search in Maps, you'll see a new option under "Find Nearby" called hikes. It'll show you recommended hikes, and you can filter by the type of hike (loop, for example) and specify a length. You'll find options in the vicinity and tapping into one will show you a topographical view with the elevation details, how challenging it should be as well as estimated duration. You can tap to save each route and store it for offline reference later and add notes too. There's a new Library view and you'll find it in your profile in Maps. 

You'll also be able to create new routes in Maps by tapping anywhere to start defining your route. You can keep tapping to add waypoints, which will cause the trail to continue to connect them, then hit a "Close loop" button to finish your trail. These routes can be shared, though it's not yet clear if you can share it to, say, your friend or driver to have them take your preferred path to your destination. 

Two iPhones separated by the iOS 18 logo, showing the Map app and Notes app respectively.
Apple

The hikes that Apple will serve up in Maps are created by its own team, which is working with US National Parks, so they'll only be available for the 63 national parks in the country to begin with. In other words, it's not porting information from AllTrails, for example. In a press release, Apple said thousands of hikes will be available to browse at launch.

As a city dweller who only sometimes hikes, my excitement is less about hiking and more about the potential of sharing my custom routes to show people how they should walk to my building or favorite restaurant from the train station. It's a compelling feature, and arguably a reason I'd choose Apple Maps versus Google's.

Frankly, the Maps update might be my favorite out of everything that wasn't shown off during the WWDC 2024 keynote by a huge margin. But some of the new tools coming to Calendar tickle my fancy too. Specifically, the new integration with reminders makes it easier to not just schedule your tasks right into your daybook, but also check them off from the Calendar app. You can soon move reminders around by long pressing and dragging them, so that note to call your mom can be placed on a slot at 5pm on Wednesday, instead of sitting in your Reminders app. In addition, Calendar is getting new views that better detail your level of activity each day in a month, similar to how the Fitness app shows your daily rings progress quickly in the monthly view.

This isn't one that wasn't mentioned at all during the keynote, but there are details about how Tapback works that weren't described at yesterday's show. If you're like me, you might not even have remembered that Tapback refers to those reactions you can send in Messages by double tapping on a blue or gray bubble. With iOS 18, you'll get more options than the limited selection of heart, thumbs up, thumbs down, "Haha," exclamation points and question mark. They're also going to show up in full color with the update, instead of the existing (boring) gray. 

What I found out later on, though, is that when you double tap a message that already has reactions attached, a new balloon appears at the top of your screen showing who has responded with which emoji. This should make it easier to lurk in a group chat, but also could double as an unofficial polling tool by asking your friends to react with specific emojis to indicate different answers. That should make Messages a bit more like Slack, and I wish Whatsapp and Telegram would take note.

There are quite a lot of features coming to iOS 18 that didn't get much love on the WWDC stage, like the Journal app's new widget for the home screen, which shows prompts for reflection and lets you create new entries. Journal also has a new insights view that displays your writing streaks and other historical data, plus a new tool that lets you add your state of mind to each entry from within the app.

Meanwhile, Safari is getting a new "Highlights" button in the search (or URL) bar, and tapping it will show a machine-learning-generated summary of the webpage you're on. Tapping into this brings up a panel with more information like navigation directions to a restaurant mentioned on the page, for example, or a phone number to call up a business. You can also quickly launch the reader view from this pane.

I wasn't super enthusiastic about either of those, largely because I don't use the Journal app much and I don't need Safari summarizing a webpage for me. But there are some other buried updates that I really wanted to shout out. For example, Math Notes for iPad and with Apple Pencil certainly got a lot of time, but it wasn't till I looked at Apple's iOS 18 press release that I found out the iPhone's Notes app is also getting a version of it. According to the screenshot Apple included, it looks like you can tally up and split expenses between a group of friends by writing a list of expenses and how much each item cost, then add the names of each expense to a formula with plus and equal signs, then get that divided by the number of people in your group. Not quite Splitwise, but I could see this becoming more powerful over time.

I was also intrigued by some of the Smart Script features on iPadOS 18, especially when I realized that you can just move your handwritten words around by dragging your handwritten words further away from each other, and the rest of your scrawled text moves in tandem. This is hard to describe, and I'll have to wait till I can try it for myself to show you an animated example. But it was impressive, even if it's not extremely useful.

Finally, the Passwords app and other privacy updates got a shout out during the keynote, but I learned more about how things like accessory setup and contacts sharing with apps work. Apple is releasing a new accessory setup kit so that device makers can adopt a pairing interface similar to how you'd connect your AirPods or Apple Watch to your iPhone. If developers don't use this approach, the new Bluetooth connection interface will be clearer about what other devices are on your network and what you're actually granting access to when you let an app see other devices on your network. Though it wasn't entirely skipped during the keynote, the Passwords app is something that makes me happy, since I'm absolutely sick of having to dig through settings to get codes for apps which I use my iPhone's authenticator to unlock. 

There are plenty of features that were talked about that I'm excited by and learned more about the workings of, including the new dynamic clock style in the Photos face in watchOS 11, pinned collections in the redesigned Photos app and iPadOS mirroring for easier remote tech support. Oh, and that Airplay feature that'll let you send money to friends by holding your phones together? Yes! Being able to pause and adjust your Activity rings in watchOS and that Training Load insight? Hallelujah!

And though I can see the appeal of locked and hidden apps, I'm not sure I'd find much use for that and it would probably exacerbate my already prone-to-suspicion nature.

I'm also a little wary of things like Genmoji and Image Playground, which are both Apple Intelligence features that won't hit all iOS 18 devices. There will be metadata information indicating when images were generated by Apple's AI, and guardrails in place to prevent the creation of abusive and exploitative content. 

Clearly, there are plenty of updates coming to Apple's phones, tablets, laptops and wearables later this year, and I can't wait to try them out. The public beta should be ready around the end of summer this year, which is when most people (who are willing to risk an unstable platform) can check them out. 

Catch up here for all the news out of Apple's WWDC 2024.

This article originally appeared on Engadget at https://www.engadget.com/my-favorite-ios-18-ipados-18-and-watchos-11-features-that-flew-under-the-radar-at-wwdc-2024-113044069.html?src=rss

Apple Intelligence: What devices and features will actually be supported?

Apple Intelligence is coming, but not to every iPhone out there. In fact, you'll need to have a device with an A17 Pro processor or M-series chip to use many of the features unveiled during the Apple Intelligence portion of WWDC 2024. That means only iPhone 15 Pro owners (and those with an M-series iPad or MacBook) will get the iOS 18-related Apple Intelligence (AI?) updates like Genmoji, Image Playground, the redesigned Siri and Writing Tools. Then there are things like Math Notes and Smart Script on iPadOS 18 and the new features in Messages coming via iOS 18 that will be arriving for anyone that can upgrade to the latest platforms. It's confusing, and the best way to anticipate what you're getting is to know what processor is in your iPhone, iPad or Mac.

It's not evident exactly why older devices using an A16 chip (like the iPhone 14 Pro) won't work with Apple Intelligence, given its neural engine seems more than capable compared to the M1. A closer look at the specs sheets of those two processors show that the main differences appear to be in memory and GPU prowess. Specifically, the A16 Bionic can only support a maximum of 6GB of RAM onboard while the M1 starts at 8GB and goes up to 16GB. In fact, all the supported devices have at least 8GB of RAM and that could hint at why your iPhone 14 Pro will not be able to handle making Genmojis, perhaps.

Though it might not seem quite fair that owners of a relatively recent iPhone won't get to use Apple Intelligence features, you'll still be getting a healthy amount of updates via iOS 18. Here's a quick breakdown of what is coming via iOS 18, and what's only coming if your iPhone supports Apple Intelligence.

Basically everything described during the iOS portion of yesterday's WWDC 2024 keynote is coming to all iPhones (that can update to iOS 18). That includes the customizable home screen, Control Center, dedicated Passwords app, redesigned Photos app, new Tapback emoji reactions, text effects, scheduled sending and more. Messages via Satellite is only coming to iPhone 14 or newer, and you'll be able to send text messages, emojis and Tapbacks, but not images or videos. 

You'll also be tied to the same satellite service plan that you got at the time of your purchase of an iPhone 14. If you bought your iPhone 14 in January 2024, you received a free two-year subscription to be able to use Emergency SOS via Satellite and other satellite communication features that now include texting. That means that to continue texting people via satellite after January 2026, you'll need to start paying for a plan. 

There are a whole host of updates coming with iOS 18 that Apple didn't quite cover in its keynote either, and I'll be putting up a separate guide about that in a bit. But suffice to say that apps like Maps, Safari, Calendar and Journal are getting new functions that, together with the other changes mentioned so far, add up to a meaty OS upgrade.

In short, all of them. If you have an iPhone 15 Pro or an iPad (or Mac) with an M-series chip, you'll get a redesigned Siri, Genmoji and Image Playground, as well as writing tools baked into the system. That means tools like proofreading, summarizing or helping you adjust your tone in apps like Mail, Notes and Keynote are limited to the AI-supported devices. If you don't have one of those, you'll get none of this. 

The redesigned Siri, which is only coming through Apple Intelligence, will be able to understand what's on your screen to contextually answer your queries. If you've been texting with your friend about which baseball player is the best, you can ask Siri (by long pressing the power button or just saying Hey Siri) "How many homeruns has he done?" The assistant will know who "he" is in this context, and understand you're referring to the athlete, not the friend you're chatting with. 

Apple Intelligence is also what brings the ability to type to Siri — and you can invoke this keyboard to talk to the assistant by double tapping the bottom of the screen. 

This also means that new glowing edge animation that appears when Siri is triggered is limited to the Apple Intelligence-supported devices. You'll still be looking at that little orb at the bottom of your screen when you talk to the assistant on an iPhone 14 Pro or older.

There are loads more features coming via Apple Intelligence, which appears to be set for release later this year. 

Catch up here for all the news out of Apple's WWDC 2024.

This article originally appeared on Engadget at https://www.engadget.com/apple-intelligence-what-devices-and-features-will-actually-be-supported-185850732.html?src=rss

Fitbit Ace LTE hands-on: Wearable gaming to make exercise fun (but not too fun)

Google is crossing genres with its latest wearable for kids, combining a gaming system and an activity tracker in the Fitbit Ace LTE. The company is pitching this as a “first-of-its-kind connected smartwatch that transforms exercise into play and safely helps kids lead more active, independent lives.” Basically, think of it as a Nintendo Switch pared down into an activity tracker for children aged 7 and up, with a few safety and connectivity features built in.

The main idea here is to get kids up and moving, in exchange for progress on the Ace LTE’s onboard games. But there are also basic tools that let parents (and trusted contacts) stay in touch with the wearer. Through the new Fitbit Ace app (that adults can install on iOS or Android), guardians can set play time, monitor activity progress and send calls or messages. On the watch itself, kids can also use the onscreen keyboard or microphone to type or dictate texts or choose an emoji.

Since the Fitbit Ace LTE uses a simplified version of the hardware on the Pixel Watch 2, it’s pretty responsive. One major difference, though, is that the kid-friendly tracker uses Gorilla Glass 3 on its cover, in addition to the 5 ATMs of water-resistance that both models share. Google does include a protective case with each Ace LTE, and it doesn’t add much weight.

There are also other obvious differences because the Pixel Watch 2 has a circular face while the Fitbit Ace LTE has a “squircle” (square with rounded corners) OLED with two large buttons on the right side. The latter’s band is also a lot narrower, and it comes “with technology built in,” according to Google’s vice president of product management Anil Sabharwal. That's just a fancy way to say that the Ace LTE recognizes when you swap in a new strap and each accessory comes with unique content.

The Fitbit Ace LTE on a wrist held in mid-air, with a cartoon room on the screen.
Cherlynn Low for Engadget

 

 The company is calling these straps “Cartridges” — another reminder of how the Fitbit Ace LTE is a gaming console wannabe. When you snap a new one on, you’ll see an animation of all the bonus material you just got. They include new backgrounds and items for your Tamagotchi-esque pet called “eejie.” Separate bands also add unique cartoony strips, called Noodles, that make their way around the edges of the watch's display every day which chart the wearer’s progress towards daily goals, similar to Apple's activity rings.

I’m dancing around the main part of the Fitbit Ace LTE’s proposition, because I wanted to get the hardware out of the way. The most interesting concept here is the idea of a wearable gaming system. The Ace LTE’s home screen looks fairly typical. It shows you the time and the Noodle activity ring around it, as well as some small font at the very bottom showing the number of points collected.

To the left of this page is what Sabharwal called a “playlist” — a collection of daily quests. Like on other iOS or Android games, this is a bunch of targets to hit within a dictated time frame to ensure you’re engaged, and achieving these goals leads to rewards.

Most of these rewards are things you can use to jazz up your digital pet’s home over on the right of the home screen. Google calls these things “eejies” — that name doesn’t actually mean anything. Some engineers in a room looked at the letters “I” “J” and “I” and sounded them out and thought sure, why not. (No, those letters don't actually stand for anything, either.)

The Fitbit Ace LTE on a wrist held in mid-air, with a digital character inside a pink bedroom on the screen. At the top is the word
Cherlynn Low for Engadget

According to Google, “Eejies are customizable creatures that feed off daily activity — the more kids reach their movement goals, the more healthy and happy their eejie gets.” When daily activities are completed and each child earns arcade tickets (or when a new watch strap is attached), they can exchange them for new outfit or furniture items for their eejies.

Even though they’re supposed to be “customizable creatures,” the eejies are anthropomorphic and look like… well, kids. Depending on how you style them, they sort of look like sullen teenagers, even. Don’t expect a cute Pikachu or Digimon to play with, these eejie are two-legged beings with heads, arms and necks. I’d prefer something cuter, but perhaps the target demographic likes feeding and playing with a strange avatar of themselves.

When multiple Ace LTE wearers meet up, their eejie can visit each other and leave emoji messages. Of course, how fun that is depends on how many of your (kid’s) friends have Ace LTEs.

Even without that social component though, the Ace LTE can be quite a lot of fun. It is the home of Fitbit Arcade, a new library of games built specifically for this wearable. So far, I’ve only seen about six games in the collection, including a room escape game, a fishing simulator and a Mario Kart-like racer.

The first game I tried at Google’s briefing was Smoky Lake, the fishing game. After a quick intro, I tapped on a shadow of a fish in the water, and flung my arm out. I waited till the Ace LTE buzzed, then pulled my wrist in. I was told that I had caught a puffer fish, and swiped through to see more information about past catches. I earned five arcade tickets with this catch. 

I gleefully tried again and caught what I was told was the “biggest pineapple gillfish” acquired that day. Other hauls the Ace LTE I was wearing had acquired included a “ramen squid” and a “blob fish,” and tapping an icon on the upper left brought up my library of things that had been caught.

The Fitbit Ace LTE on a wrist held in mid-air, with the words
Cherlynn Low for Engadget

I then played a round of Pollo 13, a racing game where I played as a chicken in a bathtub competing in an intergalactic space match against my arch nemesis. There, I tilted my wrist in all directions to steer, keeping my vehicle on track or swerving to collect items that sped me up. Just as I expected based on my prior Mario Kart experience (and also my general lack of skill at driving in real life), I sucked at this game and came in last. Sabharwal gently informed me that this was the poorest result they had seen all day.

I didn’t get to check out other titles installed, like Galaxy Rangers, Jelly Jam or Sproutlings but I was most intrigued by a room escape game, which is my favorite genre.

Google doesn’t want to encourage obsession or addiction to the Ace LTE’s games, though. “We don’t want kids to overexercise. We don’t want kids to feel like they have a streak and if they miss a day, ‘Oh my God, the world is over!’” Sabharwal said.

To that end, progress in each game is built around encouraging the wearer to meet movement goals to advance to new stages. Every two to three minutes, you’ll be prompted to get up and move. In Smokey Lake, for instance, you’ll be told that you’ve run out of bait and have to walk a few hundred steps to go to the bait shop. This can be achieved by walking a number of steps or doing any activity that meets similar requirements. Google is calling this “interval-based gaming,” playing on the idea of “interval-based training.” After about five to 10 sessions, the company thinks each wearer will hit the 60 to 90 minutes of daily required activity recommended by the World Health Organization.

The Fitbit Ace LTE on a wrist held in mid-air, with two game titles on a carousel in view:
Cherlynn Low for Engadget

The idea of activity as currency for games isn’t exactly novel, but Google’s being quite careful in its approach. Not only is it trying to avoid addiction, which for the target age group is a real concern, but the company also says it built the Ace LTE “responsibly from the ground up” by working with “experts in child psychology, public health, privacy and digital wellbeing.” It added that the device was “built with privacy in mind, front and center,” and that only parents will ever be shown a child’s location or activity data in their apps. Location data is deleted after 24 hours, while activity data is deleted after a maximum of 35 days. Google also said “there are no third-party apps or ads on the device.”

While activity is the main goal at launch, there is potential for the Ace LTE to track sleep and other aspects of health to count towards goals. Parts of the Ace LTE interface appeared similar to other Fitbit trackers, with movement reminders and a Today-esque dashboard. But from my brief hands-on, it was hard to fully explore and compare.

Though I like the idea of the Ace LTE and was definitely entertained by some of the games, I still have some reservations. I was concerned that the device I tried on felt warm, although Sabharwal explained it was likely because the demo units had been charging on and off all day. I also didn’t care for the thick bezels around the screen, though that didn’t really adversely impact my experience. What did seem more of a problem was the occasional lag I encountered waiting for games to load or to go to the home screen. I’m not sure if that was a product of early software or if the final retail units will have similar delays, and will likely need to run a full review to find out.

The Fitbit Ace LTE is available for pre-order today for $230 on the Google Store or Amazon and it arrives on June 5. You’ll need to pay an extra $10 a month for the Ace Pass plan, which includes LTE service (on Google’s Fi) and access to Fitbit Arcade and regular content updates. If you spring for an annual subscription, you’ll get a collectable Ace Band (six are available at launch) and from now till August 31, the yearly fee is discounted at 50 percent off, making it about $5 a month.

Update, May 29, 3:15PM ET: This story has been edited to clarify that the Fitbit Ace LTE's hardware is a simplified version of the Pixel Watch 2. It is not capable of sleep or stress tracking.

This article originally appeared on Engadget at https://www.engadget.com/fitbit-ace-lte-hands-on-wearable-gaming-to-make-exercise-fun-but-not-too-fun-140059054.html?src=rss

Fitbit Ace LTE hands-on: Wearable gaming to make exercise fun (but not too fun)

Google is crossing genres with its latest wearable for kids, combining a gaming system and an activity tracker in the Fitbit Ace LTE. The company is pitching this as a “first-of-its-kind connected smartwatch that transforms exercise into play and safely helps kids lead more active, independent lives.” Basically, think of it as a Nintendo Switch pared down into an activity tracker for children aged 7 and up, with a few safety and connectivity features built in.

The main idea here is to get kids up and moving, in exchange for progress on the Ace LTE’s onboard games. But there are also basic tools that let parents (and trusted contacts) stay in touch with the wearer. Through the new Fitbit Ace app (that adults can install on iOS or Android), guardians can set play time, monitor activity progress and send calls or messages. On the watch itself, kids can also use the onscreen keyboard or microphone to type or dictate texts or choose an emoji.

Since the Fitbit Ace LTE uses a simplified version of the hardware on the Pixel Watch 2, it’s pretty responsive. One major difference, though, is that the kid-friendly tracker uses Gorilla Glass 3 on its cover, in addition to the 5 ATMs of water-resistance that both models share. Google does include a protective case with each Ace LTE, and it doesn’t add much weight.

There are also other obvious differences because the Pixel Watch 2 has a circular face while the Fitbit Ace LTE has a “squircle” (square with rounded corners) OLED with two large buttons on the right side. The latter’s band is also a lot narrower, and it comes “with technology built in,” according to Google’s vice president of product management Anil Sabharwal. That's just a fancy way to say that the Ace LTE recognizes when you swap in a new strap and each accessory comes with unique content.

The Fitbit Ace LTE on a wrist held in mid-air, with a cartoon room on the screen.
Cherlynn Low for Engadget

 

 The company is calling these straps “Cartridges” — another reminder of how the Fitbit Ace LTE is a gaming console wannabe. When you snap a new one on, you’ll see an animation of all the bonus material you just got. They include new backgrounds and items for your Tamagotchi-esque pet called “eejie.” Separate bands also add unique cartoony strips, called Noodles, that make their way around the edges of the watch's display every day which chart the wearer’s progress towards daily goals, similar to Apple's activity rings.

I’m dancing around the main part of the Fitbit Ace LTE’s proposition, because I wanted to get the hardware out of the way. The most interesting concept here is the idea of a wearable gaming system. The Ace LTE’s home screen looks fairly typical. It shows you the time and the Noodle activity ring around it, as well as some small font at the very bottom showing the number of points collected.

To the left of this page is what Sabharwal called a “playlist” — a collection of daily quests. Like on other iOS or Android games, this is a bunch of targets to hit within a dictated time frame to ensure you’re engaged, and achieving these goals leads to rewards.

Most of these rewards are things you can use to jazz up your digital pet’s home over on the right of the home screen. Google calls these things “eejies” — that name doesn’t actually mean anything. Some engineers in a room looked at the letters “I” “J” and “I” and sounded them out and thought sure, why not. (No, those letters don't actually stand for anything, either.)

The Fitbit Ace LTE on a wrist held in mid-air, with a digital character inside a pink bedroom on the screen. At the top is the word
Cherlynn Low for Engadget

According to Google, “Eejies are customizable creatures that feed off daily activity — the more kids reach their movement goals, the more healthy and happy their eejie gets.” When daily activities are completed and each child earns arcade tickets (or when a new watch strap is attached), they can exchange them for new outfit or furniture items for their eejies.

Even though they’re supposed to be “customizable creatures,” the eejies are anthropomorphic and look like… well, kids. Depending on how you style them, they sort of look like sullen teenagers, even. Don’t expect a cute Pikachu or Digimon to play with, these eejie are two-legged beings with heads, arms and necks. I’d prefer something cuter, but perhaps the target demographic likes feeding and playing with a strange avatar of themselves.

When multiple Ace LTE wearers meet up, their eejie can visit each other and leave emoji messages. Of course, how fun that is depends on how many of your (kid’s) friends have Ace LTEs.

Even without that social component though, the Ace LTE can be quite a lot of fun. It is the home of Fitbit Arcade, a new library of games built specifically for this wearable. So far, I’ve only seen about six games in the collection, including a room escape game, a fishing simulator and a Mario Kart-like racer.

The first game I tried at Google’s briefing was Smoky Lake, the fishing game. After a quick intro, I tapped on a shadow of a fish in the water, and flung my arm out. I waited till the Ace LTE buzzed, then pulled my wrist in. I was told that I had caught a puffer fish, and swiped through to see more information about past catches. I earned five arcade tickets with this catch. 

I gleefully tried again and caught what I was told was the “biggest pineapple gillfish” acquired that day. Other hauls the Ace LTE I was wearing had acquired included a “ramen squid” and a “blob fish,” and tapping an icon on the upper left brought up my library of things that had been caught.

The Fitbit Ace LTE on a wrist held in mid-air, with the words
Cherlynn Low for Engadget

I then played a round of Pollo 13, a racing game where I played as a chicken in a bathtub competing in an intergalactic space match against my arch nemesis. There, I tilted my wrist in all directions to steer, keeping my vehicle on track or swerving to collect items that sped me up. Just as I expected based on my prior Mario Kart experience (and also my general lack of skill at driving in real life), I sucked at this game and came in last. Sabharwal gently informed me that this was the poorest result they had seen all day.

I didn’t get to check out other titles installed, like Galaxy Rangers, Jelly Jam or Sproutlings but I was most intrigued by a room escape game, which is my favorite genre.

Google doesn’t want to encourage obsession or addiction to the Ace LTE’s games, though. “We don’t want kids to overexercise. We don’t want kids to feel like they have a streak and if they miss a day, ‘Oh my God, the world is over!’” Sabharwal said.

To that end, progress in each game is built around encouraging the wearer to meet movement goals to advance to new stages. Every two to three minutes, you’ll be prompted to get up and move. In Smokey Lake, for instance, you’ll be told that you’ve run out of bait and have to walk a few hundred steps to go to the bait shop. This can be achieved by walking a number of steps or doing any activity that meets similar requirements. Google is calling this “interval-based gaming,” playing on the idea of “interval-based training.” After about five to 10 sessions, the company thinks each wearer will hit the 60 to 90 minutes of daily required activity recommended by the World Health Organization.

The Fitbit Ace LTE on a wrist held in mid-air, with two game titles on a carousel in view:
Cherlynn Low for Engadget

The idea of activity as currency for games isn’t exactly novel, but Google’s being quite careful in its approach. Not only is it trying to avoid addiction, which for the target age group is a real concern, but the company also says it built the Ace LTE “responsibly from the ground up” by working with “experts in child psychology, public health, privacy and digital wellbeing.” It added that the device was “built with privacy in mind, front and center,” and that only parents will ever be shown a child’s location or activity data in their apps. Location data is deleted after 24 hours, while activity data is deleted after a maximum of 35 days. Google also said “there are no third-party apps or ads on the device.”

While activity is the main goal at launch, there is potential for the Ace LTE to track sleep and other aspects of health to count towards goals. Parts of the Ace LTE interface appeared similar to other Fitbit trackers, with movement reminders and a Today-esque dashboard. But from my brief hands-on, it was hard to fully explore and compare.

Though I like the idea of the Ace LTE and was definitely entertained by some of the games, I still have some reservations. I was concerned that the device I tried on felt warm, although Sabharwal explained it was likely because the demo units had been charging on and off all day. I also didn’t care for the thick bezels around the screen, though that didn’t really adversely impact my experience. What did seem more of a problem was the occasional lag I encountered waiting for games to load or to go to the home screen. I’m not sure if that was a product of early software or if the final retail units will have similar delays, and will likely need to run a full review to find out.

The Fitbit Ace LTE is available for pre-order today for $230 on the Google Store or Amazon and it arrives on June 5. You’ll need to pay an extra $10 a month for the Ace Pass plan, which includes LTE service (on Google’s Fi) and access to Fitbit Arcade and regular content updates. If you spring for an annual subscription, you’ll get a collectable Ace Band (six are available at launch) and from now till August 31, the yearly fee is discounted at 50 percent off, making it about $5 a month.

Update, May 29, 3:15PM ET: This story has been edited to clarify that the Fitbit Ace LTE's hardware is a simplified version of the Pixel Watch 2. It is not capable of sleep or stress tracking.

This article originally appeared on Engadget at https://www.engadget.com/fitbit-ace-lte-hands-on-wearable-gaming-to-make-exercise-fun-but-not-too-fun-140059054.html?src=rss

HP Omnibook X hands-on: Vintage branding in the new era of AI

All over the PC industry today, we’re learning of new systems and products launching in conjunction with Microsoft’s Copilot+ push. But HP isn’t just showing off new Snapdragon-powered laptops as part of the program. The company up and decided to nuke its entire product portfolio altogether and unify most of its sub-series.

While HP was never the worst offender in the world of awful product names — I’m looking at you, Sony, LG and Lenovo — being able to quickly identify the make and model of a device is crucial when you’re deciding what to buy. HP’s vice president of consumer PC products Pierre-Antoine Robineau admits as much, saying “to be fair, we don’t make things easy with our portfolio.” He referred to the company’s brands like Spectre, Pavilion and Envy, saying that if you ask ChatGPT what they are, the answers you’d get might refer to a ghost or a gazebo.

To simplify things, HP is getting rid of all those names on its consumer product portfolio and unifying everything under the Omni label. It’ll use Omnibook to refer to laptops, Omnidesk for desktops and Omnistudio for all-in-ones. For each category, it’ll add a label saying “3,” “5,” “7,” “X” or “Ultra” to indicate how premium or high-end the model is. That means the Omnibook Ultra is the highest-tier laptop, while the Omnidesk 3 might be the most basic or entry-level desktop system. That sort of numbering echoes Sony’s recent streamlined nomenclature of its home theater and personal audio offerings.

If Omnibook sounds familiar, that’s because HP actually had a product with that name, and it was available from 1993 to about 2002. The Omni moniker makes sense now in the 2020s, HP says, because these are devices that can do just about anything and act as multiple things at once. (As long as they don’t claim to be omniscient, omnipresent or omnipotent, I’ll let this slide.)

The company is also cleaning things up on the commercial side of its business, where the word “Elitebook” has traditionally been the most recognized label. It’s keeping that name, adopting the same Elitebook, Elitedesk and Elitestudio distinctions across categories and using the same “Ultra” and “X” labels to denote each model’s tier. However, instead of “3,” “5” or “7” here, HP is using even numbers (2, 4, 6 or 8), in part because it has used even series numbers like “1040” and “1060” in the Elitebook line before. Keeping similar numbers around can help IT managers with the shift in names, HP said.

The first new laptops under this new naming system are the Omnibook X and the Elitebook Ultra. They share very similar specs, with the Elitebook offering software that make them easier for IT managers to deploy to employees. Both of these come with 14-inch 2.2K touchscreens that were, at least in my brief time with them during a recent hands-on, bright and colorful.

I didn’t get to explore much of the new Windows 11, since the units available either ran existing software or were locked. I presume, though, that these would have other Copilot+ PC goodies that Microsoft announced earlier today.

What I can tell you is that I prefer the aesthetic of HP’s older Spectre models. The company’s machines turned heads and caught eyes thanks to their shiny edges and uniquely cut-off corners. I’m a sucker for razor sharp edges and gold or silver finishes, so that line of laptops really called to me.

In contrast, the HP Omnibook X seems plain. It comes in white or silver (the Elitebook is available in blue) and has a uniform thickness along its edges. It’s still thin and light, at 14mm (or about 0.55 inches) and 1.33 kilograms (or 2.93 pounds). But it’s certainly lost a little flavor, and I crave some spice in a device.

That’s not to say the Omnibook is hideous. It’s fine! I actually like the color accents on the keyboard deck. The power button is a different shade of blue depending on the version you get, while the row of function keys is a light shade of gray or blue. Typing on the demo units felt comfortable, too, though I miss the clicky feedback on older Elitebooks and would like a tad more travel on the keyboard.

You might also need to invest in a dongle for a card reader or if you have lots of accessories, but the two USB-C sockets and one USB-A might be enough in a pinch. Thankfully, there’s a headphone jack, too. Like every other Copilot+ PC announced today, the Omnibook and Elitebook are both powered by Qualcomm’s Snapdragon X Elite processor and promise 26 hours of battery life when playing local video. HP says its “next-gen AI PCs” have dedicated NPUs that are “capable of 45 trillion operations per second (TOPS),” which is slightly more than the 40 TOPS Microsoft is claiming for its Copilot+ PCs.

The company is also distinguishing its own AI PCs by adorning them with a logo that’s the letters “A” and “I” twisted into a sort of DNA helix. You’ll find it on the keyboard deck and the spine of the machine. It’s not big enough to be annoying, though you’ll certainly see it.

If you're already a fan of the HP Omnibook X or Elitebook Ultra, you can pre-order them today. The Omnibook X will start at $1,200 and come with 1 TB of storage, while the Elitebook Ultra starts at $1,700. Both systems will begin shipping on June 18.

Catch up on all the news from Microsoft's Copilot AI and Surface event today!

This article originally appeared on Engadget at https://www.engadget.com/hp-omnibook-x-hands-on-vintage-branding-in-the-new-era-of-ai-180038627.html?src=rss

Apple brings eye-tracking to recent iPhones and iPads

Ahead of Global Accessibility Awareness Day this week, Apple is issuing its typical annual set of announcements around its assistive features. Many of these are useful for people with disabilities, but also have broader applications as well. For instance, Personal Voice, which was released last year, helps preserve someone's speaking voice. It can be helpful to those who are at risk of losing their voice or have other reasons for wanting to retain their own vocal signature for loved ones in their absence. Today, Apple is bringing eye-tracking support to recent models of iPhones and iPads, as well as customizable vocal shortcuts, music haptics, vehicle motion cues and more. 

The most intriguing feature of the set is the ability to use the front-facing camera on iPhones or iPads (at least those with the A12 chip or later) to navigate the software without additional hardware or accessories. With this enabled, people can look at their screen to move through elements like apps and menus, then linger on an item to select it. 

That pause to select is something Apple calls Dwell Control, which has already been available elsewhere in the company's ecosystem like in Mac's accessibility settings. The setup and calibration process should only take a few seconds, and on-device AI is at work to understand your gaze. It'll also work with third-party apps from launch, since it's a layer in the OS like Assistive Touch. Since Apple already supported eye-tracking in iOS and iPadOS with eye-detection devices connected, the news today is the ability to do so without extra hardware.

Apple is also working on improving the accessibility of its voice-based controls on iPhones and iPads. It again uses on-device AI to create personalized models for each person setting up a new vocal shortcut. You can set up a command for a single word or phrase, or even an utterance (like "Oy!" perhaps). Siri will understand these and perform your designated shortcut or task. You can have these launch apps or run a series of actions that you define in the Shortcuts app, and once set up, you won't have to first ask Siri to be ready. 

Another improvement coming to vocal interactions is "Listen for Atypical Speech," which has iPhones and iPads use on-device machine learning to recognize speech patterns and customize their voice recognition around your unique way of vocalizing. This sounds similar to Google's Project Relate, which is also designed to help technology better understand those with speech impairments or atypical speech.

To build these tools, Apple worked with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign. The institute is also collaborating with other tech giants like Google and Amazon to further development in this space across their products.

For those who are deaf or hard of hearing, Apple is bringing haptics to music players on iPhone, starting with millions of songs on its own Music app. When enabled, music haptics will play taps, textures and specialized vibrations in tandem with the audio to bring a new layer of sensation. It'll be available as an API so developers can bring greater accessibility to their apps, too. 

Drivers with disabilities need better systems in their cars, and Apple is addressing some of the issues with its updates to CarPlay. Voice control and color filters are coming to the interface for vehicles, making it easier to control apps by talking and for those with visual impairments to see menus or alerts. To that end, CarPlay is also getting bold and large text support, as well as sound recognition for noises like sirens or honks. When the system identifies such a sound, it will display an alert at the bottom of the screen to let you know what it heard. This works similarly to Apple's existing sound recognition feature in other devices like the iPhone.

A graphic demonstrating Vehicle Motion Cues on an iPhone. On the left is a drawing of a car with two arrows on either side of its rear. The word
Apple

For those who get motion sickness while using their iPhones or iPads in moving vehicles, a new feature called Vehicle Motion Cues might alleviate some of that discomfort. Since motion sickness is based on a sensory conflict from looking at stationary content while being in a moving vehicle, the new feature is meant to better align the conflicting senses through onscreen dots. When enabled, these dots will line the four edges of your screen and sway in response to the motion it detects. If the car moves forward or accelerates, the dots will sway backwards as if in reaction to the increase in speed in that direction.

There are plenty more features coming to the company's suite of products, including Live Captions in VisionOS, a new Reader mode in Magnifier, support for multi-line braille and a virtual trackpad for those who use Assistive Touch. It's not yet clear when all of these announced updates will roll out, though Apple has historically made these features available in upcoming versions of iOS. With its developer conference WWDC just a few weeks away, it's likely many of today's tools get officially released with the next iOS.

This article originally appeared on Engadget at https://www.engadget.com/apple-brings-eye-tracking-to-recent-iphones-and-ipads-140012990.html?src=rss

Apple brings eye-tracking to recent iPhones and iPads

Ahead of Global Accessibility Awareness Day this week, Apple is issuing its typical annual set of announcements around its assistive features. Many of these are useful for people with disabilities, but also have broader applications as well. For instance, Personal Voice, which was released last year, helps preserve someone's speaking voice. It can be helpful to those who are at risk of losing their voice or have other reasons for wanting to retain their own vocal signature for loved ones in their absence. Today, Apple is bringing eye-tracking support to recent models of iPhones and iPads, as well as customizable vocal shortcuts, music haptics, vehicle motion cues and more. 

The most intriguing feature of the set is the ability to use the front-facing camera on iPhones or iPads (at least those with the A12 chip or later) to navigate the software without additional hardware or accessories. With this enabled, people can look at their screen to move through elements like apps and menus, then linger on an item to select it. 

That pause to select is something Apple calls Dwell Control, which has already been available elsewhere in the company's ecosystem like in Mac's accessibility settings. The setup and calibration process should only take a few seconds, and on-device AI is at work to understand your gaze. It'll also work with third-party apps from launch, since it's a layer in the OS like Assistive Touch. Since Apple already supported eye-tracking in iOS and iPadOS with eye-detection devices connected, the news today is the ability to do so without extra hardware.

Apple is also working on improving the accessibility of its voice-based controls on iPhones and iPads. It again uses on-device AI to create personalized models for each person setting up a new vocal shortcut. You can set up a command for a single word or phrase, or even an utterance (like "Oy!" perhaps). Siri will understand these and perform your designated shortcut or task. You can have these launch apps or run a series of actions that you define in the Shortcuts app, and once set up, you won't have to first ask Siri to be ready. 

Another improvement coming to vocal interactions is "Listen for Atypical Speech," which has iPhones and iPads use on-device machine learning to recognize speech patterns and customize their voice recognition around your unique way of vocalizing. This sounds similar to Google's Project Relate, which is also designed to help technology better understand those with speech impairments or atypical speech.

To build these tools, Apple worked with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign. The institute is also collaborating with other tech giants like Google and Amazon to further development in this space across their products.

For those who are deaf or hard of hearing, Apple is bringing haptics to music players on iPhone, starting with millions of songs on its own Music app. When enabled, music haptics will play taps, textures and specialized vibrations in tandem with the audio to bring a new layer of sensation. It'll be available as an API so developers can bring greater accessibility to their apps, too. 

Drivers with disabilities need better systems in their cars, and Apple is addressing some of the issues with its updates to CarPlay. Voice control and color filters are coming to the interface for vehicles, making it easier to control apps by talking and for those with visual impairments to see menus or alerts. To that end, CarPlay is also getting bold and large text support, as well as sound recognition for noises like sirens or honks. When the system identifies such a sound, it will display an alert at the bottom of the screen to let you know what it heard. This works similarly to Apple's existing sound recognition feature in other devices like the iPhone.

A graphic demonstrating Vehicle Motion Cues on an iPhone. On the left is a drawing of a car with two arrows on either side of its rear. The word
Apple

For those who get motion sickness while using their iPhones or iPads in moving vehicles, a new feature called Vehicle Motion Cues might alleviate some of that discomfort. Since motion sickness is based on a sensory conflict from looking at stationary content while being in a moving vehicle, the new feature is meant to better align the conflicting senses through onscreen dots. When enabled, these dots will line the four edges of your screen and sway in response to the motion it detects. If the car moves forward or accelerates, the dots will sway backwards as if in reaction to the increase in speed in that direction.

There are plenty more features coming to the company's suite of products, including Live Captions in VisionOS, a new Reader mode in Magnifier, support for multi-line braille and a virtual trackpad for those who use Assistive Touch. It's not yet clear when all of these announced updates will roll out, though Apple has historically made these features available in upcoming versions of iOS. With its developer conference WWDC just a few weeks away, it's likely many of today's tools get officially released with the next iOS.

This article originally appeared on Engadget at https://www.engadget.com/apple-brings-eye-tracking-to-recent-iphones-and-ipads-140012990.html?src=rss

Google just snuck a pair of AR glasses into a Project Astra demo at I/O

In a video showcasing the prowess of Google's new Project Astra experience at I/O 2024, an unnamed person demonstrating asked Gemini "do you remember where you saw my glasses?" The AI impressively responded "Yes, I do. Your glasses were on a desk near a red apple," despite said object not actually being in view when the question was asked. But these glasses weren't your bog-standard assistive vision aid; these had a camera onboard and some sort of visual interface!

The tester picked up their glasses and put them on, and proceeded to ask the AI more questions about things they were looking at. Clearly, there is a camera on the device that's helping it take in the surroundings, and we were shown some sort of interface where a waveform moved to indicate it was listening. Onscreen captions appeared to reflect the answer that was being read aloud to the wearer, as well. So if we're keeping track, that's at least a microphone and speaker onboard too, along with some kind of processor and battery to power the whole thing. 

We only caught a brief glimpse of the wearable, but from the sneaky seconds it was in view, a few things were evident. The glasses had a simple black frame and didn't look at all like Google Glass. They didn't appear very bulky, either. 

In all likelihood, Google is not ready to actually launch a pair of glasses at I/O. It breezed right past the wearable's appearance and barely mentioned them, only to say that Project Astra and the company's vision of "universal agents" could come to devices like our phones or glasses. We don't know much else at the moment, but if you've been mourning Google Glass or the company's other failed wearable products, this might instill some hope yet.

Catch up on all the news from Google I/O 2024 right here!

This article originally appeared on Engadget at https://www.engadget.com/google-just-snuck-a-pair-of-ar-glasses-into-a-project-astra-demo-at-io-172824539.html?src=rss

Google’s Project Astra uses your phone’s camera and AI to find noise makers, misplaced items and more.

When Google first showcased its Duplex voice assistant technology at its developer conference in 2018, it was both impressive and concerning. Today, at I/O 2024, the company may be bringing up those same reactions again, this time by showing off another application of its AI smarts with something called Project Astra. 

The company couldn't even wait till its keynote today to tease Project Astra, posting a video to its social media of a camera-based AI app yesterday. At its keynote today, though, Google's DeepMind CEO Demis Hassabis shared that his team has "always wanted to develop universal AI agents that can be helpful in everyday life." Project Astra is the result of progress on that front. 

According to a video that Google showed during a media briefing yesterday, Project Astra appeared to be an app which has a viewfinder as its main interface. A person holding up a phone pointed its camera at various parts of an office and verbally said "Tell me when you see something that makes sound." When a speaker next to a monitor came into view, Gemini responded "I see a speaker, which makes sound."

The person behind the phone stopped and drew an onscreen arrow to the top circle on the speaker and said, "What is that part of the speaker called?" Gemini promptly responded "That is the tweeter. It produces high-frequency sounds."

Then, in the video that Google said was recorded in a single take, the tester moved over to a cup of crayons further down the table and asked "Give me a creative alliteration about these," to which Gemini said "Creative crayons color cheerfully. They certainly craft colorful creations."

The rest of the video goes on to show Gemini in Project Astra identifying and explaining parts of code on a monitor, telling the user what neighborhood they were in based on the view out the window. Most impressively, Astra was able to answer "Do you remember where you saw my glasses?" even though said glasses were completely out of frame and were not previously pointed out. "Yes, I do," Gemini said, adding "Your glasses were on a desk near a red apple."

After Astra located those glasses, the tester put them on and the video shifted to the perspective of what you'd see on the wearable. Using a camera onboard, the glasses scanned the wearer's surroundings to see things like a diagram on a whiteboard. The person in the video then asked "What can I add here to make this system faster?" As they spoke, an onscreen waveform moved to indicate it was listening, and as it responded, text captions appeared in tandem. Astra said "Adding a cache between the server and database could improve speed."

The tester then looked over to a pair of cats doodled on the board and asked "What does this remind you of?" Astra said "Schrodinger's cat." Finally, they picked up a plush tiger toy, put it next to a cute golden retriever and asked for "a band name for this duo." Astra dutifully replied "Golden stripes."

This means that not only was Astra processing visual data in realtime, it was also remembering what it saw and working with an impressive backlog of stored information. This was achieved, according to Hassabis, because these "agents" were "designed to process information faster by continuously encoding video frames, combining the video and speech input into a timeline of events, and caching this information for efficient recall."

It was also worth noting that, at least in the video, Astra was responding quickly. Hassabis noted in a blog post that "While we’ve made incredible progress developing AI systems that can understand multimodal information, getting response time down to something conversational is a difficult engineering challenge."

Google has also been working on giving its AI more range of vocal expression, using its speech models to "enhanced how they sound, giving the agents a wider range of intonations." This sort of mimicry of human expressiveness in responses is reminiscent of Duplex's pauses and utterances that led people to think Google's AI might be a candidate for the Turing test.

While Astra remains an early feature with no discernible plans for launch, Hassabis wrote that in future, these assistants could be available "through your phone or glasses." No word yet on whether those glasses are actually a product or the successor to Google Glass, but Hassabis did write that "some of these capabilities are coming to Google products, like the Gemini app, later this year."

Catch up on all the news from Google I/O 2024 right here!

This article originally appeared on Engadget at https://www.engadget.com/googles-project-astra-uses-your-phones-camera-and-ai-to-find-noise-makers-misplaced-items-and-more-172642329.html?src=rss

Ask Google Photos to help make sense of your gallery

Google is inserting more of its Gemini AI into many of its product and the next target in its sights is Photos. At its I/O developer conference today, the company's CEO Sundar Pichai announced a feature called Ask Photos, which is designed to help you find specific images in your gallery by talking to Gemini. 

Ask Photos will show up as a new tab at the bottom of your Google Photos app. It'll start rolling out to One subscribers first, starting in US English over the upcoming months. When you tap over to that panel, you'll see the Gemini star icon and a welcome message above a bar that prompts you to "search or ask about Photos."

According to Google, you can ask things like "show me the best photo from each national park I've visited," which not only draws upon GPS information but also requires the AI to exercise some judgement in determining what is "best." The company's VP for Photos Shimrit Ben-Yair told Engadget that you'll be able to provide feedback to the AI and let it know which pictures you preferred instead. "Learning is key," Ben-Yair said.

You can also ask Photos to find your top photos from a recent vacation and generate a caption to describe them so you can more quickly share them to social media. Again, if you didn't like what Gemini suggested, you can also make tweaks later on.

For now, you'll have to type your query to Ask Photos — voice input isn't yet supported. And as the feature rolls out, those who opt in to use it will see their existing search feature get "upgraded" to Ask. However, Google said that "key search functionality, like quick access to your face groups or the map view, won't be lost."

The company explained that there are three parts to the Ask Photos process: "Understanding your question," "crafting a response" and "ensuring safety and remembering corrections." Though safety is only mentioned in the final stage, it should be baked in the entire time. The company acknowledged that "the information in your photos can be deeply personal, and we take the responsibility of protecting it very seriously."

To that end, queries are not stored anywhere, though they are processed in the cloud (not on device). People will not review conversations or personal data in Ask Photos, except "in rare cases to address abuse or harm." Google also said it doesn't train "any generative AI product outside of Google Photos on this personal data, including other Gemini models and products."

Your media continues to be protected by the same security and privacy measures that cover your use of Google Photos. That's a good thing, since one of the potentially more helpful ways to use Ask Photos might be to get information like passport or license expiry dates from pictures you might have snapped years ago. It uses Gemini's multimodal capabilities to read text in images to find answers, too.

Of course, AI isn't new in Google Photos. You've always been able to search the app for things like "credit card" or a specific friend, using the company's facial and object recognition algorithms. But Gemini AI brings generative processing so Photos can do a lot more than just deliver pictures with certain people or items in them.

Other applications include getting Photos to tell you what themes you might have used for the last few birthday parties you threw for your partner or child. Gemini AI is at work here to study your pictures and figure out what themes you already adopted.

There are a lot of promising use cases for Ask Photos, which is an experimental feature at the moment and that is "starting to roll out soon." Like other Photos tools, it might begin as a premium feature for One subscribers and Pixel owners before trickling down to all who use the free app. There's no official word yet on when or whether that might happen, though.

Catch up on all the news from Google I/O 2024 right here!

This article originally appeared on Engadget at https://www.engadget.com/ask-google-photos-to-get-help-making-sense-of-your-gallery-170734062.html?src=rss