Engadget Podcast: Recapping WWDC 2024 from Apple Park

There was no new Apple hardware at WWDC 2024, but Apple still had tons of news around AI and its upcoming operating systems. In this bonus episode, Cherlynn and Devindra brave the California heat to discuss Apple Intelligence and how it's different than other AI solutions. And they dive into other new features they're looking forward to, like the iPhone mirroring in macOS Sequoia and iPadOS 18's surprisingly cool Calculator app.


Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!

Hosts: Devindra Hardawar and Cherlynn Low
Music: Dale North and Terrence O'Brien

Devindra: What's up, folks? This is Devindra here, and we are live at Apple Park. Cherlynn and I are in the middle of covering Apple's WWDC conference. Cherlynn, what's up? How's it going?

Cherlynn: We are, I feel quite zen right now, because even though I have a lot more meetings coming up, we are seated outside, it's nice out, and even though it's really hot, it's not dying. it's nice. I'm chill.

Devindra: It's nice we are both, we've gone through four to five meetings. For both of us. We've gone through the keynote. We're writing a bunch of news folks. So we're just gonna sit down and Give you our thoughts about what's going on. Cherlynn and I also did a video that's up on our YouTube channel recapping why we think Apple intelligence is doing things a little differently and maybe better than Stuff from Microsoft and Google, but yeah, Sherlyn, you've been talking with Apple a lot.

What is your general takeaway from this year's WWDC?

Cherlynn: Yeah, to set the stage a little, I think, this morning 8 a. m. I had my first meeting and then it's been four meetings like you said, Devendra, covering topics like Apple intelligence, privacy, iOS 18 and iPadOS 18 and watchOS 11 as well. My main thing is that yes, we have actually throughout the keynote, we heard things that we've seen in other platforms, right?

Like they're blatantly copying magic eraser from Google's editor on, this thing called cleanup and photos. and they're adding different things like, oh, you can now rearrange your apps and skin them the way you can in Android's material you. But, the way Apple's thought things through proves and continues to prove to be different from everyone else.

It's a bit more thoughtful, a bit cleaner, a bit more sophisticated. And, again, I think you see this most in Apple Intelligence. And, Devindra, you've been asking everybody here, can we say AI? Can we?

Devindra: I don't know. So one thing I started figuring it out, or at least as we were writing about Apple Intelligence, is that Making an acronym for it is tough because I can't just call it AI and then talk about That stuff versus Copilot or versus OpenAI and I've started using Apple AI as a way to shorten it But I have been asking Apple folks here basically everyone we've encountered about how they shorten Apple intelligence and the resounding response I get is like a data processing error.

It's like watching a human kind of just like stop being able to process information. They look over to the PR person. They're like, what do I see here? But the response I always get is, Apple intelligence. That's all we say. We only ever say Apple intelligence. One person said, personal intelligence, which is a phrase Tim Cook used.

But yeah, it is funny that it is, it seems almost like a corporate command not to call Apple intelligence AI or shorten it that way in any way.

Cherlynn: they think of the words AI, or the letters AI, to stand for Apple Intelligence, it seems And then the word that they fall back on when they don't want to say Apple Intelligence is just Intelligence. three syllables, is only one more than saying AI. Still, though, AI is so much easier to say, in my opinion.

Devindra: It just feels like they have stumbled themselves into this weird branding hole, where they took the letters A and I, but they can't use AI. But it is also an AI powered thing. I just think it's funny, and shows, like, how I don't know how absurd these companies can be at some points. but yeah, let's briefly talk about Apple Intelligence, Cherlynn. I'm more impressed by what Apple's doing here because it does seem like they're announcing features that we actually would want to use, and it's more centered on features within apps, stuff like making Siri better, rather than what Microsoft did.

Microsoft was just like, hey, nobody likes our search engine. Here we put AI in our search engine then everybody all of a sudden thought it was cool And then they put that copilot they rebranded as copilot They put that in Windows and it's like dot profit I don't think it actually led to anything.

I don't care about copilot in Windows It hasn't been functionally useful for me But just looking at the stuff here that Apple has shown off like I want to use this new Siri I want to use a lot of these new features that they're showing off. I don't know if you feel differently

Cherlynn: I think Siri is only one part of the Apple intelligence puzzle.

I think there's a lot of other stuff that they were, that they demoed that would be very intriguing. I do feel like a lot of their writing tools, things that we're going to see on Mac and iPad, are things we've seen elsewhere, like Copilot, like Gemini and OpenAI, have all offered some version of rewriting something for you, summarizing it for you, providing a TLDR.

Apple obviously being the sort of, vertical integration king that it is good at Bringing it so that when you highlight a body of text or something, you can see this like blue or yellow or whatever circle up here at the top left, where you find your copy and paste options, you might also go there to get a writing tool like, yeah, help me adjust the tone of this cover letter that I'm writing, for example. it's Stuff that we've seen, but yet applied a just a bit better, a bit more thoughtfully. the Siri stuff, they've redesigned Siri to better understand you if you, interrupting yourself in the middle of, issuing a command. if you're like, Oh, adjust this timer. Oh, sorry, set it for 15, not 20 minutes, that sort of thing. it will do it. It's smart enough. it is definitely, more thought out and more system wide and deeply integrated. And to use their own words, Personal, more personally, contextually aware than say, Gemini on a Pixel phone. And that's the only real other place I can compare it to because the Copilot on Surface PCs don't seem that deeply integrated just yet.

Devindra: It almost seems like Copilot is directionless. It's like Microsoft was just like, Hey, OpenAI is cool, do you like ChatGPT? look, we put it in Windows. are you not entertained? Aren't you happy about this? And I wasn't, I've tested this stuff for a while. I think Google's at least trying to be a little more, Thoughtful also about how it's doing it with Gemini.

Like it's trying to like hook into all the Google services and all the stuff you're already relying on. But Apple's whole thing is like they are building on the privacy standards that they have talked about before a lot of this Processing is happening on device with their local models They do go to the cloud for some of their like more complex things.

But we also read about the what is it the private cloud? connection that they were talking about and even that seems cool I'd recommend you all it's like weird to even discuss something like this, but they have basically Created a cloud solution that is they say is more secure You it's an anonymized connection.

Like when, first of all, Apple's models only send little bits of data to their cloud. It's anon anonymized in a way, like the, the VPN relay thing that they have on iPhones is, these servers don't save your data. They don't save logs. So that's also something that will prevent, authorities like police or the FBI from getting records of what you're doing. but Apple's just like keeping yourself out of that. And they also say that they're publishing the images. of the software being used on the servers for researchers to audit and to take a look at and your phone can only talk with basically servers running the exact software that they expect it to so your phone will have to keep getting updated there's like just multiple layers of security which is not the sort of thing I think most people think about when they're doing like cloud services, at least from what I've seen.

Cherlynn: I think, so one thing, the irony of Copilot being directionless is just quite funny to me. you don't want a Copilot or a Pilot to be directionless. But anyway, yeah, the private cloud computing is definitely something that, Apple is approaching differently compared to Microsoft and Google, where they explicitly lay out how anonymized and how protected and encrypted your data is. and true. Apple's point, which something that Craig Federighi pointed out during the keynote as well, they actually put it out there. They want independent verification and validation that their stuff is securely happening and all the transfer of your process. for example, right after the keynote, a certain co, CEO or owner of a certain social media network or platform was like, OpenAI integration with this thing is going to be a security risk, right? I am referring to Elon Musk's tweets on X. And, from my understanding, having taken a lot of meetings since, the OpenAI integration is happening like this. whenever you ask Apple Intelligence devices a query, Siri for example, the first thing it's doing is figuring out whether it can do it on device or if it needs to send through privacy cloud compute to the servers to process.

Then, on devices obviously like quite direct, right? But if it needs to pass your information on to chat GPT because you've asked it something that, whatever. It will first surface hey, do you want to pass your information to chat GPT? And it will do it every time. you're not going to say yes once and then afterwards never have to be asked chat GPT access again. and then there is a contractual thing between OpenAI and Apple which prevents OpenAI from storing your requests. and also Apple is just not handing any IP address information over. It is using that sort of private relay thing, to pass on like any IP address information needed. It's just hiding the actual info. and then once OpenAI has done chat GPTing your answers for you, It is supposed to erase your information or whatever and get rid of it. It is a contractual thing supposedly, and that remains to be seen, right? that's how much you trust OpenAI to do that. and then the similar sort of concepts for privacy, cloud compute here.

So again, very well thought out, right? just very Apple in its approach.

Devindra: Thoughtful, I think, is the word. I don't want to sound like I said this in the video. Not too much of a fanboy. We have not seen this stuff in action yet or in the wild. But, I think like the initial I don't know. Problems we saw was something like Microsoft Recall, which was a cool idea.

But Microsoft, it was like a blunt force approach to Hey, we're just going to remember everything you did on your computer by capturing everything we did on your computer. And we're going to serve it, we're going to save it in a database on your system that, Hey, anybody can apparently access with like very little protections around it.

And it literally took days for security researchers to even be like, what the hell is this? this is. Very easy to break through. Microsoft ended up having to, basically rework how that feature is. Initially it was, it was a feature that was always enabled and you had to opt out of it. Now it's opt in. people had to complain to alleviate these very obvious issues. And I think at the very least, I don't have that sense with Apple. Like I feel like they've at least sat down, maybe also talked with researchers and be like, is this cool? Is this actually copacetic in terms of like privacy and user safety and everything?

So I don't know if you have any further thoughts on that.

Cherlynn: Because Apple knows that the sort of price to pay if it's caught with egg on its face is so high and actually arguably higher for it than any of its other rivals to be, is all the more invested in making sure this is going and being done the right way and honestly I wish Google and Microsoft would take notes. I will say there's a lot of other privacy things that are very intriguing to me. I did, I am fresh from like a privacy related demo that, was very, so the passwords app is a new thing that I'm very excited about, I'm very welcome. I feel very welcome. Or I'm welcoming it? Whatever. they're also changing certain things like the allowing access to all your contacts or limiting access or whatever in the permission settings for various apps that need.

Like for example, if I'm playing Match Factory, why do they need to get all the access to all my contacts they don't need? I also think it's funny, no, maybe not funny. one of the new features coming to iOS 18 is locking and hiding. specific apps. locking makes sense. I get it. Hiding, though, seems like you're, like, Ashley Madison ing things for everybody, letting the cheaters of the world keep secrets.

I don't know how I feel about that, but, it's the, atomic bomb thing, right? Do you make it and then let people use it how they will, or, I don't know who at Apple decided this was a necessary feature. Are you going to use this feature Devendra?

Devindra: listen, I could have my Tinder account somehow.

I don't know. but I do think the app blocking thing is cool because parents often have to give their devices to kids and Oh, you don't want them to swipe away, you don't want them to do other stuff. So this way you can lock an app if you want to show off your photos or show off something to somebody and just have them not poke around, which has been.

It's also like a very common problem we see on like TV shows and movies where somebody would be like, Hey, can I take a look at this photo? And they're like, get all your personal data from your phone instantly because it's open. So it seems like a very smart way of dealing with privacy too.

Cherlynn: And very Apple esque in that, if you lock an app, say, your messages, for example, then it, also the contents of that app won't show up, and search won't show up, and series suggestions, or spotlight suggestions, or, even map suggestions. there's just a lot here. and, just to take away broader notes from WWDC2, like I said earlier, there's a lot of, small changes that, that make everything seem very meaty. iOS 18 actually might be a big upgrade. the messages updates that are coming. the new tap back, emoji.

Finally, we can do more than exclamation marks. Sometimes I just want to make a sad face. I can't do that. I have to do thumbs down. I like that they're coming. Oh, and I'm back to the Apple Intelli I know I'm jumping around a bit, but talking about emoji, another thing that Apple did right from the get, I think because it's been able to observe the pitfalls that other people have fallen into is to be like, okay, we're limiting it to these very obviously cartoonish sort of graphic like representation, so no photorealistic.

And then when it's creating images of people in gemmoji, you can only use your own creation. So you're basically choosing from a template or based on your like people that you have in your photos or your people gallery sort of situation. But because it's in a cartoonish representation, people are never going to mistake it for someone that's actually a real life.

You can't, for example, there are guardrails in place that like, prevent you from making the image playground generate something that looks harmful or violent or is exploitative. which again, goes to show, Apple's thought this through, right?

Devindra: And I think a lot of people are asking, what are, where, what are these models trained on?

Because Apple's talking about a lot of its own models, small ones that run directly on your device, larger ones that are in the cloud. And occasionally they'll reach out to OpenAI for chat GPT stuff. Apple has told us that they are training their models on licensed data, like images, things like that.

Some stuff from the open web, publishers can refuse to participate. Like they can say their site is not crawlable by Apple stuff. and they say that, Apple will, if somebody changes down the line, like what they want to be accessible to their own models, Apple will reflect that with further updates.

So again, it's opaque, but at least what they're telling us. To me sounds better than what I've heard from Google and certainly from open AI. so I think that's cool real quick. Let's talk about macOS Sequoia, which has most of the features again All these features pretty much come across all of its products.

So apple intelligence is going to be a thing that's by the way is going to be working on Max running Apple Silicon, so M1 to M4 No M4 Max yet, but M1, M series Max and also the iPhone 15 Pro Sherlyn you wrote a piece about the features people can expect if you have an iPhone 14 Pro Basically, you're sore out of luck.

You get some iOS 18 features, but not everything, right?

Cherlynn: Yeah, all the iOS 18 features, but none of the Apple intelligence features, unfortunately So that redesigned Siri with the glowing edges. That's not coming. It is so pretty I also want to say that the iPadOS things that seem really cool, all the pencil features, the handwriting stuff.

So a lot of the, Keynote or like in demos I've taken, some features will be like, this is ML power, it's like Smart Script, for example, is powered by ML, but it's not part of Apple Intelligence. So you are still going to get that in iPadOS, 18 when you upgrade, regardless of whether you have the M1 iPad or like an older one. but yeah, I gotta talk to you about MathNodes, Devindra. Were you blown away by that demo? Like, when they just draw the equal sign, then that, thing just sums itself. The solution just solves itself. It's Mean Girls Mathletes, but on a whole other level to me.

Devindra: It's, it's cool, and that's also something they say is ML powered, not necessarily Apple Intelligence powered.

So if you have older iPads, you will see some benefits of that. It's cool, but I also feel like, Bye for now. It's like a superpowered calculator. I don't know how many people have Apple Pencils and are like scribbling down math formulas, but it's cool. I dig it.

Cherlynn: I think to, to begin with the fact that an iPad never had a calculator app before this, it's like astonishing.

But then now that it's here, Apple's clearly thought about look, we're bringing this to the bigger screen. We want it to be pencil friendly. We want it to be big screen friendly. Let's really think about the layout here. And this is explains why there's been a delay. And I actually get it. there's you can go into the history tab to see your previous like calculations. There's a lot more calculations you can do on this calculator, A currency conversion, which I forgot to ask, like, how is it pulling the actual rate? But whatever, and then you can go into the notes section and then I feel like almost feel like calculator is a misnomer in this case because it's doing way more than calculating and solving equations.

It's like you can draw like a blueprint of a house and have it measure the areas like length and width, whatever, but at the same time, map that to like price calculations, like price estimates, like if you use this material. So something I saw happen was like you did Price equals X, area equals Y, and then price times area equals, and once you draw the area, it's like programming basically, but all done in the notes app.

And that's really it blew my mind a little bit, which I hate to admit because I don't like to be so like, fangirly, but damn, that was cool.

Devindra: It was cool. Maybe the excuse for not having a calculator app built into the iPad. one thing I want to mention about macOS Decoy is iPhone mirroring. Which was something I like half predicted.

Like I wrote a wish piece for what I would want to see in Vision Pro and Vision OS 2. And one thing was I would really love to be able to mirror the iPhone just the way like you can mirror a MacBook inside Vision Pro. have a full projection of the screen. That's not coming to Vision OS 2. Vision OS 2 is like very minor updates it seems.

But It is something coming to Mac Sequoia and to use that you need to use a phone, an iPhone. With I 18 A Mac with Mac, Sequoia, you get almost instantly. I haven't seen projected, like how it actually works in real time, but it does seem like you hit a button, you get a window view of your iPhone and you use it.

On your Mac as you would in real life you see your home screen you can scroll between apps You can your notifications like very smartly are reintegrated into the max notifications. That's fun you could play games on your iphone and when you launch a game, the window will go widescreen The audio seems to come through pretty quickly just seems like a really cool feature because At least on Macs, like I always have my phone nearby.

It's always like doing other stuff, but I would love to be able to like just have that open and also see other notifications coming in. It's just like very extensible in terms of like how you're interacting with your hardware. The iPhone, by the way, stay the screen stays locked. So it doesn't look like somebody is just like you're just like mirroring a direct computer or something.

One thing we learned from Apple because I've been asking around about this. If you mirrored your Mac inside of the Vision Pro, And then that Mac was also mirroring an iPhone. Would you actually be able to do the iPhone stuff from within the Vision Pro? And I've heard from a couple of folks that is basically not going to happen.

You can run one continuity feature. iPhone mirroring is part of continuity. So those features that let you like copy and paste across devices and stuff. You can only run one mirroring feature at a time. So basically you can't do that with the Mac. I'm still sitting here waiting for iPhone mirroring in Vision OS.

Clearly though, like they have the capability. The Vision Pro is running an M. 2 chip. If iPhone mirroring works on M. 1 and M. 2 Macs, there is no reason why that isn't going to be in Vision Pro eventually. So I'm kinda, I feel like we half won that bet, basically.

Cherlynn: I just gotta say that Windows and Android have been trying to do this forever.

I can remember like years ago though, when the first like Galaxy books try to do this. That's at least my earliest encounter with it. It works. I haven't seen this happen, yet on the iPhone side of things, as in I haven't personally taken a demo, so I couldn't tell you if it's actually better or more thoughtful. knowing Apple, knowing its deep integration prowess is probably gonna work better.

Devindra: probably. Apple also gives very good demos, so that's something we've learned. do you have any further thoughts about WWDC or what's ahead for Apple, Sherlyn?

Cherlynn: I have so much to dive into in detail, like the watchOS stuff, the iOS and iPadOS features.

I guess broad strokes. It's it feels almost revolutionary because Apple is finally jumping on board the AI train, and renaming the train, taking over basically. And you know what? The thing is, I hate this, but now people are going to pay attention. Because what? now my friends actually are going to realize what Genmoji is supposed to do, what you can actually do by, feeding AI, generator.

It's gonna, and we're gonna start seeing more, writing tool assisted emails and reviews, I think it's momentous. I think people are really going to start paying attention to what AI means and what it can do. I don't know if it's good for the world, but yeah, it just feels like big.

Devindra: We ran out of time talking to Apple people, but I did want to ask them, do you think these writing tools are actually helpful?

Because then it just, All our emails, all our conversations are going to start to sound like weirdly robotic or extra formal AI documents or AI texts and I'm not a fan of that, not too interested, but the Genmoji stuff is cool because it's like we have had Dali and other things like create these AI images, what do you do with them? post it on social media? I don't know. Genmoji is just if you want to create an emoji based on a specific feeling, you can create a thing to your liking. Just a really smart use of that. That technology, So anyway, I am in the process of installing the, the iOS 18 developer beta on my phone. I think according to the rules, you can't talk about that, but we can talk about it when they launch the public beta. So that's, later next month, I believe. But we're going to be testing this stuff out. We're going to be thinking about these features. any other takeaways from Apple, Sherlyn?

Cherlynn: No, send us your thoughts though, right? podcast@engadget.Com is the most direct way to reach us. but come back on, we do a Thursday livestream at podcast@engadget.com Eastern on our YouTube channel where we have direct Q& A sessions where we can probably answer your questions, in real time. And I'm pretty sure we'll still continue to dig deep into what we learned, this week on our episode, that drops on Fridays or Thursday nights, right? come back for all of that.

Devindra: Yeah, definitely. We're still gonna be doing a longer, a normal podcast episode this week. Cherlynn and I are in California now, but we'll be flying back tomorrow and ready to podcast on live stream on Thursday. So we'll be back folks. let us know what you think about all this news, podcast@engadget.com. Thanks folks. We're out.

This article originally appeared on Engadget at https://www.engadget.com/engadget-podcast-recapping-wwdc-2024-from-apple-park-011440662.html?src=rss

You can’t mirror your iPhone while mirroring your Mac on Apple Vision Pro

So close, yet so far. Ahead of WWDC 2024, I had hoped Apple would let you mirror your iPhone inside of the Vision Pro, just like how you can use your Mac on an enormous virtual display. Instead, we got iPhone Mirroring on macOS Sequoia. As the name implies, it will let you see everything on your iPhone from the comfort of your Mac.

But, I wondered, what if you mirrored a Mac that was mirroring an iPhone in the Vision Pro? It seems like the ideal workaround in theory, one that would solve the headset's annoying inability to play nicely with your iPhone. But, unfortunately, it won't work. We've heard from knowledgeable sources that Apple's hardware only supports one of its Continuity mirroring features at the time. So if you're sending your Mac's screen to the Vision Pro, you won't be able to mirror your iPhone at the same time.

We haven't heard the exact reason for that limitation, but I'd wager it comes down to networking limitations. Mirroring a sharp and lag-free version of your Mac's screen is difficult enough — juggling that alongside a perfectly rendered copy of your iPhone might be too tough for some Macs. Apple is already pushing beyond its current Continuity restrictions with visionOS 2, which will support higher resolution Mac mirroring, as well as the ability to virtualize an ultra-wide display. So perhaps there's room for multi-device mirroring down the line.

It's not hard to imagine Apple bringing the iPhone mirroring feature directly to the Vision Pro eventually, but ideally, it would also work alongside Mac mirroring in the headset.

Here are a few other tidbits we've learned about iPhone mirroring on macOS Sequoia while exploring WWDC: 

  • It requires both WiFi and Bluetooth to work, and the iPhone is projected at 60 fps.

  • When you launch a game, the iPhone window flips into landscape view on your Mac. The game's sound also appears to be synchronized well.

  • Mirroring will use around the same amount of battery life on your iPhone as typical usage.

  • If you unlock your iPhone directly, the mirrored window closes immediately on your Mac.

  • You'll eventually be able to drag and drop files and other content between your iPhone and Mac. This feature will also be available on third-party apps.

Update 6/12/24, 1:16PM ET: Early testers have discovered that visionOS 2 supports direct AirPlay mirroring from iPhones and iPads. This isn't the same as the Mac's iPhone mirroring feature, since you can't directly interact with the window within Vision Pro, but it's one way to keep tabs on your other devices. We've reached out to Apple for comment on this feature, which wasn't discussed during WWDC. 

Catch up here for all the news out of Apple's WWDC 2024.

This article originally appeared on Engadget at https://www.engadget.com/you-cant-mirror-your-iphone-while-mirroring-your-mac-on-apple-vision-pro-222021905.html?src=rss

Apple refuses to call Apple Intelligence ‘AI’

"How do you shorten Apple Intelligence?" That’s the question I’ve asked several Apple employees at WWDC 2024, and their practiced responses have become comically absurd.

“We just say Apple Intelligence,” they tell me. “Yah, but do you say that every time? The AI acronym is right there!” I’d retort. The usual response is a stiff smile and clenched teeth, like a human programming error in real-time. (Yes, I'm aware it's just overly aggressive media training in action.) One person suggested they also say "personal intelligence" — yes, a phrase that's longer than Apple Intelligence.

There's no doubt Apple Intelligence means many things to the company. It's an effort to compete with Microsoft's (still unproven) Copilot and Google's Gemini. It's a way to make Apple seem "hip" with ChatGPT. And it should enable a slew of new features for consumers. But Apple Intelligence is never "AI" to Apple.

Normally, I'd chalk this up to a silly branding quirk. But it becomes a problem as we cover Apple Intelligence. It's a long phrase that's just begging to be shortened to "AI," but then how do you distinguish that abbreviation from ChatGPT, Copilot and the general concept of AI? During the WWDC 2024 keynote, Apple only mentioned the phrase "artificial intelligence" three times: Twice while referring to its previous AI-powered features, and another while referring to "other artificial intelligence tools" like ChatGPT.

At this point, I've just decided to call Apple Intelligence "Apple AI." It's shorter and it differentiates the product from competitors. And yes, it just means "Apple Apple Intelligence," but everyone still says "ATM machine" and "PIN number." It's not my fault Apple decided to co-opt the acronym "AI."

Catch up here for all the news out of Apple's WWDC 2024.

This article originally appeared on Engadget at https://www.engadget.com/apple-refuses-to-call-apple-intelligence-ai-195913202.html?src=rss

Apple Intelligence: What devices and features will actually be supported?

Apple Intelligence is coming, but not to every iPhone out there. In fact, you'll need to have a device with an A17 Pro processor or M-series chip to use many of the features unveiled during the Apple Intelligence portion of WWDC 2024. That means only iPhone 15 Pro owners (and those with an M-series iPad or MacBook) will get the iOS 18-related Apple Intelligence (AI?) updates like Genmoji, Image Playground, the redesigned Siri and Writing Tools. Then there are things like Math Notes and Smart Script on iPadOS 18 and the new features in Messages coming via iOS 18 that will be arriving for anyone that can upgrade to the latest platforms. It's confusing, and the best way to anticipate what you're getting is to know what processor is in your iPhone, iPad or Mac.

It's not evident exactly why older devices using an A16 chip (like the iPhone 14 Pro) won't work with Apple Intelligence, given its neural engine seems more than capable compared to the M1. A closer look at the specs sheets of those two processors show that the main differences appear to be in memory and GPU prowess. Specifically, the A16 Bionic can only support a maximum of 6GB of RAM onboard while the M1 starts at 8GB and goes up to 16GB. In fact, all the supported devices have at least 8GB of RAM and that could hint at why your iPhone 14 Pro will not be able to handle making Genmojis, perhaps.

Though it might not seem quite fair that owners of a relatively recent iPhone won't get to use Apple Intelligence features, you'll still be getting a healthy amount of updates via iOS 18. Here's a quick breakdown of what is coming via iOS 18, and what's only coming if your iPhone supports Apple Intelligence.

Basically everything described during the iOS portion of yesterday's WWDC 2024 keynote is coming to all iPhones (that can update to iOS 18). That includes the customizable home screen, Control Center, dedicated Passwords app, redesigned Photos app, new Tapback emoji reactions, text effects, scheduled sending and more. Messages via Satellite is only coming to iPhone 14 or newer, and you'll be able to send text messages, emojis and Tapbacks, but not images or videos. 

You'll also be tied to the same satellite service plan that you got at the time of your purchase of an iPhone 14. If you bought your iPhone 14 in January 2024, you received a free two-year subscription to be able to use Emergency SOS via Satellite and other satellite communication features that now include texting. That means that to continue texting people via satellite after January 2026, you'll need to start paying for a plan. 

There are a whole host of updates coming with iOS 18 that Apple didn't quite cover in its keynote either, and I'll be putting up a separate guide about that in a bit. But suffice to say that apps like Maps, Safari, Calendar and Journal are getting new functions that, together with the other changes mentioned so far, add up to a meaty OS upgrade.

In short, all of them. If you have an iPhone 15 Pro or an iPad (or Mac) with an M-series chip, you'll get a redesigned Siri, Genmoji and Image Playground, as well as writing tools baked into the system. That means tools like proofreading, summarizing or helping you adjust your tone in apps like Mail, Notes and Keynote are limited to the AI-supported devices. If you don't have one of those, you'll get none of this. 

The redesigned Siri, which is only coming through Apple Intelligence, will be able to understand what's on your screen to contextually answer your queries. If you've been texting with your friend about which baseball player is the best, you can ask Siri (by long pressing the power button or just saying Hey Siri) "How many homeruns has he done?" The assistant will know who "he" is in this context, and understand you're referring to the athlete, not the friend you're chatting with. 

Apple Intelligence is also what brings the ability to type to Siri — and you can invoke this keyboard to talk to the assistant by double tapping the bottom of the screen. 

This also means that new glowing edge animation that appears when Siri is triggered is limited to the Apple Intelligence-supported devices. You'll still be looking at that little orb at the bottom of your screen when you talk to the assistant on an iPhone 14 Pro or older.

There are loads more features coming via Apple Intelligence, which appears to be set for release later this year. 

Catch up here for all the news out of Apple's WWDC 2024.

This article originally appeared on Engadget at https://www.engadget.com/apple-intelligence-what-devices-and-features-will-actually-be-supported-185850732.html?src=rss

Apple ID is now Apple Account

For all the AI features, customization options and everything else coming to Apple’s operating systems this year, there is one other notable update. The company is rebranding Apple ID to Apple Account in iOS 18, iPadOS 18, macOS Sequoia and watchOS 11.

The reason behind the change is to provide "a consistent sign-in experience across Apple services and devices," the company wrote in a blog post. Apple Account "relies on a user's existing credentials," so you won't have to change anything.

The betas of the new operating systems already use the term Apple Account, but MacRumors notes that Apple ID is still used in some places, such as the account sign-in page on Apple's website. The company is most likely going to complete the Apple Account transition by the time it rolls out the latest major public versions of the operating systems (which also include tvOS and visionOS) this fall.

Catch up here for all the news out of Apple's WWDC 2024.

This article originally appeared on Engadget at https://www.engadget.com/apple-id-is-now-apple-account-172019457.html?src=rss

The Morning After: Everything Apple announced at WWDC

Apple’s annual developer shindig kicked off with its traditional keynote outlining all the new tricks its products will soon do. There are big changes for iOS 18, iPadOS 18, macOS Sequoia and watchOS 11, not to mention visionOS 2. Some highlights include a standalone Passwords app, better health metrics on the Watch and Apple Intelligence, its own spin on AI. There’s more to learn about, so keep reading to learn all the biggest stories from the show.

— Dan Cooper

Blackmagic is developing a camera for immersive Apple Vision Pro videos

Yes, iOS 18 will include RCS support

Apple’s new AI-powered Siri can use apps for you

Apple may integrate Google’s Gemini AI into iOS in the future

iOS 18 embraces Apple Intelligence, deeper customization and a more useful Siri

macOS Sequoia will let you see your iPhone mirrored on your Mac’s screen

iPadOS 18 is getting a big boost with Apple Intelligence

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!

Image from WWDC 2024
Apple

Apple has finally bowed to pressure, bringing AI to its devices in the form of Apple Intelligence, powered by OpenAI. The system will bolster Siri, offering its generative AI smarts to write emails, summarize news articles and offer finer-grain control of your apps. It’ll be interesting to see, given Apple’s long-held distaste for machine learning gimmicks, if this can win where Google and Microsoft have floundered.

Continue Reading.

Image of apple passwords
Apple

Apple already has a dedicated password manager buried in its operating systems, but now it’ll be its own app. Passwords will act as a standalone password manager across every Apple platform and will even work on Windows via iCloud. Like iCloud Keychain, it’ll generate and record passwords to all of your sites and services, locking them behind biometric security.

Continue Reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-everything-apple-announced-at-wwdc-111550649.html?src=rss

How Apple Intelligence could avoid Microsoft and Google’s AI mistakes

Apple's spin on AI is finally here, and it already seems smarter than Microsoft Copilot and Google Bard. Apple Intelligence focuses on privacy and "personal intelligence," with a bit of an assist from ChatGPT. While we haven't tested it ourselves yet, Apple appears to be avoiding the pitfalls of Microsoft's Recall feature, as well as Google Bard's unfortunate early gaffes. The company isn't trying to capture everything you're doing on your computer, and it's being careful about how it's using larger AI models like ChatGPT. 

Shortly after the WWDC 2024 keynote ended, Engadget's Cherlynn Low and Devindra Hardawar discussed why they think Apple is taking a more thoughtful approach to AI.

This article originally appeared on Engadget at https://www.engadget.com/how-apple-intelligence-could-avoid-microsoft-and-googles-ai-mistakes-000751533.html?src=rss

How does Apple send your data to its cloud AI servers? Very carefully, it claims.

For years, Apple has touted privacy as its major advantage over rivals like Google and Microsoft. Instead of relying on cloud processing to improve or organize your images, which requires sending your photos to Google's servers, Apple handles those tasks directly on your device. But with the advent of Apple Intelligence, the company's take on artificial intelligence, the company is stepping out of its comfort zone with "Private Cloud Compute." It says "private" right in the name, so it has to be secure, right?

While Apple AI will run some models locally, it will occasionally have to send data to Apple's servers for complex requests. So how is the company squaring this with its previous security stance? 

According to Craig Federighi, Apple's SVP of Software Engineering, the company is being very careful about how its sending your data to its servers. "You're putting a lot of faith in the cloud... with Private Cloud Compute, the stakes are even higher," he said in a WWDC 2024 conversation with Apple's AI head, John Giannandrea, and YouTube influencer iJustine.

During the WWDC keynote, Federighi showed off how Apple AI could help him reschedule a meeting and determine if he could still attend his daughter's dance recital. Apple AI was able to determine who his daughter actually was, where her event was located, and the estimated travel time from his meeting.

Federighi says Apple isn't sending all of your data to the cloud, instead it's only uploading the most important bits of information relevant to your Apple AI query. Additionally, your server request is anonymous, since it's using the same IP masking technology as iCloud Private Relay. Federighi also noted that Apple's cloud servers have no permanent storage and don't have the ability to keep logs. 

To make things even more secure, Federighi says Private Cloud Compute servers are running software with published images for security researchers to audit. Apple Intelligence devices can only talk with servers running those approved images — if there are any changes to the servers, the local devices will also need to be updated to see them.

That process may a bit restrictive, but that's precisely the point. Federighi calls it "a step up" in the level of trust you can have with server computing. "It's essential that you know no one—not Apple, not anyone else, can access the information used to process your request," he said.  

This article originally appeared on Engadget at https://www.engadget.com/how-does-apple-send-your-data-to-its-cloud-ai-servers-very-carefully-it-claims-233312425.html?src=rss

Apple may integrate Google’s Gemini AI into iOS in the future

Apple is integrating GPT-4o, the large language model that powers ChatGPT into iOS 18, iPadOS 18 and MacOS Sequioa thanks to a partnership with OpenAI announced at WWDC, the company’s annual developer conference, on Monday. But shortly after the keynote ended, Craig Federighi, Apple’s senior vice president of software engineering said that the company might also bake in Gemini, Google’s family of large language model, into its operating systems.

“We want to enable users ultimately to choose the models they want, maybe Google Gemini in the future,” Federighi said in a conversation with YouTuber iJustine after the keynote. “Nothing to announce right now.”

The news is notable because even though Apple did mention plans to add more AI models into its operating system in the keynote, it didn’t mention Gemini specifically. Letting people choose the AI model they want on their devices instead of simply foisting one on them would give Apple devices a level of customization that none of its competitors like Google or Samsung have.

Catch up here for all the news out of Apple's WWDC 2024.

This article originally appeared on Engadget at https://www.engadget.com/apple-may-integrate-googles-gemini-ai-into-ios-in-the-future-220240081.html?src=rss

In case there weren’t enough emoji already, Apple’s Genmoji uses AI to generate even more

Currently, Unicode 15.1 supports just shy of 3,800 various emoji. But for everyone out there that for some reason thinks that's not nearly enough, today at WWDC 2024, Apple announced the ability to use AI to generate unique emoji based on your prompts.

Called Genmoji, which looks to be an awful portmanteu of the words "generate" and "emoji," these new creations are powered by Apple Intelligence, which is a new collection of AI features coming to the iPhone, iPad and Mac sometime later this year. Similar to creating images with services like Midjourney and Dall-E, users will be able to whip up custom emoji by inputting specific prompts. Once made, they can be shared with others as stickers, reactions in a Tapback or simply embedded in-line in messages. 

A demo of Apple Intelligence being used to make a custom Genmoji.
Apple

While the feature isn't expected to be officially available until later this fall, there don't seem to be any major limitations to what you can dream up. In a teaser at WWDC, Apple showed examples like a smiley face with cucumbers over its eyes and a T-rex riding a skateboard while wearing a tutu. That said knowing Apple, there is sure to be some restrictions for Genmoji made using more graphic prompts like guns or blood. 

Now on some level, it could be fun to razz your friends with Genmoji based on their latest mishap. But at the same time, part of the magic of emoji has always been being able to convey a message using the limited number of icons while still getting your point across. Also, it's truly hard to imagine how much added value a bagel with lox Genmoji (see the lead picture above) provides compared to the classic image. But since AI is so hot right now, seeing Apple Intelligence get applied to emoji was probably an inevitability. 🤷‍♂️

This article originally appeared on Engadget at https://www.engadget.com/in-case-there-werent-enough-emoji-already-apples-genmoji-uses-ai-to-generate-even-more-200011608.html?src=rss