So close, yet so far. Ahead of WWDC 2024, I had hoped Apple would let you mirror your iPhone inside of the Vision Pro, just like how you can use your Mac on an enormous virtual display. Instead, we got iPhone Mirroring on macOS Sequoia. As the name implies, it will let you see everything on your iPhone from the comfort of your Mac.
But, I wondered, what if you mirrored a Mac that was mirroring an iPhone in the Vision Pro? It seems like the ideal workaround in theory, one that would solve the headset's annoying inability to play nicely with your iPhone. But, unfortunately, it won't work. We've heard from knowledgeable sources that Apple's hardware only supports one of its Continuity mirroring features at the time. So if you're sending your Mac's screen to the Vision Pro, you won't be able to mirror your iPhone at the same time.
We haven't heard the exact reason for that limitation, but I'd wager it comes down to networking limitations. Mirroring a sharp and lag-free version of your Mac's screen is difficult enough — juggling that alongside a perfectly rendered copy of your iPhone might be too tough for some Macs. Apple is already pushing beyond its current Continuity restrictions with visionOS 2, which will support higher resolution Mac mirroring, as well as the ability to virtualize an ultra-wide display. So perhaps there's room for multi-device mirroring down the line.
It's not hard to imagine Apple bringing the iPhone mirroring feature directly to the Vision Pro eventually, but ideally, it would also work alongside Mac mirroring in the headset.
Here are a few other tidbits we've learned about iPhone mirroring on macOS Sequoia while exploring WWDC:
It requires both WiFi and Bluetooth to work, and the iPhone is projected at 60 fps.
When you launch a game, the iPhone window flips into landscape view on your Mac. The game's sound also appears to be synchronized well.
Mirroring will use around the same amount of battery life on your iPhone as typical usage.
If you unlock your iPhone directly, the mirrored window closes immediately on your Mac.
You'll eventually be able to drag and drop files and other content between your iPhone and Mac. This feature will also be available on third-party apps.
Update 6/12/24, 1:16PM ET: Early testers have discovered that visionOS 2 supports direct AirPlay mirroring from iPhones and iPads. This isn't the same as the Mac's iPhone mirroring feature, since you can't directly interact with the window within Vision Pro, but it's one way to keep tabs on your other devices. We've reached out to Apple for comment on this feature, which wasn't discussed during WWDC.
This article originally appeared on Engadget at https://www.engadget.com/you-cant-mirror-your-iphone-while-mirroring-your-mac-on-apple-vision-pro-222021905.html?src=rss
"How do you shorten Apple Intelligence?" That’s the question I’ve asked several Apple employees at WWDC 2024, and their practiced responses have become comically absurd.
“We just say Apple Intelligence,” they tell me. “Yah, but do you say that every time? The AI acronym is right there!” I’d retort. The usual response is a stiff smile and clenched teeth, like a human programming error in real-time. (Yes, I'm aware it's just overly aggressive media training in action.) One person suggested they also say "personal intelligence" — yes, a phrase that's longer than Apple Intelligence.
There's no doubt Apple Intelligence means many things to the company. It's an effort to compete with Microsoft's (still unproven) Copilot and Google's Gemini. It's a way to make Apple seem "hip" with ChatGPT. And it should enable a slew of new features for consumers. But Apple Intelligence is never "AI" to Apple.
Normally, I'd chalk this up to a silly branding quirk. But it becomes a problem as we cover Apple Intelligence. It's a long phrase that's just begging to be shortened to "AI," but then how do you distinguish that abbreviation from ChatGPT, Copilot and the general concept of AI? During the WWDC 2024 keynote, Apple only mentioned the phrase "artificial intelligence" three times: Twice while referring to its previous AI-powered features, and another while referring to "other artificial intelligence tools" like ChatGPT.
At this point, I've just decided to call Apple Intelligence "Apple AI." It's shorter and it differentiates the product from competitors. And yes, it just means "Apple Apple Intelligence," but everyone still says "ATM machine" and "PIN number." It's not my fault Apple decided to co-opt the acronym "AI."
Apple Intelligence is coming, but not to every iPhone out there. In fact, you'll need to have a device with an A17 Pro processor or M-series chip to use many of the features unveiled during the Apple Intelligence portion of WWDC 2024. That means only iPhone 15 Pro owners (and those with an M-series iPad or MacBook) will get the iOS 18-related Apple Intelligence (AI?) updates like Genmoji, Image Playground, the redesigned Siri and Writing Tools. Then there are things like Math Notes and Smart Script on iPadOS 18 and the new features in Messages coming via iOS 18 that will be arriving for anyone that can upgrade to the latest platforms. It's confusing, and the best way to anticipate what you're getting is to know what processor is in your iPhone, iPad or Mac.
Why won't the iPhone 14 Pro get Apple Intelligence?
It's not evident exactly why older devices using an A16 chip (like the iPhone 14 Pro) won't work with Apple Intelligence, given its neural engine seems more than capable compared to the M1. A closer look at the specs sheets of those two processors show that the main differences appear to be in memory and GPU prowess. Specifically, the A16 Bionic can only support a maximum of 6GB of RAM onboard while the M1 starts at 8GB and goes up to 16GB. In fact, all the supported devices have at least 8GB of RAM and that could hint at why your iPhone 14 Pro will not be able to handle making Genmojis, perhaps.
Though it might not seem quite fair that owners of a relatively recent iPhone won't get to use Apple Intelligence features, you'll still be getting a healthy amount of updates via iOS 18. Here's a quick breakdown of what is coming via iOS 18, and what's only coming if your iPhone supports Apple Intelligence.
What iOS 18 features will be coming to iPhones?
Basically everything described during the iOS portion of yesterday's WWDC 2024 keynote is coming to all iPhones (that can update to iOS 18). That includes the customizable home screen, Control Center, dedicated Passwords app, redesigned Photos app, new Tapback emoji reactions, text effects, scheduled sending and more. Messages via Satellite is only coming to iPhone 14 or newer, and you'll be able to send text messages, emojis and Tapbacks, but not images or videos.
You'll also be tied to the same satellite service plan that you got at the time of your purchase of an iPhone 14. If you bought your iPhone 14 in January 2024, you received a free two-year subscription to be able to use Emergency SOS via Satellite and other satellite communication features that now include texting. That means that to continue texting people via satellite after January 2026, you'll need to start paying for a plan.
There are a whole host of updates coming with iOS 18 that Apple didn't quite cover in its keynote either, and I'll be putting up a separate guide about that in a bit. But suffice to say that apps like Maps, Safari, Calendar and Journal are getting new functions that, together with the other changes mentioned so far, add up to a meaty OS upgrade.
What Apple Intelligence features are older devices missing out on?
In short, all of them. If you have an iPhone 15 Pro or an iPad (or Mac) with an M-series chip, you'll get a redesigned Siri, Genmoji and Image Playground, as well as writing tools baked into the system. That means tools like proofreading, summarizing or helping you adjust your tone in apps like Mail, Notes and Keynote are limited to the AI-supported devices. If you don't have one of those, you'll get none of this.
The redesigned Siri, which is only coming through Apple Intelligence, will be able to understand what's on your screen to contextually answer your queries. If you've been texting with your friend about which baseball player is the best, you can ask Siri (by long pressing the power button or just saying Hey Siri) "How many homeruns has he done?" The assistant will know who "he" is in this context, and understand you're referring to the athlete, not the friend you're chatting with.
Apple Intelligence is also what brings the ability to type to Siri — and you can invoke this keyboard to talk to the assistant by double tapping the bottom of the screen.
This also means that new glowing edge animation that appears when Siri is triggered is limited to the Apple Intelligence-supported devices. You'll still be looking at that little orb at the bottom of your screen when you talk to the assistant on an iPhone 14 Pro or older.
This article originally appeared on Engadget at https://www.engadget.com/apple-intelligence-what-devices-and-features-will-actually-be-supported-185850732.html?src=rss
The reason behind the change is to provide "a consistent sign-in experience across Apple services and devices," the company wrote in a blog post. Apple Account "relies on a user's existing credentials," so you won't have to change anything.
The betas of the new operating systems already use the term Apple Account, but MacRumors notes that Apple ID is still used in some places, such as the account sign-in page on Apple's website. The company is most likely going to complete the Apple Account transition by the time it rolls out the latest major public versions of the operating systems (which also include tvOS and visionOS) this fall.
Apple’s annual developer shindig kicked off with its traditional keynote outlining all the new tricks its products will soon do. There are big changes for iOS 18, iPadOS 18, macOS Sequoia and watchOS 11, not to mention visionOS 2. Some highlights include a standalone Passwords app, better health metrics on the Watch and Apple Intelligence, its own spin on AI. There’s more to learn about, so keep reading to learn all the biggest stories from the show.
Apple has finally bowed to pressure, bringing AI to its devices in the form of Apple Intelligence, powered by OpenAI. The system will bolster Siri, offering its generative AI smarts to write emails, summarize news articles and offer finer-grain control of your apps. It’ll be interesting to see, given Apple’s long-held distaste for machine learning gimmicks, if this can win where Google and Microsoft have floundered.
Apple's spin on AI is finally here, and it already seems smarter than Microsoft Copilot and Google Bard. Apple Intelligence focuses on privacy and "personal intelligence," with a bit of an assist from ChatGPT. While we haven't tested it ourselves yet, Apple appears to be avoiding the pitfalls of Microsoft's Recall feature, as well as Google Bard's unfortunate early gaffes. The company isn't trying to capture everything you're doing on your computer, and it's being careful about how it's using larger AI models like ChatGPT.
Shortly after the WWDC 2024 keynote ended, Engadget's Cherlynn Low and Devindra Hardawar discussed why they think Apple is taking a more thoughtful approach to AI.
This article originally appeared on Engadget at https://www.engadget.com/how-apple-intelligence-could-avoid-microsoft-and-googles-ai-mistakes-000751533.html?src=rss
For years, Apple has touted privacy as its major advantage over rivals like Google and Microsoft. Instead of relying on cloud processing to improve or organize your images, which requires sending your photos to Google's servers, Apple handles those tasks directly on your device. But with the advent of Apple Intelligence, the company's take on artificial intelligence, the company is stepping out of its comfort zone with "Private Cloud Compute." It says "private" right in the name, so it has to be secure, right?
While Apple AI will run some models locally, it will occasionally have to send data to Apple's servers for complex requests. So how is the company squaring this with its previous security stance?
According to Craig Federighi, Apple's SVP of Software Engineering, the company is being very careful about how its sending your data to its servers. "You're putting a lot of faith in the cloud... with Private Cloud Compute, the stakes are even higher," he said in a WWDC 2024 conversation with Apple's AI head, John Giannandrea, and YouTube influencer iJustine.
During the WWDC keynote, Federighi showed off how Apple AI could help him reschedule a meeting and determine if he could still attend his daughter's dance recital. Apple AI was able to determine who his daughter actually was, where her event was located, and the estimated travel time from his meeting.
Federighi says Apple isn't sending all of your data to the cloud, instead it's only uploading the most important bits of information relevant to your Apple AI query. Additionally, your server request is anonymous, since it's using the same IP masking technology as iCloud Private Relay. Federighi also noted that Apple's cloud servers have no permanent storage and don't have the ability to keep logs.
To make things even more secure, Federighi says Private Cloud Compute servers are running software with published images for security researchers to audit. Apple Intelligence devices can only talk with servers running those approved images — if there are any changes to the servers, the local devices will also need to be updated to see them.
That process may a bit restrictive, but that's precisely the point. Federighi calls it "a step up" in the level of trust you can have with server computing. "It's essential that you know no one—not Apple, not anyone else, can access the information used to process your request," he said.
This article originally appeared on Engadget at https://www.engadget.com/how-does-apple-send-your-data-to-its-cloud-ai-servers-very-carefully-it-claims-233312425.html?src=rss
Apple is integrating GPT-4o, the large language model that powers ChatGPT into iOS 18, iPadOS 18 and MacOS Sequioa thanks to a partnership with OpenAI announced at WWDC, the company’s annual developer conference, on Monday. But shortly after the keynote ended, Craig Federighi, Apple’s senior vice president of software engineering said that the company might also bake in Gemini, Google’s family of large language model, into its operating systems.
“We want to enable users ultimately to choose the models they want, maybe Google Gemini in the future,” Federighi said in a conversation with YouTuber iJustine after the keynote. “Nothing to announce right now.”
The news is notable because even though Apple did mention plans to add more AI models into its operating system in the keynote, it didn’t mention Gemini specifically. Letting people choose the AI model they want on their devices instead of simply foisting one on them would give Apple devices a level of customization that none of its competitors like Google or Samsung have.
This article originally appeared on Engadget at https://www.engadget.com/apple-may-integrate-googles-gemini-ai-into-ios-in-the-future-220240081.html?src=rss
Currently, Unicode 15.1 supports just shy of 3,800 various emoji. But for everyone out there that for some reason thinks that's not nearly enough, today at WWDC 2024, Apple announced the ability to use AI to generate unique emoji based on your prompts.
Called Genmoji, which looks to be an awful portmanteu of the words "generate" and "emoji," these new creations are powered by Apple Intelligence, which is a new collection of AI features coming to the iPhone, iPad and Mac sometime later this year. Similar to creating images with services like Midjourney and Dall-E, users will be able to whip up custom emoji by inputting specific prompts. Once made, they can be shared with others as stickers, reactions in a Tapback or simply embedded in-line in messages.
Apple
While the feature isn't expected to be officially available until later this fall, there don't seem to be any major limitations to what you can dream up. In a teaser at WWDC, Apple showed examples like a smiley face with cucumbers over its eyes and a T-rex riding a skateboard while wearing a tutu. That said knowing Apple, there is sure to be some restrictions for Genmoji made using more graphic prompts like guns or blood.
Now on some level, it could be fun to razz your friends with Genmoji based on their latest mishap. But at the same time, part of the magic of emoji has always been being able to convey a message using the limited number of icons while still getting your point across. Also, it's truly hard to imagine how much added value a bagel with lox Genmoji (see the lead picture above) provides compared to the classic image. But since AI is so hot right now, seeing Apple Intelligence get applied to emoji was probably an inevitability. 🤷♂️
This article originally appeared on Engadget at https://www.engadget.com/in-case-there-werent-enough-emoji-already-apples-genmoji-uses-ai-to-generate-even-more-200011608.html?src=rss
WWDC is always where we learn about the year's biggest updates to Apple's operating systems. Given that the iPhone is Apple's most important product, it's no surprise that iOS takes up a major chunk of the attention each June. WWDC 2024 is no exception, as Apple had a ton of new features and updates to go over, many of which concerned AI (or Apple Intelligence, as the company is calling it).
As part of this new era, Siri is getting a major overhaul. The voice assistant will be able to get much more done as it will be more deeply integrated into your apps and have more contextual awareness. You'll be able to use Siri for things like photo editing, rewriting emails and prioritizing notifications. There's the option to type your Siri commands as well, which is a nice accessibility upgrade.
The language models will be able to rewrite, proofread and summarize text for you in apps such as Mail, Notes, Safari, Pages and Keynote, as well as third-party apps. Image generation will be available too in sketch, illustration and animation styles — so you won't be able to generate realistic images using Apple's tech. Image generation is built into apps such as Notes, Freeform and Pages.
Apple
You'll be able to use natural language prompts to search for photos of certain people. There's also the promise of more intelligent search in the browser and (at long last!) transcriptions of calls and Voice Memos to catch up to a feature Pixel devices have had for a while.
Although Apple Intelligence will pull from your personal information to make sure the systems are applicable to you, it will be aware of your personal data without collecting it, according to Apple software engineering SVP Craig Federighi.
Apple is employing a blend of both on-device and cloud-based AI processing. Your iPhone will handle as much of the legwork locally as it can, with more complex operations being sent to Apple's processing centers. That raises some questions about privacy, one of Apple's central selling points to would-be customers (especially after Apple openly took digs at rivals that use cloud servers for data processing), but Federighi gave some answers to those.
For one thing, the company has established something called Private Cloud Compute. Apple says the aim is to wield the power of the cloud for more advanced AI processing while making sure your iPhone data remains safe and secure.
To use these new features on iOS, you'll need a device that has at least an Apple A17 Pro chipset — in other words, an iPhone 15 Pro or one of this year's upcoming models. Apple Intelligence features will be available for free on iOS 18, iPadOS 18 and macOS Sequoia this fall in US English.
Customization
Apple also focused on customization. You'll be able to make the home screen look more like your own vibe than ever. You'll be able to change the colors of app icons, which can automatically get a different look when you have Dark Mode enabled. Your apps won't need to be locked within a rigid grid anymore either. Your home screen can look almost as messy as you want.
Control Center is getting some big changes. You'll be able to access things like media playback and smart home controls from here. Developers will be able to take advantage of this and offer Control Center management for their apps too. It'll be possible to pin custom controls to the home screen for your most frequently used apps and functions (so you'll be able to switch out the flashlight control for something else, for instance). Custom controls can also be mapped to the physical action button as you see fit.
Messages
When it comes to Messages, there's another nice update in the form of scheduling. When you're catching up on things late at night, you'll be able to time a message to send in the morning, for instance. Those who use emoji reactions in Messages (aka Tapbacks) are getting a nice update too. You'll be able to choose from any emoji instead of the five basic reactions Apple has offered for years.
Text effects (the little animations that show up when you type a certain phrase) are getting an upgrade as well. Meanwhile, Apple will offer satellite messaging support on iPhone 14 and later devices. That's a major update, especially for those who go off the grid often, as messaging will be more useful beyond emergencies. You'll be able to send and receive texts, emoji and Tapbacks via iMessage and SMS.
Apple
There's also a key AI-related change coming to the Messages app. Your iPhone will be able to generate custom emoji based on what you're writing. You might need a PhD in semiotics to decipher some of the "Genmoji" you receive.
There's one other big update for Messages in iOS 18: Apple will add support for RCS (Rich Communication Services) to Messages. RCS is a more advanced messaging protocol than SMS. It enables better media sharing, Wi-Fi messaging, group chats and, crucially, better security thanks to end-to-end encryption. It should allow for more secure, media-rich messaging between iPhone and Android devices.
Apple for years refused to support RCS in order to keep iMessage a walled garden. But after persistent pressure from Google — and more importantly, new EU laws coming into force — Apple promised to start supporting RCS sometime this year. Apple, which is never petty about anything ever, almost completely glossed over the addition of RCS in its the keynote, relegating it to a three-word mention.
Apps
The Photos app is getting is biggest redesign ever, Apple says. It's getting a visual overhaul and one of the key aims is to help you find your photos more easily (filtering out screenshots should be a breeze, for one thing). Your snaps will be organized around memorable moments. Apple Intelligence will power features like Clean Up, which is effectively Apple's version of Google's Magic Eraser tool.
The Mail app will soon be able to categorize emails — just like Gmail has for years. Apple will also organize emails by sender and make it easy to archive or delete every email you get from a certain company. This will be optional, so you can stick to a single inbox if you wish.
Maps, meanwhile, will offer more detailed topographic maps to bring the app more into line with the Apple Watch. This should be useful for planning routes while hiking. As for the Journal app, it will now show stats for things like a daily writing streak.
Wallet is getting a handy new feature that will allow you to send cash without having to exchange personal details. All you need to do is simply tap your phones together. This could be handy for splitting the bill after dinner with a new acquaintance. Tickets saved to Wallet can now include stadium details, recommended Apple Music playlists and other information.
Calendar can show events and tasks from Reminders app, while the Notes app can automatically solve any math equations you enter. The Home app will offer guest access
Another welcome change is the introduction of a dedicated Passwords app. This will work across iOS, iPadOS, visionOS and macOS and make it easier to find saved passwords from iCloud Keychain. Even better, there will also be Windows support via the iCloud for Windows app. Hopefully, this will make it easier for everyone to use a password manager and have unique passwords for every single account — something we strongly recommend.
This being Apple, of course it has some new privacy controls for apps in iOS 18. You'll have the option to lock apps behind an authentication method (i.e., your PIN or Face ID) so that when you pass your iPhone to someone to show them your camera roll, they can't go snooping in your Messages. You can also hide apps — perhaps ones you use for dating — in a locked folder too. Elsewhere on the app privacy front, you'll be able to decide which of your contacts an app has access to instead of giving them absolutely everyone's phone numbers and personal information.
Elsewhere, Apple is bringing Game Mode to iPhone. This aims to boost performance by minimizing background activity, while controllers and AirPods should be more responsive.
During an emergency call, dispatchers will be able to send a request to turn it into a video call or to share media from the camera roll. This, Apple suggests, can help first responders better prepare for an incident. The Health app, meanwhile, has been redesigned to make it easier to access vital info in an emergency.
On the accessibility front, users will be able to navigate their iPhone using eye tracking. You'll be able to set up a custom sound that will trigger tasks using the Vocal Shortcut feature, while Music Haptics aims to give those who are deaf or hard of hearing another way to experience music via the Taptic Engine.
A developer beta of iOS 18 is available today and a public version will roll out in July. As always, iOS 18 will roll out to all eligible iPhones this fall.
If your device can run iOS 17, you'll be able to install iOS 18. The list of eligible devices includes the iPhone 11 and later lineups, along with iPhone Xs, Xs Max, Xr and the second-gen SE.
This article originally appeared on Engadget at https://www.engadget.com/ios-18-gets-a-revamped-control-center-and-loads-of-home-screen-customization-options-172350046.html?src=rss