As Apple races to add more advanced features to its smartphones, other companies are moving in the opposite direction. One such company, Light, just unveiled its latest and greatest minimalist phone, offering a stark contrast to the app-filled gadgets we all stare at way, way too much.
The Light Phone III eschews the e-paper screen found with the previous models, opting for a sleek black-and-white OLED. It still doesn’t offer any access to social media, the internet or even email. This is, first and foremost, a phone. It’s not completely bare, however, as the Light Phone III includes a camera and an embedded NFC chip for making payments. It also allows access to navigation tools, a simple music player, texting, voice notes, a calendar, a timer and an alarm.
It’s larger than previous generations, with a cute form factor that can only be described as “the Rabbit R1, but serious.” There’s no half-baked AI integration here, however, but there is an analog scroll wheel on the side for navigation and for making adjustments. It’s around the width of modern iPhones, but much shorter. The team says this was on purpose, so as to make it easier to text while holding the device vertically.
Light
Light has also paid for private access to navigational information, so Google won’t get ahold of any tracking data. The camera looks to be a simple point-and-shoot that can’t compete with modern offerings from Google or Apple, but will get the job done.
Other specs include 6GB of RAM, up from 1GB in the Light Phone II, 128GB of memory and a newer Qualcomm chip. There's a fingerprint ID on the power button and loudspeakers at the bottom. The battery is much larger than what was included with previous iterations and its user-replaceable.
Now, here’s the bad news. The Light Phone III is $800, which is more than twice the price of the $300 Light Phone II. That’s a whole lot of cheddar for what amounts to, well, a phone. Deleting all of the intrusive apps on your current smartphone costs $0, though that’s easier said than done.
However, Light is running a deal for early adopters. The phone’s available for $400 for a limited time to crowdfund mass production. Models should ship out in January.
This article originally appeared on Engadget at https://www.engadget.com/light-unveils-a-new-minimalist-phone-with-a-black-and-white-oled-screen-163225012.html?src=rss
Google is bringing some new and upgraded features to its hardware lineup as part of the June Pixel feature drop. The update will start rolling out today to all supported Pixel phones, tablets and smartwatches.
First of all, Google is expanding access to its Gemini Nano generative AI model, which will now be available on Pixel 8 and Pixel 8a. Until now, it's only been present on the Pixel 8 Pro. At the outset, the model will be available as a developer option on Pixel 8 and 8a and this can be enabled through the device settings.
The Recorder app is getting an AI-powered boost too. Google says you'll get more detailed summaries of recordings on Pixel 8, Pixel 8 Pro and Pixel 8a compared with earlier versions of the app. Speaker labels will be applied if you have a Pixel 6 or newer, and there's the option to export transcripts to text files and Google Docs.
Handily, you'll now be able to connect a Pixel 8a, Pixel 8 or Pixel 8 Pro to an external display via USB-C to view videos and photos on a larger screen. On the Pixel Fold and sixth-gen and later Pro Pixels, you'll be able to choose which lens to use while taking photos.
Meanwhile, Pixel 6 and newer models will support reverse phone number lookup directly from the call log. Those devices and Pixel Tablet will also be able to "automatically identify the best moment from your photo in HDR+ with just a single shutter press," Google says, which will seemingly make it easier to take a snap of your smiling face when it's in focus.
Pixel Watch is getting some updates too, including car crash detection on the second-gen model. If your wearable detects that you may have been in a severe car accident, it will ask if you're okay. If you don't respond or you need help, it can contact emergency services for you. The feature will also notify your contacts and provide them with your real-time location. Other updates include fall detection improvements and PayPal access via Google Wallet on both generations of the Pixel Watch.
Google says there will be easier access to the Google Home App on devices running Wear OS 3 and above too. You'll be able to access and control a smart home device from your watch face, for instance, and access favorite devices with a swipe. The Google Home Favorites widget will be available on phones and tablets running Android 12 and later too, though in public preview for now.
Last but not least, when a Pixel Tablet is docked in hub mode, it can receive richer notifications from a Nest doorbell. You'll be able to see who's at the door, chat with them via two-way talk or send a quick response.
This article originally appeared on Engadget at https://www.engadget.com/google-brings-gemini-nano-to-more-pixel-devices-and-enhances-recorder-summaries-160917592.html?src=rss
WWDC is always where we learn about the year's biggest updates to Apple's operating systems. Given that the iPhone is Apple's most important product, it's no surprise that iOS takes up a major chunk of the attention each June. WWDC 2024 is no exception, as Apple had a ton of new features and updates to go over, many of which concerned AI (or Apple Intelligence, as the company is calling it).
As part of this new era, Siri is getting a major overhaul. The voice assistant will be able to get much more done as it will be more deeply integrated into your apps and have more contextual awareness. You'll be able to use Siri for things like photo editing, rewriting emails and prioritizing notifications. There's the option to type your Siri commands as well, which is a nice accessibility upgrade.
The language models will be able to rewrite, proofread and summarize text for you in apps such as Mail, Notes, Safari, Pages and Keynote, as well as third-party apps. Image generation will be available too in sketch, illustration and animation styles — so you won't be able to generate realistic images using Apple's tech. Image generation is built into apps such as Notes, Freeform and Pages.
Apple
You'll be able to use natural language prompts to search for photos of certain people. There's also the promise of more intelligent search in the browser and (at long last!) transcriptions of calls and Voice Memos to catch up to a feature Pixel devices have had for a while.
Although Apple Intelligence will pull from your personal information to make sure the systems are applicable to you, it will be aware of your personal data without collecting it, according to Apple software engineering SVP Craig Federighi.
Apple is employing a blend of both on-device and cloud-based AI processing. Your iPhone will handle as much of the legwork locally as it can, with more complex operations being sent to Apple's processing centers. That raises some questions about privacy, one of Apple's central selling points to would-be customers (especially after Apple openly took digs at rivals that use cloud servers for data processing), but Federighi gave some answers to those.
For one thing, the company has established something called Private Cloud Compute. Apple says the aim is to wield the power of the cloud for more advanced AI processing while making sure your iPhone data remains safe and secure.
To use these new features on iOS, you'll need a device that has at least an Apple A17 Pro chipset — in other words, an iPhone 15 Pro or one of this year's upcoming models. Apple Intelligence features will be available for free on iOS 18, iPadOS 18 and macOS Sequoia this fall in US English.
Customization
Apple also focused on customization. You'll be able to make the home screen look more like your own vibe than ever. You'll be able to change the colors of app icons, which can automatically get a different look when you have Dark Mode enabled. Your apps won't need to be locked within a rigid grid anymore either. Your home screen can look almost as messy as you want.
Control Center is getting some big changes. You'll be able to access things like media playback and smart home controls from here. Developers will be able to take advantage of this and offer Control Center management for their apps too. It'll be possible to pin custom controls to the home screen for your most frequently used apps and functions (so you'll be able to switch out the flashlight control for something else, for instance). Custom controls can also be mapped to the physical action button as you see fit.
Messages
When it comes to Messages, there's another nice update in the form of scheduling. When you're catching up on things late at night, you'll be able to time a message to send in the morning, for instance. Those who use emoji reactions in Messages (aka Tapbacks) are getting a nice update too. You'll be able to choose from any emoji instead of the five basic reactions Apple has offered for years.
Text effects (the little animations that show up when you type a certain phrase) are getting an upgrade as well. Meanwhile, Apple will offer satellite messaging support on iPhone 14 and later devices. That's a major update, especially for those who go off the grid often, as messaging will be more useful beyond emergencies. You'll be able to send and receive texts, emoji and Tapbacks via iMessage and SMS.
Apple
There's also a key AI-related change coming to the Messages app. Your iPhone will be able to generate custom emoji based on what you're writing. You might need a PhD in semiotics to decipher some of the "Genmoji" you receive.
There's one other big update for Messages in iOS 18: Apple will add support for RCS (Rich Communication Services) to Messages. RCS is a more advanced messaging protocol than SMS. It enables better media sharing, Wi-Fi messaging, group chats and, crucially, better security thanks to end-to-end encryption. It should allow for more secure, media-rich messaging between iPhone and Android devices.
Apple for years refused to support RCS in order to keep iMessage a walled garden. But after persistent pressure from Google — and more importantly, new EU laws coming into force — Apple promised to start supporting RCS sometime this year. Apple, which is never petty about anything ever, almost completely glossed over the addition of RCS in its the keynote, relegating it to a three-word mention.
Apps
The Photos app is getting is biggest redesign ever, Apple says. It's getting a visual overhaul and one of the key aims is to help you find your photos more easily (filtering out screenshots should be a breeze, for one thing). Your snaps will be organized around memorable moments. Apple Intelligence will power features like Clean Up, which is effectively Apple's version of Google's Magic Eraser tool.
The Mail app will soon be able to categorize emails — just like Gmail has for years. Apple will also organize emails by sender and make it easy to archive or delete every email you get from a certain company. This will be optional, so you can stick to a single inbox if you wish.
Maps, meanwhile, will offer more detailed topographic maps to bring the app more into line with the Apple Watch. This should be useful for planning routes while hiking. As for the Journal app, it will now show stats for things like a daily writing streak.
Wallet is getting a handy new feature that will allow you to send cash without having to exchange personal details. All you need to do is simply tap your phones together. This could be handy for splitting the bill after dinner with a new acquaintance. Tickets saved to Wallet can now include stadium details, recommended Apple Music playlists and other information.
Calendar can show events and tasks from Reminders app, while the Notes app can automatically solve any math equations you enter. The Home app will offer guest access
Another welcome change is the introduction of a dedicated Passwords app. This will work across iOS, iPadOS, visionOS and macOS and make it easier to find saved passwords from iCloud Keychain. Even better, there will also be Windows support via the iCloud for Windows app. Hopefully, this will make it easier for everyone to use a password manager and have unique passwords for every single account — something we strongly recommend.
This being Apple, of course it has some new privacy controls for apps in iOS 18. You'll have the option to lock apps behind an authentication method (i.e., your PIN or Face ID) so that when you pass your iPhone to someone to show them your camera roll, they can't go snooping in your Messages. You can also hide apps — perhaps ones you use for dating — in a locked folder too. Elsewhere on the app privacy front, you'll be able to decide which of your contacts an app has access to instead of giving them absolutely everyone's phone numbers and personal information.
Elsewhere, Apple is bringing Game Mode to iPhone. This aims to boost performance by minimizing background activity, while controllers and AirPods should be more responsive.
During an emergency call, dispatchers will be able to send a request to turn it into a video call or to share media from the camera roll. This, Apple suggests, can help first responders better prepare for an incident. The Health app, meanwhile, has been redesigned to make it easier to access vital info in an emergency.
On the accessibility front, users will be able to navigate their iPhone using eye tracking. You'll be able to set up a custom sound that will trigger tasks using the Vocal Shortcut feature, while Music Haptics aims to give those who are deaf or hard of hearing another way to experience music via the Taptic Engine.
A developer beta of iOS 18 is available today and a public version will roll out in July. As always, iOS 18 will roll out to all eligible iPhones this fall.
If your device can run iOS 17, you'll be able to install iOS 18. The list of eligible devices includes the iPhone 11 and later lineups, along with iPhone Xs, Xs Max, Xr and the second-gen SE.
This article originally appeared on Engadget at https://www.engadget.com/ios-18-gets-a-revamped-control-center-and-loads-of-home-screen-customization-options-172350046.html?src=rss
Apple is going all in on AI in the most Apple way possible. At WWDC, Apple's annual conference for developers, the company revealed Apple Intelligence, an Apple-branded version of AI that is more focused on infusing its software with the technology and upgrading existing apps to make them more useful. Apple Intelligence will be powered both by Apple’s homegrown tech as well as a partnership with OpenAI, the maker of ChatGPT, Apple announced.
One of Apple’s biggest AI upgrades is coming to Siri. The company’s built-in voice assistant will now be powered by large language models, the tech that underlies all modern-day generative AI. Siri, which has languished over the years, may become more useful now that it can interact more closely with Apple’s operating systems and apps. You can, for instance, ask Siri to give you a summary of a news article, delete an email or edit a photo. The assistant will also be able to take more than 100 actions, such as finding photos based on a general description of their contents, or extracting personal information from a picture of your ID to automatically fill in forms online. Finally, you can type your question into Siri instead of using your voice.
Apple Intelligence will be highlight relevant content in Safari as you browse. You’ll also be able to use it to quickly catch up on priority notifications. And just like Gmail and Outlook, your devices will be able create fleshed out responses to emails and text messages on your behalf. Apple also announced a suite of new features called Writing Tools that uses AI to write, rewrite, proofread and summarize text across the system, useful to draft emails and blog posts, for instance.
Apple
Apple Intelligence will use AI to record, transcribe and summarize your phone calls, rivaling third-party transcription services like Otter. All participants are automatically notified when you start recording, and a transcript of the conversation's main points is automatically generated at the end. You can also use AI to generate images, stickers and custom emoji (which Apple calls Genmoji) in any app.
Apple
Thanks to its partnership with OpenAI, Apple also is baking the base version of GPT-4o — OpenAI's newest large language model — into Siri as well as Writing Tools. Siri can act as an intermediary for user queries to GTP-4o, and Writing Tools can use the LLM to help compose text. Apple claims unless you connect your paid ChatGPT account to your Apple device, the company won't store your requests or other identifying information like your IP address.
Apple Intelligence, which the company says will be in beta at launch, will be restricted to the iPhone 15 Pro and Pro Max and iPads and Macs with M1 (or higher) chipsets. Your device will also need to be set to US English.
Apple's AI features are a long time coming. Generative AI has shaken up Silicon Valley ever since OpenAI launched ChatGPT around the end of 2022. Since then, Apple’s rivals like Google, Samsung and Microsoft, as well as companies like Meta have raced to integrate AI features in all their primary products. Last month, Google announced that AI would be a cornerstone of the next version of Android and made major AI-powered changes to its search engine. Samsung, Apple’s primary smartphone competitor, added AI features to its phones earlier this year that can translate calls in real time and edit photos. Microsoft, too, unveiled AI-powered Copilot PCs, aimed at infusing Windows with AI features that include live captioning, image editing, and beefing up systemwide search.
Many of the new features coming to watchOS 11 are fitness-focused, with a new Training Load feature for the Activities app, pregnancy stats in Cycle Tracking and a brand new Vitals app. Apple's Worldwide Developers Conference (WWDC) on Monday showcased all the things we can expect when the operating system update for Apple Watches hits this fall.
The new app, Vitals, synthesizes data gathered overnight to give you a better understanding of your overall health. Apple Watch sensors will monitor details like heart rate, wrist temperature and respiration and combine that with data from the Apple Heart and Movement study so it can track changes over time and give you a heads up when things look outside their normal range. From what we've seen, Apple Watch batteries (outside of the Ultra model) don't quite make it through a night after a full day of use, so it'll be interesting to see how useful the new app will prove.
The Activity app is getting a few new abilities, including Training Load that measures the intensity and duration of your workout to see how it's impacting you over time. Using data from GPS, sensor metrics like heart rate and pace, as well as your personal details like age and weight, the app will determine a rating for each workout from one (easy) to 10 (all out). And if you don't agree with the assessment, you can manually adjust it. In a post workout summary, you can see if you're training above your average or below it.
The Workout app within the Apple Watch will now include a Check In button to let friends and family know when you're heading out and back home safe from a run.
Using your iPhone, you can set more customizations in the Activity app too. Now you can adjust your goals for the day of the week and set rest days while still hanging onto your streaks. The summary screen in the Fitness app on iPhone is customizable too.
The Cycle Tracking app can now give you insights during pregnancy, showing applicable health data as it tracks the duration of your pregnancy. This includes a look at your heart rate, which typically rises during pregnancy, as well as a running timeline of the gestational age. Walking stability alerts during the third trimester can also help you avoid falling issues that sometimes arise. Mental health will also get some attention, with reminders to take a monthly assessments to keep you aware of issues that pregnancy and postpartum conditions can trigger, like depression.
Apple
Smart Stack — the rolling list of active app widgets you access by swiping up from the bottom of the screen — is getting some updates as well. Now instead of just active apps, the Smart Stack will include time sensitive widgets like precipitation alerts before it rains or the translate widget when traveling. That alerted us to the fact that the Translate app is coming to the Apple Watch, with translation support for 20 languages. Live Activities and Check In will come to the Smart Stack, too.
Apple is opening up access to the Smart Stack to third parties. So, for example, you'll see that your Uber is arriving in the widget carousel. Developers will have access to the Double Tap feature as well, for hands-free interaction with more apps.
If you like the Photos watch face, there's good news here too. Machine learning models will help you find the best photos to feature by identifying and scoring the images of your friends and family based on facial expressions and image composition. It can even automatically crop and frame them for you.
Almost as an afterthought, Apple also mentioned the advent of turn-by-turn directions for hiking and walking routes you created yourself.
If you're itching to try out the new features for yourself, you can do so next month if you're part of Apple's beta software program. Developers gained access as of the announcement. And for regular folk, watchOS 11 will be available as a free software update this fall for Apple Watch Series 6 and newer.
This article originally appeared on Engadget at https://www.engadget.com/watchos-11-includes-a-new-vitals-app-to-see-all-your-key-health-metrics-175600647.html?src=rss
To start, iPadOS is getting deeper customization options for your home screen including the ability to put app icons pretty much wherever you want. Apple's Control Center has also been expanded with support for creating multiple lists and views, resizing and rearranging icons and more. There's also a new floating tab bar that makes it easy to navigate between apps, which can be further tuned to remember your favorites. Next, SharePlay is getting the ability to draw diagrams on someone else's iPad or control someone else's device remotely (with permission) for times like when you need to help troubleshoot.
Apple
After years of requests, the iPad is also getting its own version of the Calculator app, which includes a new Math Notes feature that supports the Apple Pencil and the ability to input handwritten formulas. Math Notes will even update formulas in real time or you can save them in case you want to revisit things later. Alternatively, the Smart Script tool in the Notes app uses machine learning to make your notes less messy and easier to edit.
General privacy is also being upgraded with a new feature that lets you lock an app. This allows a friend or family member to borrow your device without giving them full access to everything on your tablet. Alternatively, there’s also a new hidden apps folder so you can stash sensitive software in a more secretive way.
Apple
In Messages, Tapbacks are now compatible with all your emoji. Furthermore, you'll be able to schedule messages or send texts via satellite in case you aren't currently connected to Wi-Fi or a cellular network. Apple even says messages sent using satellite will feature end-to-end encryption.
The Mail and Photos apps are also getting similarly big revamps. Mail will feature new categorizations meant to make it easier to find specific types of offers or info (like plane flights). Meanwhile, the Photos app will sport an updated UI that will help you view specific types of images while hiding things like screenshots. And to better surface older photos and memories, there will be new categories like Recent Days and People and Pets to put similar types of pics all in a single collection.
Audio controls on iPads is also getting a boost with a new ability for Siri to understand gestures for “Yes” and “No” by either shaking or nodding your head while wearing AirPods. This should make it easier to provide Apple's digital assistant with simple responses in areas like a crowded bus or quiet waiting room where you might be uncomfortable talking aloud.
Apple
However, the biggest addition this year is that alongside all the iPad-specific features, Apple’s tablet OS is also getting Apple Intelligence. This covers many of the company’s new AI-powered features like the ability to create summaries of websites, proofread or rewrite emails or even generate new art based on your prompts.
Apple says that to make its AI more useful, features will be more personalized and contextual. That said, to help protect your privacy and security, the company claims it won’t build profiles or sell data to outside parties. Generally, Apple says it will use on-device processing for most of its tools, though some features require help from the cloud.
As its iconic digital assistant, Siri is getting a big refresh via Apple Intelligence too. This includes better natural language recognition and the ability to understand and remember context from one query to another. Siri will also be able to help you use your device, allowing you to ask your tablet how to perform certain tasks, search for files or control apps and features using your voice.
Apple
Some examples of what Apple Intelligence can do is highlight priority emails and put them at the top of your inbox so you don't miss important messages or events. Or if you're feeling more creative, you can use AI to create unique emoji (called Genmoji). And in photos, Apple Intelligence can help you edit images with things like the Clean Up tool. And for those who want the freedom to use other AI models, Apple is adding the option to integrate other services, the first of which will be Chat GPT.
Finally, other minor updates including a new Passwords app for stashing credentials across apps and websites, a new dedicated Game Mode with personalized spatial audio, expanded hiking results in Apple Maps and a new eye-tracking feature for improved accessibility.
This article originally appeared on Engadget at https://www.engadget.com/ipados-18s-smart-script-uses-machine-learning-to-make-your-handwriting-less-horrible-175306533.html?src=rss
Yesterday's Apple's Worldwide Developers Conference keynote teased a lot of what users can expect this fall when big iOS, iPadOS, macOS and watchOS updates hit their devices. Changes coming include RCS support in Messages, a new Passwords app, a revamped Calculator app for iPhone and iPad and a bunch of artificial intelligence (AI) infusions across the board with the new "Apple Intelligence" system. The latter will bring some of the biggest updates to Apple devices in years, including generative AI image creation, "Genmoji" custom emojis, text summarization and even some ChatGPT integration as well. If you weren't able to catch the news live, here's a rundown of everything announced at WWDC 2024.
Apple Intelligence
Apple
Apple revealed its plans to incorporate AI into its operating systems at WWDC this year. Dubbed "Apple Intelligence," this new generative AI system will appear in iOS and iPad 18 and macOS Sequoia in the form of (what Apple believes to be) practical tools that most people can use regularly. Those features include new writing tools that can help you rewrite, proofread and summarize things like emails and other messages, original emoji and image creation and more. Going hand-in-hand with original image generation is a new feature called Genmoji, which allows users to create their own unique emojis by typing in descriptions and requirements like "T-rex wearing a tutu on a surfboard."
Siri is getting an AI infusion, now that it will be powered in part by large language models. In addition to asking Siri to delete an email or edit a photo, users will also be able to ask the virtual assistant to summarize articles and webpages in Safari and even extract personal information from a picture of an ID so it can fill out an online form for them. The company emphasized the importance of "personal context" with Apple Intelligence, which will enable things like using natural language to search for photos that contain only specific family members or friends.
Apple highlighted how most Apple Intelligence actions will be done on-device to make the system as privacy-focused as possible. For queries that cannot be done locally, the work will be sent to Apple's processing centers. The company also created Private Cloud Compute, a feature that's supposed to utilize the cloud for more advanced AI processing while also making sure your data remains secure.
OpenAI's ChatGPT is also integrated into Apple Intelligence, allowing users to give Apple permission to share their queries with ChatGPT "when it might be helpful." Examples provided include asking for menu ideas that incorporate specific ingredients, or asking for decor advice while providing a photo of a space that needs sprucing up. ChatGPT will also work with the AI writing tools coming to iOS and iPadOS 18 in a new Compose feature. ChatGPT integration with iOS 18, iPadOS 18 and macOS Sequoia will roll out later this year, and apparently Apple intends to add support for other AI models in the future — meaning its partnership with OpenAI isn't a long-term exclusive.
iOS 18 and iPadOS 18
Apple
The next iPhone software update will roll out to users in the fall and, as expected, one of the biggest changes is support for Rich Communication Service, or RCS. The messaging protocol offers many improvements over SMS including end-to-end encryption, better media sharing and support for proper group chats. Apple previously stated it would adopt support for RCS in 2024 to comply with EU regulations, so it's unsurprising to see it mentioned in iOS 18's forthcoming features. Also new to Messages will be the ability to "tapback" reply using emojis and stickers, text formatting and effects and the ability to send messages via satellite.
iPhone users will have more control over their home screens in iOS 18 thanks to the fact that it will not be a locked grid system anymore. Users will be able to move app icons more freely, plus they'll be able to change app icon colors as well use a tint color picker. In terms of design and layout, this is one of the biggest changes to come to the iPhone's home screen in years and it gives iOS users similar features to those Android users have had for a long time. In the same vein, Control Center will be updated in iOS 18 to include more customization options, and will allow users to program quick controls from third-party apps in addition to the native options.
The Photos app is getting a big redesign in iOS 18, putting an emphasis on intelligently organized groups of photos that revolve around memories, trips and other big events. The new design ditches the old tabbed layout and will usher in a one-page design when you can view all of your photos individually, or view them by Collections. Users will also be able to filter out things like screenshots and receipts that would show up in a chronological format, but would otherwise mess up a tightly curated group of vacation photos.
A couple of new privacy features stand out in iOS 18, namely the ability to lock and hide apps. For the former, users can lock an app so sensitive information stays behind a Face-ID or Touch-ID wall, preventing those who you casually hand your iPhone to from seeing that information. Hiding an app, on the other hand, does exactly what you think: hides a program in a special hidden folder that others won't be able to see.
The Calculator app is getting a big overhaul in iOS 18, including improved unit conversions, a sidebar showing recent activity and integration with the Notes app. But what might be even more notable is the fact that the revamped Calculator app will not only be available on iPhones and Macs — it's coming to iPads for the first time as part of the iPadOS 18 update. Embedded within the iPadOS Calculator app is a new feature called Math Notes, which lets users write out math equations with the Apple Pencil and the app will solve many of them instantly.
iPadOS 18 will also feature a new Tab Bar, which looks similar to the Dynamic Island on iPhones. This bar makes it easier to access essential controls even when you're in apps, and depending on what you're doing, it can show up at the top of the screen or as a sidebar of sorts on the left of the display. The Notes app in iPadOS is getting another new feature called Smart Script, which will make users' handwriting more legible automatically.
macOS Sequoia
Apple
The next iteration of Apple's computer software will be called macOS Sequoia. In addition to many of the AI features also coming to iOS and iPadOS 18 as part of Apple Intelligence, the next macOS update will include iPhone mirroring, which lets users see and control their iPhone screen on a Mac screen. They'll be able to use their keyboard and trackpad to intact with the iPhone screen on their laptop, and they can even open iOS apps directly on their computers without picking up their iPhone at all.
A new Passwords app builds upon the technology of iCloud Keychain to save all of users' passwords and login credentials across devices and platforms (it will be available on Windows in addition to iOS and iPadOS). Along with standard passwords, the new app can save passkeys, verification codes and more, and give users the ability to securely share passwords with others.
Other updates coming in macOS Sequoia include a snap window arrangement tool with accompanying keyboard and menu shortcuts, Presenter Preview, which lets you see what you're about to share with call partners before they see it, and gaming upgrades like improved Windows porting capabilities using Gameporting Toolkit 2. Users will also get access to Image Playground in macOS Sequoia, Apple's AI image generator built into Apple Intelligence. It provides the ability to create AI-generated images in different styles, including animation, illustration and sketch.
watchOS 11
Apple
The next software update for the Apple Watch includes two big changes: Training Load and a new Vitals app. Training Load in watchOS 11 essentially uses many of the health and fitness metrics collected during workout tracking to estimate your effort level each time. Each workout will receive a rating from one (easy) to 10 (all out) that estimates how hard the user worked during that particular session.
The new Vitals app will show Apple Watch users how their captured health data, including heart rate, compares to baseline measurements. This will hopefully allow users to better understand when something might be off and outside the "normal" range.
The Activity app on iPhone is also getting an update to accompany watchOS 11, and will allow users to customize the data they see on the homepage so they can put their most important stats front and center. Cycle Tracking will also get an update to include more detailed pregnancy insights, including gestational age and information about the user's health metrics that may related to pregnancy (like heart rate fluctuations).
visionOS 2
Apple
Until now, Apple's Vision Pro headset has only been available in the US. That's changing soon as the company announced the device's rollout in additional countries including Australia, Canada, China, France, Germany, Japan, Singapore and the UK in the coming months. As far as the headset's software (visionOS) goes, Apple announced that visionOS 2 will add spatial photos, which adds depth to images in the Photos app, new UI gesture controls and improved Mac screen mirroring with support for higher resolutions and display sizes.
AirPods Pro audio updates
Apple
Apple briefly mentioned some software updates coming to AirPods Pro, including improved Voice Isolation, which should help the buds better pick up a user’s voice in noise environments. A new Siri Interaction is coming to AirPods Pro as well: a silent head-nod will allow users to answer an incoming call without saying a word out loud to Siri, and contrast, a shake of the head will decline a call. These silent interactions will also be applicable to messages and notifications.
This article originally appeared on Engadget at https://www.engadget.com/apple-intelligence-ai-ios-18-and-the-biggest-announcements-at-wwdc-2024-184422501.html?src=rss
The rumors are true. Apple is adding a dedicated passwords manager app to most of its operating systems. These include macOS, iPadOS, visionOS and iOS. It’ll even work on Windows by accessing the Passwords app via iCloud. That’s pretty neat. There are way too many passwords out there.
The first-party service is powered by iCloud Keychain and will compete with some heavy hitters in the space, like LastPass and 1Password. The simply-named Passwords app will be able to list various user logins and categorize them based on service type. For instance, banking passwords would be grouped differently than social media passwords. The app will also allow users to bypass manual password input by leveraging Face ID, Touch ID and autofill.
It’s worth noting that Apple already had a password manager, but it’s not exactly beloved and has been buried in the settings page. This new app, however, is quite a compelling option for those tied into the Apple ecosystem. The company didn’t say if the app was free or if it would require a monthly subscription.
This article originally appeared on Engadget at https://www.engadget.com/apple-brings-a-full-featured-passwords-app-to-the-mac-iphone-ipad-and-windows-181607490.html?src=rss
Apple's macOS 15 update is called Sequoia. The 2024 Mac software, coming this fall, includes iPhone mirroring and notification, a new passwords app and Safari upgrades. Of course, it also includes Apple Intelligence. The new software was announced at Apple’s WWDC 2024 keynote at Apple Park.
Like the company’s other 2024 updates, macOS Sequoia includes Apple Intelligence baked in — but only for Apple Silicon Macs with an M1 or newer chip. The system-wide writing tools will work in Mail, Notes, Pages and third-party apps. The AI composition features can rewrite text, proofread and summarize content.
Sequoia also includes Image Playground, Apple’s image generation tool. It lets you create “playful images” in several styles, including animations, illustrations and sketches. The feature is built into Apple’s core apps and has a standalone app.
Typing to Siri also arrives on the Mac in Sequoia, letting you switch between voice and text-based chats with the assistant. You can also use Apple Intelligence’s ChatGPT integration, which asks for user permission to send your requests to OpenAI’s bot.
iPhone mirroring lets you use your Mac to view, control and interact with your phone. It lets you access iOS apps and receive notifications from your nearby handset. Your iPhone screen stays locked in Standby mode (one of iOS 17’s updates) while you work on your computer.
Apple
macOS Sequoia also adds a new Windows-like snap window arrangement tool. Drag an app near the screen’s edge, and macOS will automatically suggest where to tile it. You can quickly place windows side by side or in corners. Sequoia will also include new keyboard and menu shortcuts to arrange tiles even faster.
Apple highlighted new video conferencing features in its WWDC keynote. Presenter Preview lets you see what you’re about to share with your call partner(s) before they see it, potentially saving folks some embarrassment. Meanwhile, Background Replacement (as its name implies) lets you swap out your real surroundings for built-in ones or your own photos in video calls.
1Password’s developers are likely squirming today with the introduction of Apple’s new Passwords app. Building on iCloud Keychain and the passwords previously buried in Safari’s settings (and system settings on iPhone and iPad), the standalone app will include all your saved credentials, verification codes and security alerts. It syncs across devices and will also appear on iOS, iPadOS, visionOS and even Windows (via iCloud for Windows).
Safari also gets some upgrades. These include Highlights, which automatically detect relevant info from webpages, and Summaries, which provide AI-fueled recaps of web content in a redesigned Reader mode.
macOS Sequoia has some gaming advancements, including improved Windows porting capabilities in Gameporting Toolkit 2. Apple said it will also be easier to port Mac games to iPad and iPhone, potentially giving developers an extra financial incentive to make or port titles for the Apple ecosystem.
This article originally appeared on Engadget at https://www.engadget.com/macos-sequoia-will-let-you-see-your-iphone-mirrored-on-your-macs-screen-180215857.html?src=rss
A year since Apple unveiled the Vision Pro, and about four months since its muted launch, the spatial computing headset still feels surprisingly undercooked. Simple features, like the ability to organize icons in the visionOS home screen, are nowhere to be found. Content that truly shows off the Vision Pro's immersive capabilities is still rare (the recent Marvel experience was just a glimpse of what's possible).
According to the latest report from Bloomberg's Mark Gurman, the company will show off visionOS 2 at its Worldwide Developers Conference ((WWDC 2024), but the update will mostly focus on polishing the Vision Pro experience. We can expect native Vision Pro versions of Apple software (right now the headset uses iPad versions of many apps), as well as a Passwords app and new environments. Apple's major AI push will also reportedly be called "Apple Intelligence," a cheeky way of colonizing the term "AI."
Beyond minor polishing and bug fixes, here's what I'd like to see on the Vision Pro at WWDC 2024 (or really, anytime in the next year, Apple!).
iPhone and iPad screen mirroring
Perhaps the most baffling aspect of the Vision Pro is how it refuses to play well with the iPhone. If you ever need to unlock your phone to use an authentication app, or quickly peep a Slack message, you'll either have to remove the Vision Pro to use FaceID, or type in your PIN and squint through the headset's middling cameras. Why?!
If Apple can already deliver sharp and lag-free macOS mirroring, it's not a huge leap to give us something similar for iPhones and iPads. Sure, ideally you'd be able to manage your text messages and other tasks in the Vision Pro without relying on other devices. Realistically, though, the Messages app doesn't always receive texts as quickly as your iPhone, and its history of texts and contacts often differs too.
Offering a quick pop-up of your iPhone's screen would erase those issues, and it would keep you within the flow of whatever you're working on in the Vision Pro. As for the lack of FaceID, Apple could tie authentication of your iPhone together with your Apple ID. You already have to sign into your Vision Pro with a PIN or Optic ID scan, as well as log into your ID itself, so Apple already knows who you are.
Mirror my MacBook Air's screen inside the Vision Pro.
Photo by Devindra Hardawar/Engadget
When it comes to iPads, screen mirroring could be just as useful as it is on Macs. If you were typing away on a document on an iPad Pro with a Magic Keyboard, why shouldn't you be able to continue doing that on the Vision Pro? Supporting less powerful iPads could also be useful, since they could mirror downloaded media or games. Why burden the headset's M2 processor when you could tap into an M2 chip on an iPad Air?
Taking this concept a step further, it would also be nice to have Apple Watch mirroring eventually. Imagine lifting up your wrist and having a glanceable view of notifications or media controls while using the Vision Pro. What if you could immediately see a 300-inch version of your Apple TV's home screen as soon as you sit down on your couch. Apple has the potential to shape reality itself while using its headset, so why not lean into that for its own devices?
Apple
More native Vision Pro apps
Recent rumors suggest we'll see native versions of Apple's apps on the Vision Pro (many are just repackaged iPad apps right now), but I'm hoping to see more developers jump on the platform. There still aren't any Vision Pro apps for Netflix, YouTube or Spotify. If you want to use those services, you'll have to log into a web browser, or rely on a third-party app like Supercut. This isn't the seamless spatial computing future I was promised, Apple.
Now I'm sure it'll be tough for Apple to get YouTube to play nice with the Vision Pro, especially as Google just recently struck a mysterious partnership with the AR headset company Magic Leap. But not being able to get Netflix and Spotify on the headset remains a huge problem for Apple. Without the apps we live with every day, Vision Pro will always seem undercooked.
Photo by Devindra Hardawar/Engadget
Cast audio to speakers and home theater systems
The Vision Pro's built-in speakers are fine, but they lack the depth of a proper pair of bookshelf speakers or Apple's own HomePod. And they certainly don't have the low-end kick you'd get from a complete home theater system and subwoofer. So why can't we just send audio easily to those devices?
Let us AirPlay to HomePods on a whim! Let me sit in my home theater and enjoy the massive speakers surrounding me, while watching Fury Road at near-IMAX scale on the Vision Pro! While I enjoy using AirPod Pros for immersive audio on the go, they can't hold a candle to the Dolby Atmos-equipped towers in my basement.
I'm sure home theater users aren't a high-priority consideration for Apple, but at the moment, who else is known for spending way too much money on hardware that isn't meant for everyone?
This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-features-wed-love-to-see-at-wwdc-2024-151822925.html?src=rss