How to mirror your iPhone on macOS Sequoia

With macOS Sequoia and iOS 18, Apple has a handy new way to hop between devices while on desktop. iPhone Mirroring shows your phone’s screen on your computer; you can even use your mouse and keyboard to interact with it. Here’s how to set up and get the most out of iPhone Mirroring.

First, iPhone Mirroring has several conditions. It only works with Apple Silicon Macs (late 2020 and later) or Intel-based models with the Apple T2 Security Chip (2018 to 2020). Of course, you’ll need to install macOS Sequoia first to use the feature. Any iPhone running iOS 18 will do.

The feature only works when your iPhone is locked (it’s okay if it’s charging or using Standby). If you unlock your iPhone while using iPhone Mirroring, the feature will temporarily disconnect.

Both devices also need Wi-Fi and Bluetooth turned on, and you’ll have to sign with your Apple Account on each. Your account needs two-factor authentication (using a trusted device or phone number) activated. The feature won’t work if your phone’s Personal Hotspot is active or you’re using AirPlay, Sidecar or internet sharing on your Mac.

Screenshot of the iPhone Mirroring app icon in the macOS dock. Other apps flank it to the left and right.
Screenshot by Will Shanklin for Engadget

Open the iPhone Mirroring app on your Mac. It should already be in your Dock (see the screenshot above), but you can also find it in your Applications folder.

The app starts with a welcome screen. Tap “Continue,” then follow the prompt to unlock your iPhone.

Next, approve iPhone notifications on your Mac. This feature shows your handset’s alerts in your Mac’s Notification Center. (When you click an iOS alert on your Mac, it will open the corresponding app in the iPhone Mirroring app.) iPhone notifications on your Mac work even when the iPhone Mirroring app is closed or inactive, or if your phone isn’t nearby.

After approving notifications, a final screen will confirm that iPhone Mirroring is ready. Click the “Get Started” button to start. Once it loads, you’ll see your iPhone’s screen.

First, you may want to resize the iPhone Mirroring app. Apple only gives you three options: actual size, smaller and larger. You can change them using keyboard shortcuts: larger (Cmd +), actual size (Cmd 0) and smaller (Cmd -). You can also resize the window in your Mac’s menu bar under the View section. Dragging the edges of the window to resize it (like with other macOS apps) won’t work here.

In most cases, interacting with your virtual iPhone on your Mac is as simple as mimicking its usual touch gestures with your trackpad and typing in text fields using your Mac’s keyboard.

macOS screenshot of the iPhone Mirroring app (showing Spotify, playing a John Lee Hooker album) overlaying a webpage in Chrome on Mac. A red arrow points at two buttons (Home and App Switcher) at the top of the virtual iPhone window.
Screenshot by Will Shanklin for Engadget

Swipe-based gestures for Home, App Switcher and Control Center won’t work on Mac, but they have shortcuts. If you move your pointer to the top of the iPhone Mirroring window, a new area will appear, revealing buttons for the iOS Home Screen (left) and the App Switcher (right). (See the screenshot above.) This area also lets you click-hold and drag the app to reposition it.

You can also go to the Home Screen by clicking on the horizontal bar at the bottom of the app’s window or using the Cmd 1 keyboard shortcut. In addition, Cmd 2 activates the App Switcher, and Cmd 3 triggers a Spotlight search. Or, swipe down with two fingers on your Mac’s trackpad from the iPhone Home Screen (in the Mac app) for Spotlight.

There’s no way to activate the iOS Control Center from your Mac. You also can’t manually change the orientation of the virtual iPhone screen, but it will rotate automatically if you launch a game that starts by default in landscape mode:

The game Bloons 5 in landscape mode, running in the iPhone Mirroring app on macOS Sequoia.
Screenshot by Will Shanklin for Engadget

iPhone audio will play on your Mac while using the feature. Some iPhone videos will play in the iPhone Mirroring window, too. However, copyrighted content will be restricted in some cases, so some videos will only be viewable through corresponding macOS apps or desktop browser windows.

Apple’s Universal Clipboard can be useful while using iPhone Mirroring. Copy something on your virtual iPhone, and you can paste it on your Mac, and vice versa. You can also use AirDrop to transfer files between the two devices while using iPhone Mirroring.

iPhone Mirroring will time out if you don’t use the virtual phone for a while. Ditto for if you move your handset away from your computer. If it times out, just follow the app’s prompt to reconnect.

macOS screenshot showing the settings window for the iPhone Mirroring app.
Screenshot by Will Shanklin for Engadget

You can choose whether to require authentication every time you use iPhone Mirroring. In the Mac app, choose iPhone Mirroring > Settings in the menu bar (or type Cmd space), and you’ll see a barebones settings screen.

You can choose “Ask Every Time” or “Authenticate Automatically.” The former requires your Mac login password, Touch ID or Apple Watch confirmation to use your virtual iPhone on your desktop. Meanwhile, the latter will log into your phone automatically without authenticating each time.

You can also reset iPhone access in this settings screen. This removes your entire setup, and you’ll need to start the process from scratch the next time you open the iPhone Mirroring app.

If you have more than one iPhone tied to your Apple Account, you can choose which one to use with iPhone Mirroring under Settings > Desktop & Dock on your Mac. If this applies to you, you’ll see the option under the “Use iPhone widgets” section. (If you only have one iPhone under your Apple Account, this option won’t appear.)

For more information on Apple’s latest models, you can check out Engadget’s reviews of the iPhone 16 and 16 Pro series phones, along with the latest MacBooks.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/how-to-mirror-your-iphone-on-macos-sequoia-130003743.html?src=rss

Meta will use AI to create lip-synced translations of creators’ Reels

Meta just announced an intriguing tool that uses AI to automatically dub Reels into other languages, complete with lip-sync. This feature was revealed at the annual Meta Connect livestream event and was introduced by CEO Mark Zuckerberg.

Zuckerberg showed this off during the keynote, and everything seemed to work flawlessly. The technology not only translates the content, according to Meta, but will also “simulate the speaker’s voice in another language and sync their lips to match.” It’s worth noting, however, that this didn’t appear to be a live demo, but was still pretty impressive. 

As for a rollout, the company says the feature will arrive first to “some creators’ videos” in English and Spanish in the US and Latin America. Meta didn’t give a timetable here. It just said the US and Latin America will be getting it first, which indicates that it’ll be tied to English and Spanish at launch. The company did mention that more languages are coming soon.

That wasn’t the only AI tool spotlighted during Meta Connect. The company’s AI platform will now allow voice chats, with a selection of celebrity voices to choose from. Meta AI is also getting new image capabilities, as it will be able to change and edit photos based on instructions from text chats within Instagram. Messenger and WhatsApp.

Catch up on all the news from Meta Connect 2024!

This article originally appeared on Engadget at https://www.engadget.com/ai/meta-will-use-ai-to-create-lip-synced-translations-of-creators-reels-175949373.html?src=rss

Meta will use AI to create lip-synced translations of creators’ Reels

Meta just announced an intriguing tool that uses AI to automatically dub Reels into other languages, complete with lip-sync. This feature was revealed at the annual Meta Connect livestream event and was introduced by CEO Mark Zuckerberg.

Zuckerberg showed this off during the keynote, and everything seemed to work flawlessly. The technology not only translates the content, according to Meta, but will also “simulate the speaker’s voice in another language and sync their lips to match.” It’s worth noting, however, that this didn’t appear to be a live demo, but was still pretty impressive. 

As for a rollout, the company says the feature will arrive first to “some creators’ videos” in English and Spanish in the US and Latin America. Meta didn’t give a timetable here. It just said the US and Latin America will be getting it first, which indicates that it’ll be tied to English and Spanish at launch. The company did mention that more languages are coming soon.

That wasn’t the only AI tool spotlighted during Meta Connect. The company’s AI platform will now allow voice chats, with a selection of celebrity voices to choose from. Meta AI is also getting new image capabilities, as it will be able to change and edit photos based on instructions from text chats within Instagram. Messenger and WhatsApp.

Catch up on all the news from Meta Connect 2024!

This article originally appeared on Engadget at https://www.engadget.com/ai/meta-will-use-ai-to-create-lip-synced-translations-of-creators-reels-175949373.html?src=rss

Strava makes it easier to keep your activity data private

Workout tracker app Strava has a history of being used to stalk people, identifying where they live or their typical running paths (take a look at this Reddit thread of people commiserating, for instance). While the platform has some safety features, a new tool should make it easier to confirm your privacy settings immediately following an activity. Strava is launching Quick Edit, which provides all users with immediate access to edit and privacy settings in the app after syncing an activity. 

Quick Edit lets you modify a few aspects of your activity, such as who can see your its details. It also gives you the option to hide certain information quickly, such as your start time, pace, or heart rate. You can even opt to hide your entire route and map. These features already exist in Strava, but Quick Edit could be helpful if you're running in a new place and forgot to change your settings or leaving from home and want to keep your address private. Basically, it can be that one extra reminder to check your privacy settings are as secure as you want. If you skip the Quick Edit screen, then Strava will apply your default settings.

The new feature also has a few non-safety options to explore. Quick Edit will prompt you to customize your activity title and upload photos and videos you took while out exploring. Just remember, if you make your map private, don't counteract that by sharing anything that could identify exactly where you are. You can also access advanced edits like gear and specific workout types through the Quick Edit screen.

This article originally appeared on Engadget at https://www.engadget.com/apps/strava-makes-it-easier-to-keep-your-activity-data-private-130024746.html?src=rss

Strava makes it easier to keep your activity data private

Workout tracker app Strava has a history of being used to stalk people, identifying where they live or their typical running paths (take a look at this Reddit thread of people commiserating, for instance). While the platform has some safety features, a new tool should make it easier to confirm your privacy settings immediately following an activity. Strava is launching Quick Edit, which provides all users with immediate access to edit and privacy settings in the app after syncing an activity. 

Quick Edit lets you modify a few aspects of your activity, such as who can see your its details. It also gives you the option to hide certain information quickly, such as your start time, pace, or heart rate. You can even opt to hide your entire route and map. These features already exist in Strava, but Quick Edit could be helpful if you're running in a new place and forgot to change your settings or leaving from home and want to keep your address private. Basically, it can be that one extra reminder to check your privacy settings are as secure as you want. If you skip the Quick Edit screen, then Strava will apply your default settings.

The new feature also has a few non-safety options to explore. Quick Edit will prompt you to customize your activity title and upload photos and videos you took while out exploring. Just remember, if you make your map private, don't counteract that by sharing anything that could identify exactly where you are. You can also access advanced edits like gear and specific workout types through the Quick Edit screen.

This article originally appeared on Engadget at https://www.engadget.com/apps/strava-makes-it-easier-to-keep-your-activity-data-private-130024746.html?src=rss

The Google Photos video editor is getting AI, because of course it is

Google added some new features and updates to the video editor in Google Photos for Android and iOS users, according to the app's support page.

The biggest update brings new “AI-powered video presets” to both versions of the app. These new presets automatically trim the length of videos, adjust the lighting, change the speed and apply new effects with just a few clicks. Some of the AI-powered effects allow motion tracking, automatic zoom and slow-motion. The new “presets” tab is located underneath the video timeline.

This isn’t the first AI feature added to the Google Photos app. Last May, Google added its “Ask Photos” feature, a Gemini-powered AI chatbot that allowed for more detailed and conversational photo searches for US users.

Google also tweaked and added some Android-specific features. The new trim tool has improved controls for more precise cuts. There’s also a new “auto enhance” feature that can automatically improve the colors and stabilize videos, and a new “speed” tool that can ramp up or slow down the action.

The new features start rolling out today.

This article originally appeared on Engadget at https://www.engadget.com/mobile/the-google-photos-video-editor-is-getting-ai-because-of-course-it-is-180958935.html?src=rss

Duolingo, best known as a language learning app, now makes a piano

Duolingo just announced a portable piano. Yes, we are talking about the same app that’s become synonymous with learning a foreign language. The app also has some music-learning courses, so this keyboard is intended for that and not for, uh, pushing down keys to trigger Spanish phrases.

The company teamed up with Loog for this instrument, so this is basically a reskin of the pre-existing Loog Piano. It swaps out the red for a, dare I say, Brat green and it ships with a neat little smartphone stand, for keeping an eye on the app during practice sessions. Other than that, it looks nearly identical to the OG version.

That’s not a bad thing. The Loog x Duolingo Piano is a three-octave digital keyboard with built-in stereo speakers, wood sides and a rechargeable battery for portable use. The keys are likely one of the biggest selling points, as they allow for dynamics (piano to forte.) These Loog keyboards are pretty much the only digital pianos with velocity-sensitive keys at this size and price point.

It doubles as a standard MIDI controller, via USB-C, and there’s a sustain port and a headphone jack. This particular version also ships with Duolingo flashcards for budding piano players. Of course, it also integrates with the company’s app, for on-the-fly tutorials. There’s a Loog Piano app coming, but we don't know when. We reached out to the company for a concrete release date and confirmation that the Duolingo piano will integrate with the Loog app. We’ll update this post when we find out, though I’d be extremely surprised if there isn’t cross-app functionality.

Just like the original Loog Piano, this one costs $249. Preorders are live right now, though it doesn’t ship until November. If you really have a hankering for a student-grade portable piano, the standard Loog Piano ships immediately.

This article originally appeared on Engadget at https://www.engadget.com/audio/duolingo-best-known-as-a-language-learning-app-now-makes-a-piano-172012643.html?src=rss

HP’s Print AI will offer a better way to print websites

HP just announced HP Print AI, which is being advertised as “the industry’s first intelligent print” experience. Beyond squeezing in tech’s two favorite letters (AI), the software looks to “simplify and enhance printing from setup to support.” There are several tools here, but the most interesting aspect is something called Perfect Output.

This could actually solve the problem of printing from web pages, which typically produces something just a hair above absolute garbage. The company says the embedded algorithms will reduce all of that unnecessary white space and will get rid of ads.

Image size will also be optimized, so printing from a website should look about as good as something that came from a word processor. HP says everything will “fit perfectly on the page for the first time.” Perfect Output isn’t just for websites, as the company says it’ll also make short work of spreadsheets, which are another frustrating thing to print out.

This feature begins rolling out today, but only to select customers as a beta. HP told Engadget that Perfect Output will work with any of the company’s printers, so long as the correct driver is installed and it’s connected to a Windows 10 or Windows 11 machine. Once some customer feedback comes in, it should go into a wider release.

HP Print AI will also use artificial intelligence to customize support for each user, with the company saying that its “intelligent technology anticipates” the needs of consumers. HP says this will be especially useful when it comes to setup and for remembering user preferences. There’s also a chatbot in there that allows for language-based queries, which runs off of a proprietary LLM the company calls a "print language model." So it's technically a, sigh, PLM. 

For now, these tools are tied to driver software. HP says that they’ll be featured prominently in a forthcoming app update scheduled for next year. 

This article originally appeared on Engadget at https://www.engadget.com/computing/accessories/hps-print-ai-will-offer-a-better-way-to-print-websites-170523565.html?src=rss

TikTok Music is on its way out

TikTok Music is shutting down following an attempt to translate views on its base app to music streaming. The music arm announced the news that accounts will close by November 28, with all user data and login information deleted.  

Google subscribers whose subscription ends after November 28 should automatically get a refund or can request one through Google Play before TikTok Music shuts down. On the other hand, Apple users must request a refund through Apple support before the 28th to get one. Anyone who actually uses TikTok Music might want to wait a minute, though, as the premium service will no longer be available once a refund is processed. Speaking of deadlines, anyone who wants to transfer their playlists from TikTok Music to another music streamer has to do so by October 28. 

TikTok Music first launched in Indonesia and Brazil in July 2023. It replaced another music platform called Resso from ByteDance (TikTok's parent company). Around the same time, it became available as a closed beta test in Australia, Mexico and Singapore, fully launching in those locations that October. Despite ByteDance filing for a "TikTok Music" trademark application in May 2022, the platform never made it to the US. 

This article originally appeared on Engadget at https://www.engadget.com/apps/tiktok-music-is-on-its-way-out-143058957.html?src=rss

Neurable’s brainwave-tracking Master & Dynamic headphones tell you when to take a break

“It’s the most powerful wearable tracking the most important organ in your body.”

Dr. Ramses Alcaide is explaining the electroencephalography (EEG) technology that his company Neurable uses to track activity with its brain-computer interface (BCI). Alcaide is the CEO and co-founder, and notes that a huge problem with EEG sensors is that they are often affixed to bulky, awkward-looking headsets — not exactly something you want to wear out in public. And to him, that’s why the technology hasn’t yet “created the type of impact that they could [on] the world.” Sure, we’ve seen a variety of headbands over the last decade, but those add an additional device to your bag. Alcaide argues there’s a better way to use EEG tech that’s even less intrusive.

Neurable began at the University of Michigan in 2011 where its technology was initially created. The overall platform is an AI system that combines filtering to increase and boost the signal of brain data. The company spun out in 2015 and has been working to bring its EEG-powered tech to “smaller everyday devices,” as Alcaide describes them.

“[It] took a lot of time, but what we’ve been able to do is take what was traditionally these large systems and bring it down to everyday devices using AI,” he says.

Devices like headphones, earbuds, helmets, AR glasses and more can be equipped with EEG sensors so that they can track neurodegenerative diseases and neurodivergence based on brain activity. For example, the ability to track Alzheimer's or ADHD before a person knows they even have it is part of the plan for Neurable. Right now though, the company’s first step is one of those “everyday wearables” that can track decreases in focus to create what Alcaide calls “good wellness hygiene.”

The earpads have EEG sensors woven into the fabric.
Billy Steele for Engadget

The company’s first device is the MW75 Neuro: a set of headphones built in collaboration with Master & Dynamic. Based on the existing MW75, this version has dry fabric EEG sensors in the ear pads, sending 12 EEG channels to the Neurable app for the software to do its AI analysis and signal processing. The app then interprets the data “with high confidence” and “lab-level accuracy,” according to the company.

The Neurable app is where all the data is displayed for the MW75 Neuro. First, it essentially gamifies mental hygiene with focus tracking. You earn points for high (2), medium (2) and low (1) focus levels, accumulating points throughout the day. You’re then able to view comparisons week-to-week as well as individual session summaries with attention span graphs. During these periods, the system can prompt you to take a break when focus decreases, which Neurable says this should help with burnout to some degree. Of course, “burnout” isn’t something that’s easy to quantify, or even tangibly measure, since there’s more than your focus or attention at play.

The MW75 Neuro isn’t just meant to keep you working. The company says monitoring your focus levels can assist you with gaming, meditation, reading and even decision-making. Noise cancellation can block out distractions during periods when you need to be locked in, which doesn’t only apply to the office. Neurable says no matter the activity, its app provides the data necessary to recognize your performance over time and identity when you need to take breaks or maybe find a different environment in order to be productive.

“This is just scratching the iceberg,” Alcaide explains. “We're not claiming or diagnosing everything, [but] it really shows you a glimpse of the future that these everyday wearables can deliver on.”

The MW75 Neuro looks exactly like the MW75, aside from the cloth earpads and extra branding.
Billy Steele for Engadget

Of course, the MW75 Neuro is a set of noise-canceling headphones, which means you’ll get a host of audio features on top of the fancy brain tech. Master & Dynamic CEO Jonathan Levine told me that this version of the headphones has an identical industrial design to the regular MW75. 40mm Beryllium drivers carry M&D’s trademark warm sound profile and four microphones are employed for active noise cancellation (ANC) and calls. There are still a host of sound modes and you can customize the EQ and more inside the M&D Connect app.

Besides the ear pads, there are some other changes on the MW75 Neuro. Neurable’s version supports Adaptive Transparency mode for starters, but the key difference is inside. The electronics were completely redesigned to add EEG processors that power the AI tech, including an ARM Cortex chip. Since the sensor-packed cushions on this model are fabric instead of leather, Levine says the variation does change the sound profile slightly. And during my testing I noticed that they aren’t quite as comfortable as those on the original model either. If you pre-order from Master & Dynamic, the company will throw in non-EEG leather ear pads for free. 

There’s a big hit to battery life, too. Neurable says the MW75 Neuro offers 10 hours of EEG tracking on a charge (8 hours with ANC on), compared to up to 28 hours with ANC on the regular version. I don’t think you’re going to use Neurable’s features for more than a few hours at a time, but you should know they do impact longevity.

Once you start a focus session, a timer begins in the app and continues until you turn it off. There’s a button up top if you need to take a break, otherwise the headphones continue tracking your brainwaves until you tell them to stop. There’s also an indicator on the timer screen to let you know if the sensors are properly connected. A reliable connection ensures optimal EEG signal quality during the session.

The Neurable app provides detailed graphs and summaries on your productivity.
Neurable

During my tests, I used the MW75 Neuro to track short focus sessions. It’s nice that the whole system runs in the background without any distractions – other than the break suggestions. Of course, you’ll have to think back to remember if any dips lined up when you look at the graph, but I felt like the app’s prompts to take a break were well-timed and probably overdue. The software can give you voice or push notifications (or both), and the app provides a separate 10-minute timer for the so-called Brain Breaks.

I don’t have any lab-grade tech to thoroughly evaluate what Neurable is doing on these headphones from a tracking standpoint. And I’ll admit that my short time with the MW75 Neuro isn’t enough time to fully evaluate their utility. But, I can begin to see how they could help over time, especially for those of us who are incentivized by streaks and daily scores. I found it interesting to see how much time I spent in high and medium focus, as well as trying to recall if a text or Slack message may have caused me to stumble during a session.

Neurable is actually working to help with that common distraction. The company is allowing developers to build apps for the MW75 Neuro, including one in the works that will automatically pause Spotify when you lose focus. To help with messages, the company is working on a chat integration that allows you to respond with head movements while remaining in the productivity zone. Alcaide argues that 90 percent of text messages can be responded to in a simple manner with a response created by ChatGPT, so the headphones’ accelerometer can be used to detect a nod or shake for automatic replies. This goes beyond what Apple is doing with Siri Interactions on AirPods since it helps facilitate an appropriate response.

“When the iPhone came out, a touchscreen was the interface,” he continues. “For [Neurable], it’s going to be the neural interface and the accelerometer. It’s going to enable us to do a lot of the same things we do with our phone with our everyday wearable.”

The MW75 Neuro is available for pre-order today in the US in silver, onyx, navy and olive color options for $699. Neurable plans to make the headphones available in Europe and the UK in 2025 for €729 / ₤629. That’s a lot for a set of headphones, but the regular MW75 is $599, so there’s only a $100 premium for Neurable’s tech. 

This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/neurables-brainwave-tracking-master--dynamic-headphones-tell-you-when-to-take-a-break-120004736.html?src=rss