Samsung’s Galaxy S24 lineup puts generative AI front and center

Samsung unveiled its Galaxy S24 devices at its first Unpacked of the year. As expected, the three smartphones have a heavy focus on artificial intelligence-powered features, from the likes of live translations to image editing.

Galaxy AI, as Samsung is calling the devices’ overarching AI system, is behind a number of communication-focused functions. For one thing, Galaxy S24 devices will natively support live, two-way translations on phone calls without the need for a third-party app, Samsung says. Since processing for most AI features is handled on-device with the help of the Snapdragon 8 Gen 3 Chipset and its neural processing unit, the conversations will stay private (well, aside from eavesdroppers who might catch one half of the chat). You'll have the option to entirely disable online processing of data for any AI features.

A demo of Samsung's Live Translate feature on a Galaxy S24 device.
Photo by Sam Rutherford/Engadget

On a similar note, on-device processing also means that you won’t need cellular data or Wi-Fi connections to use AI features such as Interpreter. This enables you to display split-screen translations of an in-person conversation. Your device will also be able to generate transcripts of recordings — these can be summarized or translated as needed.

When it comes to dashing off text-based messages, Samsung says its Chat Assist feature can help you find the right tone. Samsung Keyboard can translate messages between 13 languages too. A Note Assist function in Samsung Notes can summarize texts, generate templates and create covers to help you identify the note you’re looking for.

Meanwhile, as you’re driving, Android Auto can summarize incoming messages and suggest relevant responses and actions for you to approve via voice command. These could include things like telling someone your estimated time of arrival.

A new search experience means that you'll be able to draw a circle around something on your screen and see related results from Google. Depending on your location and the search query, you may see an AI-generated overview that pulls information from the web to offer context and more details.

A demo of the digital zoom feature on a Samsung Galaxy S24 device.
Photo by Sam Rutherford/Engadget

AI will be a driving force behind the Galaxy S24 lineup's camera systems too. Samsung suggests it will help with digital zoom, image stabilization and when capturing photos and videos in low-light. A Super HDR feature is designed to help you see a lifelike preview of an image before pressing the shutter button.

When it comes to image editing, the suite of AI tools might come in useful. Galaxy AI will offer suggested tweaks to improve a photo, while the Generative Edit function can fill in parts of an image's background. This may prove handy if a shot is crooked and you want to straighten it, as the feature should let you move the subject and fix the background. Of note, you will need a network connection to use Generative Fill. Also, whenever you use generative AI to modify a photo, your phone will apply a watermark to the image and its metadata.

If you want to slow down a video that has a lot of activity, the Instant Slow-mo feature might help out. Samsung says this can generate extra frames based on movements in the original video to slow down the action smoothly. Last but not least, the camera systems in certain Galaxy S models feature HDR integration with third-party social apps. This means that when you look up an image in Gallery or your Instagram feed or reels, you'll see photos and videos in Super HDR.

It's little surprise that Samsung is going all in on AI with its latest Galaxy phones. The company previewed its AI models at the tail end of 2023, and word at the time suggested Samsung would deploy those functions broadly in the following months. Moreover, Samsung needs to keep pace with Google, which has been focusing more on AI features on Pixel phones for the last few years. Recent Pixel models are able to handle AI processing on-device too.

This article originally appeared on Engadget at https://www.engadget.com/samsungs-galaxy-s24-lineup-puts-generative-ai-front-and-center-180034530.html?src=rss

Galaxy S24 and Pixel 8 owners can soon search for anything by drawing a circle around it

On Wednesday, Google introduced Circle to Search, a gesture-based way to quickly find info without leaving your app. The feature will be exclusive (at least at first) to the new Galaxy S24 and the Pixel 8 / Pixel 8 Pro starting at the end of January.

Google pitches Circle to Search as “a new way to search anything on your Android phone without switching apps.” You can activate the feature by long-pressing the home button or navigation bar. Then, circle something on your screen with your finger and see the results pop up at the bottom. To return to what you were doing, “simply swipe away and you’re right back where you started,” Google Search VP Cathy Edwards wrote in a company blog post.

Demo in three phone screens (lined up horizontally) of Google's Circle to Search feature. On the left, a social post with a circled corndog. Center: the corn dog highlighted with a search result pop-up at the bottom. Right: full search results for the corndog query.
Google

Despite its name, Circle to Search isn’t limited to circling. “With a simple gesture, you can select images, text or videos in whatever way comes naturally to you — like circling, highlighting, scribbling or tapping,” Google Search VP Elizabeth Reid wrote.

Circle to Search also works alongside multisearch, Google’s text / image search feature launched in the Google app in 2022. The company suggests circling to select a corn dog in a viral social post and asking, “Why are these so popular?” (“You’ll quickly learn that these sweet and savory treats are Korean corn dogs,” Google explains.) The feature works with anything on your screen, including products, other items or text in videos.

A phone screen showing search results (in the Google mobile app) for a visual search of a mysterious board game. The results reveal the game is pucket.
Google

In more hardware-agnostic news, the company is injecting generative AI into Lens multisearch in the Google app. The company says this allows you to ask “more complex or nuanced questions.” It provided an example of seeing a mysterious and unlabeled board game at a yard sale, snapping a pic and asking Google Lens, “How do you play this?”

Google says the feature will provide a generative AI-fueled overview using the web’s most relevant info. The results will include supporting links to let you scour the web for more details.

AI-powered multisearch overviews roll out this week in the Google app on Android and iOS in the US (English only). The feature is open to everyone who fits that criteria — no beta opt-in necessary. Meanwhile, Circle to Search will be available on January 31 for “select premium Android smartphones,” starting with the Galaxy S24 series, Pixel 8 and Pixel 8 Pro.

This article originally appeared on Engadget at https://www.engadget.com/galaxy-s24-and-pixel-8-owners-can-soon-search-for-anything-by-drawing-a-circle-around-it-180029757.html?src=rss

How to pre-order the Samsung Galaxy S24 Ultra

Like clockwork, the new year has brought new Samsung Galaxy smartphones. The company announced the new Galaxy S24 lineup today, which includes the flagship S24 Ultra along with the Galaxy S24+ and S24. The handsets will look familiar to Samsung diehards, and the company spent most of its launch event hyping AI features rather than hardware upgrades. The new phones boast AI perks like an enhanced photo editor, a “circle to search” feature, quick summarization tools and more. If you're on the market for a new smartphone, here’s how you can pre-order the Samsung Galaxy S24 Ultra, S24+ and S24, along with everything else announced at Samsung Unpacked 2024.

This article originally appeared on Engadget at https://www.engadget.com/how-to-pre-order-the-samsung-galaxy-s24-ultra-180028971.html?src=rss

Galaxy S24 and S24 Plus hands-on: Samsung’s AI phones are here, but with mixed results

I’ve never thought of Samsung as a software company, let alone as a name to pay attention to in the AI race. But with the launch of the Galaxy S24 series today, the company is eager to have us associate it with the year’s hottest tech trend. The new flagship phones look largely the same as last year’s models, but on the inside, change is afoot. At a hands-on session during CES 2024 in Las Vegas last week, I was more focused on checking out the new software on the Galaxy S24 and S24 Plus.

Thanks to a new Snapdragon 8 Gen 3 processor (in the US) customized “for Galaxy,” the S24 series are capable of a handful of new AI-powered tasks that seem very familiar. In fact, if you’ve used Microsoft’s CoPilot, Google’s Bard AI or ChatGPT, a lot of these tools won’t feel new. What is new is the fact that they’re showing up on the S24s, and are mostly processed on-device by Samsung’s recently announced Gauss generative AI model, which it has been quietly building out.

Samsung’s Galaxy AI features on the S24

There are five main areas where generative AI Is making a big difference in the Galaxy S24 lineup — search, translations, note creation, message composition and photo editing and processing. Aside from the notes and composition features, most of these updates seem like versions of existing Google products. In fact, the new Circle to Search feature is a Google service that is debuting on the S24 series, in addition to the Pixel 8 and Pixel 8 Pro.

Circle to Search

With Circle to Search, you basically press the middle of the screen’s bottom edge, the Google logo and a search bar pop up, and you can draw a ring around anything on the display. Well, almost anything. DRMed content or things protected from screenshots, like your banking app, are off limits. Once you’ve made your selection, a panel slides up showing your selection, along with results from Google’s Search Generative Experience (SGE).

You can scroll down to see image matches, followed by shopping, text, website and other types of listings that SGE thought were relevant. I circled the Samsung clock widget, a picture of beef wellington and a lemon, and each time I was given pretty accurate results. I was also impressed by how quickly Google correctly identified a grill that I circled on an Engadget article featuring a Weber Searwood, especially since the picture I drew around was at an off angle.

This is basically image search via Google or Lens, except it saves you from having to open another app (and take screenshots). You’ll be able to circle items in YouTube videos, your friend’s Instagram Stories (or, let’s be honest, ads). Though I was intrigued by the feature and its accuracy, I’m not sure how often I’d use it in the real world. The long-press gesture to launch Circle to Search works whether you use a gesture-based navigation or if you have the three-button layout. The latter might be slightly confusing, since you pretty much hold your finger down on the home button, but not exactly.

Circle to Search is launching on January 31st, and though it’s reserved for the Galaxy S24s and Pixel 8s for now, it’s not clear whether older devices might get the feature.

Chat Assist to tweak the tone of your messages

The rest of Samsung’s AI features are actually powered by the company’s own language models, not Google’s. This part is worth making clear, because when you use the S24 to translate a message from, say, Portuguese to Mandarin, you’ll be using Samsung’s database, not Google’s. I really just want you to direct your anger at the right target when something inevitably goes wrong.

I will say, I was a little worried when I first heard about Samsung’s new Chat Assist feature. It uses generative AI to help reword a message you’ve composed to change up the tone. Say you’re in a hurry, firing off a reply to a friend whom you know can get anxious and misinterpret texts. The S24 can take your sentences, like “On my way back now what do you need” and make it less curt. The options I saw were “casual,” “emojify,” “polite,” “professional” and “social,” which is a hashtag-filled caption presumably for your social media posts.

I typed “Hey there. Where can I get some delicious barbecue? Also, how are you?” Then I tapped the AI icon above the keyboard and selected the “Writing Style” option. After about one or two seconds, the system returned variations of what I wrote.

At the top of the results was my original, followed by the Professional version, which I honestly found hilarious. It said “Hello, I would like to inquire about the availability of delectable barbecue options in the vicinity. Additionally, I hope this message finds you well. Thank you for your attention to this matter.”

It reminded me of an episode of Friends where Joey uses a thesaurus to sound smarter. Samsung’s AI seems to have simply replaced every word with a slightly bigger word, while also adding some formal greetings. I don’t think “inquire about the availability of delectable barbecue options in the vicinity” is anything a human would write.

That said, the casual option was a fairly competent rewording of what I’d written, as was the polite version. I cannot imagine a scenario where I’d pick the “emojify” option, except for the sake of novelty. And while the social option pained me to read, at least the hashtags of #Foodie and #BBQLover seemed appropriate.

Samsung Translate

You can also use Samsung’s AI to translate messages into one of 13 languages in real-time, which is fairly similar to a feature Google launched on the Pixel 6 in 2021. The S24’s interface looks reminiscent of the Pixel’s, too, with both offering two text input fields. Like Google, Samsung also has a field at the top for you to select your target language, though the system is capable of automatically recognizing the language being used. I never got this to work correctly in a foreign language that I understand, and have no real way of confirming how accurate the S24 was in Portuguese.

Samsung’s translation engine is also used for a new feature called Live Translate, which basically acts as an interpreter for you during phone calls made via the native dialer app. I tried this by calling one of a few actors Samsung had on standby, masquerading as managers of foreign-language hotels or restaurants. After I dialed the number and turned on the Live Translate option, Samsung’s AI read out a brief disclaimer explaining to the “manager at a Spanish restaurant” that I was using a computerized system for translation. Then, when I said “Hello,” I heard a disembodied voice say “Hola” a few seconds later.

The lag was pretty bad and it threw off the cadence of my demo, as the person on the other end of the call clearly understood English and would answer in Spanish before my translated request was even sent over. So instead of:

Me: Can I make a reservation please?

S24: … ¿Puedo hacer una reserva por favor?

Restaurant: Si, cuantas personas y a que hora?

S24 (to me): … Yes, for how many people and at what time?

My demo actually went:

Me: Can I make a reservation please?

pause

Restaurant: Si, cuantas personas y a que hora?

S24: ¿Puedo hacer una reserva por favor?

pause

S24 (to me): Yes, for how many people and at what time?

It was slightly confusing. Do I think this is representative of all Live Translate calls in the real world? No, but Samsung will need to work on cutting down lag if it wants to be helpful and not confusing.

Galaxy AI reorganizing your notes

I was most taken by what Samsung’s AI can do in its Notes app, which historically has had some pretty impressive handwriting recognition and indexing. With the AI’s assistance, you can quickly reformat your large blocks of text into easy-to-read headers, paragraphs and bullets. You can also swipe sideways to see different themes, with various colors and font styles.

Notes can also generate summaries for you, though most of the summaries on the demo units didn’t appear very astute or coherent. After it auto-formatted a note titled “An Exploration of the Celestial Bodies in Our Solar System,” the first section was aptly titled “Introduction,” but the first bullet point under that was, confusingly, “The Solar System.” The second bullet point was two sentences, starting with “The Solar System is filled with an array of celestial bodies.”

Samsung also borrowed another feature from the Pixel ecosystem, using its speech-to-text software to transcribe, summarize and translate recordings. The transcription of my short monologue was accurate enough, but the speaker labels weren’t. Summaries of the transcriptions were similar to those in Notes, in that they’re not quite what I’d personally highlight.

The Galaxy S24 held in mid-air, with the viewfinder of its camera app showing on the screen.
Photo by Sam Rutherford / Engadget

That’s already a lot to cover, and I haven’t even gotten to the photo editing updates yet. My colleague Sam Rutherford goes into a lot more detail on those in his hands-on with the Galaxy S24 Ultra, which has the more-sophisticated camera system. In short though, Samsung offers edit suggestions, generative background filling and an instant slow-mo tool that fills in frames when you choose to slow down a video.

Samsung Galaxy S24 and S24 Plus hardware updates

That brings me to the hardware. On the regular Galaxy S24 and S24 Plus, you’ll be getting a 50-megapixel main sensor, 12MP wide camera and 10MP telephoto lens with 3x optical zoom. Up front is a 12MP selfie camera. So, basically, the same setup as last year. The S24 has a 6.2-inch Full HD+ screen, while the S24 Plus sports a 6.7-inch Quad HD+ panel and both offer adaptive refresh rates that can go between 1 and 120Hz. In the US, all three S24 models use a Snapdragon 8 Gen 3 for Galaxy processor, with the base S24 starting out with 8GB of RAM and 128GB of storage. Both the S24 and S24 Plus have slightly larger batteries than their predecessors, with their respective 4,000mAh and 4,900mAh cells coming in at 100mAh and 200mAh bigger than before.

Though the S24s look very similar to last year’s S23s, my first thought on seeing them was how much they looked like iPhones. That’s neither a compliment nor an indictment. And to be clear, I’m only talking about the S24 and S24 Plus, not the Ultra, which still has the distinctive look of a Note.

Four Galaxy S24 handsets in white, cream, black and purple, laid down on a table with their rear cameras facing up.
Photo by Sam Rutherford / Engadget

It feels like Samsung spent so much time upgrading the software and focusing on joining the AI race this year that it completely overlooked the S24’s design. Plus, unlike the latest iPhones, the S24s are also missing support for the newer Qi 2 wireless charging standard, which includes magnetic support, a la Apple’s MagSafe.

Wrap-up

I know it’s just marketing-speak and empty catchphrases, but I’m very much over Samsung’s use of what it thinks is trendy to appeal to people. Don’t forget, this is the company that had an “Awesome Unpacked” event in 2021 filled to the brim with cringeworthy moments and an embarrassingly large number of utterances of the words “squad” and “iconic”.

That doesn’t mean what Samsung’s done with the Galaxy S24 series is completely meaningless. Some of these features could genuinely be useful, like summarizing transcriptions or translating messages in foreign languages. But after watching the company follow trend after trend (like introducing Bixby after the rise of digital assistants, or bringing scene optimizers to its camera app after Chinese phone makers did), launching generative AI features feels hauntingly familiar. My annoyance at Samsung’s penchant for #trendy #hashtags aside, the bigger issue here is that if the company is simply jumping on a fad instead of actually thoughtfully developing meaningful features, then consumers run the risk of losing support for tools in the future. Just look at what happened to Bixby.

This article originally appeared on Engadget at https://www.engadget.com/galaxy-s24-and-s24-plus-hands-on-samsungs-ai-phones-are-here-but-with-mixed-results-180008236.html?src=rss

WhatsApp Channels now let owners send voice messages

WhatsApp has introduced new features Channels can use to interact with their followers. The biggest one, perhaps, is voice updates. Channel admins and owners, such as celebrities, can now send voice messages to their group. Meta says Puerto Rican rapper Bad Bunny was the first celebrity to test it out, but it's rolling out the feature broadly today. Voice updates is one of WhatsApp's most popular features, probably because it allows users to send messages without having to type, even while they're driving or doing something else. Apparently, WhatsApp users send 7 billion voice messages on the app every day, so it was most likely a very easy decision for Meta to bring the feature to Channels. 

In addition to voice updates, admins can now also share polls in chat that their followers can answer. Plus, WhatsApp now allows Channels to have as many as 16 administrators if they want to make sure that their followers are always up to date with the latest news. As for fans, they'll now be able to share a Channel update as their Status, which is the messaging service's version of Instagram Stories. Since it's possible to share voice notes as a WhatsApp Status, that presumably means they can also share a Channel voice update if they want. All of these features are making their way to WhatsApp users around the world. 

This article originally appeared on Engadget at https://www.engadget.com/whatsapp-channels-now-let-owners-send-voice-messages-150016866.html?src=rss

DJI’s Mic 2 now records high-quality audio to your smartphone via Bluetooth

After making a cameo appearance in the Osmo Pocket 3 camera, DJI's Mic 2 wireless microphone system has officially arrived with some nice upgrades over its popular predecessor. It can now connect directly to your smartphone via Bluetooth, while also offering improved internal recording quality, AI noise reduction, a bigger touchscreen, easier control and more. 

The transmitters come in grey with a new see-through design and DJI introduced a white color option as well. They're slightly smaller than before, but largely resemble their predecessors with a clip, magnetic mount and 3.5mm mic input. The power and link buttons are now on the same side and round instead of oblong, with the record button and USB-C input on the other side. 

DJI's Mic 2 now records high-quality audio to your smartphone via Bluetooth
DJI

In one welcome change, DJI moved the power-on LED to the sides, rather than near the front as before, where it would annoyingly appear on camera. The DJI logo is front and center, though, so you'll still need a piece of black tape to cover that up. 

The receiver has changed substantially, with a larger 1.1-inch touchscreen and a new thumbwheel to make adjustments easier. DJI has made connecting the transmitter directly to your phone simpler as well via included USB-C and Lightning adapters. 

A big plus of the Mic 2 over other kits like the Rode wireless Go 2 is the charging case that's sold with the two-transmitter kit. It now supports up to 18 hours of use on a charge, up from 15 before, and the transmitters have been upgraded from 5.5 to six hours.

DJI's Mic 2 now records high-quality audio to your smartphone via Bluetooth
DJI

Topping the list of new features is direct Bluetooth connection support, letting you pair a transmitter mic to your phone (or DJI's Osmo Pocket 3 and Action Cam 4) without the need for a receiver. That'll allow creators on a budget to purchase a transmitter mic by itself for $99, or add DJI's Lavalier Mic for an additional $35. 

Linking a phone is relatively easy — hold the record button for three seconds to put it in Bluetooth mode, then press and hold the link button for two seconds. From there, your phone should detect the transmitter. It worked great with my Pixel 7a, even though it's not on the approved list, and I was able to start recording video with much better quality audio, to say the least. 

There are a couple of caveats: the AI noise cancelling feature doesn't work when connected to a smartphone and you can only use one transmitter at a time. If you have the transmitter/receiver combo, though, you can also get audio by connecting the receiver directly to your phone as before. 

DJI's Mic 2 now records high-quality audio to your smartphone via Bluetooth
DJI

Speaking of, the Mic 2 has a couple of improvements in audio quality. It promises "brighter and more natural sounding voices" for the interviews or standup work where it's mainly used. And though the original DJI Mic supported internal recording as a backup to camera files, it now captures that at higher 32-bit float quality, letting you max out gain without fearing distortion. It also supports a higher acoustic overload point (AOP), up to 120 dB from 114 dB, meaning you'll see less distortion on higher audio levels. 

The other quality trick is AI noise cancelling, allowing the Mic 2 to lower the environmental noise so vocals stand out better. DJI promises that it works in "complex and noisy environments, such as streets and restaurants." 

A full review is to come, but I tried out the Mic 2 in a variety of situations, including inside a car, riding on a bicycle and in a howling wind. It performed well in nearly all those situations, with all distracting noise blocked in the car and bike shots, leaving just some pleasant environmental sound. However, it was unable to block out a direct 30-40 MPH wind on a sand dune, even with the included wind muff installed. It still worked well enough for me to get the shot, though, which was impressive.

DJI's Mic 2 now records high-quality audio to your smartphone via Bluetooth
DJI

Key features carrying over from the last model include the option for a safety track recorded at a lower -6 dB (in case you accidentally blow out the levels), an 820 foot range with the transmitter/receiver combo (524 feet in the EU), magnetic clips and a muff for each transmitter. For the receiver, DJI has also included preset gains for different cameras so that it'll work relatively well out of the box. It doesn't include all recent cameras, so hopefully firmware updates will address that. 

With the new options, particularly the smartphone Bluetooth connectivity and Osmo Pocket 3/Action 4 support, the Mic 2 is again likely to strike a chord with creators. It's now available for $349 with two transmitters, a receiver and charging case, $219 for a transmitter and receiver and $99 for individual transmitters. You can also purchase the charging case separately for $69. 

This article originally appeared on Engadget at https://www.engadget.com/djis-mic-2-now-records-high-quality-audio-to-your-smartphone-via-bluetooth-130018964.html?src=rss

The Morning After: A closer look at Apple’s Vision Pro

With pre-orders opening later this week, Engadget experienced a more in-depth demo of Apple's mixed-reality headset. Editor-in-chief Dana Wollman and deputy editor Cherlynn Low were fitted with the Vision Pro for some more extensive demos, including immersive video, a little bit of Disney+ and attempts to type in thin air on the Vision Pro’s floating keyboard.

TMA
Apple

They discuss the fitting process, the attention to detail in Disney+’s viewer app and where there's room for improvement with keyboards, comfort and utility. This is the company’s first new product for a while — and I had strong feelings about its last one. Early impressions suggest Apple seems to have made a fluid, intelligent headset experience — but are you willing to spend $3,499 on it?

— Mat Smith

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!​​

The biggest stories you might have missed

Google Maps finally adds Waze’s in-tunnel navigation feature

Hulu and Max win big at 75th Primetime Emmy’s

The 2024 Moto G Play gives you a 50-megapixel camera for $150

Google is laying off hundreds of workers who sell ads to big businesses

Take-Two’s lawyers think Remedy’s new R logo is too similar to Rockstar’s R logo

The Last of Us Part 2 Remastered review

A new roguelike No Return mode steals the show.

When a PS5 remake of the 2013 title The Last of Us Part I launched, it was hard to stomach the $70 price tag. Yes, the game looked incredible, there were some new modes, but the level design and gameplay were identical. It was, for all intents and purposes, a money grab.

With The Last of Us Part II Remastered, that seems less true. First, it’s a $10 upgrade for people who bought the PS4 versions (or $50 outright). Second, there’s a new roguelike game mode called No Return, which may be worth that upgrade price on its own. Nathan Ingraham, TLOU die-hard, explains.

Continue reading.

Yamaha takes on Teenage Engineering with its own colorful groovebox

The SEQTRAK is an all-in-one production studio.

TMA
Yamaha

Yamaha is a pillar of the electronic music-making world, but it’s perhaps best known for its stage synthesizers and studio monitors. Now, it’s taking Teenage Engineering with the SEQTRAK groovebox. Stylistically, it seems heavily inspired by TE.

The SEQTRAK includes a drum machine, sampler, FM and sample-based synthesizers and that semi-eponymous sequencer along with a built-in battery (three to four hours expected playtime) plus a built-in speaker, so it works without plugging in anything else. The SEQTRAK is available to pre-order at retailers for $399, which undercuts the heady pricing of Teenage Engineering’s similar product.

Continue reading.

Apple shipped more smartphones than anyone else last year

It’s the first time Apple has held the top spot.

Both IDC and Canalys’ most recent analysis of smartphone shipments shows Apple has beaten Samsung to roll out more smartphones than any other company. IDC’s preliminary data said Apple shipped 234.6 million units in 2023, equal to 20.1 percent of the market share. In comparison, Samsung shipped 226.6 units for 19.4 percent of the market share. This is the first time Samsung has fallen from the number-one spot since 2010. 

Back then, Nokia was in the lead.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-a-closer-look-at-apples-vision-pro-121522078.html?src=rss

Apple updates US App Store guidelines allowing developers to link to third-party payments

Apple is relaxing a key App Store rule that has long been a source of frustration to developers. The iPhone maker will allow U.S. developers to link to outside websites for in-app purchases, according to the company’s updated developer guidelines.

The change comes shortly after the United States Supreme Court rejected an appeal to reconsider a lower court ruling requiring Apple to allow developers to direct customers to alternative payment methods. The change only applies to iOS and iPadOS apps in the U.S. app stores and developers are still required to pay a commission for in-app purchases not made via the App Store.

It seems that Apple will continue to maintain tight control over payments, even under the new rules. According to a support page, developers will need approval from Apple before they can take advantage of the new rule, and app makers will only be permitted to notify users about alternative payment methods in specific ways. For example, the company’s guidelines to developers stipulate that links can only be shown in an app one time, and only in “a single, dedicated location.” App makers are also prohibited from using in-app pop-ups or mentioning outside payments in their App Store listing.

The company is also officially requiring developers to pay it a commission for purchases made outside of its App Store. The commission is set at 12 percent for developers who are part of its small business program, and 27 percent for larger developers. But, as 9to5Mac points out, the company may have some trouble enforcing those terms. 

In court documents, the company argued that it would be “exceedingly difficult and, in many cases, impossible” to collect the fees. In its messaging to developers, however, the company says that they are required to submit monthly reports, even if they haven’t processed any transactions, and that the company has the right to audit their records.

Still, the change is a significant concession for Apple, which has long been criticized for developers for App Store rules sometimes viewed as draconian and arbitrary. The company’s rule barring developers from communicating with users about alternative (and often cheaper) payment methods was a central aspect of the Epic v. Apple trial in 2021. The company had previously loosened some of these rules following the trial and a subsequent class-action lawsuit from developers. Apple also allows dating apps in the Netherlands to offer alternative payment options.

Some high profile developers who have previously run up against Apple’s App Store policies were sharply critical of the company’s latest changes. Epic CEO Tim Sweeney called it a “bad-faith ‘compliance’ plan” in a post on X. He called the 27 percent fee “anticompetitive” and said that “Apple will front-run competing payment processors with their own ‘scare screen’ to disadvantage them.” He added that Epic would pursue a legal challenge to its changes in District Court.

 David Heinemeier Hansson, cofounder of the Hey email app, which publicly battled with Apple over its payment policies, also slammed the changes. “Apple is going to poison the one victory Epic secured in their lawsuit so bad nobody would ever think to use it,” he wrote on X.

Apple didn’t immediately respond to a request for comment.

This article originally appeared on Engadget at https://www.engadget.com/apple-updates-us-app-store-guidelines-allowing-developers-to-link-to-third-party-payments-235836357.html?src=rss

Supreme Court declines appeals from Apple and Epic Games in App Store case

The US Supreme Court has declined to hear the appeals filed by both Apple and Epic Games following a judge’s ruling that Apple must allow developers to offer alternative methods to pay for apps and services other than through the App Store. It did not provide an explanation as to why it refused to review either appeal, but it means the permanent injunction giving developers a way to avoid the 30 percent cut Apple takes will remain in place.

Apple made the appeal to the high court back in September of last year, requesting it review the circuit court’s decision it deemed “unconstitutional.” The case brought forward by Epic Games is the first to challenge the business model of the App store, which helps Apple rake in billions. In May 2023, Apple said that developers generated about $1 trillion in total billings through the App Store in 2022. Gaming apps sold on the App Store generate an estimated $100 billion in revenue each year.

While the Ninth Circuit ruled in favor of Epic’s appeal that Apple has indeed broken California's Unfair Competition law, it rejected Epic’s claim that the App store is a monopoly. In addition to declining to hear Apple’s appeal, SCOTUS also will not review Epic’s appeal that the district court had made “legal errors.”

Epic claimed that Apple violates federal antitrust laws through its business model, however, this is not an issue the high court will consider. The CEO of Epic Games, Tim Sweeney, called the appeal denial “a sad outcome” on X.

Epic Games has been front and center in the fight against Apple’s developer transaction fee policy since 2020. Other companies, including Spotify and the New York Times, are also trying to challenge app store policies on Apple and Google platforms. The Coalition for App Fairness, which consists of more than 60 companies now, believes no developers should be required to use the app store exclusively. The Epic lawsuit was just the start — problems have been piling up for Apple. Even the Department of Justice (DOJ) is reportedly considering filing an antitrust case against it. The DOJ has been conducting an investigation into whether Apple’s App Store practices have killed competition in the space.

This article originally appeared on Engadget at https://www.engadget.com/supreme-court-declines-appeals-from-apple-and-epic-games-in-app-store-case-192755323.html?src=rss

Apple Vision Pro hands-on, redux: Immersive Video, Disney+ app, floating keyboard and a little screaming

With pre-orders for the Apple Vision Pro headset opening this week, the company is getting ready to launch one of its most significant products ever. It announced this morning an “entertainment format pioneered by Apple” called Apple Immersive Video, as well as new viewing environments in the Disney+ app featuring scenes from the studio’s beloved franchises like the Avengers and Star Wars.

We already got hands-on once back at WWDC when the headset was first announced, but two of our editors, Dana Wollman and Cherlynn Low, had a chance to go back and revisit the device (and in Dana’s case, experience it for the first time). Since we’ve already walked you through some of the basic UI elements in our earlier piece, we decided to focus on some of the more recently added features, including Apple Immersive Video, the new Disney+ environments, a built-in “Encounter Dinosaurs” experience, as well as the floating keyboard, which didn’t work for us when we first tried the device in June of last year. Here, too, we wanted to really get at what it actually feels like to use the device, from the frustrating to the joyful to the unintentionally eerie. (Yes, there was a tear, and also some screaming.)

Fit, comfort and strap options

Cherlynn: The best heads-up display in the world will be useless if it can’t be worn for a long time, so comfort is a crucial factor in the Apple Vision Pro’s appeal. This is also a very personal factor with a lot of variability between individual users. I have what has been described as a larger-than-usual head, and a generous amount of hair that is usually flat-ironed. This means that any headgear I put on tends to slip, especially if the band is elastic.

Like the version that our colleague Devindra Hardawar saw at WWDC last year, the Vision Pro unit I tried on today came with a strap that you stretch and ends up at the back of your head. It was wide, ridged and soft, and I at first thought it would be very comfortable. But 15 minutes into my experience, I started to feel weighed down by the device, and five more minutes later, I was in pain. To be fair, I should have flagged my discomfort to Apple earlier, and alternative straps were available for me to swap out. But I wanted to avoid wasting time. When I finally told the company’s staff about my issues, they changed the strap to one that had two loops, with one that went over the top of my head.

A woman with dark hair wearing the Apple Vision Pro headset, sat back on a gray couch.
Apple

Dana: The fitting took just long enough — required just enough tweaking — that I worried for a minute that I was doing it wrong, or that I somehow had the world’s one unfittable head. First, I struggled to get the lettering to look sharp. It was like sitting at an optometrist's office, trying out a lens that was just slightly too blurry for me. Tightening the straps helped me get the text as crisp as it needed to be, but that left my nose feeling pinched. The solution was swapping out the seal cushion for the lighter of the two options. (There are two straps included in the box, as well as two cushions.) With those two tweaks — the Dual Loop Band and the light seal cushion — I finally felt at ease.

Cherlynn: Yep, that Dual Loop band felt much better for weight distribution, and it didn’t keep slipping down my hair. It’s worth pointing out that Apple did first perform a scan to determine my strap size, and they chose the Medium for me. I also had to keep turning a dial on the back right to make everything feel more snug, so I had some control over how tightly the device sat. Basically, you’ll have quite a lot of options to adapt the Vision Pro to your head.

Apple Immersive Video and spatial videos

Dana: Sitting up close in the center of Apple Immersive and spatial videos reminded me of Jimmy Stewart’s character in It’s A Wonderful Life: I was both an insider and outsider at the same time. In one demo, we saw Alicia Keys performing the most special of performances: just for us, in a living room. In a different series of videos — these meant to demonstrate spatial video — we saw the same family at mealtime, and a mother and daughter outside, playing with bubbles.

As I watched these clips, particularly the family home videos that reminded me of my own toddler, I felt immersed, yes, but also excluded; no one in the videos sees you or interacts with you, obviously. You are a ghost. I imagined myself years from now, peering in from the future on bygone videos of my daughter, and felt verklempt. I did not expect to get teary-eyed during a routine Apple briefing.

Cherlynn: The Immersive Video part of my demo was near the end, by which point I had already been overwhelmed by the entire experience and did not quite know what more to expect. The trailer kicked off with Alicia Keys singing in my face, which I enjoyed. But I was more surprised by the kids playing soccer with some rhinos on the field, and when the animals charged towards me, I physically recoiled. I loved seeing the texture of their skin and the dirt on the surface, and was also impressed when I saw the reflection of an Apple logo on the surface of a lake at the end. I didn’t have the same emotional experience that Dana did, but I can see how it would evoke some strong feelings.

A banner with the words
Apple

Disney+ app

Dana: Apple was very careful to note that the version of the Disney+ app we were using was in beta; a work in progress. But what we saw was still impressive. Think of it like playing a video game: Before you select your race course, say, you get to choose your player. In this case, your “player” is your background. Do you want to sit on a rooftop from a Marvel movie? The desert of Tatooine? Make yourself comfortable in whatever setting tickles your fancy, and then you can decide if actually you want to be watching Loki or Goosebumps in your Star Wars wasteland. It’s not enough to call it immersive. In some of these “outdoor” environments in particular, it’s like attending a Disney-themed drive-in. Credit to Disney: They both understand – and respect – their obsessive fans. They know their audience.

Cherlynn: As a big Marvel fangirl, I really geeked out when the Avengers Tower environment came on. I looked around and saw all kinds of easter eggs, including a takeout container from Shawarma Grill on the table next to me. It feels a little silly to gush about the realism of the images, but I saw no pixels. Instead, I looked at a little handwritten note that Tony Stark had clearly left behind and felt like I was almost able to pick it up. When we switched over to the Tatooine environment, I was placed in the cockpit of Luke Skywalker’s landspeeder, and when I reached out to grab the steering controls, I was able to see my own hands in front of me. I felt slightly disappointed to not actually be able to interact with those elements, but it was definitely a satisfying experience for a fan.

Typing experience

Cherlynn: Devindra mentioned that the floating keyboard wasn’t available at his demo last year, and was curious to hear what that was like. I was actually surprised that it worked, and fairly well in my experience. When I selected the URL bar by looking at it and tapping my thumb and forefinger, the virtual keyboard appeared. I could either use my eyes to look at the keys I wanted, then tap my fingers together to push them. Or, and this is where I was most impressed, I could lean forward and press the buttons with my hands.

It’s not as easy as typing on an actual keyboard would be, but I was quite tickled by the fact that it worked. Kudos to Apple’s eye- and hand-tracking systems, because they were able to detect what I was looking at or aiming for most of the time. My main issue with the keyboard was that it felt a little too far away and I needed to stretch if I wanted to press the buttons myself. But using my eye gaze and tapping wasn’t too difficult for a short phrase, and if I wanted to input something longer I could use voice typing (or pair a Bluetooth keyboard if necessary).

A screenshot of the Vision Pro home screen, with about a dozen apps floating above a lake.
Apple

Dana: This was one of the more frustrating aspects of the demo for me. Although there were several typing options – hunting and pecking with your fingers, using eye control to select keys, or just using Siri – none of them felt adequate for anything resembling extended use. It took several tries for me to even spell Engadget correctly in the Safari demo. This was surprising to me, as so many other aspects of the broader Apple experience – the pinch gesture, the original touch keyboard on the original iPhone – “just work,” as Apple loves to say about itself. The floating keyboard here clearly needs improvement. In the meantime, it’s harder to imagine using the Vision Pro for actual work. The Vision Pro feels much further along as a personal home theater.

Meditation

Cherlynn: As someone who’s covered the meditation offerings by companies like Apple and Fitbit a fair amount, I wasn’t sure what to expect of the Vision Pro. Luckily, this experience took place in the earlier part of the demo, so I wasn’t feeling any head strain yet and was able to relax. I leaned back on the couch and watched as a cloud, similar to the Meditation icon in the Apple Watch, burst into dozens of little “leaves” and floated around me in darkness. As the 1-minute session started, soft, comforting music played in the background as a voice guided me through what to do. The leaves pulsed and I felt enveloped by relaxing visuals and calming sounds and altogether it felt quite soothing. It’s funny how oddly appropriate a headset is for something like meditating, where you can literally block out distractions in the world and simply focus on your breathing. This was a fitting use of the Vision Pro that I certainly did not anticipate.

Dana: I wanted more of this. A dark environment, with floating 3D objects and a prompt to think about what I am grateful for today. The demo only lasted one minute, but I could have gone longer.

Encounter Dinosaurs

Cherlynn: Fun fact about me: Dinosaurs don’t scare me, but butterflies do. Yep. Once you’ve stopped laughing, you can imagine the trauma I had to undergo at this demo. I’d heard from my industry friends and Devindra all about how they watched a butterfly land on their fingers in their demos at WWDC, before dinosaurs came bursting out of a screen to roar at them. Everyone described this as a realistic and impressive technological demo, since the Vision Pro was able to accurately pinpoint for everyone where their fingers were and have the butterflies land exactly on their fingertips.

I did not think I’d have to watch a butterfly land on my body today, and just generally do not want that in life. But for this demo, I kept my eyes open to see just how well Apple would do, and, because I had a minor calibration issue at the start of this demo, I had to do this twice. The first time this happened, I… screamed a bit. I could see the butterfly’s wings and legs. That’s really what creeped me out the most — seeing the insect’s legs make “contact” with my finger. There was no tactile feedback, but I could almost feel the whispery sensation of the butterfly’s hairy ass legs on my finger. Ugh.

Then the awful butterfly flew away and a cute baby dinosaur came out, followed by two ferocious dinosaurs that I then stood up to “pet”. It was much more fun after, and actually quite an impressive showcase of the Vision Pro’s ability to blend the real world with immersive experiences, as I was able to easily see and walk around a table in front of me to approach the dinosaur.

Dana: Unlike Cher, I did not scream, though I did make a fool of myself. I held out my hand, to beckon one of the dinosaurs, and it did in fact walk right up to me and make a loud sound in my face. I “pet” it before it retreated. Another dinosaur appeared. I once again held out my hand, but that second dino ignored me. As the demo ended, I waved and heard myself say “bye bye.” (Did I mention I live with a toddler?) I then remembered there were other adults in the room, observing me use the headset, and felt sheepish. Which describes much of the Vision Pro experience, to be honest. You could maybe even say the same of any virtual reality headset worth their salt. It is immersive to the point that you will probably, at some point, throw decorum to the wind.

The Disney+ app floating above a living room in a screenshot of the visionOS interface on the Apple Vision Pro.
Apple

Final (ish) thoughts

Cherlynn: I had been looking forward to trying on the Vision Pro for myself and was mostly not disappointed. The eye- and hand-tracking systems are impressively accurate, and I quickly learned how to navigate the interface, so much so that I was speeding ahead of the instructions given to me. I’m not convinced that I’ll want to spend hours upon hours wearing a headset, even if the experience was mind-blowing. The device’s $3,500 price is also way out of my budget.

But of all the VR, AR and MR headsets I’ve tried on in my career, the Apple Vision Pro is far and away the best, and easily the most thought-out. Apple also took the time to show us what you would look like to other people when using the device, with a feature called EyeSight that would put a visual feed of your eyes on the outside of the visor. Depending on what you’re doing in visionOS, the display would show some animations indicating whether you’re fully immersed in an environment or if you can see the people around you.

Dana: The Vision Pro was mostly easier to use than I expected, and while it has potential as an all-purpose device that you could use for web browsing, email, even some industrial apps, its killer application, for now, is clearly watching movies (home videos or otherwise). I can’t pretend that Apple is the first to create a headset offering an immersive experience; that would be an insult to every virtual reality headset we’ve tested previously (sorry, Apple, I’m going to use the term VR). But if you ask me what it felt like to use the headset, particularly photo and video apps, my answer is that I felt joy. It is fun to use. And it is up to you if this much fun should cost $3,500.

Update, January 17 2024, 3:04PM ET: This article was edited to clarify the TV shows you can view in the Disney+ app's immersive environment. You can only watch Disney+ shows in the environments, like the Avengers Tower or the landspeeder on Tatooine. A previous misspelling of the word Tatooine was also edited, as well as clarification around the head strap option that was available at the WWDC demo.

This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-hands-on-redux-immersive-video-disney-app-floating-keyboard-and-a-little-screaming-180006222.html?src=rss