Like clockwork, the new year has brought new Samsung Galaxy smartphones. The company announced the new Galaxy S24 lineup today, which includes the flagship S24 Ultra along with the Galaxy S24+ and S24. The handsets will look familiar to Samsung diehards, and the company spent most of its launch event hyping AI features rather than hardware upgrades. The new phones boast AI perks like an enhanced photo editor, a “circle to search” feature, quick summarization tools and more. If you're on the market for a new smartphone, here’s how you can pre-order the Samsung Galaxy S24 Ultra, S24+ and S24, along with everything else announced at Samsung Unpacked 2024.
Tag Archives: Technology & Electronics
Galaxy S24 and S24 Plus hands-on: Samsung’s AI phones are here, but with mixed results
I’ve never thought of Samsung as a software company, let alone as a name to pay attention to in the AI race. But with the launch of the Galaxy S24 series today, the company is eager to have us associate it with the year’s hottest tech trend. The new flagship phones look largely the same as last year’s models, but on the inside, change is afoot. At a hands-on session during CES 2024 in Las Vegas last week, I was more focused on checking out the new software on the Galaxy S24 and S24 Plus.
Thanks to a new Snapdragon 8 Gen 3 processor (in the US) customized “for Galaxy,” the S24 series are capable of a handful of new AI-powered tasks that seem very familiar. In fact, if you’ve used Microsoft’s CoPilot, Google’s Bard AI or ChatGPT, a lot of these tools won’t feel new. What is new is the fact that they’re showing up on the S24s, and are mostly processed on-device by Samsung’s recently announced Gauss generative AI model, which it has been quietly building out.
Samsung’s Galaxy AI features on the S24
There are five main areas where generative AI Is making a big difference in the Galaxy S24 lineup — search, translations, note creation, message composition and photo editing and processing. Aside from the notes and composition features, most of these updates seem like versions of existing Google products. In fact, the new Circle to Search feature is a Google service that is debuting on the S24 series, in addition to the Pixel 8 and Pixel 8 Pro.
Circle to Search
With Circle to Search, you basically press the middle of the screen’s bottom edge, the Google logo and a search bar pop up, and you can draw a ring around anything on the display. Well, almost anything. DRMed content or things protected from screenshots, like your banking app, are off limits. Once you’ve made your selection, a panel slides up showing your selection, along with results from Google’s Search Generative Experience (SGE).
You can scroll down to see image matches, followed by shopping, text, website and other types of listings that SGE thought were relevant. I circled the Samsung clock widget, a picture of beef wellington and a lemon, and each time I was given pretty accurate results. I was also impressed by how quickly Google correctly identified a grill that I circled on an Engadget article featuring a Weber Searwood, especially since the picture I drew around was at an off angle.
This is basically image search via Google or Lens, except it saves you from having to open another app (and take screenshots). You’ll be able to circle items in YouTube videos, your friend’s Instagram Stories (or, let’s be honest, ads). Though I was intrigued by the feature and its accuracy, I’m not sure how often I’d use it in the real world. The long-press gesture to launch Circle to Search works whether you use a gesture-based navigation or if you have the three-button layout. The latter might be slightly confusing, since you pretty much hold your finger down on the home button, but not exactly.
Circle to Search is launching on January 31st, and though it’s reserved for the Galaxy S24s and Pixel 8s for now, it’s not clear whether older devices might get the feature.
Chat Assist to tweak the tone of your messages
The rest of Samsung’s AI features are actually powered by the company’s own language models, not Google’s. This part is worth making clear, because when you use the S24 to translate a message from, say, Portuguese to Mandarin, you’ll be using Samsung’s database, not Google’s. I really just want you to direct your anger at the right target when something inevitably goes wrong.
I will say, I was a little worried when I first heard about Samsung’s new Chat Assist feature. It uses generative AI to help reword a message you’ve composed to change up the tone. Say you’re in a hurry, firing off a reply to a friend whom you know can get anxious and misinterpret texts. The S24 can take your sentences, like “On my way back now what do you need” and make it less curt. The options I saw were “casual,” “emojify,” “polite,” “professional” and “social,” which is a hashtag-filled caption presumably for your social media posts.
I typed “Hey there. Where can I get some delicious barbecue? Also, how are you?” Then I tapped the AI icon above the keyboard and selected the “Writing Style” option. After about one or two seconds, the system returned variations of what I wrote.
At the top of the results was my original, followed by the Professional version, which I honestly found hilarious. It said “Hello, I would like to inquire about the availability of delectable barbecue options in the vicinity. Additionally, I hope this message finds you well. Thank you for your attention to this matter.”
It reminded me of an episode of Friends where Joey uses a thesaurus to sound smarter. Samsung’s AI seems to have simply replaced every word with a slightly bigger word, while also adding some formal greetings. I don’t think “inquire about the availability of delectable barbecue options in the vicinity” is anything a human would write.
That said, the casual option was a fairly competent rewording of what I’d written, as was the polite version. I cannot imagine a scenario where I’d pick the “emojify” option, except for the sake of novelty. And while the social option pained me to read, at least the hashtags of #Foodie and #BBQLover seemed appropriate.
Samsung Translate
You can also use Samsung’s AI to translate messages into one of 13 languages in real-time, which is fairly similar to a feature Google launched on the Pixel 6 in 2021. The S24’s interface looks reminiscent of the Pixel’s, too, with both offering two text input fields. Like Google, Samsung also has a field at the top for you to select your target language, though the system is capable of automatically recognizing the language being used. I never got this to work correctly in a foreign language that I understand, and have no real way of confirming how accurate the S24 was in Portuguese.
Samsung’s translation engine is also used for a new feature called Live Translate, which basically acts as an interpreter for you during phone calls made via the native dialer app. I tried this by calling one of a few actors Samsung had on standby, masquerading as managers of foreign-language hotels or restaurants. After I dialed the number and turned on the Live Translate option, Samsung’s AI read out a brief disclaimer explaining to the “manager at a Spanish restaurant” that I was using a computerized system for translation. Then, when I said “Hello,” I heard a disembodied voice say “Hola” a few seconds later.
The lag was pretty bad and it threw off the cadence of my demo, as the person on the other end of the call clearly understood English and would answer in Spanish before my translated request was even sent over. So instead of:
Me: Can I make a reservation please?
S24: … ¿Puedo hacer una reserva por favor?
Restaurant: Si, cuantas personas y a que hora?
S24 (to me): … Yes, for how many people and at what time?
My demo actually went:
Me: Can I make a reservation please?
pause
Restaurant: Si, cuantas personas y a que hora?
S24: ¿Puedo hacer una reserva por favor?
pause
S24 (to me): Yes, for how many people and at what time?
It was slightly confusing. Do I think this is representative of all Live Translate calls in the real world? No, but Samsung will need to work on cutting down lag if it wants to be helpful and not confusing.
Galaxy AI reorganizing your notes
I was most taken by what Samsung’s AI can do in its Notes app, which historically has had some pretty impressive handwriting recognition and indexing. With the AI’s assistance, you can quickly reformat your large blocks of text into easy-to-read headers, paragraphs and bullets. You can also swipe sideways to see different themes, with various colors and font styles.
Notes can also generate summaries for you, though most of the summaries on the demo units didn’t appear very astute or coherent. After it auto-formatted a note titled “An Exploration of the Celestial Bodies in Our Solar System,” the first section was aptly titled “Introduction,” but the first bullet point under that was, confusingly, “The Solar System.” The second bullet point was two sentences, starting with “The Solar System is filled with an array of celestial bodies.”
Samsung also borrowed another feature from the Pixel ecosystem, using its speech-to-text software to transcribe, summarize and translate recordings. The transcription of my short monologue was accurate enough, but the speaker labels weren’t. Summaries of the transcriptions were similar to those in Notes, in that they’re not quite what I’d personally highlight.
That’s already a lot to cover, and I haven’t even gotten to the photo editing updates yet. My colleague Sam Rutherford goes into a lot more detail on those in his hands-on with the Galaxy S24 Ultra, which has the more-sophisticated camera system. In short though, Samsung offers edit suggestions, generative background filling and an instant slow-mo tool that fills in frames when you choose to slow down a video.
Samsung Galaxy S24 and S24 Plus hardware updates
That brings me to the hardware. On the regular Galaxy S24 and S24 Plus, you’ll be getting a 50-megapixel main sensor, 12MP wide camera and 10MP telephoto lens with 3x optical zoom. Up front is a 12MP selfie camera. So, basically, the same setup as last year. The S24 has a 6.2-inch Full HD+ screen, while the S24 Plus sports a 6.7-inch Quad HD+ panel and both offer adaptive refresh rates that can go between 1 and 120Hz. In the US, all three S24 models use a Snapdragon 8 Gen 3 for Galaxy processor, with the base S24 starting out with 8GB of RAM and 128GB of storage. Both the S24 and S24 Plus have slightly larger batteries than their predecessors, with their respective 4,000mAh and 4,900mAh cells coming in at 100mAh and 200mAh bigger than before.
Though the S24s look very similar to last year’s S23s, my first thought on seeing them was how much they looked like iPhones. That’s neither a compliment nor an indictment. And to be clear, I’m only talking about the S24 and S24 Plus, not the Ultra, which still has the distinctive look of a Note.
It feels like Samsung spent so much time upgrading the software and focusing on joining the AI race this year that it completely overlooked the S24’s design. Plus, unlike the latest iPhones, the S24s are also missing support for the newer Qi 2 wireless charging standard, which includes magnetic support, a la Apple’s MagSafe.
Wrap-up
I know it’s just marketing-speak and empty catchphrases, but I’m very much over Samsung’s use of what it thinks is trendy to appeal to people. Don’t forget, this is the company that had an “Awesome Unpacked” event in 2021 filled to the brim with cringeworthy moments and an embarrassingly large number of utterances of the words “squad” and “iconic”.
That doesn’t mean what Samsung’s done with the Galaxy S24 series is completely meaningless. Some of these features could genuinely be useful, like summarizing transcriptions or translating messages in foreign languages. But after watching the company follow trend after trend (like introducing Bixby after the rise of digital assistants, or bringing scene optimizers to its camera app after Chinese phone makers did), launching generative AI features feels hauntingly familiar. My annoyance at Samsung’s penchant for #trendy #hashtags aside, the bigger issue here is that if the company is simply jumping on a fad instead of actually thoughtfully developing meaningful features, then consumers run the risk of losing support for tools in the future. Just look at what happened to Bixby.
WhatsApp Channels now let owners send voice messages
WhatsApp has introduced new features Channels can use to interact with their followers. The biggest one, perhaps, is voice updates. Channel admins and owners, such as celebrities, can now send voice messages to their group. Meta says Puerto Rican rapper Bad Bunny was the first celebrity to test it out, but it's rolling out the feature broadly today. Voice updates is one of WhatsApp's most popular features, probably because it allows users to send messages without having to type, even while they're driving or doing something else. Apparently, WhatsApp users send 7 billion voice messages on the app every day, so it was most likely a very easy decision for Meta to bring the feature to Channels.
In addition to voice updates, admins can now also share polls in chat that their followers can answer. Plus, WhatsApp now allows Channels to have as many as 16 administrators if they want to make sure that their followers are always up to date with the latest news. As for fans, they'll now be able to share a Channel update as their Status, which is the messaging service's version of Instagram Stories. Since it's possible to share voice notes as a WhatsApp Status, that presumably means they can also share a Channel voice update if they want. All of these features are making their way to WhatsApp users around the world.
This article originally appeared on Engadget at https://www.engadget.com/whatsapp-channels-now-let-owners-send-voice-messages-150016866.html?src=rssDJI’s Mic 2 now records high-quality audio to your smartphone via Bluetooth
After making a cameo appearance in the Osmo Pocket 3 camera, DJI's Mic 2 wireless microphone system has officially arrived with some nice upgrades over its popular predecessor. It can now connect directly to your smartphone via Bluetooth, while also offering improved internal recording quality, AI noise reduction, a bigger touchscreen, easier control and more.
The transmitters come in grey with a new see-through design and DJI introduced a white color option as well. They're slightly smaller than before, but largely resemble their predecessors with a clip, magnetic mount and 3.5mm mic input. The power and link buttons are now on the same side and round instead of oblong, with the record button and USB-C input on the other side.
In one welcome change, DJI moved the power-on LED to the sides, rather than near the front as before, where it would annoyingly appear on camera. The DJI logo is front and center, though, so you'll still need a piece of black tape to cover that up.
The receiver has changed substantially, with a larger 1.1-inch touchscreen and a new thumbwheel to make adjustments easier. DJI has made connecting the transmitter directly to your phone simpler as well via included USB-C and Lightning adapters.
A big plus of the Mic 2 over other kits like the Rode wireless Go 2 is the charging case that's sold with the two-transmitter kit. It now supports up to 18 hours of use on a charge, up from 15 before, and the transmitters have been upgraded from 5.5 to six hours.
Topping the list of new features is direct Bluetooth connection support, letting you pair a transmitter mic to your phone (or DJI's Osmo Pocket 3 and Action Cam 4) without the need for a receiver. That'll allow creators on a budget to purchase a transmitter mic by itself for $99, or add DJI's Lavalier Mic for an additional $35.
Linking a phone is relatively easy — hold the record button for three seconds to put it in Bluetooth mode, then press and hold the link button for two seconds. From there, your phone should detect the transmitter. It worked great with my Pixel 7a, even though it's not on the approved list, and I was able to start recording video with much better quality audio, to say the least.
There are a couple of caveats: the AI noise cancelling feature doesn't work when connected to a smartphone and you can only use one transmitter at a time. If you have the transmitter/receiver combo, though, you can also get audio by connecting the receiver directly to your phone as before.
Speaking of, the Mic 2 has a couple of improvements in audio quality. It promises "brighter and more natural sounding voices" for the interviews or standup work where it's mainly used. And though the original DJI Mic supported internal recording as a backup to camera files, it now captures that at higher 32-bit float quality, letting you max out gain without fearing distortion. It also supports a higher acoustic overload point (AOP), up to 120 dB from 114 dB, meaning you'll see less distortion on higher audio levels.
The other quality trick is AI noise cancelling, allowing the Mic 2 to lower the environmental noise so vocals stand out better. DJI promises that it works in "complex and noisy environments, such as streets and restaurants."
A full review is to come, but I tried out the Mic 2 in a variety of situations, including inside a car, riding on a bicycle and in a howling wind. It performed well in nearly all those situations, with all distracting noise blocked in the car and bike shots, leaving just some pleasant environmental sound. However, it was unable to block out a direct 30-40 MPH wind on a sand dune, even with the included wind muff installed. It still worked well enough for me to get the shot, though, which was impressive.
Key features carrying over from the last model include the option for a safety track recorded at a lower -6 dB (in case you accidentally blow out the levels), an 820 foot range with the transmitter/receiver combo (524 feet in the EU), magnetic clips and a muff for each transmitter. For the receiver, DJI has also included preset gains for different cameras so that it'll work relatively well out of the box. It doesn't include all recent cameras, so hopefully firmware updates will address that.
With the new options, particularly the smartphone Bluetooth connectivity and Osmo Pocket 3/Action 4 support, the Mic 2 is again likely to strike a chord with creators. It's now available for $349 with two transmitters, a receiver and charging case, $219 for a transmitter and receiver and $99 for individual transmitters. You can also purchase the charging case separately for $69.
The Morning After: A closer look at Apple’s Vision Pro
With pre-orders opening later this week, Engadget experienced a more in-depth demo of Apple's mixed-reality headset. Editor-in-chief Dana Wollman and deputy editor Cherlynn Low were fitted with the Vision Pro for some more extensive demos, including immersive video, a little bit of Disney+ and attempts to type in thin air on the Vision Pro’s floating keyboard.
They discuss the fitting process, the attention to detail in Disney+’s viewer app and where there's room for improvement with keyboards, comfort and utility. This is the company’s first new product for a while — and I had strong feelings about its last one. Early impressions suggest Apple seems to have made a fluid, intelligent headset experience — but are you willing to spend $3,499 on it?
— Mat Smith
You can get these reports delivered daily direct to your inbox. Subscribe right here!
The biggest stories you might have missed
Google Maps finally adds Waze’s in-tunnel navigation feature
Hulu and Max win big at 75th Primetime Emmy’s
The 2024 Moto G Play gives you a 50-megapixel camera for $150
Google is laying off hundreds of workers who sell ads to big businesses
Take-Two’s lawyers think Remedy’s new R logo is too similar to Rockstar’s R logo
The Last of Us Part 2 Remastered review
A new roguelike No Return mode steals the show.
When a PS5 remake of the 2013 title The Last of Us Part I launched, it was hard to stomach the $70 price tag. Yes, the game looked incredible, there were some new modes, but the level design and gameplay were identical. It was, for all intents and purposes, a money grab.
With The Last of Us Part II Remastered, that seems less true. First, it’s a $10 upgrade for people who bought the PS4 versions (or $50 outright). Second, there’s a new roguelike game mode called No Return, which may be worth that upgrade price on its own. Nathan Ingraham, TLOU die-hard, explains.
Yamaha takes on Teenage Engineering with its own colorful groovebox
The SEQTRAK is an all-in-one production studio.
Yamaha is a pillar of the electronic music-making world, but it’s perhaps best known for its stage synthesizers and studio monitors. Now, it’s taking Teenage Engineering with the SEQTRAK groovebox. Stylistically, it seems heavily inspired by TE.
The SEQTRAK includes a drum machine, sampler, FM and sample-based synthesizers and that semi-eponymous sequencer along with a built-in battery (three to four hours expected playtime) plus a built-in speaker, so it works without plugging in anything else. The SEQTRAK is available to pre-order at retailers for $399, which undercuts the heady pricing of Teenage Engineering’s similar product.
Apple shipped more smartphones than anyone else last year
It’s the first time Apple has held the top spot.
Both IDC and Canalys’ most recent analysis of smartphone shipments shows Apple has beaten Samsung to roll out more smartphones than any other company. IDC’s preliminary data said Apple shipped 234.6 million units in 2023, equal to 20.1 percent of the market share. In comparison, Samsung shipped 226.6 units for 19.4 percent of the market share. This is the first time Samsung has fallen from the number-one spot since 2010.
Back then, Nokia was in the lead.
This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-a-closer-look-at-apples-vision-pro-121522078.html?src=rssApple updates US App Store guidelines allowing developers to link to third-party payments
Apple is relaxing a key App Store rule that has long been a source of frustration to developers. The iPhone maker will allow U.S. developers to link to outside websites for in-app purchases, according to the company’s updated developer guidelines.
The change comes shortly after the United States Supreme Court rejected an appeal to reconsider a lower court ruling requiring Apple to allow developers to direct customers to alternative payment methods. The change only applies to iOS and iPadOS apps in the U.S. app stores and developers are still required to pay a commission for in-app purchases not made via the App Store.
It seems that Apple will continue to maintain tight control over payments, even under the new rules. According to a support page, developers will need approval from Apple before they can take advantage of the new rule, and app makers will only be permitted to notify users about alternative payment methods in specific ways. For example, the company’s guidelines to developers stipulate that links can only be shown in an app one time, and only in “a single, dedicated location.” App makers are also prohibited from using in-app pop-ups or mentioning outside payments in their App Store listing.
The company is also officially requiring developers to pay it a commission for purchases made outside of its App Store. The commission is set at 12 percent for developers who are part of its small business program, and 27 percent for larger developers. But, as 9to5Mac points out, the company may have some trouble enforcing those terms.
In court documents, the company argued that it would be “exceedingly difficult and, in many cases, impossible” to collect the fees. In its messaging to developers, however, the company says that they are required to submit monthly reports, even if they haven’t processed any transactions, and that the company has the right to audit their records.
Still, the change is a significant concession for Apple, which has long been criticized for developers for App Store rules sometimes viewed as draconian and arbitrary. The company’s rule barring developers from communicating with users about alternative (and often cheaper) payment methods was a central aspect of the Epic v. Apple trial in 2021. The company had previously loosened some of these rules following the trial and a subsequent class-action lawsuit from developers. Apple also allows dating apps in the Netherlands to offer alternative payment options.
Some high profile developers who have previously run up against Apple’s App Store policies were sharply critical of the company’s latest changes. Epic CEO Tim Sweeney called it a “bad-faith ‘compliance’ plan” in a post on X. He called the 27 percent fee “anticompetitive” and said that “Apple will front-run competing payment processors with their own ‘scare screen’ to disadvantage them.” He added that Epic would pursue a legal challenge to its changes in District Court.
David Heinemeier Hansson, cofounder of the Hey email app, which publicly battled with Apple over its payment policies, also slammed the changes. “Apple is going to poison the one victory Epic secured in their lawsuit so bad nobody would ever think to use it,” he wrote on X.
Apple didn’t immediately respond to a request for comment.
This article originally appeared on Engadget at https://www.engadget.com/apple-updates-us-app-store-guidelines-allowing-developers-to-link-to-third-party-payments-235836357.html?src=rssSupreme Court declines appeals from Apple and Epic Games in App Store case
The US Supreme Court has declined to hear the appeals filed by both Apple and Epic Games following a judge’s ruling that Apple must allow developers to offer alternative methods to pay for apps and services other than through the App Store. It did not provide an explanation as to why it refused to review either appeal, but it means the permanent injunction giving developers a way to avoid the 30 percent cut Apple takes will remain in place.
Apple made the appeal to the high court back in September of last year, requesting it review the circuit court’s decision it deemed “unconstitutional.” The case brought forward by Epic Games is the first to challenge the business model of the App store, which helps Apple rake in billions. In May 2023, Apple said that developers generated about $1 trillion in total billings through the App Store in 2022. Gaming apps sold on the App Store generate an estimated $100 billion in revenue each year.
The Supreme Court denied both sides’ appeals of the Epic v. Apple antitrust case. The court battle to open iOS to competing stores and payments is lost in the United States. A sad outcome for all developers.
— Tim Sweeney (@TimSweeneyEpic) January 16, 2024
While the Ninth Circuit ruled in favor of Epic’s appeal that Apple has indeed broken California's Unfair Competition law, it rejected Epic’s claim that the App store is a monopoly. In addition to declining to hear Apple’s appeal, SCOTUS also will not review Epic’s appeal that the district court had made “legal errors.”
Epic claimed that Apple violates federal antitrust laws through its business model, however, this is not an issue the high court will consider. The CEO of Epic Games, Tim Sweeney, called the appeal denial “a sad outcome” on X.
Epic Games has been front and center in the fight against Apple’s developer transaction fee policy since 2020. Other companies, including Spotify and the New York Times, are also trying to challenge app store policies on Apple and Google platforms. The Coalition for App Fairness, which consists of more than 60 companies now, believes no developers should be required to use the app store exclusively. The Epic lawsuit was just the start — problems have been piling up for Apple. Even the Department of Justice (DOJ) is reportedly considering filing an antitrust case against it. The DOJ has been conducting an investigation into whether Apple’s App Store practices have killed competition in the space.
This article originally appeared on Engadget at https://www.engadget.com/supreme-court-declines-appeals-from-apple-and-epic-games-in-app-store-case-192755323.html?src=rssApple Vision Pro hands-on, redux: Immersive Video, Disney+ app, floating keyboard and a little screaming
With pre-orders for the Apple Vision Pro headset opening this week, the company is getting ready to launch one of its most significant products ever. It announced this morning an “entertainment format pioneered by Apple” called Apple Immersive Video, as well as new viewing environments in the Disney+ app featuring scenes from the studio’s beloved franchises like the Avengers and Star Wars.
We already got hands-on once back at WWDC when the headset was first announced, but two of our editors, Dana Wollman and Cherlynn Low, had a chance to go back and revisit the device (and in Dana’s case, experience it for the first time). Since we’ve already walked you through some of the basic UI elements in our earlier piece, we decided to focus on some of the more recently added features, including Apple Immersive Video, the new Disney+ environments, a built-in “Encounter Dinosaurs” experience, as well as the floating keyboard, which didn’t work for us when we first tried the device in June of last year. Here, too, we wanted to really get at what it actually feels like to use the device, from the frustrating to the joyful to the unintentionally eerie. (Yes, there was a tear, and also some screaming.)
Fit, comfort and strap options
Cherlynn: The best heads-up display in the world will be useless if it can’t be worn for a long time, so comfort is a crucial factor in the Apple Vision Pro’s appeal. This is also a very personal factor with a lot of variability between individual users. I have what has been described as a larger-than-usual head, and a generous amount of hair that is usually flat-ironed. This means that any headgear I put on tends to slip, especially if the band is elastic.
Like the version that our colleague Devindra Hardawar saw at WWDC last year, the Vision Pro unit I tried on today came with a strap that you stretch and ends up at the back of your head. It was wide, ridged and soft, and I at first thought it would be very comfortable. But 15 minutes into my experience, I started to feel weighed down by the device, and five more minutes later, I was in pain. To be fair, I should have flagged my discomfort to Apple earlier, and alternative straps were available for me to swap out. But I wanted to avoid wasting time. When I finally told the company’s staff about my issues, they changed the strap to one that had two loops, with one that went over the top of my head.
Dana: The fitting took just long enough — required just enough tweaking — that I worried for a minute that I was doing it wrong, or that I somehow had the world’s one unfittable head. First, I struggled to get the lettering to look sharp. It was like sitting at an optometrist's office, trying out a lens that was just slightly too blurry for me. Tightening the straps helped me get the text as crisp as it needed to be, but that left my nose feeling pinched. The solution was swapping out the seal cushion for the lighter of the two options. (There are two straps included in the box, as well as two cushions.) With those two tweaks — the Dual Loop Band and the light seal cushion — I finally felt at ease.
Cherlynn: Yep, that Dual Loop band felt much better for weight distribution, and it didn’t keep slipping down my hair. It’s worth pointing out that Apple did first perform a scan to determine my strap size, and they chose the Medium for me. I also had to keep turning a dial on the back right to make everything feel more snug, so I had some control over how tightly the device sat. Basically, you’ll have quite a lot of options to adapt the Vision Pro to your head.
Apple Immersive Video and spatial videos
Dana: Sitting up close in the center of Apple Immersive and spatial videos reminded me of Jimmy Stewart’s character in It’s A Wonderful Life: I was both an insider and outsider at the same time. In one demo, we saw Alicia Keys performing the most special of performances: just for us, in a living room. In a different series of videos — these meant to demonstrate spatial video — we saw the same family at mealtime, and a mother and daughter outside, playing with bubbles.
As I watched these clips, particularly the family home videos that reminded me of my own toddler, I felt immersed, yes, but also excluded; no one in the videos sees you or interacts with you, obviously. You are a ghost. I imagined myself years from now, peering in from the future on bygone videos of my daughter, and felt verklempt. I did not expect to get teary-eyed during a routine Apple briefing.
Cherlynn: The Immersive Video part of my demo was near the end, by which point I had already been overwhelmed by the entire experience and did not quite know what more to expect. The trailer kicked off with Alicia Keys singing in my face, which I enjoyed. But I was more surprised by the kids playing soccer with some rhinos on the field, and when the animals charged towards me, I physically recoiled. I loved seeing the texture of their skin and the dirt on the surface, and was also impressed when I saw the reflection of an Apple logo on the surface of a lake at the end. I didn’t have the same emotional experience that Dana did, but I can see how it would evoke some strong feelings.
Disney+ app
Dana: Apple was very careful to note that the version of the Disney+ app we were using was in beta; a work in progress. But what we saw was still impressive. Think of it like playing a video game: Before you select your race course, say, you get to choose your player. In this case, your “player” is your background. Do you want to sit on a rooftop from a Marvel movie? The desert of Tatooine? Make yourself comfortable in whatever setting tickles your fancy, and then you can decide if actually you want to be watching Loki or Goosebumps in your Star Wars wasteland. It’s not enough to call it immersive. In some of these “outdoor” environments in particular, it’s like attending a Disney-themed drive-in. Credit to Disney: They both understand – and respect – their obsessive fans. They know their audience.
Cherlynn: As a big Marvel fangirl, I really geeked out when the Avengers Tower environment came on. I looked around and saw all kinds of easter eggs, including a takeout container from Shawarma Grill on the table next to me. It feels a little silly to gush about the realism of the images, but I saw no pixels. Instead, I looked at a little handwritten note that Tony Stark had clearly left behind and felt like I was almost able to pick it up. When we switched over to the Tatooine environment, I was placed in the cockpit of Luke Skywalker’s landspeeder, and when I reached out to grab the steering controls, I was able to see my own hands in front of me. I felt slightly disappointed to not actually be able to interact with those elements, but it was definitely a satisfying experience for a fan.
Typing experience
Cherlynn: Devindra mentioned that the floating keyboard wasn’t available at his demo last year, and was curious to hear what that was like. I was actually surprised that it worked, and fairly well in my experience. When I selected the URL bar by looking at it and tapping my thumb and forefinger, the virtual keyboard appeared. I could either use my eyes to look at the keys I wanted, then tap my fingers together to push them. Or, and this is where I was most impressed, I could lean forward and press the buttons with my hands.
It’s not as easy as typing on an actual keyboard would be, but I was quite tickled by the fact that it worked. Kudos to Apple’s eye- and hand-tracking systems, because they were able to detect what I was looking at or aiming for most of the time. My main issue with the keyboard was that it felt a little too far away and I needed to stretch if I wanted to press the buttons myself. But using my eye gaze and tapping wasn’t too difficult for a short phrase, and if I wanted to input something longer I could use voice typing (or pair a Bluetooth keyboard if necessary).
Dana: This was one of the more frustrating aspects of the demo for me. Although there were several typing options – hunting and pecking with your fingers, using eye control to select keys, or just using Siri – none of them felt adequate for anything resembling extended use. It took several tries for me to even spell Engadget correctly in the Safari demo. This was surprising to me, as so many other aspects of the broader Apple experience – the pinch gesture, the original touch keyboard on the original iPhone – “just work,” as Apple loves to say about itself. The floating keyboard here clearly needs improvement. In the meantime, it’s harder to imagine using the Vision Pro for actual work. The Vision Pro feels much further along as a personal home theater.
Meditation
Cherlynn: As someone who’s covered the meditation offerings by companies like Apple and Fitbit a fair amount, I wasn’t sure what to expect of the Vision Pro. Luckily, this experience took place in the earlier part of the demo, so I wasn’t feeling any head strain yet and was able to relax. I leaned back on the couch and watched as a cloud, similar to the Meditation icon in the Apple Watch, burst into dozens of little “leaves” and floated around me in darkness. As the 1-minute session started, soft, comforting music played in the background as a voice guided me through what to do. The leaves pulsed and I felt enveloped by relaxing visuals and calming sounds and altogether it felt quite soothing. It’s funny how oddly appropriate a headset is for something like meditating, where you can literally block out distractions in the world and simply focus on your breathing. This was a fitting use of the Vision Pro that I certainly did not anticipate.
Dana: I wanted more of this. A dark environment, with floating 3D objects and a prompt to think about what I am grateful for today. The demo only lasted one minute, but I could have gone longer.
Encounter Dinosaurs
Cherlynn: Fun fact about me: Dinosaurs don’t scare me, but butterflies do. Yep. Once you’ve stopped laughing, you can imagine the trauma I had to undergo at this demo. I’d heard from my industry friends and Devindra all about how they watched a butterfly land on their fingers in their demos at WWDC, before dinosaurs came bursting out of a screen to roar at them. Everyone described this as a realistic and impressive technological demo, since the Vision Pro was able to accurately pinpoint for everyone where their fingers were and have the butterflies land exactly on their fingertips.
I did not think I’d have to watch a butterfly land on my body today, and just generally do not want that in life. But for this demo, I kept my eyes open to see just how well Apple would do, and, because I had a minor calibration issue at the start of this demo, I had to do this twice. The first time this happened, I… screamed a bit. I could see the butterfly’s wings and legs. That’s really what creeped me out the most — seeing the insect’s legs make “contact” with my finger. There was no tactile feedback, but I could almost feel the whispery sensation of the butterfly’s hairy ass legs on my finger. Ugh.
Then the awful butterfly flew away and a cute baby dinosaur came out, followed by two ferocious dinosaurs that I then stood up to “pet”. It was much more fun after, and actually quite an impressive showcase of the Vision Pro’s ability to blend the real world with immersive experiences, as I was able to easily see and walk around a table in front of me to approach the dinosaur.
Dana: Unlike Cher, I did not scream, though I did make a fool of myself. I held out my hand, to beckon one of the dinosaurs, and it did in fact walk right up to me and make a loud sound in my face. I “pet” it before it retreated. Another dinosaur appeared. I once again held out my hand, but that second dino ignored me. As the demo ended, I waved and heard myself say “bye bye.” (Did I mention I live with a toddler?) I then remembered there were other adults in the room, observing me use the headset, and felt sheepish. Which describes much of the Vision Pro experience, to be honest. You could maybe even say the same of any virtual reality headset worth their salt. It is immersive to the point that you will probably, at some point, throw decorum to the wind.
Final (ish) thoughts
Cherlynn: I had been looking forward to trying on the Vision Pro for myself and was mostly not disappointed. The eye- and hand-tracking systems are impressively accurate, and I quickly learned how to navigate the interface, so much so that I was speeding ahead of the instructions given to me. I’m not convinced that I’ll want to spend hours upon hours wearing a headset, even if the experience was mind-blowing. The device’s $3,500 price is also way out of my budget.
But of all the VR, AR and MR headsets I’ve tried on in my career, the Apple Vision Pro is far and away the best, and easily the most thought-out. Apple also took the time to show us what you would look like to other people when using the device, with a feature called EyeSight that would put a visual feed of your eyes on the outside of the visor. Depending on what you’re doing in visionOS, the display would show some animations indicating whether you’re fully immersed in an environment or if you can see the people around you.
Dana: The Vision Pro was mostly easier to use than I expected, and while it has potential as an all-purpose device that you could use for web browsing, email, even some industrial apps, its killer application, for now, is clearly watching movies (home videos or otherwise). I can’t pretend that Apple is the first to create a headset offering an immersive experience; that would be an insult to every virtual reality headset we’ve tested previously (sorry, Apple, I’m going to use the term VR). But if you ask me what it felt like to use the headset, particularly photo and video apps, my answer is that I felt joy. It is fun to use. And it is up to you if this much fun should cost $3,500.
Update, January 17 2024, 3:04PM ET: This article was edited to clarify the TV shows you can view in the Disney+ app's immersive environment. You can only watch Disney+ shows in the environments, like the Avengers Tower or the landspeeder on Tatooine. A previous misspelling of the word Tatooine was also edited, as well as clarification around the head strap option that was available at the WWDC demo.
This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-hands-on-redux-immersive-video-disney-app-floating-keyboard-and-a-little-screaming-180006222.html?src=rssThe iRobot Roomba 694 drops to a record low of $160
Life's busy enough for many of us without having to get bogged down in day-to-day home maintenance. So, if you have some cash to spare, why not make things easier for yourself by splurging on a robot vacuum? Several iRobot Roomba models are up to 50 percent off in a sale on Amazon. Some have dropped to record low prices, including the iRobot Roomba 694. At $160, that model is 42 percent off its usual price of $275.
Although it's not the first time the 694 has dropped to this price, it's always worth calling out since this is our top pick for the best budget robot vacuum. It's easy to use thanks to the three onboard buttons and connected iRobot app. There's Alexa and Google Assistant support too, so you can instruct the vacuum to start cleaning with a voice command.
You can set cleaning schedules so that the 694 travels through your home on a regular basis to pick up any dirt and we found that it does a solid job of lifting muck from carpets as well as hard flooring. Battery life varies depending on the type of flooring. According to iRobot, the device will run for up to 90 minutes while cleaning hardwood floors but in our testing the battery lasted around 45 minutes when it was deployed over several types of surfaces. Of course, when it's time to recharge, the Roomba will return to its dock and juice up.
On the downside, you only get the essentials you need to get started — the 694 doesn't come with any replacement filters or brushes. But given that you'd be saving well over $100 on this model thanks to the sale, you might be able to set aside some funds to buy those when the time comes.
Those who are looking for some added features may be more interested in the Roomba j9+, which is also down to a record low. At $599, it's $300 off the regular price. According to iRobot, the Roomba j9+ "sucks up dust and debris better than any other robot vacuum." It also has a Dirt Detective feature through which it can learn the areas of your home that tend to get the dirtiest, so it knows which rooms to prioritize, the level of suction to apply and how many cleaning passes are likely to be needed.
The Roomba j9+ can detect and avoid common obstacles such as cables and socks. Perhaps most importantly, it can spot and stay clear of pet waste. This model will also automatically empty its bin, which means you have even less to worry about.
Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.
This article originally appeared on Engadget at https://www.engadget.com/the-irobot-roomba-694-drops-to-a-record-low-of-160-152516337.html?src=rssThe 2024 Moto G Play gives you a 50-megapixel camera for $150
Motorola has unveiled the 2024 Moto G Play, and its $150 price is still its killer feature. Although you can accuse Motorola of churning out too many nearly identical cheap phones, at least this year’s model adds several new perks. These include a fast-focusing 50-megapixel rear camera, a 6nm Snapdragon 680 processor and double the storage of its predecessor.
The 2024 Moto G Play still has a 6.5-inch LCD with a middling 720p (1600 x 720) resolution. (However, its variable 90Hz refresh rate, impressive for this price point, also returns from last year’s model.) The handset runs Android 13 out of the box, has 64GB of built-in storage and supports microSDs up to 1TB.
The phone’s rear camera has a 50-megapixel sensor with f/1.8 aperture, quad-pixel tech, HDR and phase detection autofocus. It shoots video at 1080p (or 720p) at 30fps. On the front is an 8MP sensor.
The new Moto G Play has a 5,000mAh battery (estimated for “up to 46 hours”) and supports 15W rapid charging. This year’s model adds Dolby Atmos support for its speakers, and it’s certified for hi-res audio when used with compatible wired headphones. The phone is rated IP52 for dust and water protection.
The Moto G Play launches in the US on February 8 for $150. Motorola says it will be available unlocked from its website, Amazon and Best Buy. (A locked version will be sold at various wireless carriers.) Meanwhile, Canadian users can buy it a bit earlier, starting on January 26 from “select carriers and national retailers.”
This article originally appeared on Engadget at https://www.engadget.com/the-2024-moto-g-play-gives-you-a-50-megapixel-camera-for-150-140031208.html?src=rss