Sonos has laid off around 100 employees on Wednesday, first reported byThe Verge and confirmed to Engadget. Workers from the company’s marketing department allegedly bore the brunt of the hit. The cuts come as Sonos tries to simultaneously sell the public on its new Ace headphones and fix the rebuilt Sonos mobile app, which CEO Patrick Spence admitted was the result of his push for development speed.
The company confirmed the layoffs in a statement to Engadget. “We made the difficult decision to say goodbye to approximately 100 team members representing 6 percent of the company,” Spence said in a statement. “This action was a difficult, but necessary, measure to ensure continued, meaningful investment in Sonos' product roadmap while setting Sonos up for long term success.”
The company is also reportedly “winding down” some customer support offices, including one in Amsterdam scheduled for shutdown later this year. Sonos’ LinkedIn page reports 1,800 employees worldwide, and the six-percent figure quoted in the statement would put it at about 1,650 workers. The company’s last layoffs, in June 2023, slashed seven percent of its workforce.
Although Engadget’s review was largely impressed with the company’s new Ace headphones, the app complaints largely overshadowed the highly anticipated hardware launch. Designed to address “performance and reliability issues” and rebuild the developer platform with “modern programming languages that will allow us to drive more innovation faster,” the app launch has been a debacle. It’s created headaches for the company’s most loyal customers and threatened to drag down the brand as it pushes into new product categories. It even led to the delay of two new products that were otherwise ready to roll.
The new Sonos app for Android, iOS and desktop launched in May without core functionality like sleep timers and alarms. Customers reported problems rearranging speakers in different rooms, some only working intermittently and problems completing other basic tasks. Others even said they often couldn’t load the app on the first try.
Sonos
For a taste of how broken the app is, Spence laid out a timeline to repair it in a blog post late last month. July and August were dedicated to improving stability when adding new products and implementing Music Library improvements. An August and September window is reserved for improving volume responsiveness, user interface, stability and error handling. September and October will include tweaks to alarm consistency and reliability, and the restoration of editing playlists and queues. Improvements to settings will also be addressed. (Phew!)
In Spence’s statement about Wednesday’s layoffs, he said the cuts won’t affect the work on the app. “Our continued commitment to the app recovery and delighting our customers remains our priority and we are confident that today’s actions will not impact our ability to deliver on that promise,” the CEO wrote.
Today’s announcement wasn’t received well by the company’s Reddit community, which has been vocal about the app’s problems since its launch. Some viewed today’s reported layoffs as targeting 100 workers when one high-profile one would’ve done the trick. “I have to say that, I didn’t have both feet in the door to fire Patrick Spence, but any CEO who leaves his employees hung out to dry and then signs the paper that lays them off is a scumbag piece of shit,” u/teryan2006 wrote.
“Since I took over as CEO, one of my particular points of emphasis has been the imperative for Sonos to move faster,” Spence said on a July earnings call. “That is what led to my promise to deliver at least two new products every year — a promise we have successfully delivered on. With the app, however, my push for speed backfired.”
Update, August 14, 2023, 4:56 PM ET: This story has been updated to add the statement from Sonos CEO Patrick Spence.
This article originally appeared on Engadget at https://www.engadget.com/audio/sonos-still-trying-to-fix-its-broken-app-reportedly-lays-off-100-employees-203224705.html?src=rss
Waymo driverless cars in San Francisco have been coming back online at night and honking at each other, as reported by CBS News. Videos have begun circulating showing dozens of the vehicles sitting in the same parking lot and just honking away without a care in the world. This has, obviously, irked some human neighbors who need sleep.
Is this a sign of the forthcoming AI apocalypse or is it some robotaxis learning how to flirt? Unfortunately for those looking for a “robots in love” narrative, it’s neither. It’s just an error within the security software. Simply put, the software mandates a honk when another car gets too close. These particular Waymo taxis sit right next to one another in a cramped parking lot when not in use and, well, there you go.
"We recently introduced a useful feature to help avoid low-speed collisions by honking if other cars get too close while reversing toward us," the company said in a statement. "It has been working great in the city, but we didn't quite anticipate it would happen so often in our own parking lots.”
Waymo says that it has updated the software to address the issue, noting that “our electric vehicles should keep the noise down for our neighbors moving forward.” So that’s that. Another mystery solved.
This article originally appeared on Engadget at https://www.engadget.com/transportation/waymo-driverless-cars-have-gotten-inexplicably-chatty-honking-at-one-another-all-night-162115440.html?src=rss
Following a major update back in the spring, the Sonos app was very broken and missing key functionality. The company admitted it made a huge mistake in pushing the redesign too soon and explained that it has since uncovered more “issues” that have prohibited it from adding those missing features. Users were quickly frustrated, and now the company is so mired in fixing its app problem that it delayed two products that are ready to launch ahead of the holiday season. Let’s discuss how we got here, what happened with the app, the consequences Sonos is facing and what likely happens next.
The backstory
Sonos released a completely rebuilt version of its app for Android, iOS and desktop in May. The total redesign was focused on making it easier to play different kinds of content while also creating a hub that’s better suited for finding what you need. Of course, it also has to work with the company’s various speakers and soundbars, and the overhaul took place ahead of the release of the first Sonos headphones. Those cans, the Ace, brought new functionality that had to be supported in the app, so the company thought it was time to wipe the slate clean.
“We viewed re-architecting the app as essential to the growth of Sonos as we expand into new categories and move ambitiously outside of the home,” CEO Patrick Spence said on the company’s Q3 earnings call. “In addition to its more modern user interface, the new app has a modular developer platform based on modern programming languages that will allow us to drive more innovation faster, and thus let Sonos deliver all kinds of new features over time that the old app simply could not accommodate.” He also noted that “performance and reliability issues had crept in” over the company’s history, so the user experience had already suffered as a result of the aging platform.
Billy Steele for Engadget
Some customers will always be resistant to change. The new customizable interface took some getting used to, but that wasn’t the problem. The app was missing basic features like sleep timers and alarms. Users also reported the inability to rearrange speakers in different rooms, speakers working intermittently and trouble completing other basic tasks. Some say they can’t reliably load the app on the first try.
“We developed the new app to create a better experience, with the ability to drive more innovation in the future, and with the knowledge that it would get better over time. However, since launch we have found a number of issues,” Spence explained in July. “Fixing these issues has delayed our prior plan to quickly incorporate missing features and functionality."
Spence laid out a roadmap for fixing the problems in the same blog post, which initially included restoring the ability to add new products to your home setup. Even I had trouble adding the Ace headphones to the app at first, but after a few tries I eventually got it. Spence also said that the company had released updates to the app every two weeks since the redesign launched on May 7, and that it would continue that schedule alongside detailed release notes. The most recent version, which included TV Audio Swap with the Ace headphones and older Sonos soundbars, delivered a handful of very basic things — like the ability to clear the queue on the iOS version.
“Since I took over as CEO, one of my particular points of emphasis has been the imperative for Sonos to move faster,” Spence said on the earnings call. “That is what led to my promise to deliver at least two new products every year — a promise we have successfully delivered on. With the app, however, my push for speed backfired.”
The fallout
Sonos
Customers have been understandably upset since early May, which is well-documented in the r/sonos subreddit. But, user satisfaction isn’t the only issue that the company is facing. With the new version of the app so busted it had to be fixed before missing items could be added, Sonos has delayed the launch of two new products that were ready to go on sale in Q4. Spence said that the company enlisted “the original software architect of the Sonos experience,” Nick Millington, to do “whatever it takes” to remedy the issues.
And it’s not just a headache for customers. Sonos dealers and installers, which make up a significant part of the company’s business, are allegedly so frustrated with what they encounter trying to do their jobs that some of them have paused sales. “As an installer when you try and talk this up to somebody to buy, it is extremely embarrassing at this point when you have to just say ‘well, the parent company is having issues,’ it makes you look like the jackass,” one Reddit post explained.
Spence explained on the company’s Q3 earnings call that the app debacle would cost between $20 and $30 million. CFO Saori Casey said that sum is due to the loss of sales stemming from the software problem and having to delay two new products that would’ve generated more revenue. As such, Sonos had to revise its Q4 financial guidance to lower expectations until this “chapter,” as Spence describes the current fiasco, is resolved.
While the company hasn’t revealed any details on what either of those two now-delayed products might be, there have been rumors that offer some clues. Bloomberg reported in November that the company was working on a soundbar that would surpass the capabilities of its current flagship, the Arc, in addition to a set-top TV streaming box. Both of those products would be hot-ticket items for the holidays, especially for the Sonos faithful.
Sonos Arc
Kyle Maack/Engadget
Bloomberg explained that the new soundbar would likely be $1,200, or $300 more than the Arc. Additionally, the new model would include “new technology” following the company’s acquisition of Mayht Holding BV. In Sonos’ announcement of that move, it said that Mayht “invented a new, revolutionary approach to audio transducers.” More specifically, the company’s engineering methods allow it to build transducers, a basic component of speakers, that are smaller and lighter without sacrificing quality.
According to that same Bloomberg report, the Sonos TV streaming box will be powered by an Android-based OS with various apps for different services. The device is also said to act as a hub for Sonos gear in the home. Dolby Atmos and Dolby Vision should be on the spec sheet, and voice control will reportedly play a significant role in the gadget’s operation.
Bloomberg further explained that Sonos was working on a new high-end amplifier that could cost at least $3,000 and new eight-inch ceiling speakers. Frustrated dealers and installers would likely be hesitant to sell such expensive items with the app in the current state, if Sonos could even add the functionality to make them work. Any of these could be the two products the company was forced to delay, and any of them would’ve likely padded the bottom line before the end of the year. And that would be on top of the early sales of the Ace, a device that should be popular this holiday season.
What happens next?
Sonos has a clear plan for how it intends to fix its buggy app, but there’s no timeline for exactly how long that will take. The company has yet to deliver a TrueCinema feature for its Ace headphones that will map a room (with the aid of a soundbar) to virtually recreate the acoustics for a more realistic experience. And as we learned recently, new products have been delayed for the time being.
“Building a new software foundation was the right investment for the future of Sonos, but our rollout in May has fallen dramatically short of the mark,” Spence said on the earnings call. “We will not rest until we’ve addressed the issues with our app, and have delivered new versions that materially improve our customers’ experiences.”
The company is at a standstill until its app is fixed, with a full product pipeline plugged up for the foreseeable future – if the reports are to be believed. From the looks of it, Sonos planned to release an app that wasn’t completely done, but one that it thought was stable enough to use, with the goal of adding features over time to improve the overall experience. However, the complete rebuild was essential to the new technology and devices Sonos has in the works, since the company has been clear the old app wouldn’t have supported them. And since one of those items was the Ace, Sonos had to decide if it could wait longer to release its first headphones, a highly-anticipated product, or just go ahead with what it had.
This article originally appeared on Engadget at https://www.engadget.com/audio/speakers/why-is-the-sonos-app-so-broken-140028060.html?src=rss
Spotify can now show its users in the European Union how much its plans cost within its iOS app after their trial period ends. The company has revealed that it's opting into Apple's "entitlement" for music streaming services in an update to an old blog post. This "entitlement" was created after the European Commission slapped Apple with a €1.8 billion ($1.95 billion) fine back in March for restricting alternative music streaming apps on the App Store. The commission's decision followed an investigation that was opened when Spotify filed a complaint against the tech company, accusing it of suppressing its service in favor of iTunes and Apple Music.
Apple initially rejected the update that Spotify submitted in April this year to add "basic pricing and website information" on its app in Europe. Now that Apple has approved changes, users will be able to see pricing information, as well as promotional offers, within the Spotify app for iPhones. They'll also see a note saying that they can go to the Spotify website to subscribe to any of the service's plans. However, the service chose not to provide users with an in-app link that would give them access to external payment options. As The Verge notes, it's because Apple recently tweaked its App Store rules in the EU, stating that it will still take a cut of developers' sales even if customers pay via third-party providers.
"Unfortunately, Spotify and all music streaming services in the EU are still not able to freely give consumers a simple opportunity to click a link to purchase in app because of the illegal and predatory taxes Apple continues to demand, despite the Commission’s ruling," Spotify wrote in its post. It added that "if the European Commission properly enforces its decision, iPhone consumers could see even more wins, like lower cost payment options and better product experiences in the app."
This article originally appeared on Engadget at https://www.engadget.com/entertainment/music/apple-finally-allows-spotify-to-display-pricing-in-the-eu-123010178.html?src=rss
Google’s Pixel 9 lineup is powered by cutting-edge hardware like the Tensor G4 processor and tons of RAM that should help keep your phone feeling fast and fresh for years to come. But all that hardware is also designed to power brand new AI experiences.
“Android is reimagining your phone with Gemini,” wrote Sameer Samat, Google’s president of the Android Ecosystem, in a blog post published on Tuesday. “With Gemini deeply integrated into Android, we’re rebuilding the operating system with AI at the core. And redefining what phones can do.”
Here are the big new AI features coming with the new Pixel devices.
Gemini overlays and Gemini Live
Gemini, Google’s AI-powered chatbot, will be the default assistant on the new Pixel 9, Pixel 9 Pro, Pixel 9 Pro XL and Pixel 9 Pro Fold phones. To access it, simply hold down your phone’s power button and start talking or typing in your question.
A big new change is that you can now bring up Gemini on top of any app you’re using to ask questions about what’s on your screen, like finding specific information about a YouTube video you’re watching, for instance. You’ll also be able to generate images directly from this overlay and drag and drop them into the underlying app, as well as upload a photo into the overlay and ask Gemini questions about it.
Google
If you buy the pricier Pixel 9 Pro (starting at $999), Google’s bundling in one free year of the Google One AI Premium Plan that typically runs $19.99 a month for access to 2 TB cloud storage and access to Gemini Advanced, which lets you try Gemini directly in Google products like Gmail and Docs to help you summarize text and conversations.
Crucially, Gemini Advanced also includes access to Gemini Live, which Google describes as a new “conversational experience” to make speaking with Gemini more intuitive (I’m not the only one having a hard time keeping track of all the things Google brands “Gemini,” don’t worry). You can use Gemini Live to have natural conversations with Gemini about anything that’s on your mind, including, Google says, using it for help with complex questions and job interviews, choosing between a variety of voices that sound stunningly lifelike, according to demos that Google showed Engadget earlier this month.
Google
Recently, OpenAI released Advanced Voice Mode, a similar feature, to paying ChatGPT customers with a voice assistant that can talk, sing, laugh and allegedly understand emotion. When asked if getting Gemini Live to sound as human-like as possible was one of Google’s goals, Sissie Hsiao, the company’s vice president and general manager of Gemini Experiences told Engadget that Google was “not here to flex the technology. We’re here to build a super helpful assistant.”
Photos and Camera features
Google is using AI to make both taking and editing pictures dramatically better with the Pixel 9 phones, something they’ve focused on for years now. A new feature called Add Me, which will be released in preview with the new devices, for instance, will let you take a group photo and then take a picture of the photographer separately and add it to the main picture seamlessly — handy if you don’t have anyone around to take a picture of your entire group.
Meanwhile, Magic Editor, the built-in, AI-powered editing tool on Android, can now suggest the best crops and even expand existing images by filling in details with generative AI to get more of the scene. Finally, a new “reimagine” feature will let you add elements like fall leaves or make grass greener — punching up your images, yes, but blurring the line between which of your memories are real and which are not.
Circle to Search now lets you share
You can already search anything that you see on your phone by simply circling it, but now, AI will intelligently clip whatever you’ve circled and let you instantly share it in a text message or an email. Handy.
Google
Pixel Screenshots
Google
If you can't figure out how to sort through the tons of pictures of receipts, tickets and screenshots from social media littering your phone's photo gallery, use AI to help. A brand new app called Pixel Screenshots available on the new Pixel devices at launch will go through your photo library (once you give it permission), pick out screenshots, and then identify what's within each picture. You can also click pictures of real-world signs (such as a music festival you want to attend, for example), and directly ask the app relevant questions like when do the tickets for the festival go on sale.
Call Notes
A new feature called Call Notes will automatically save a private summary of each phone call. so you can refer back to a transcript to quickly look up important information from the call like an appointment time, address, or phone number later. Google notes that the feature runs fully on-device, which means that nothing is sent to Google's servers for processing. And everyone on the call will be notified if you've activated Call Notes.
Pixel Studio
Google
We've been able to use AI to generate images for a long time now, but Google is finally building in the feature right into Android thanks to Pixel Studio, a dedicated new image-generation app for Pixel 9 devices. The app runs on both, an on-device model powered by the new Tensor G4 processor and Google's Imagen 3 model in the cloud. You can share any images you create in the app through messaging or email directly.
A similar feature called Apple Image Playground is coming to newer iPhones with iOS 18 in September.
Custom weather reports
Google will use AI to create custom weather reports for your specific location right at the top of a new Weather app so you "don't have to scroll through a bunch of numbers to get a sense of the day's weather," according to the company's blog post.
This article originally appeared on Engadget at https://www.engadget.com/ai/here-are-all-the-ai-features-coming-to-the-pixel-9-phones-173551511.html?src=rss
August used to be a relatively sleepy month for tech news — no longer! Now that Google scooted up its annual October Pixel event by two months, the tech world is abubble, going over everything execs announced from Mountain View, California at the Made By Google keynote on Tuesday.
The Pixel 9 launch event came with enthusiastic introductions for all the hardware we expected, including the new Pixel 9 and its sizable camera bump. The Pixel 9 Pro and the larger Pixel 9 Pro XL made their official debut, too, and the new foldable, the Pixel 9 Pro Fold, does indeed measure just 0.4 inches thick. The new Pixel Watch 3 and Pixel Buds Pro 2 have now been revealed, as were plenty of software features, mostly in the form of Gemini integrations. One surprise was the not-loudly-stated fact that Pixel 9 Phones won't launch with the Android 15 operating system — they'll have Android 14 to start. For the play-by-play, you can check out our liveblog or check out Google's stream. If you just want the highlights, here's everything announced at the 2024's Made by Google Pixel event.
Google Pixel 9
Photo by Sam Rutherford / Engadget
What you first notice about Google's latest Pixel 9 is the redesigned look. The camera band has been replaced with oblong oval that stands proud from a slab that's about a tenth of an inch thinner than the Pixel 8. The screen size has bumped back up to 6.3 inches, after dipping to 6.2 inches on the Pixel 8, and is covered in Corning Gorilla Glass Victus 2. It's got a polished glass back with satin metal finishes on the frame and cameras.
Speaking of cameras, there's the same number as last year (two in the back, one in the front) but the previous generation's 12 MP ultrawide lens has been replaced with a 48 MP ultrawide lens. The other two cameras have the same specs, except the front cam now has autofocus for better selfies. To take advantage of those fancy sensors, new AI photography enhancements like Add Me and Reimagine join the existing Magic Editor, Night Sight and Best Take features.
The Pixel 9 houses the same Tensor G4 chip, designed to be better at everyday tasks, while using up less battery. That's the same chip as its more expensive siblings and now the base model Pixel comes with 12GB of memory, eliminating the option of an 8GB model. That extra RAM will help handle the many Gemini integrations coming standard in Android 15. Pressing and holding the power button will overlay the assistant on whatever you're doing on-screen and can answer questions, pull details from other apps, and produce contextualized recommendations based on images you take.
Engadget's Sam Rutherford spent some time hands-on with the Pixel 9 family of phones and so far, likes what he sees, noting that the new designs "look great" and the AI tools and features are shaping up to be useful iterations on what can otherwise seem like a buzzy bandwagon add-on.
The Pixel 9 comes with 7 years of OS and security updates and is available in Obsidian, Porcelain, Wintergreen and Peony. Its starts at $799 for 128GB of storage and is now open for pre-orders and all Pixel 9 phones will hit the shelves August 22.
Google Pixel 9 Pro and Pixel 9 Pro XL
Photo by Sam Rutherford / Engadget
Those not content with a standard-issue phone can opt for the Pixel 9 Pro or the Pixel 9 Pro XL instead. Notably this year, the Pro moniker doesn't necessarily mean bigger; the Pixel 9 Pro is the same size as the regular Pixel 9, both with 6.3-inch screens. Google created a new category in its lineup with the Pixel 9 Pro XL — a phone with the the same general specs as the Pro model but with a larger, 6.8-inch display and a 5,060 mAh battery (versus the 4,700 mAh battery on the smaller version).
All three Pixel 9 models use the same Google Tensor G4 processor, but the two Pro phones have 16GB of RAM on hand to execute AI tricks and any other task you might demand from them. You can get either phone with 128GB of storage or a full terabyte. The Pro models also pack an additional 48 MP telephoto lens in the back and a heftier 42 MP selfie camera up front.
Both come with a year's subscription to the Google One AI Premium Plan which lets you access all of the tricks Gemini can do — after the free trial, you'll need to pay $20 monthly (the plan also comes with 2TB of storage).
The Pixel 9 Pro starts at $999 and the Pixel 9 Pro XL starts at $1,099. Both come in the same four colors: Obsidian, Porcelain, Hazel and Rose Quartz, and include a promised seven years of security and features updates. Like everything announced at the event, the phones are now open to pre-orders and will be on store shelves August 22.
Google Pixel 9 Pro Fold
Photo by Sam Rutherford / Engadget
We now officially know that Google's second foldable phone is not called the Pixel Fold 2, but rather the Pixel 9 Pro Fold. To go along with the enlarged name, there's a bigger, eight-inch inner screen, making it the largest on any phone out there. The outer screen is larger too, measuring 6.3 inches, up from 5.8 inches last year. Google claims the interior screen is 80 percent brighter than its predecessor and now maxes out at 2,700 nits.
A persistent complaint with foldables is how heavy and bulky they can feel. Google hopes a few design tweaks will help with that. The Pixel 9 Pro Fold now measures just 0.4 inches when closed, making it the thinnest foldable on the market (as long as you don't count the sizable camera bump) and at 257g it's about 25 grams lighter than the Pixel Fold.
The new foldable houses Google's Tensor G4 chip, comes standard with 16GB of RAM and offers your choice of 256GB or 512GB of storage. Like every piece of 2024 Pixel hardware, the 9 Pro Fold is tailored around Google's Gemini AI contrivances. Pressing the power button brings up the assistant, which you can use in split screen on the foldable. The phone also comes with a year of the One AI Premium plan, which jumps to $20 per month afterwards.
The three exterior cameras include a wide, ultrawide and telephoto lens with 5x optical zoom and up to 20x Super Res Zoom. There's a 10 MP camera on both the interior and on the front of the exterior screen. And, thanks to the foldable nature of the phone you can take selfies using the more powerful rear cameras by checking out the preview of the shot on the exterior screen.
We've already spent a little time with the new foldable and so far, like what we see. It's thinner than Z Fold 6 but packs a larger interior screen. And the AI tools the foldable enables actually seem useful.
The Google Pixel 9 Pro Fold costs the same $1,799 as the 2023 model and comes in either Obsidian or Porcelain. It too is now available for pre-order and will hit stores September 4.
Google Pixel Watch 3
Photo by Sam Rutherford / Engadget
Last year we said the Pixel Watch 2 was "catching up to its rivals," but still took issue with the disjointed Fitbit integration and the lack of wireless charging. Fitbit is still very much a part of the Pixel Watch 3 experience and charging still requires a cable — our full review will tell us whether those are dealbreakers or not.
The watch now comes in two sizes, with a larger 45mm case size joining the 41mm model. Thanks to thinner bezels, the 41mm display is 10 percent larger than on the Pixel Watch 2 and the 45mm screen is 40 percent larger. Both screens peak at 2,000 nits, which is twice as bright as 2023's watch, and both get as dim as 1 nit.
There's a new readiness score and cardio load tracking, which sounds a bit like the Training Load feature in Apple's watchOS 11 — all of which give you feedback on how hard you're pushing yourself. New integrations include displaying a live feed of your Nest cams from your watch and using the wearable as a Google TV remote. The battery offers the same 24 hours of use on a charge, but Google claims recharging will be 20 percent quicker with a 30-watt wall adapter (sold separately). Call Assist will add the "hold a minute" ability, which can answer your call and ask the caller to wait until you're in a better spot or have set down whatever you're working on.
The heart rate tracking has been updated to work more accurately while running, an activity that's particularly hard to track. Readiness score and Cardio Load combine to give you a Daily Readiness score, which tells you how intense you should work out in a given day.
The "first of its kind" Loss of Pulse Detection feature will automatically call emergency services and direct them to your location if the algorithm detects a dangerous situation judging by your pulse, movement and other metrics. It will start out in select EU countries and parts of the UK, with more regions to come.
The 41mm Pixel Watch 3 retails for $349 for WiFi only and $449 with LTE. The 45mm model goes for $399 or $499 if you get cellular connectivity. Both come in your choice of black or silver, with an added hazel hue for the larger case size. You can pre-order them now and the watches will be on the shelves on September 10.
Google Pixel Buds Pro 2
Photo by Sam Rutherford / Engadget
Despite being smaller and lighter, the Pixel Buds Pro 2 somehow manage to pack an extra hour of battery life compared to the Pixel Buds Pro, now getting up to 12 hours of play with noise cancellation off. The Silent Seal feature is back and now should reduce twice as much noise as before. Plus they'll support Spatial audio with head tracking — but only when paired with a Pixel 6 or newer phone or a Pixel Tablet.
The Tensor chip inside, the first in a pair of Google earbuds, enables the new features and the company claims it can process audio significantly faster in order to adapt to your environment. And thanks to multi-path processing, noise-cancellation computations don't happen on the same channel as the audio, so the music you hear is unaltered.
New "twist-to-adjust" stabilizers should keep the buds in place when you're working out, but can be rotated in the other direction for a more comfortable feel. Conversation Detection is back, meaning you won't have to pull out a bud when you need to talk to someone — instead the music pauses when you start speaking and resumes when you're done. The Buds claim to be the lightest noise cancelling earbuds in their class.
And of course, Gemini is built-in. You can do things like ask for walking directions or access your email. It also supports Gemini Live, which Google's Sandeep Waraich demonstrated on stage by using the prompt "Let's talk live." Gemini suggested things like breathing techniques to stay calm in a crowd and tips on how to approach a person they admire.
The Pixel Buds 2 Pro are available to pre-order now and will be on shelves on September 26. They come in Porcelain, Hazel, Wintergreen and Peony, and are selling for $229, which is $29 more than the Pixel Buds Pro were at launch.
Google Pixel Screenshot app
The new Pixel Screenshot app is only available on Pixel 9 phones at launch (no word yet on wider availability) and uses Gemini Nano (the on-board AI model) to save, extract and organize info you might otherwise forget the origin of. For example, you can take a screenshot of an Instagram post about a music festival and the AI will give you a summary as well as buttons to do things like add dates to your calendar the next time you access that screenshot from the app. A screenshot of a restaurant should produce options to call the business or navigate there via Google Maps.
Google says you should also be able to ask natural questions using the microphone, and Pixel Screenshots will either answer it outright and bring up relevant annotated images. The app launches today, along with everything else and while it's reminiscent of Apple's redesigned Photos app, we'll have to try it for ourselves to really suss out all the differences.
Android 15 and Gemini AI
The one thing we expected that didn't come into play was the launch of Android 15 — we heard a little about the new operating system during May's Google I/O event. But now we know it's not coming right away. New Pixel Phones will launch with Android 14. The Google execs didn't hit that fact very hard, but a look at the Pixel 9 spec sheet confirms it. It's possible the new OS simply wasn't ready for a bumped up release date, as the hardware was, and Android 15 may very well be sticking with its October launch date. We won't be surprised to see the OS drop later this fall.
When it comes, Android 15 will, unsurprisingly, revolve around giving Gemini the reigns and letting Google's AI do your bidding. But all of those AI features will still be available when the phones launch, even if they're running Android 14.
Google has revamped its Assistant around Gemini. Google hardware Chief Rick Osterloh said, "It's the biggest leap forward since we launched Google Assistant." And Google promises that the assistant won't just be for fancy new flagship devices, but existing mid-range ones as well — and not just Google phones, but all Android phones. Some of the event's live demos were performed on Samsung and Motorola handsets (although there were a few hiccups). President of Android, Sameer Samat, called Gemini the "most widely available AI Assistant."
Gemini can be pulled up over whatever app you're using and answer your questions about what's on screen. Circle to Share lets you quickly send whatever you happen to be looking at to your contacts with just a couple taps.
You can now also ask Gemini to access files in your Drive or messages in your Gmail account to generate text-based content like bios, workout plans or itineraries. To keep your personal details private, requests involving the most sensitive info are handled by Gemini Nano, an AI model that lives on your phone.
Other examples included asking Gemini to create a list of the foods a YouTuber ate in a video and then adding those to a list. Or asking it to create a playlist described by vibes. With Gemini Live, you get a few new voices, including Dipper, Ursa and Vega. Gemini Live hits hard on the conversational aspect, and the responses did indeed sound natural — especially when you don't have to keep saying "hey Google."
Pixel Weather, Call Notes, Made You Look
Pixel Weather, a redesigned weather app for the Pixel 9 family, comes with a handy AI summary and is completely customizable. Call Notes can give you an AI-powered summary of a call after you hang up. You can even review the full transcript. For privacy, the transcriptions and summaries are taken care of on-device instead of sending it to the cloud. Of course it's completely optional.
Other features revealed in Mountain View include the Add Me feature in photos — which NBA All Star Jimmy Butler came on stage to demonstrate. The AI trick allows the photographer to be in the shot by taking one photo without them, and then another with them in place, then AI merges the photos, ensuring reality's increasing subjectivity.
Pixel Studio can create images using text prompts and suggestions for different styles fonts. Google's Alexander Schiffhauer noted that thanks to RCS coming to iPhone, these and other images will appear the same for everyone in text message conversations.
On the Google Pixel 9 Pro Fold, the Made You Look feature will get your toddler to look at the camera and smile by displaying fun cartoon characters on the exterior screen while you take the image with the rear camera. Magic Editor combines classic photo editing with AI interpretations and generative AI capabilities so you can, for instance, add wildflowers and a hot air balloon to an image you took where the grass was boring and the sky was clear.
We also got a quick look at where Google's research in AI is leading. Google hardware Chief Rick Osterloh told us that Gemini will continue to evolve, particularly through integration with Project Astra, Google's deep mind research prototype, which was demonstrated back at I/O earlier this year. The goal is an even more natural and context-aware assistant, allowing Gemini Live to understand your questions and pics to help you get things done, such as homework, brainstorming and more complex tasks like how to open a business.
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/the-google-pixel-9-the-pixel-9-pro-fold-and-everything-else-announced-at-the-made-by-google-pixel-launch-event-170033517.html?src=rss
About 50 percent of my photo album is receipts. That is, screenshots of everything I consider even mildly interesting. Whether it’s Uber drivers who never seem to be getting closer, hot tea from my friend’s Instagram Stories or unfathomable email threads, my gallery is full of unexplainable internet detritus. Best of all, just from viewing their thumbnails, I can never know where exactly a specific image is, because walls of text all look the same from afar. So when Google announced its new Pixel Screenshots app at its Made By Google event today, I was excessively excited.
The Screenshots app launches alongside the Pixel 9, Pixel 9 Pro and Pixel 9 Pro Fold, and uses Gemini AI to help locate specific images. After you grant the app access to your photos, the AI will not only ingest files it thinks are screenshots, but also start identifying what’s within each picture.
On the home page, you’ll see a row at the top called “Collections,” with a series of pre-organized snaps like “Gift Ideas,” “Boots” or “Places to visit.” These can be curated by yourself or suggested by the system.
Below this row is a grid of all your most recent captures, and at the bottom is a search bar and a Plus symbol next to it. Pressing that symbol will let you either launch the camera or import a photo from your album. This is helpful for pictures you’ve taken of real-world signs that contain information you want Gemini AI to help remember.
Tapping each screenshot in this app will expand the image and bring up a title, summary and buttons based on its contents. These are all AI-generated, so if you’re looking at a picture of a music festival’s Instagram post about upcoming dates, the title might say “Lollapalooza headline acts” with buttons to add specific events from that picture to your calendar. If you’ve pulled up an image of a restaurant’s website, then Screenshots might offer shortcuts to call the shop or navigate to the business address via Maps.
From the home page, you can either type into the search bar or tap the microphone icon in it and ask Google for things like “What was Sam’s WiFi password?” or “How much do I owe Cherlynn?” The app will scour your gallery and not only return images with possibly relevant info, but also attempt to answer your question up top. In the demo I saw at a recent hands-on event, a Google rep asked the app “When do the tickets for the festival go on sale?”
Screenshots responded almost instantly by pulling up a picture of a folk festival’s Instagram post, and seconds later showed the words “The tickets for the festival go on sale on August 5th.” This example was particularly impressive as there were multiple dates noted in the screenshot, one for the ticket sales starting and one for the festival itself kicking off. From the same interface, the company’s rep was able to get the Pixel 9 to set a reminder to buy the tickets in time.
It’s kind of a coincidence that Google is launching this app today, considering Apple’s redesign of its Photos app also pays extra attention to organizing and filtering out screenshots. My experience of both approaches is extremely limited at the moment, but currently I slightly prefer Google’s Screenshots app. It feels like a more focused and deliberate way to look for information and get help from AI, rather than possibly getting distracted by my million selfies in the Photos app on my iPhone when I’m trying to look for a bank statement, perhaps.
The use of AI to make sense of our screenshots feels like a smart one, though there are of course privacy concerns. Microsoft already had to hit pause on the rollout of its Recall feature that was supposed to remember everything you were doing on your computer by taking screenshots every few seconds. Google’s Screenshots app uses Gemini Nano, which is its on-device AI model for local processing, and the company says this feature won’t share your screenshots offline (beyond the backups you might already have opted in to via Google Photos).
The Pixel Screenshots app will be on the Pixel 9 family at launch, and the company has nothing to share on wider availability at the moment. But based on how Google has launched and rolled out apps like Recorder in the past, it’s likely that older Pixel devices will get Screenshots in time, as long as it’s received well by users.
This article originally appeared on Engadget at https://www.engadget.com/apps/the-pixel-screenshots-app-uses-ai-to-scour-the-screengrabs-i-cant-remember-why-i-saved-170043423.html?src=rss
With the new Pixel 9, Google is continuing its push for more AI-powered features while also developing a more durable design and addressing one of my longest-running requests: the addition of a more compact Pro model.
The new Pixel 9 family
Unlike previous generations, the Pixel 9 line will now be divided across three handsets. There’s the base Pixel 9 which features a 6.3-inch screen, the 6.8-inch Pixel 9 Pro XL and the newest member of the family: the Pixel 9 Pro, which has all the same features as the Pro XL but in a smaller chassis with a 6.3-inch screen. In essence, the P9 Pro is for everyone who always wanted the extra telephoto camera you got on previous top-tier Google phones, but without the need to upgrade to a physically larger device.
The standard Pixel 9 will be available in four colors: obsidian, porcelain, wintergreen and peony.
Photo by Sam Rutherford/Engadget
All three models are powered by Google’s latest Tensor G4 chip. However, when compared to the regular Pixel 9, while it has the same size screen, the P9 Pro’s display sports a slightly higher resolution (1,280 x 2856 versus 1,080 x 2,424) and better peak brightness (3,000 nits vs 2,700 nits). It also gets more storage size options (up to 1TB) and 16GB of RAM instead of 12GB like on the base model. However, both the Pixel 9 and Pixel 9 Pro feature the same 4,700 mAh batteries, so longevity should be quite comparable.
Updated styling
Now that we got that out of the way, we can look at the line’s updated design. Every model features Gorilla Glass Victus 2 in front and back along with a boxier frame that Google claims is twice as durable as the previous generation. There are also a few small cosmetic differences such as a different color options (the most notable is peony on the Pixel 9 and rose quartz on the Pro and Pro XL) and a matte satin finish on the base model versus a shiny, polished treatment for the more expensive Pro phones.
From the front, it almost looks like Google is stealing a page out of Apple’s playbook, as both the Pixel 9 and iPhone 14 have similar silhouettes with flat sides and rounded corners. But everything changes when you flip the phone around. Gone is the camera bar that recent Pixels had become known for and in its place is a tall pill-shaped module that looks kind of like a visor. It’s almost like what an Among Us character would look like if you made one into a phone.
For 2024, Google's top-tier phone will be split into two models: the 6.3-inch Pixel 9 Pro (left) and the 6.8-inch Pixel 9 Pro XL (right).
Photo by Sam Rutherford/Engadget
But more importantly, Google has upgraded the Pixel 9 line’s camera sensors with a new 50-MP main camera, a 48-MP ultra-wide that can also shoot macros and, for the Pro and Pro XL, a third 48-MP cam with a 5x telephoto zoom. Unfortunately, it was hard to get a good sense of how much image quality has improved during my short hands-on session, but I’d argue the biggest improvements are some of Google’s new camera features anyway.
This includes the debut of Zoom Enhance, which was originally teased back during the launch of the Pixel 8 but hadn’t been officially released until now. It takes soft blurry images and uses AI to increase both detail and sharpness. But the most impressive thing is that it seems to deliver on the TV show magic from series like CSI, where you can just press a button and suddenly a blurry pic becomes clear as day.
New software and camera features
In Google’s Magic Editor, there are two additions called Autoframe and Reimagine. The former relies on machine learning to analyze existing shots and recompose them to better highlight the subject or their surroundings while filling in the blanks similar to how the Content-Aware Fill tool works in Photoshop. Meanwhile, the latter can add new elements to a photo (it works best on foregrounds and backgrounds) simply by typing something in the prompt box. During our session, I replaced a road with a raging river with surprisingly good results.
There’s also the Add Me tool, which uses augmented reality guides to help you shoot two group shots with different people holding the phone before merging everything together. This means that everyone can be in the final image without needing to ask a stranger for help. And as an expansion of last year’s Video Boost tool, you can now shoot videos with up to 20x zoom or clips with up to 8K resolution.
But perhaps the most intriguing new software is two new exclusive standalone apps: Pixel Screenshots and Pixel Studio. Pixel Screenshots is very straightforward as it uses AI to analyze and search through all your saved screenshots so that you can easily retrieve information like reservations, things mentioned in a text or anything else. That said, unlike Microsoft’s Recall feature in Windows 11, the Pixel 9 doesn’t create screenshots and save screenshots automatically, you have to do that on your own. This potentially sidesteps some of the more pressing security concerns, especially as everything in the Pixel Screenshots app happens on device.
The new Pixel Screenshots app on the Pixel 9 uses on-device AI to help search and organize all your screencaps.
Photo by Sam Rutherford/Engadget
Alternatively, for people who want to create brand new images, the Pixel Studio app uses AI to generate pretty much anything you can think of. You can even make custom stickers with your friends’ faces and combine them with other materials to create things like invitations.
Elsewhere, the Pixel Weather app features new AI-generated summaries of the day’s conditions, while updates to Clear Calling and the new Call Notes feature allow you to better hear and transcribe what’s being said. Finally, for more adventurous folk, Google’s Satellite SOS feature will allow you to text emergency services for help even when your phone doesn’t have a cellular or Wi-Fi connection. The service will be free for the first two years, though it remains to be seen how much it will cost after that.
Early impressions
Photo by Sam Rutherford/Engadget
All in all, the new Pixel 9 family isn’t a major departure from last year’s phones. That said, I think Google’s revamped designs look great and there are a ton of individual features and tools that seem quite powerful. So even if you might not have plans for all of them, stuff like Satellite SOS, Add Me or the Pixel Studio could make or break certain situations. Then you combine that best-in-class image quality, great screens with top-notch brightness. But the biggest ongoing development is how Google continues to build out its library of class-leading software and services. The Pixel is already home to powerful features like the Pixel Recorder, Call Screener and more, and now it’s getting support for Satellite SOS plus apps like the Pixel Studio, which is essentially a self-contained alternative to services like Midjourney. Year by year, it feels like Google is continuing to grow its lead in AI and software.
The standard Pixel 9 starts at $799 and will be available in four colors: obsidian, porcelain, wintergreen and peony. The Pixel 9 Pro and 9 Pro XL start at $999 and $1,099 respectively, and will be available in obsidian, porcelain, hazel and rose quartz. Pre-orders go live today with official sales beginning on August 22.
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/google-pixel-9-and-9-pro-hands-on-a-smart-evolution-and-a-smaller-pro-model-170015733.html?src=rss
Opera One, the browser with a focus on generative AI features that Opera launched for desktop last year, is now available for iOS devices. It retains its desktop counterpart's cleaner look, but it comes with a full screen interface and features specifically designed for mobile use. The company said it experienced a 63 percent growth in new users across the European Union after the Digital Markets Act was implemented, and now it has "embraced the opportunities presented by the new regulatory landscape."
Users will be able to move their search bar to the bottom of the screen if that will make it easier to type in queries on the go, especially if they're only using one hand. They can also activate the search bar simply by swiping down in the same way they'd swipe down to look for apps on their phone, as well. In addition, the browser's updated search function can make it faster to look up information: As soon as they start typing, a set of predictive chips will show up right above their keyboard with several possible options, including complete URLs for websites they may want to visit. The colors of the browser's top bar and bottom search bar change to blend in with the website the user is visiting, and both bars disappear when the user starts browsing.
And since Opera One has a focus on generative AI features, it comes with the company's Aria built-in browser assistant. Aria now has voice input, so users can speak queries out loud. Plus, users will be able to ask Aria to generate images using Google’s Imagen2 image generation model. Finally, since Opera puts a focus on security, the One iOS browser comes with a built-in ad blocker and free VPN.
This article originally appeared on Engadget at https://www.engadget.com/apps/operas-ai-focused-web-browser-one-is-now-on-ios-130013697.html?src=rss
Apple takes a lot of strong positions, but their ultimate hill to die on might just be requiring apps to make purchases through the tech giant. The latest example comes from Patreon, which announced that Apple is requiring it to switch over to the iOS in-app purchase system or risk expulsion. Patreon's entire purpose is to allow creators to offer "patrons" memberships in exchange for content. While some tiers are unpaid, creators offer paid options to make money — something this shift could impact.
Patreon users need to know about two main changes. By this November, all creators can only offer a subscription-based plan on iOS as the app store doesn't support other formats, such as first-of-the-month or per-creation plans. As a result, Patreon is rolling out a 16-month-long migration process that will shift all memberships to subscriptions by November 2025. At that point, subscription-based plans will be the only option available, unfortunately proving Apple's far-reaching power.
Apple will also be taking a 30 percent cut on all subscriptions made on the Patreon iOS app after November of this year — something its done for Patreon in-app commerce purchases since early 2024. Patreon has designed a tool that allows creators to increase their prices on the iOS app and leave them as is on the browser site and Android devices. However, creators can turn it off if they'd rather leave their rates as is.
This article originally appeared on Engadget at https://www.engadget.com/apps/patreon-will-have-to-use-apples-in-app-purchase-system-or-be-removed-from-the-app-store-192631471.html?src=rss