Three new theft protection features that Google announced earlier this year have reportedly started rolling out on Android. The tools — Theft Detection Lock, Offline Device Lock and Remote Lock — are aimed at giving users a way to quickly lock down their devices if they’ve been swiped, so thieves can’t access any sensitive information. Android reporter Mishaal Rahman shared on social media that the first two tools had popped up on a Xiaomi 14T Pro, and said some Pixel users have started seeing Remote Lock.
These three features actually appear to be rolling out globally, judging by all the replies I've received! https://t.co/IAj8NLcST0
Theft Detection Lock is triggered by the literal act of snatching. The company said in May that the feature “uses Google AI to sense if someone snatches your phone from your hand and tries to run, bike or drive away.” In such a scenario, it’ll lock the phone’s screen.
Offline Device Lock, on the other hand, can automatically lock the screen after a thief has disconnected the phone from the internet. You can already remotely lock your phone with Google’s Find My Device, but the third feature, Remote Lock, lets you do so without having to scramble to figure out your Google account password. All you’d need for this is “your phone number and a quick security challenge using any device.”
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/googles-theft-protection-features-have-started-showing-up-for-some-android-users-210634941.html?src=rss
OpenAI has debuted a new workspace interface for ChatGPT called Canvas. The AI giant unveiled its new ChatGPT workspace on its official blog and immediately made it available for ChatGPT Plus and Team users. Enterprise and Edu users will be able to access Canvas sometime next week.
Canvas is a virtual interface space for writing and coding projects that allows users to consult with ChatGPT on certain portions of a project. A separate window opens besides the main chat space and users can put writing or code on this new “canvas” and highlight sections to have the model focus on and edit “like a copy editor or code reviewer,” according to the blog.
Canvas can either be opened manually by typing “use canvas” in your prompt. Canvas can also automatically open when it “detects a scenario in which it could be helpful,” according to the blog post. There are also several shortcuts that can be used for writing and coding projects. For writing projects, users can ask ChatGPT for suggested edits or length adjustments, or ask it to change the reading level of a block of text, from graduate school level down to kindergarten. It can also add "relevant emojis for emphasis and color."
Coders can have ChatGPT review their code and add inline suggestions for improvements. It can also mark up your work with logs and comments to aid in debugging and make understanding your code easier. It's also capable of fixing bugs and port coding to a different language such as JavaScript, TypeScript, Python, Java, C++ or PHP in Canvas mode.
OpenAI’s Canvas feature brings ChatGPT in line with other AI assistants that have separate workspaces to focus on certain areas of a project like Anthropic's Artifacts and the coding focused AI model Cursor.
Update, October 4, 12:55PM ET: This story was edited after publishing to include more context on the code and text functionality of the Canvas feature.
This article originally appeared on Engadget at https://www.engadget.com/ai/openai-rolls-out-canvas-its-newest-chatgpt-interface-230335185.html?src=rss
Snap’s latest augmented reality glasses have a completely new — but still very oversized — design, larger field of view and all-new software that supports full hand tracking abilities. But the company is only making the fifth-generation Spectacles available to approved developers willing to commit to a year-long $99/month subscription to start.
It’s an unusual strategy, but Snap says it’s taking that approach because developers are, for now, best positioned to understand the capabilities and limitations of augmented reality hardware. They are also the ones most willing to commit to a pricey $1,000+ subscription to get their hands on the tech.
Developers, explains Snap’s director of AR platform Sophia Dominguez, are the biggest AR enthusiasts. They’re also the ones who will build the kinds of experiences that will eventually make the rest of Snapchat’s users excited for them too. “This isn't a prototype,” Dominguez tells Engadget. “We have all the components. We're ready to scale when the market is there, but we want to do so in a thoughtful way and bring developers along with our journey.”
Snap gave me an early preview of the glasses ahead of its Partner Summit event, and the Spectacles don’t feel like a prototype the way its first AR-enabled Spectacles did in 2021. The hardware and software are considerably more powerful. The AR displays are sharper and more immersive, and they already support over two dozen AR experiences, including a few from big names like Lego and Niantic (Star Wars developer Industrial Light and Motion also has a lens in the works, according to Snap.)
The glasses
To state the obvious, the glasses are massive. Almost comically large. They are significantly wider than my face, and the arms stuck out past the end of my head. A small adapter helped them fit around my ears more snugly, but they still felt like they might slip off my face if I jerked my head suddenly or leaned down.
Still, the new frames look slightly more like actual glasses than the fourth-generation Spectacles, which had a narrow, angular design with dark lenses. The new frames are made of thick black plastic and have clear lenses that are able to darken when you move outside, sort of like transition lenses.
The fifth-generation Spectacles are the first to have clear lenses.
Karissa Bell for Engadget
The lenses house Snap’s waveguide tech that, along with “Liquid Crystal on Silicon micro-projectors,” enable their AR abilities. Each pair is also equipped with cameras, microphones and speakers.
Inside each arm is a Qualcomm Snapdragon processor. Snap says the dual processor setup has made the glasses more efficient and prevents the overheating issues that plagued their predecessor. The change seems to be an effective one. In my nearly hour-long demo, neither pair of Spectacles I tried got hot, though they were slightly warm to the touch after extended use. (The fifth-generation Spectacles have a battery life of about 45 minutes, up from 30 min with the fourth-gen model.)
Snap's newest AR Spectacles are extremely thick.
Karissa Bell for Engadget
Snap has also vastly improved Spectacles’ AR capabilities. The projected AR content was crisp and bright. When I walked outside into the sun, the lenses dimmed, but the content was very nearly as vivid as when I had been indoors. At a resolution of 37 pixels per degree, I wasn’t able to discern individual pixels or fuzzy borders like I have on some other AR hardware.
But the most noticeable improvement from Snap’s last AR glasses is the bigger field of view. Snap says it has almost tripled the field of view from its previous generation of Spectacles, increasing the window of visible content to 46 degrees. Snap claims this is equivalent to having a 100-inch display in the room with you, and my demo felt significantly more immersive than what I saw in 2021.
The fourth-generation Spectacles (above) were narrow and not nearly as oversized as the fifth-gen Spectacles (below).
Karissa Bell for Engadget
It isn’t, however, fully immersive. I still found myself at times gazing around the room, looking for the AR effects I knew were around me. At other points, I had to physically move around my space in order to see the full AR effects. For example, when I tried out a human anatomy demo, which shows a life-sized model of the human body and its various systems, I wasn’t able to see the entire figure at once. I had to move my head up and down in order to view the upper and lower halves of the body.
Snap OS
The other big improvement to the latest Spectacles is the addition of full hand tracking abilities. Snap completely redesigned the underlying software powering Spectacles, now called Snap OS, so the entire user interface is controlled with hand gestures and voice commands.
You can pull up the main menu on the palm of one hand, sort of like Humane’s AI Pin and you simply tap on the corresponding icon to do things like close an app or head back to the lens explorer carousel. There are also pinch and tap gestures to launch and interact with lenses. While Snap still calls these experiences lenses, they look and feel more like full-fledged apps than the AR lens effects you’d find in the Snapchat app.
Lego has a game that allows you to pick up bricks with your hands and build objects. I also tried a mini golf game where you putt a golf ball over an AR course. Niantic created an AR version of its tamagotchi-like character Peridot, which you can place among your surroundings.
The interface for Snapchat's AI assistant, MyAI, on Spectacles.
Snap
You can also interact with Snapchat’s generative AI assistant, MyAI, or “paint” the space around you with AR effects. Some experiences are collaborative, so if two people with Spectacles are in a room together, they can view and interact with the same AR content together. If you only have one pair of Spectacles, others around you can get a glimpse of what you’re seeing via the Spectacles mobile app. It allows you to stream your view to your phone, a bit like how you might cast VR content from a headset to a TV.
The new gesture-based interface felt surprisingly intuitive. I occasionally struggled with lenses that required more precise movements, like picking up and placing individual Lego bricks, but the software never felt buggy or unresponsive.
There are even more intriguing use cases in the works. Snap is again partnering with OpenAI so that developers can create multimodal experiences for Spectacles. “Very soon, developers will be able to bring their [OpenAI] models into the Spectacles experience, so that we can really lean into the more utilitarian, camera-based experiences,” Dominguez says. “These AI models can help give developers, and ultimately, their end customers more context about what's in front of them, what they're hearing, what they're seeing.”
Is AR hardware about to have a moment?
CEO Evan Spiegel has spent years touting the promise of AR glasses, a vision that for so long has felt just out of reach. But if the company’s 2021 Spectacles showed AR glasses were finally possible, the fifth-generation Spectacles feel like Snap may finally be getting close to making AR hardware that’s not merely an experiment.
For now, there are still some significant limitations. The glasses are still large and somewhat unwieldy, for one. While the fifth-gen Spectacles passably resemble regular glasses, it’s hard to imagine walking around with them on in public.
Then again, that might not matter much to the people Snap most wants to reach. As virtual and mixed reality become more mainstream, people have been more willing to wear the necessary headgear in public. People wear their Apple Vision Pro headsets on airplanes, in coffee shops and other public spaces. As Snap points out, its Spectacles, at least, don’t cover your entire face or obscure your eyes. And Dominguz says the company expects its hardware to get smaller over time.
Snap's fifth-generation Spectacles are its most advanced, and ambitious, yet.
Karissa Bell for Engadget
But the company will also likely need to find a way to reduce Spectacles’ price. Each pair reportedly costs thousands of dollars to produce, which helps explain Snap’s current insistence on a subscription model, but it’s hard to imagine even hardcore AR enthusiasts shelling out more than a thousand dollars for glasses that have less than one hour of battery life.
Snap seems well aware of this too. The company has always been upfront with the fact that it’s playing the long game when it comes to AR, and that thinking hasn’t changed. Dominguez repeatedly said that the company is intentionally starting with developers because they are the ones “most ready” for a device like the fifth-gen Spectacles and that Snap intends to be prepared whenever the consumer market catches up.
The company also isn’t alone in finally realizing AR hardware. By all accounts, Meta is poised to show off the first version of its long-promised augmented reality glasses next week at its developer event. Its glasses, known as Orion, are also unlikely to go on sale anytime soon. But the attention Meta brings to the space could nonetheless benefit Snap as it tries to sell its vision for an AR-enabled world.
This article originally appeared on Engadget at https://www.engadget.com/social-media/snaps-fifth-generation-spectacles-bring-your-hands-into-into-augmented-reality-180026541.html?src=rss
Uber just announced the expansion of safety features directed toward drivers, including a national rollout of enhanced rider verification, which begins tomorrow. If a rider undergoes these additional verification steps they’ll get a “Verified” badge on their account, which will let drivers know everything is on the up and up.
The company says it designed these new verification measures “in response to driver feedback that they want to know more about who is getting in their car.” The company began testing this feature earlier this year and it must have been a success, as it's getting a national rollout. Lyft has its own version of this tool, though it's still being tested in select markets.
Uber verifies riders by cross-checking account information against third-party databases, though it’ll also accept official government IDs. The program will also allow users to bring in their verification status from the CLEAR program.
While rider ID is the most notable safety tool announced, Uber’s also bringing its Record My Ride feature to the whole country after another successful beta test. This lets drivers record the entirety of the ride via their smartphone cameras, without the need to invest in a dashcam. The footage is fully encrypted, with Uber saying nobody can access it unless a driver sends it in for review. The company hopes this will allow it to “more quickly and fairly resolve any incidents that may arise.”
Uber
Drivers can now cancel any trip without a financial penalty and they can “unmatch” from any riders they don't feel comfortable picking up. Finally, there’s a new PIN verification feature in which drivers can request riders to enter a number to confirm they are, in fact, the correct guest.
Uber tends to focus its resources on riders over drivers, so this is a nice change of pace. It is kind of a bummer, however, that drivers require this kind of enhanced verification system just to root out some bad apples and keep doing their jobs. In other words, don’t be a jerk during your next Uber ride.
Correction, September 17 2024, 10:45AM ET: This story and its headline originally stated that Uber's rider verification program was rolling out nationwide as of today. The rollout starts tomorrow, September 18. We apologize for the error.
This article originally appeared on Engadget at https://www.engadget.com/transportation/ubers-rider-id-program-is-now-available-everywhere-in-the-us-143037313.html?src=rss
Folded between all the new hardware announcements, Apple surprised us last week with news of FDA-approved hearing aid features for the AirPods Pro. No new hardware needed — it’s all in software updates. In the last decade, we’ve seen several companies tackle hearing-aid technology, aided by the boom in wireless tech. Now, arguably the most influential company in consumer tech is trying it. John Falcone outlines why this is a big deal. Or, at least, a very good deal.
The iPhone 16 event is over, and now we’ve got plenty of thoughts to share after playing with all of Apple’s new hardware. In this episode, Devindra and Cherlynn chat about the entire iPhone 16 and Pro lineup, and Billy Steele joins to chat about his experience with the AirPods 4 and Apple Watch Series 10. It turns out the Apple Watch stole the show.
The entire Annapurna Interactive team has left the company after its executives walked out, according to a Bloomberg report. Apparently, the video game publisher had been negotiating with Annapurna Pictures to spin off Annapurna Interactive into its own entity. Those talks broke down, so “all 25 members of the Annapurna Interactive team collectively resigned,” the team said in a joint statement.
This article originally appeared on Engadget at https://www.engadget.com/general/the-morning-after-the-airpods-pros-new-hearing-aid-features-are-a-big-deal-111529376.html?src=rss
OpenAI has unveiled yet another artificial intelligence model. This one is called o1, and the company claims it can perform complex reasoning tasks more effectively than its predecessors. Apparently, o1 was trained to “spend more time thinking through problems before they respond.” According to the company: “[the models] learn to refine their thinking process, try different strategies and recognize their mistakes.”
That more considered response means it’s significantly slower at processing prompts than GPT-4o. And while it might be thinking more, o1 hasn’t solved the problem of hallucinations — a term for AI models making up information. OpenAI’s chief research officer Bob McGrew told The Verge, “We can’t say we solved hallucinations.”
The FixHub is a USB-C powered soldering iron designed to help fix whatever’s on your workbench (and be easily fixable itself). The iron includes a 55Wh battery pack, which acts as a stand and temperature control. Founder Kyle Wiens told Engadget FixHub was born of frustration with soldering irons and their limits. So his company tried to fix those.
Elgato has introduced the Stream Deck Studio, a new version of its creative control tech targeting professionals. This 19-inch rackmount console has 32 LCD keys and two rotary dials. Oh, and a $900 price tag.
A decade — and countless clones — later, the original Flappy Bird is coming back. If you don’t recall the 2014 hit mobile game, you’d tap the screen to flap the bird’s wings and squeeze it through gaps between pipes. The game debuted in May 2013, but it didn't blow up until the following January. Developer Dong Nguyen soon revealed the game was raking in $50,000 per day from advertising. He decided to remove the game, but clones of his creation persisted. Under the banner of the Flappy Bird Foundation, some dedicated fans acquired the rights to the game, officially, so now it’s flapping back.
Every year, the calculus of choosing which iPhone to get feels increasingly frustrating. Do you opt for the standard iPhone? Do you splurge for the latest and greatest Pro model, something that might take two years to pay off completely? Or should you just buy a slightly older used or refurbished model to get the best deal possible? And of course, there's always the safest bet: Save your money and keep your current phone for as long as possible.
After sitting with Apple's announcements for a day, one thing has become clear: The plain $799 iPhone 16 is a pretty solid deal, at least compared to basic iPhones from the last few years. It actually has a new processor, Apple's A18, instead of reusing an older chip. It brings over the Action button from the iPhone 15 Pro and it also has Apple's new camera button. The iPhone 16 is also relatively future-proof since it supports Apple Intelligence, something that doesn't work on the non-Pro iPhone 15 and older models.
Apple
While the $999 iPhone 16 Pro has an additional camera and supports more powerful photography and filmmaking features, the line between that device and the standard iPhone 16 is blurrier than ever before.
It certainly makes more sense to invest in the iPhone 16 today, instead of the poor, beleaguered iPhone SE. That device sports an aging A15 chip, the tiny 4.7-inch screen of yore and it still costs $429 like it did in 2022. It's rumored that we'll see a new iPhone SE sometime next year, but the current model is simply a terrible option in Apple's current lineup.
If you don't care about Apple Intelligence and you'd like to replace an older device, you can currently find a refurbished iPhone 15 on Amazon and elsewhere between $500 and $600. But really, if you're willing to shell out that much money for a used device with a limited warranty, it might make more sense to grab an iPhone 16 and pay it off over time via your carrier. Trade-in deals can also shave off a significant chunk of a new phone's price. Verizon, for example, is currently offering $800 off a Pixel 9 or iPhone 15 when you swap out an older device.
To be clear, all of the preceding advice only applies if you need to replace a trashed iPhone, or you’re excited to play with Apple Intelligence. Judging from the latest rumors, we’ll likely see an ultra-thin iPhone model next year (similar to the latest iPad Pro), so it might just make sense to hold onto your existing device. And don’t forget, Apple’s AI features won’t be immediately available at launch — you’ll have to wait until an October update for the first batch of features, and Siri won’t get all of its new smarts until the first half of 2025.
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/its-a-good-year-to-buy-the-vanilla-iphone-16-and-avoid-the-se-at-all-costs-192919611.html?src=rss
If you missed it, Apple unveiled all the new iPhones we expected, a new Apple Watch that might steal a lot of the headlines (bigger, thinner, better!) and a surprising new push for its wireless headphones, with three new models and a software update for AirPods Pro that brings the company into a new product category, through software alone.
We’ll chew over those below, but first up: the iPhone 16 and the iPhone 16 Pro. There’s a surprising parity of new features in the two devices this year. The iPhone 16 gets an action button (left edge, above the volume buttons) and a new camera button, too.
Well, it’s more than just a button. With all sorts of sensors and tech crammed inside, you can adjust controls and settings in the camera — think zoom, exposure and aperture — with gentle slides and presses. Apple is already planning an update for a half-press to focus the camera and a full press to capture.
Why did Apple redesign the iPhone 16 camera module? The cynic in me thought this change, with lenses stacked vertically, was just a desperate indicator to show that folks had the newest iPhone, but there’s a reason. Apparently, this arrangement means the cameras can work in tandem to capture spatial video and photos.
The iPhone 16 Pro (and Pro Max) both get an array of upgraded shooting features. There is (thankfully!) parity between the 6.9-inch Pro Max and 6.2-inch Pro, with the smaller Pro getting 5x optical zoom this year. The ultra-wide-angle camera has been upgraded from 12 megapixels to 48 megapixels in both Pro models. The iPhone 16 Pro can shoot video in 4K at 120 fps, so you can switch to and from slow-mo footage after the fact — no slow-motion recording mode necessary.
The iPhone 16 Pros have a larger battery, delivering us that perennial line: the “best iPhone battery life ever.” However, on Apple’s own specification cheat sheet, it gets a little more specific, saying it was the “Best iPhone battery life on Pro Max.” That’s something to scrutinize in reviews. But after last year’s shift to USB-C, the lightness of a titanium build and an action button is not quite as convincing an upgrade.
Having said that, what am I doing? I currently juggle an Apple iPhone 15 Pro Max and a Pixel 9 Pro XL, which are too big for me. Last year, I leaned on the iPhone 15 Pro Max, instead of the iPhone 15 Pro, for 5x camera zoom… and nothing else.
This time around, like the iPhone 14 series, there are pretty much identical specs across the iPhone 16 Pro duo — which is how it should be — so I’m lining up a pre-order for a black iPhone 16 Pro, with 1TB of storage. I might change my mind.
If you’re a Pro Max kind of iPhone owner, I think you could probably wait a year, but it’s also a great time to upgrade to the base iPhone 16. It gets those new buttons, it’ll work with Apple Intelligence in a few months and it has a pretty gorgeous lineup of colors.
We’ve got hands-on impressions on everything below.
— Mat Smith
You can get these reports delivered daily direct to your inbox. Subscribe right here!
Apple’s wearable had the biggest spec jump at its event yesterday, with the biggest display and thinnest design ever on an Apple Watch. The wide-angle OLED display is even a bit larger than the Apple Watch Ultra 2. Apple claims the screen is 40 percent brighter when looked at from an angle, thanks to that new OLED tech, and it’ll show a second hand ticking away even when your wrist is down. The company claims an 18-hour battery life, which someone needs to remind Apple is not a full day.
Apple launched its fourth-generation AirPods this week, as rumors suggested. They have a redesigned bud shape that’s supposed to fit more ear shapes. The new entry-level AirPods also have the H2 chip to power features like Spatial Audio. The charging case has a USB-C port, and it’s the smallest AirPods charging case to date. To throw a wrench into the works, Apple announced a second model supporting active noise cancellation and Transparency mode. Not to be confused with the AirPods Pro, which still exist.
Spare a thought for Deputy Editor Cherlynn Low. As we bundled her into a plane for Cupertino to report on all that Apple stuff, she was wrapping up her review of Google’s Pixel Watch 3. While we had compared Google’s latest wearable to Apple’s Watch Series 9, we think Apple has some wearable competition here, thanks to solid battery life and a comprehensive and intuitive suite of health-tracking features.
Sony has announced a PlayStation 5 Technical Presentation stream at 11AM ET today on the PlayStation YouTube channel. All indications point toward this being the official unveiling of the PS5 Pro. Mark Cerny, the lead architect of the PS5, will host the stream, which will focus on the PS5 and innovations in gaming technology. For less than 10 minutes.
This article originally appeared on Engadget at https://www.engadget.com/general/the-morning-after-should-you-upgrade-to-an-iphone-16-111524098.html?src=rss
Apple is giving every iPhone 16 and iPhone 16 Pro model brand new chips, instead of just using last year's hardware in its cheaper phones. The company unveiled the A18 and A18 Pro chips at the iPhone 16 launch event today, and as you'd expect, they're built with Apple Intelligence in mind. The chips offers more memory and a new 16-core Neural Engine, in addition to some incremental performance boosts over older models. More so than the past few years — where you could point to new camera lenses or hardware tweaks as a reason to get the new iPhone — the chip is the key selling point for the iPhone 16 lineup.
Other than last year's iPhone 15 Pro and Pro Max, older iPhones can't run Apple Intelligence features like the revamped Siri, Genmoji and integrated ChatGPT search. (Anyone who splurged for those higher end iPhones chose wisely, as there's little reason to upgrade.) AI workloads require plenty of RAM to juggle large language models, so that alone disqualifies the iPhone 15 and 15 Plus, which only had 6GB of RAM on the A16 chip (a holdover from the iPhone 14 Pro). The iPhone 15 Pro and Pro Max, on the other hand, featured 8GB of RAM with the A17 Pro chip.
The A18 chip, along with thermal design optimizations, helps the iPhone 16 achieve 30 percent faster sustained gaming performance, according to Apple. And just like the A17 Pro last year, the new chip supports hardware accelerated ray tracing, which helps it deliver more realistic lighting in some titles. Apple also revealed that Honor of Kings: World will be coming to iPhones next year.
Apple's A18 Pro goes a step further than the A18, delivering up to 15 percent faster speeds than the A17 Pro, as well as 2x faster hardware-accelerated ray tracing. Notably, the A18 Pro also uses 20 percent less power than the A17 Pro. All of that hardware isn't just meant for Apple Intelligence, it also powers the complex new photography features in the iPhone 16 Pro's cameras.
Apple
Apple's older strategy of using the previous year's chips on the iPhone and iPhone Plus made sense. Those devices didn't require the demanding camera processing of the Pro models, which were entirely geared towards power users. Apple could cut manufacturing costs and still deliver a solid user experience for iPhone owners with older chips. (Even though it debuted in 2022, the A16 chip in the iPhone 15 is still very capable today.)
But now that Apple is centering the iPhone experience around Apple Intelligence, a family-wide spec bump isn't too surprising. And even if you're not excited about Apple's AI offerings (which they'll never actually call AI), it's nice to have some more RAM in the base iPhone line.
LG's stretchable displays that we first saw at CES 2023 have made a new public appearance at a fashion show, the company announced. The tech — which LG now calls Stretchable — appeared as part of clothing and bag concepts at 2025 Seoul Fashion Week.
The displays were added to the front of garments, sleeves and clutch bags designed by Korean designers Youn-Hee Park and Chung-Chung Lee. "We have been able to design future fashion concepts with new materials that have never existed before," Park said.
Stretchable displays can be pulled, bent and twisted, so they go a step farther than the bending and twisting displays used in foldable smartphones. To make them more supple, LG built the substrate material from a silicon similar to that used in contact lenses, with microLEDs smaller than 40-micrometers for the light source. LG notes they can be stretched from 12 to 14 inches or about 20 percent.
Flexible wearable tech has been a much researched feature for fashion and even things like invisibility cloaks. South Korea's government created a national project to test their commercial potential for new types of wearable tech across multiple industries. The main challenge has been to make it more like fabric instead of stiff plastic, but LG seems to have at least partially cracked the problem.
This article originally appeared on Engadget at https://www.engadget.com/wearables/lg-flaunts-its-stretchable-displays-on-the-catwalk-140053981.html?src=rss