For the discerning gamer who didn't rush out to grab NVIDIA's RTX 40-series GPUs over the last few years, the company's new Super cards are genuinely compelling. They all deliver far more power than before — so much so that the $599 RTX 4070 Super is actually a decent 4K gaming card. But the $999 RTX 4080 Super is also a solid deal for power-hungry gamers, since it's $200 less than the original 4080. Between those two cards, NVIDIA now has desirable upgrade options for mid-range and high-end gamers.
But what of the new RTX 4070 Ti? It's stuck right between its two siblings, with no clear audience. After all, if you're ready to spend $799 on a video card, stretching a bit more for the 4080 Super might make more sense for the additional power. Otherwise, you might as well just get the 4070 Super, overclock it a bit, and spend the extra $200 on another SSD or more RAM. It's a conundrum entirely of NVIDIA's making, one that might have been solved by giving the 4070 Ti Super a slight $50 discount. (And it's even more confusing when some cards, like the ASUS model we reviewed, are priced above MSRP.)
Just like the 4070 Super, the 4070 Ti Super and 4080 Super feature more CUDA cores than the original models, as well as slightly higher clock speeds. The 4070 Ti Super packs in 8,448 CUDA cores compared to 7,680 on the 4070 Ti, while the 4080 Super has 10,240 CUDA cores instead of 9,728. The 4070 Ti Super also has 4GB more VRAM, bringing it up to 16GB of GDD6X memory, while the 4080 Super sticks with 16GB. (NVIDIA likely doesn't want to edge much closer to the beastly RTX 4090, which is now selling for hundreds more than its original $1,599 launch price.)
Given the RAM bump, I expected the RTX 4070 Ti Super to be far faster than the 4070 Ti, but it ended up scoring only 742 points higher in the 3DMark TimeSpy Extreme benchmark. It fares better on games, scoring 15fps higher in both Halo Infinite (104fps)and Cyberpunk2077 (93fps)while playing in 4K with maxed out graphics and ray tracing settings. The 4080 Super's results were similarly muted: It hit 289 more points than the 4080 in TimeSpy Extreme, and it was 10fps higher in Cyberpunk (103fps) while playing in 4K with ray tracing.
Content creators will also appreciate the additional horsepower. The 4070 Ti Super scored 87,707 points in the Luxmark HDR benchmark and 7,424 points in Blender's test, compared to 75,997 and 7,247 on the 4070 Ti super. The 4080 Super hit 99,515 points in Luxmark HDR compared to 94,832 points from before.
These are the results you'd expect by simply throwing more CUDA cores into these cards, so I'm not exactly disappointed. But after benchmarking both GPUs and comparing their results to the 4070 Super, I'm more underwhelmed by the 4070 Ti Super. I can understand NVIDIA not wanting to make it much faster, but it seriously needs to be cheaper than the original 4070 Ti. Otherwise, like I said, the 4070 Super is far more compelling, and I wouldn't be surprised if cheaper 4070 Ti cards pop up.
We reviewed the ASUS TUF Gaming 4070 Ti Super (which is currently selling for $850), as well as NVIDIA's Founders Edition 4080 Super. The ASUS GPU sports three fans and a typical heatsink design, whereas NVIDIA's card once again uses its unique (and very effective) vapor chamber cooling setup. Both cards hovered around 75 Celsius under load, but the ASUS GPU was noticeably louder thanks to those three large fans. I could still hear the 4080 Super spin up under pressure, but it wasn't nearly as loud.
If you can’t tell by now, the biggest selling point of the 4080 Super is that it's $200 less than the original. Not only is it a better deal on its own, but its lower price should (hopefully) help to reduce the cost of older 4080 cards eventually too. And if you’re not ready to shell out $999 for a video card, then the $599 4070 Super is no slouch either.
This article originally appeared on Engadget at https://www.engadget.com/nvidia-rtx-4070-ti-super-and-4080-super-review-183034039.html?src=rss
Samsung’s Galaxy S24 phones are all about AI, but how do they compare against Google’s AI tech? This week, Cherlynn and Devindra discuss what works and doesn’t about Samsung’s ambitious new smartphones, and why it may be a good thing for the Korean giant to directly compete with Google. Also, Senior Editor Karissa Bell joins to discuss the social media CEO Senate hearing, which, unsurprisingly, doesn’t really amount to much.
Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!
Topics
Samsung Galaxy S24 and S24 Ultra reviews: AI with mixed results – 0:47
Senate gathers social media CEOs over online child safety – 15:15
Graphic images of Taylor Swift on X prompts U.S. bill to let people sue over sexual deepfakes – 28:11
Universal Music Group pulls songs from TikTok during talks on a new music rights deal – 33:05
Delaware court denies Elon Musk’s “unfathomable” Tesla payday – 38:31
Neuralink claims to have implanted its first chip in a human test subject – 40:32
Google reveals new text-to-image generative AI tool, ImageFX – 41:46
Microsoft posted another blowout earnings report for Q2 of the 2024 fiscal year, with revenues of $62 billion (up 18 percent from last year) and profits of $21.9 billion (a 33 percent increase). But really, the most interesting thing about this quarter is that we finally get to see how the $68.7 billion Activision Blizzard acquisition affects the $3 trillion company. While Microsoft isn't breaking out specific numbers, it says that its overall gaming revenue increased by 49 percent, 44 points of which came from the "net impact" of the Activision deal.
Microsoft's More Personal Computing division, which includes Xbox, Surface and Windows, was up 19 percent ($16.9 billion) since last year. The company says the Activision deal accounted for 15 points of that increase. It's a huge change for a division that's been severely impacted by dwindling PC sales (which affects Windows licenses and Surfaces) and struggling Xbox consoles. PC device revenues were down 9 percent for the quarter, while Xbox hardware sales were up 3 percent.
Xbox content and services revenue is also up 61 percent since last year, 55 points of which comes from Activision. It'll be interesting to see if Microsoft can actually leverage that acquisition to help Xbox sales, or at the very least, spur on more interest in Game Pass subscriptions. (Unfortunately, we don't have any updates on how that service is doing.)
This article originally appeared on Engadget at https://www.engadget.com/microsofts-gaming-revenue-is-up-49-percent-in-q2-mostly-thanks-to-the-activision-deal-222502444.html?src=rss
Note: This review was originally published during Sundance 2024. We're reposting it because Seeking Mavis Beacon is now out in theaters.
With a healthy dose of heart and whimsy, the Sundance documentary Seeking Mavis Beacon follows two young Black women who are devoted to finding the original model for Mavis Beacon Teaches Typing. If you touched a computer during the '80s or '90s, there's a good chance that Mavis helped you get comfortable with a keyboard. Or at the very least, you might remember her from the program's original 1987 cover: a smiling, elegant Black woman dressed in a cream-colored outfit. She embodied style and professional poise — it was as if you could be just as capable as her if you bought that program.
It's no spoiler to say that "Mavis Beacon" didn't really exist – she was a marketing idea crafted by a group of white dudes from Silicon Valley. But the program's cover star was real: Her name was Renee L'Esperance, a Haitian model who was discovered while working at Saks Fifth Avenue in Los Angeles. After her image helped make Mavis Beacon Teaches Typing a success, she retreated from the spotlight, reportedly heading back to retire in the Caribbean.
Seeking Mavis Beacon
The documentary's director and writer, Jazmin Jones, as well as her collaborator, Olivia McKayla Ross, start with those basic details and set out to find L'Esperance like a pair of digital detectives. From a home base in a rundown Bay Area office – surrounded by tech ephemera, a variety of art pieces and images of influential black women – they lay out L'Esperance's reported timeline, follow leads and even host a spiritual ceremony to try and connect with the model.
I won't say if the pair actually end up finding L'Esperance because it's the journey that makes Seeking Mavis Beacon such a joy to watch. Jones and Ross both grew up with the typing program and felt a kinship toward the character of Mavis Beacon. It was the first program to prominently feature a Black woman on the cover (a move that reportedly caused some suppliers to cut their orders), so it made the technology world seem like somewhere young Black women could actually fit in. Beacon's digital hands also appear on-screen, as if she's gently guiding your fingers to the correct letters and positioning.
To help uncover more details about the whereabouts of Mavis Beacon, Jones and Ross set up a hotline and website for anyone to submit clues. Some of those calls are featured in the film, and they make it clear that her digital presence inspired many people. The film opens with references to Beacon throughout culture, including one of my favorite bits from Abbott Elementary, where Quinta Brunson's over-achieving teacher is far too excited to spot the typing icon in a school crowd. I was reminded of my own childhood experience with Mavis Beacon Teaches Typing, spending free periods at school and idle time at home trying to get my typing speed up. By middle school, typing felt as natural as breathing. And yes, I would also have freaked out if I saw the real Beacon in person.
While the documentary doesn't seem out of place at Sundance, which is known for innovative projects, it also sometimes feels like a piece of experimental media meant for YouTube or an art show filled with impossibly cool twenty-somethings. (At one point, Ross attends a farewell ceremony for one of her friends' dead laptops, which was hosted in an art space filled with people dressed in white. That's the sort of hip weirdness that will either turn you off of this film, or endear you to it more.)
Yeleen Cohen
Jones shows us screen recordings of her own desktop, where she may be watching a TikTok alongside her notes. Instead of a full-screen video chat with another person, sometimes we just see a FaceTime window (and occasionally that reflects Jones' own image looking at the screen). Finding Mavis Beacon tells its story in a way that digital natives will find natural, without locking itself exclusively into screens like the film Searching.
As is true for many first features, the film could use some narrative tightening. Jones and Ross's investigation stalls at several points, and we're often just left adrift as they ponder their next steps. The pair also occasionally appear too close to the story, or at least, that's how it seems when we see Jones tearing up while pleading to meet with L'Esperance.
But I'd argue that's also part of the charm of Seeking Mavis Beacon. Jones and Ross aren't some true crime podcast hosts looking to create content out of controversy. They're young women who found comfort in one of the few faces in tech that looked like them. With this film, Jones and Ross could be similarly inspirational for a new generation of underrepresented techies.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/seeking-mavis-beacon-review-sundance-documentary-140049830.html?src=rss
A woman has a text chat with her long-dead lover. A family gets to hear a deceased elder speak again. A mother gets another chance to say goodbye to her child, who died suddenly, via a digital facsimile. This isn't a preview of the next season of Black Mirror — these are all true stories from the Sundance documentary Eternal You, a fascinating and frightening dive into tech companies using AI to digitally resurrect the dead.
It's yet another way modern AI, which includes large language models like ChatGPT and similar bespoke solutions, has the potential to transform society. And as Eternal You shows, the AI afterlife industry is already having a profound effect on its early users.
The film opens on a woman having a late night text chat with a friend: "I can't believe I'm trying this, how are you?" she asks, as if she's using the internet for the first time. "I'm okay. I'm working, I'm living. I'm... scared," her friend replies. When she asks why, they reply, "I'm not used to being dead."
Beetz Brothers Film Production
It turns out the woman, Christi Angel, is using the AI service Project December to chat with a simulation of her first love, who died many years ago. Angel is clearly intrigued by the technology, but as a devout Christian, she's also a bit spooked out by the prospect of raising the dead. The AI system eventually gives her some reasons to be concerned: Cameroun reveals that he's not in heaven, as she assumes. He's in hell.
"You're not in hell," she writes back. "I am in hell," the AI chatbot insists. The digital Cameroun says he's in a "dark and lonely" place, his only companions are "mostly addicts." The chatbot goes on to say he's currently haunting a treatment center and later suggests "I'll haunt you." That was enough to scare Angel and question why she was using this service in the first place.
While Angel was aware she was talking to a digital recreation of Cameroun, which was based on the information she provided to Project December, she interacted with the chatbot as if she was actually chatting with him on another plane of existence. That's a situation that many users of AI resurrection services will likely encounter: Rationality can easily overwhelm your emotional response while "speaking" with a dead loved one, even if the conversation is just occurring over text.
In the film, MIT sociologist Sherry Turkle suggests that our current understanding of how AI affects people is similar to our relationship with social media over a decade ago. That makes it a good time to ask questions about the human values and purposes it's serving, she says. If we had a clearer understanding of social media early on, maybe we could have pushed Facebook and Twitter to confront misinformation and online abuse more seriously. (Perhaps the 2016 election would have looked very different if we were aware of how other countries could weaponize social media.)
Beetz Brothers Film Production
Eternal You also introduces us to Joshua Barbeau, a freelance writer who became a bit of an online celebrity in 2021 when The San Francisco Chronicle reported on his Project December chatbot: a digital version of his ex-fiancee Jessica. At first, he used Project December to chat with pre-built bots, but he eventually realized he could use the underlying technology (GPT-3, at the time) to create one with Jessica's personality. Their conversations look natural and clearly comfort Barbeau. But we're still left wondering if chatting with a facsimile of his dead fiancee is actually helping Barbeau to process his grief. It could just as easily be seen as a crutch that he feels compelled to pay for.
It's also easy to be cynical about these tools, given what we see from their creators in the film. We meet Jason Rohrer, the founder and Project December and a former indie game designer, who comes across as a typical techno-libertarian.
"I believe in personal responsibility," he says, after also saying that he's not exactly in control of the AI models behind Project December, and right before we see him nearly crash a drone into his co-founders face. "I believe that consenting adults can use that technology however they want and they're responsible for the results of whatever they're doing. It's not my job as the creator of the technology to prevent the technology from being released, because I'm afraid of what somebody might do with it."
But, as MIT's Turkle points out, reanimating the dead via AI introduces moral questions that engineers like Rohrer likely aren't considering. "You're dealing with something much more profound in the human spirit," she says. "Once something is constituted enough that you can project onto it, this life force. It's our desire to animate the world, which is human, which is part of our beauty. But we have to worry about it, we have to keep it in check. Because I think it's leading us down a dangerous path."
Beetz Brothers Film Production
Another service, Hereafter.ai, lets users record stories to create a digital avatar of themselves, which family members can talk to now or after they die. One woman was eager to hear her father's voice again, but when she presented the avatar to her family the reaction was mixed. Younger folks seemed intrigue, but the older generation didn't want any part of it. "I fear that sometimes we can go too far with technology," her father's sister said. "I would just love to remember him as a person who was wonderful. I don't want my brother to appear to me. I'm satisfied knowing he's at peace, he's happy, and he's enjoying the other brothers, his mother and father."
YOV, an AI company that also focuses on personal avatars, or "Versonas," wants people to have seamless communication with their dead relatives across multiple channels. But, like all of these other digital afterlife companies, it runs into the same moral dilemmas. Is it ethical to digitally resurrect someone, especially if they didn't agree to it? Is the illusion of speaking to the dead more helpful or harmful for those left behind?
The most troubling sequence in Eternal You focuses on a South Korean mother, Jang Ji-sun, who lost her young child and remains wracked with guilt about not being able to say goodbye. She ended up being the central subject in a VR documentary, Meeting You, which was broadcast in South Korea in early 2020. She went far beyond a mere text chat: Jang donned a VR headset and confronted a startlingly realistic model of her child in virtual reality. The encounter was clearly moving for Jang, and the documentary received plenty of media attention at the time.
"There's a line between the world of the living and the world of the dead," said Kim Jong-woo, the producer behind Meeting You. "By line, I mean the fact that the dead can't come back to life. But people saw the experience as crossing that line. After all, I created an experience in which the beloved seemed to have returned. Have I made some huge mistake? Have I broken the principle of humankind? I don't know... maybe to some extent."
Eternal You paints a haunting portrait of an industry that's already revving up to capitalize on grief-stricken people. That's not exactly new; psychics and people claiming to speak to the dead have been around for our entire civilization. But through AI, we now have the ability to reanimate those lost souls. While that might be helpful for some, we're clearly not ready for a world where AI resurrection is commonplace.
This article originally appeared on Engadget at https://www.engadget.com/sundance-documentary-eternal-you-shows-how-ai-companies-are-resurrecting-the-dead-153025316.html?src=rss
Apple’s Mac just turned 40 years old! This week, Devindra chats with Deputy Editor Nathan Ingraham about his Mac retrospective. We focus on how much has changed since Apple’s disastrous 2016 lineup, why the Apple Silicon chips feel so revolutionary, and look back at our earliest Mac experiences. Also, we review the Framework Laptop 16, a wonderfully modular miracle of a laptop, but one that we wish had more graphics power for gaming. (But hey, at least you can replace the GPU eventually!).
Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!
Topics
Framework Laptop 16 review: Amazingly modular, but not so great at gaming – 1:17
The Mac turns 40 – 19:27
More tech layoffs at Blizzard/Activision, Riot, eBay and others – 49:58
Apple’s Car concept is allegedly still alive – 52:44
Apple overhauls App Store rules in response to European Union regulation – 58:25
Apple's app platform is finally opening up a bit. Today, the company said that it will allow developers to utilize new in-app experiences, including streaming games, accessing mini-apps, and talking with chatbots. That means devs can create a single app that houses an easily accessible catalog of their streaming titles. Perhaps we'll finally see a usable Game Pass app from Microsoft (or even its long-awaited mobile game store).
The new in-app experiences, which also includes things like mini-games and plug-ins, will also get new discovery opportunities. Apple isn't being clear about what that means, but it could involve new sections of the App Store pointing to specific features. It wouldn't be too surprising to see a collection of apps feature chatbots, for example. Apple also says the new built-in experiences will be able to use its in-app purchase system for the first time (like easily buying a subscription to a specific mini-game or chatbot).
"The changes Apple is announcing reflect feedback from Apple’s developer community and is consistent with the App Store’s mission to provide a trusted place for users to find apps they love and developers everywhere with new capabilities to grow their businesses," the company said in a blog post. "Apps that host this content are responsible for ensuring all the software included in their app meets Apple’s high standards for user experience and safety."
This article originally appeared on Engadget at https://www.engadget.com/apple-lets-apps-feature-streaming-games-chatbots-and-other-built-in-experiences-180016453.html?src=rss
If you’re a PC hardware geek who’s been dreaming of a laptop that you can upgrade far beyond the life cycle of a typical machine, Framework's modular notebooks must seem like a miracle. The American company has a straightforward pitch: What if your laptop could be nearly as customizable as a desktop, with the ability to swap components out for repairs and upgrades? What if we could put an end to disposable hardware? We were intrigued by Framework's original 13-inch notebook and its Chromebook variant, despite some rough edges and a basic design. Now, with the Framework Laptop 16, the company is targeting the most demanding and (arguably) hardest group of PC users to please: Gamers.
Framework has already proved it can build compelling modular laptops, but can the Laptop 16 cram in powerful graphics, a fast display and other components to keep up with the likes of Alienware, Razer and ASUS? Sort of, it turns out — and there are plenty of other tradeoffs for living the modular laptop dream. Hardware quirks abound, battery life is mediocre and it still looks like a totally generic machine. But how many other notebooks could let you completely upgrade your CPU or GPU in a few years? Who else offers a customizable keyboard setup? In those respects, the Framework 16 stands alone.
You'll also have to pay dearly for its unique features. The Framework Laptop 16 starts at $1,399 for its DIY Edition, which includes a Ryzen 7 7840HS chip, but RAM, storage and an OS will cost extra. (You could also bring your own hardware, if you happen to have all those components lying around). The pre-built "Performance" model goes for $1,699 with the same Ryzen chip, 16GB of RAM, 512GB of storage and Windows 11 Home. The highest-end "Overkill" edition starts at $2,099 with a Ryzen 9 7940HS, 32GB of RAM and a 1TB SSD. Oh, and if you want the dedicated Radeon RX 7700S GPU, that's an additional $400 for every model.
I just wanted you to have those numbers in mind as we dive into what the Laptop 16 gets right, because for true PC tinkerers, those high prices could be worth it. The device's singular personality was clear the instant I opened it: I saw a machine with a fairly typical display, the usual wrist rest area with a touchpad, and a big gaping hole where the keyboard was supposed to be. I've come across hundreds (probably thousands) of laptops in my time, this was one of the rare times where I felt genuinely surprised. Underneath the metal Mid Plate where the keyboard was supposed to be, I could see the internals of the Framework 16 peeking through, just tempting me to get my hands dirty (and my knuckles inevitably scraped up).
Photo by Devindra Hardawar/Engadget
After opening the two side locks on the wrist rest, I slid the two side spacers off. Then, ever so carefully, I pulled back on the touchpad to detach it from the case. That's when I learned that I didn't have to be too gentle with the Laptop 16. All of the components are built for removal. With the lower panels gone, I had full access to the metal barrier protecting the rest of the machine’s internals.
At that point, I realized it paid to read Framework's online documentation, as things quickly got more complicated. It stated that I needed to remove the cable with the number one next to it, and then unscrew 16 screws spread through the Mid Plate. Thankfully, the screws are held in place, so I didn’t have to worry about losing them as I would during a desktop build.
Then, I was treated to a wondrous sight: A laptop with a completely open mainboard, featuring components I could easily reach without much effort. There's a large battery at the bottom, a wireless networking card at the top left, SSD slots in the middle and two RAM slots off to the side. QR codes are nestled alongside the parts, which direct you to online help documentation. The last time I saw so many easily reachable components was on the failed Alienware Area 51M, another dreamy modular laptop, but that was quietly killed after a few years. (Dell was sued by Area 51M customers who felt misled about its upgradability, though that ultimately didn't amount to much.)
And yes, I know other large gaming laptops like the Razer Blade 16 also let you easily access their RAM and SSDs. But those machines don't have the modular ambitions of the Framework Laptop 16. I could see the Ryzen 7840HS module within reach, and also easily swap out my review unit's Radeon RX 7700S graphics. That GPU, by the way, is completely optional. You can order the Laptop 16 with a slimmer expansion bay instead, which helps to cool the Ryzen chip's Radeon 780M graphics. Or you could have both modules and swap them out as needed. Simply having the option to do so is revolutionary.
The Radeon 7700S GPU is contained within a module that sticks out from the rear of the Laptop 16. A more powerful video card could potentially stick out further, while a more efficient one could end up being smaller. The key is that the choice could be entirely yours (I'm hedging a bit here, because Framework and AMD still haven't committed to the availability of future GPU upgrades). The GPU module also makes a big difference when it comes to weight: The Laptop 16 clocks in at 5.3 pounds with the graphics card attached, whereas it’s just 4.6 pounds with the standard expansion bay.
Photo by Devindra Hardawar/Engadget
Looking at the Framework Laptop 16 splayed out on my workbench, all I could see was possibility. The possibility of doubling my RAM in a couple of years to run local AI models, upgrading the CPU for a major power upgrade, and replacing the battery on my own after far too many charge cycles. Framework is selling a dream of hope. I had my doubts when the company launched, especially after seeing how badly Dell botched the Area 51M. (Fun fact: Frank Azor, the Alienware co-founder who spearheaded that machine's launch, is now AMD's chief gaming architect. He left Dell before the company failed to live up to its upgradability promises.)
But now that Framework has several products under its belt, and it's managed to deliver a truly replaceable mobile GPU where others have failed, I find myself rooting for this little hardware company that's daring to do something different. (Okay, sure, it also raised $27 million in VC funding, but hardware is a difficult and expensive thing to get right!)
Even if you're not eager to get new components in a few years, the Framework Laptop 16's modularity also allows you to easily customize it for your needs. As I reassembled the machine, a process that took around three minutes, I wanted to make my setup look different from a typical laptop. So I slapped the RGB keyboard module on the left side of the Mid Plate (it landed with a genuinely satisfying magnetic thunk) and aligned the trackpad right below it. To the right of the keyboard, I installed a customizable button module (you can also order a standard Numpad, if you'd like), and metal spacers on the right of the trackpad.
Photo by Devindra Hardawar/Engadget
With the top of the machine configured, I also had to figure out which ports I wanted to equip along the sides of the Laptop 16. Framework handles that process brilliantly: The computer has three expansion bays along each side, all of which lead to USB-C connections at the end. The expansion cards are just USB-C dongles connecting to your typical ports, including USB Type A ($9), Type C ($9), a headphone jack ($19) and HDMI ($19). Our review unit came with a handful of cards, so I slapped on two USB-C ports on the left (which also handle charging), USB A on both sides, as well as HDMI and 3.5mm on the right (because the legend will never die).
If I was configuring my own machine, I'd also opt for Ethernet ($39) and MicroSD ($19). The cards sit flush with the Laptop 16 once they're installed, and are very secure once you enable the locks on the bottom of the case. They're so easy to swap out, I wouldn't be surprised if Framework owners end up switching between them on the fly. You can never have too many ports, after all.
Photo by Devindra Hardawar/Engadget
While I appreciated the simple customizability of the ports, charging was a bit annoying. Framework's documentation points out that only certain expansion slots can be used for USB-C charging. There's also a USB-C port on the back of the GPU module, which I was disappointed to learn couldn't actually charge the Laptop 16. The company told me that USB-C port is only meant for accessories and additional displays. Still, it would have been nice to have rear charging support just to hide the cable from view.
Once I had everything locked into place, this ugly duckling of a laptop started to look like a gaming swan. The RGB keyboard jolted to life when I hit power. I had no idea what I was going to do with the programmable keyboard, but I could see it potentially being useful while podcasting (and certainly if I was a game streamer). But I also realized that nothing is permanent about the Laptop 16.
I learned quickly that I wasn't a fan of typing for too long on a left-aligned keyboard, so I yanked everything out and center-aligned the keyboard and trackpad instead. Instead of blank metal spacers around the keyboard, I installed some customizable LED modules, which basically exist to look pretty. That took me just two minutes. The keyboard, by the way, is wonderful to type on, with 1.5mm of key travel and a soft landing that easily dampens my heavy typing. The trackpad is also smooth to the touch and has a responsive click. It's so great that I have to wonder how some Windows laptops still ship with frustrating touchpads — I'm looking at you, ZenBook 14 OLED.
There's so much to love about the Framework Laptop 16, I was genuinely bummed to discover that it was a fairly mediocre gaming machine, at least for its high price. Across multiple games and benchmarks, it fell in line with laptops sporting NVIDIA's RTX 4060 GPU, a card typically found in systems starting around $1,000 (and sometimes less). Framework isn't completely out of line, though, Razer still sells the Blade 16 for $2,500 (down from $2,699). Remember, you're paying for the magic of customizability, not just raw performance.
Our review unit included the Radeon GPU module, the Ryzen 7 chip, 16GB of RAM and a 512GB SSD, which would all cost at least $2,144 to configure. (That doesn't include the cost of expansion cards or additional input modules.) For that amount of money, I really would have liked to see more than 61fps on average while playing Halo Infinite in 1440p with Ultra graphics settings. In Cyberpunk, I hit 53fps on average with maxed out graphics and mid-range ray tracing settings. Both games fared better in 1080p — 85fps in Halo and 76fps in Cyberpunk with the same settings — but still, those are numbers I'd typically only put up with in a budget gaming laptop.
As for benchmarks, the Framework Laptop 16 scored 200 points less than the Razer Blade 18 with an RTX 4060 in 3DMark's TimeSpy Extreme. And as usual, the AMD GPU still lagged behind in the Port Royal ray tracing demo. Still, the Laptop 16 held up decently in the broader PCMark 10 benchmark, which tests productivity apps and not just gaming. The Framework machine hit a score of 8,129, putting it alongside some of the fastest machines we saw last year (it even beat out the Blade 18, which was running a beefy Intel i9-13950HX CPU).
While I would have liked to see higher numbers across the board, the Framework Laptop 16's 16-inch screen was at least a joy to behold throughout my testing. It's an LED panel running at 2,560 by 1,600 pixels with a 165Hz refresh rate, a respectable 500 nits of brightness and 100-percent DCI-P3 color gamut coverage. The display made the neon-soaked world of Cyberpunk pop more than usual, though it certainly didn't have the extra brightness of MiniLED screens or the eye-searing contrast of OLED panels. At the risk of repeating myself, the beauty of this screen is that you can yank it off the laptop in a few minutes and replace it if your kid damages it, or if Framework releases new modules. (Again, big if there.)
The Framework's left speaker.
Photo by Devindra Hardawar/Engadget
Personally, I’d also eagerly swap out the Laptop 16’s 3-watt speakers the instant Framework offers upgrades. They’re serviceable, but given what Apple and Dell offer these days, they feel almost insulting. Music sounds far too tinny, and they can barely even convey the faux drama of a typical movie trailer. I’m sure most people would use headphones while gaming, but if you’re the sort of person who relies on your laptop speakers for music, I beg you to consider other options.
I’d also recommend some sort of noise blocking solution that can overpower the Laptop 16’s fans. While I was gaming and benchmarking the system, I could swear it was about to lift off like my DJI drone. The fans are louder than any gaming laptop I’ve encountered over the past few years, but at least they did their job. CPU temps stayed around 80 degrees Celsius under load, while the GPU typically stayed under 70C.
Since it’s a huge gaming laptop, I didn’t expect much battery life from the Framework Laptop 16, and I was right: It lasted for four hours and five minutes in the PCMark 10 “Modern Office” battery benchmark. I saw similar results while writing this review, and as you’d expect, it lasted around two hours playing a demanding game like Halo Infinite.
Photo by Devindra Hardawar/Engadget
Much like the original Framework notebook, the Laptop 16 is meant for a niche group of PC users, those who prioritize customizability and upgradability at all costs. If you’re a gamer trying to get the most frames for your dollar, this isn’t really the machine for you (consider these budget gaming PCs, or wait to see how we feel about the Zephyrus G14 in our review). But if you want a notebook that could last you for the next decade, and don’t mind so-so gaming performance, the Laptop 16 could be the notebook of your dreams.
This article originally appeared on Engadget at https://www.engadget.com/framework-laptop-16-review-modular-wonder-mediocre-gaming-laptop-150026910.html?src=rss
No, NVIDIA's mid-range RTX 40-series GPUs aren't getting any cheaper, but at least the new RTX 4070 Super packs in a lot more performance for $599. We called the original RTX 4070 the "1,440p gaming leader," and that still holds for the Super. It's so much faster, especially when it comes to ray tracing, that it edges close to the $799 RTX 4070 Ti (due to be replaced by its own Super variant, as well). And together with the power of DLSS3 upscaling, the 4070 Super is a far more capable 4K gaming card.
So what makes the RTX 4070 Super so special? Raw power, basically. It features 7,168 CUDA cores, compared to 5,888 on the 4070 and 7,680 on the 4070 Ti. Its base clock speed is a bit higher than before (1.98GHz compared to the 4070's 1.92GHz), but it has the same 2.48GHz boost clock and 12GB of GDDR6X VRAM as the original.
The difference between the RTX 4070 Super and the plain model was immediately obvious. On my desktop, powered by a Ryzen 9 7900X with 32GB of RAM, I was able to run Cyberpunk 2077 in 4K with Ultra graphics and DLSS at an average of 78fps. The RTX 4070 sometimes struggled to stay above 60fps at those settings. NVIDIA’s new GPU showed its limits in Cyberpunk's RT Overdrive mode (which enables intensive real-time path tracing), where I only saw 51fps on average while using DLSS and frame generation. (CD Projekt says that mode is meant for the RTX 4070 Ti and up, or on the 3090 at 1080p/30fps).
While the original RTX 4070 was a card that could occasionally let you game in 4K, the 4070 Super makes that a possibility far more often (so long as you can use DLSS). Of course, you'll need to have reasonable expectations (you’re not getting 4K/120fps) and ideally a G-Sync monitor to smooth out performance.
None
3DMark TimeSpy Extreme
Port Royal (Ray Tracing)
Cyberpunk
Blender
NVIDIA RTX 4070 Super
9,830
12,938/60fps
1440p RT Overdrive DLSS: 157
GPU 6,177
NVIDIA RTX 4070
8,610
11,195/52 fps
1440p RT DLSS: 120 fps
6,020
NVIDIA RTX 4070 Ti
10,624
14,163/66 fps
1440p RT DLSS: 135 fps
7,247
AMD Radeon RX 7900 XT
11,688
13,247/61 fps
1440p FSRT RT: 114 fps
3,516
When it comes to 1,440p gaming, the RTX 4070 Super is truly a superstar. In Cyberpunk's Overdrive ray tracing mode with Ultra graphics settings, I saw an average of 157fps — almost enough to satisfy the demands of a 165hz 1,440p monitor. To my eye, the whole experience looked far smoother than the 4K Overdrive results and, as usual, I found it hard to tell the difference between 4K and 1,440p textures during actual gameplay.
Similarly, I'd rather keep the 160fps/1,440p average I saw in Halo Infinite with maxed out graphics, than the 83fps I reached in 4K. That game doesn't get an assist from DLSS, either, so there's no upscaling magic going on in those numbers.
Across most of our benchmarks, the RTX 4070 Super landed smack dab between the 4070 and 4070 Ti. In 3DMark Timespy Extreme, for example, the new GPU scored 9,830 points, compared to 8,610 on the 4070 and 10,624 on the 4070 Ti. In some cases, like the Port Royal ray tracing benchmark, it leaned far closer to the 4070 Ti (which also bodes well for the 4070 Super's overclocking potential). NVIDIA's advanced cooling setup on its "Founders Edition" cards also continues to work wonders: The 4070 Super idled at around 40 Celsius and typically maxed out at 66C under heavy load.
Photo by Devindra Hardawar/Engadget
The RTX 4070 Super is clearly a big step forward from the original card, and a far better value for $599. It's a solid upgrade if you're running a 20-series NVIDIA GPU and even some of the lower-end 30-series options. The value should hopefully trickle downhill, as well: The original 4070 now sells for $550 on NVIDIA's website and used models are on eBay for well below that.
While we’ll continue to long for the days when “mid-range” described a $300 GPU, NVIDIA is giving gamers more of a reason to shell out for the $599 RTX 4070 Super. It’ll satisfy all of your 1,440p gaming needs — and it’s ready to deliver decent 4K performance, as well.
This article originally appeared on Engadget at https://www.engadget.com/nvidia-rtx-4070-super-review-a-1440p-powerhouse-for-599-160025855.html?src=rss
No, NVIDIA's mid-range RTX 40-series GPUs aren't getting any cheaper, but at least the new RTX 4070 Super packs in a lot more performance for $599. We called the original RTX 4070 the "1,440p gaming leader," and that still holds for the Super. It's so much faster, especially when it comes to ray tracing, that it edges close to the $799 RTX 4070 Ti (due to be replaced by its own Super variant, as well). And together with the power of DLSS3 upscaling, the 4070 Super is a far more capable 4K gaming card.
So what makes the RTX 4070 Super so special? Raw power, basically. It features 7,168 CUDA cores, compared to 5,888 on the 4070 and 7,680 on the 4070 Ti. Its base clock speed is a bit higher than before (1.98GHz compared to the 4070's 1.92GHz), but it has the same 2.48GHz boost clock and 12GB of GDDR6X VRAM as the original.
The difference between the RTX 4070 Super and the plain model was immediately obvious. On my desktop, powered by a Ryzen 9 7900X with 32GB of RAM, I was able to run Cyberpunk 2077 in 4K with Ultra graphics and DLSS at an average of 78fps. The RTX 4070 sometimes struggled to stay above 60fps at those settings. NVIDIA’s new GPU showed its limits in Cyberpunk's RT Overdrive mode (which enables intensive real-time path tracing), where I only saw 51fps on average while using DLSS and frame generation. (CD Projekt says that mode is meant for the RTX 4070 Ti and up, or on the 3090 at 1080p/30fps).
While the original RTX 4070 was a card that could occasionally let you game in 4K, the 4070 Super makes that a possibility far more often (so long as you can use DLSS). Of course, you'll need to have reasonable expectations (you’re not getting 4K/120fps) and ideally a G-Sync monitor to smooth out performance.
None
3DMark TimeSpy Extreme
Port Royal (Ray Tracing)
Cyberpunk
Blender
NVIDIA RTX 4070 Super
9,830
12,938/60fps
1440p RT Overdrive DLSS: 157
GPU 6,177
NVIDIA RTX 4070
8,610
11,195/52 fps
1440p RT DLSS: 120 fps
6,020
NVIDIA RTX 4070 Ti
10,624
14,163/66 fps
1440p RT DLSS: 135 fps
7,247
AMD Radeon RX 7900 XT
11,688
13,247/61 fps
1440p FSRT RT: 114 fps
3,516
When it comes to 1,440p gaming, the RTX 4070 Super is truly a superstar. In Cyberpunk's Overdrive ray tracing mode with Ultra graphics settings, I saw an average of 157fps — almost enough to satisfy the demands of a 165hz 1,440p monitor. To my eye, the whole experience looked far smoother than the 4K Overdrive results and, as usual, I found it hard to tell the difference between 4K and 1,440p textures during actual gameplay.
Similarly, I'd rather keep the 160fps/1,440p average I saw in Halo Infinite with maxed out graphics, than the 83fps I reached in 4K. That game doesn't get an assist from DLSS, either, so there's no upscaling magic going on in those numbers.
Across most of our benchmarks, the RTX 4070 Super landed smack dab between the 4070 and 4070 Ti. In 3DMark Timespy Extreme, for example, the new GPU scored 9,830 points, compared to 8,610 on the 4070 and 10,624 on the 4070 Ti. In some cases, like the Port Royal ray tracing benchmark, it leaned far closer to the 4070 Ti (which also bodes well for the 4070 Super's overclocking potential). NVIDIA's advanced cooling setup on its "Founders Edition" cards also continues to work wonders: The 4070 Super idled at around 40 Celsius and typically maxed out at 66C under heavy load.
Photo by Devindra Hardawar/Engadget
The RTX 4070 Super is clearly a big step forward from the original card, and a far better value for $599. It's a solid upgrade if you're running a 20-series NVIDIA GPU and even some of the lower-end 30-series options. The value should hopefully trickle downhill, as well: The original 4070 now sells for $550 on NVIDIA's website and used models are on eBay for well below that.
While we’ll continue to long for the days when “mid-range” described a $300 GPU, NVIDIA is giving gamers more of a reason to shell out for the $599 RTX 4070 Super. It’ll satisfy all of your 1,440p gaming needs — and it’s ready to deliver decent 4K performance, as well.
This article originally appeared on Engadget at https://www.engadget.com/nvidia-rtx-4070-super-review-a-1440p-powerhouse-for-599-160025855.html?src=rss