Orbital AI data centers could work, but they might ruin Earth in the process

At the start of the month, Elon Musk announced that two of his companies — SpaceX and xAI  — were merging, and would jointly launch a constellation of 1 million satellites to operate as orbital data centers. Musk's reputation might suggest otherwise, but according to experts, such a plan isn't a complete fantasy. However, if executed at the scale suggested, some of them believe it would have devastating effects on the environment and the sustainability of low Earth Earth orbit.     

Musk and others argue that putting data centers in space is practical given how much more efficient solar panels are away from Earth's atmosphere. In space, there are no clouds or weather events to obscure the sun, and in the correct orbit, solar panels can collect sunlight through much of the day. In combination with declining rocket launch costs and the price of powering AI data centers on Earth, Musk has said that within three years space will be the cheapest way to generate AI compute power. 

Ahead of the billionaire's announcement, SpaceX filed an eight-page application with the Federal Communications Commission detailing his plan. The company hopes to deposit the satellites in this massive cluster in altitudes ranging between 500km and 2000km. They would communicate with one another and SpaceX's Starlink constellation using laser "optical links." Those Starlink satellites would then transmit inference requests to and from Earth. To power the entire effort, SpaceX has proposed putting the new constellation in sun-synchronous orbit, meaning the spacecraft would fly along the dividing line that separates the day and night sides of the planet. 

Almost immediately the plan was greeted with skepticism. How would SpaceX, for instance, cool millions of GPUs in space? At first glance, that might seem like a weird point to get hung up on — much of space being around -450 Fahrenheit — but the reality is more complicated. In the near vacuum of space, the only way to dissipate heat is to slowly radiate it out, and in direct sunlight, objects can easily overheat. As one commenter on Hacker News succinctly put it, "a satellite is, if nothing else, a fantastic thermos."

Scott Manley, who, before he created one of the most popular space-focused channels on YouTube, was a software engineer and studied computational physics and astronomy, argues SpaceX has already solved that problem at a smaller scale with Starlink. He points to the company's latest V3 model, which has about 30 square meters of solar panels. "They have a bunch of electronics in the middle, which are taking that power and doing stuff with it. Now, some of that power is being beamed away as radio waves, but there's a lot of thermal power that's being generated and then having to be dissipated. So they already have a platform that's running electronics off of power, and so it's not a massive leap to turn into something doing compute."

Kevin Hicks, a former NASA systems engineer who worked on the Curiosity rover mission, is more skeptical. "Satellites with the primary goal of processing large amounts of compute requests would generate more heat than pretty much any other type of satellite," he said. "Cooling them is another aspect of the design which is theoretically possible but would require a ton of extra work and complexity, and I have doubts about the durability of such a cooling system."  

What about radiation then? There's a reason NASA relies on ancient hardware like the PowerPC 750 CPU found inside the Perseverance rover: Older chips feature larger transistors, making them more resilient to bit flips — errors in processing caused most often by cosmic radiation — that might scramble a computation. "Binary ones and zeroes are about the presence or absence of electrons, and the amount of charge required to represent a 'one' goes down as the transistors get smaller and smaller," explains Benjamin Lee, professor of computer and information science at the University of Pennsylvania. Space is full of energized particles traveling at incredible velocities, and the latest GPUs are built on the smallest, most advanced processing nodes to create transistor-dense silicon. Not a great combination.

"My concern about radiation is that we don't know how many bit flips will occur when you deploy the most advanced chips and hundreds of gigabytes of memory up there," said Professor Lee, pointing to preliminary research by Google on the subject. As part of Project Suncatcher, its own effort to explore the viability of space-based data centers, the company put one of its Trillium TPUs in front of a proton beam to bombard it with radiation. It found the silicon was "surprisingly radiation-hard for space applications." 

While those results were promising, Professor Lee points out we just don't know how resilient GPUs are to radiation at this scale. "Even though modern computer architectures can detect and sometimes correct for those errors, having to do that again and again will slow down or add overhead to space-based computation," he said.   

Space engineer Andrew McCalip, who's done a deep dive on the economics of orbital data centers, is more optimistic, pointing to the natural resilience of AI models. "They don't require 100 percent perfect error-free runs. They're inherently very noisy, very stochastic," he explains, adding that part of the training for modern AI systems involves "injecting random noise into different layers."   

Even if SpaceX could harden its GPUs against radiation, the company would still lose satellites to GPUs that break down. If you know anything about data centers here on Earth, it's that they require constant maintenance. Components like SSDs and GPUs die all the time. Musk has claimed SpaceX's AI satellites would require "little" in the way of operating or maintenance costs. That's only true if you accept the narrowest possible interpretation of what maintaining a fleet of AI satellites would entail.

"I think that there's no case in which repair makes sense. It's a fly till you die scenario," says McCalip. From an economic perspective, McCalip argues the projected death rate of GPUs in space represents "one of the biggest uncertainties" of the orbital data center model. McCalip's put that number at nine percent on the basis of a study Meta published following the release of its Llama 3 model (which, incidentally, measured hardware failures on Earth.) But the reality is no one knows what the attrition rate of those chips will be until they're in space. 

Orbital data centers also likely wouldn't be a direct replacement for their terrestrial counterparts. SpaceX's application specifically mentions inference as the primary use case for its new constellation. Inference is the practical side of running an AI system. It sees a model apply its learning to data it hasn't seen before, like a prompt you write in ChatGPT, to make predictions and generate content. In other words, AI models would still need to be trained on Earth, and it's not clear that the process could be offloaded to a constellation of satellites. "My initial thinking is that computations that require a lot of coordination, like AI training, may end up being tricky to get right at scale up there," says Professor Lee.     

In 1978, a pair of NASA scientists proposed a scenario where low Earth orbit could become so dense with space junk that collisions between those objects would begin to cascade. That scenario is known as Kessler syndrome

One estimate from satellite tracking website Orbiting Now puts the number of objects in orbit around the planet at approximately 15,600. Another estimate from NASA suggests there are 45,000 human-made objects orbiting Earth. No matter the number, what's currently in orbit represents a fraction of the 1 million additional satellites Musk wants to launch.  

According to Aaron Boley, professor of physics and astronomy at the University of British Columbia and co-director of the Outer Space Institute, forward-looking modeling of Earth's orbit above 700 kilometers — where part of SpaceX's proposed cluster would live — suggests that area of space is already showing signs of Kessler syndrome. 

While it takes less time for debris to clear in low Earth orbit, Professor Boley says there's already enough material in that region of space where there could be a cascading effect from a major collision. Debris could, in a worst case scenario, take a decade to clear up. In turn, that could lead to disruptions in global communications, climate monitoring missions and more.     

"You could get to the point where you're just launching material in, and you could ask yourself how many satellites can I afford to lose? Can you reconstitute your constellation faster than you're losing parts of it because of debris?" says Boley. "That's a horrible future in terms of the environmental perspective" In particular, it would limit opportunities for humans to fly into low Earth orbit. "Could you operate in it? Yeah, but it would come with higher and higher costs," adds Boley. 

"The entire world is struggling with the problem of how we safely fly multiple mega constellations," says Richard DalBello, who previously ran the Traffic Coordination System for Space (TraCSS) at the US Department of Commerce. Right now, there is no common global space situational awareness (SSA) system, and government and satellite operators are using uncoordinated national and commercial systems that are likely producing different results. At the start of the year, SpaceX lowered the orbit of thousands of Starlink satellites after one of them nearly collided with a Chinese satellite. 

SpaceX has its own in-house SSA system called Stargaze, which it uses to fly its more than 7,000 Starlink satellites. According to DalBello, competing operators can receive SSA data from SpaceX, but to do so they must share their satellite position information. “Assuming data sharing, it is likely Stargaze can make an important contribution to spaceflight safety" says DalBello. “SpaceX is likely to have success with US and other commercial operators, but without the assistance of the federal government, other governments — particularly China — will likely be unwilling to share their satellite and SSA data." 

According to DalBello, the Biden administration was unable to make meaningful progress on the next-generation TraCSS system, in part because Congress was initially reluctant to fund the program. Meanwhile, the current Trump administration hasn't shown interest in advancing the work that began during the president's first term.  

Even if the regulatory situation suddenly changes and the world's governments agree on an international SSA system, SpaceX launching 1 million satellites along the day-night terminator would see the company effectively monopolize one of the Earth's most valuable and important orbits. Professor Boley argues we should view our planet's orbits as a resource that belongs to everyone. "Every time you put a satellite up, you use part of that resource. Now someone else can't use it." 

And as Hicks points out, even a single cascade of colliding satellites would prevent that space from being used for scientific endeavors. "You would have to wait years for that debris to slowly come back into the atmosphere and burn up. In the meantime, that debris is taking up space that could be used for climate monitoring missions or any other types of missions that governments want to launch."   

Separately, the constant churn of Starship launches and re-entry of dead satellites would have a potentially dire impact on our planet's atmosphere. "We're not prepared for it," Boley flatly says of the latter. "We're not prepared for what's happening now, and what's happening now is already potentially bad." 

According to Musk's "basic math," SpaceX could add 100 gigawatts of AI compute capacity annually by launching a million tons of satellite per year. McCalip estimates a 100-gigawatt buildout alone would necessitate about 25,000 Starship flights.  

Many of the metals found in satellites, including aluminum, magnesium and lithium, in combination with the exhaust rockets release into the atmosphere, can have complicated effects on the health of the planet. For instance, they can affect polar cloud formations, which in turn can facilitate ozone layer destruction through the chemical reactions that occur on their surfaces. According to Boley, the problem is we just don't know how severe those environmental factors could become at the scale Musk has proposed, and SpaceX has provided us with precious few details on its mitigation plans. All it has said is that its plan would "achieve transformative cost and energy efficiency while significantly reducing the environmental impact associated with terrestrial data centers."   

Even if SpaceX could and does go out its way to mitigate the atmospheric effects of constant rocket flights, those spacecraft still need to be manufactured here on Earth. At one of his previous roles, Hicks studied rocket emissions and found the supply chains needed to build them produce an "order of magnitude" more carbon emissions than the rockets themselves.   

SpaceX plans to fly its new satellites in a sun-synchronous orbit, meaning for much of the year, they'll be sunlit. Each new Starlink generation has been larger and heavier than the one before it, with SpaceX stating in a recent filing that its upcoming V3 model could weigh up to 2,000 kilograms, up from the 575 kilograms of the V2 Mini Optimized. While we don't know the exact dimensions of the company's still-hypothetical AI satellites, they will almost certainly be bigger than their Starlink counterparts. 

SpaceX has done more than most space operators to reduce the brightness of its satellites, but Professor Boley says he expects that this new constellation will be "strikingly bright" when moving through the night sky. In aggregate, he estimates they will almost certainly be harmful to scientific research here on Earth, limiting what terrestrial observatories can see.  

"You're going to see them with the naked eye. You're going to see them with cameras. It's going to be like living near an airport where you see all these things flying over just after sunset and the next couple of hours after sunset," says Manley. "I don't know if I want to have my entire sunset be just a band of satellites constantly shooting overhead."

There are good reasons to make some spacecraft capable of doing AI inference. For instance, Professor Lee suggests it would make orbital imaging satellites more useful, as those spacecraft could do on-site analysis, instead of sending high-resolution files over long distances, saving time in the process. But the dose, as they say, makes the poison.

"There's a lot of excitement about the many possibilities that can be brought to society and humanity through continued access to space, but the promise of prosperity is not permission to be reckless," he says. "At this moment, we're allowing that excitement to overtake that more measured progression [...] those impacts don't just impact outer space but Earth as well." 

This article originally appeared on Engadget at https://www.engadget.com/ai/orbital-ai-data-centers-could-work-but-they-might-ruin-earth-in-the-process-170000099.html?src=rss

The best cheap phones for 2026

A few years ago, it may have been fashionable to spend $1,000 on the latest flagship smartphone, but for most people, that’s neither practical nor necessary. You don't even have to spend $500 today to get a decent handset, whether it’s a refurbished iPhone or an affordable Android phone, as there are plenty of decent options as low as $160.

However, navigating the budget phone market can be tricky; options that look good on paper may not be in practice, and some devices will end up costing you more when you consider many come with restrictive storage. While we spend most of our time reviewing mid- to high-end handsets at Engadget, we've tested a number of the latest budget-friendly phones on the market to see cut it as the best cheap phones you can get right now.

For this guide, our top picks cost between $100 and $300. Anything less and you might as well go buy a dumb phone instead. Since they’re meant to be more affordable than flagship phones and even midrange handsets, budget smartphones involve compromises; the cheaper a device, the lower your expectations around specs, performance and experience should be. For that reason, the best advice I can give is to spend as much as you can afford. In this price range, even $50 or $100 more can get you a dramatically better product.

Second, you should know what you want most from a phone. When buying a budget smartphone, you may need to sacrifice a decent main camera for long battery life, or trade a high-resolution display for a faster CPU. That’s just what comes with the territory, but knowing your priorities will make it easier to find the right phone.

It’s also worth noting some features can be hard to find on cheaper handsets. For instance, you won’t need to search far for a device with all-day battery life — but if you want a phone with excellent camera quality, you’re better off shelling out for one of the recommendations in our midrange smartphone guide, which all come in at $600 or less.

Wireless charging and waterproofing also aren’t easy to find in this price range and forget about the fastest chipset. On the bright side, most of our recommendations come with headphone jacks, so you won’t need to buy wireless headphones.

iOS is also off the table, since, following the discontinuation of the iPhone SE, the $599 iPhone 16e is now the most affordable offering from Apple. That leaves Android as the only option in the under-$300 price range. Thankfully today, there’s little to complain about Google’s operating system – and you may even prefer it to iOS.

Lastly, keep in mind most Android manufacturers typically offer far less robust software features and support for their budget devices. In some cases, your new phone may only receive one major software update and a year or two of security patches beyond that. That applies to the OnePlus and Motorola recommendations on our list.

If you’d like to keep your phone for as long as possible, Samsung has the best software policy of any Android manufacturer in the budget space, offering at least four years of security updates on all of its devices. Recently, it even began offering six years of support on the $200 A16 5G, which we recommend below. That said, if software support (or device longevity overall) is your main focus, consider spending a bit more on the $500 Google Pixel 9a, or even the previous-gen Pixel 8a, which has planned software updates through mid-2031.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/best-cheap-phones-130017793.html?src=rss

The best cheap phones for 2026

A few years ago, it may have been fashionable to spend $1,000 on the latest flagship smartphone, but for most people, that’s neither practical nor necessary. You don't even have to spend $500 today to get a decent handset, whether it’s a refurbished iPhone or an affordable Android phone, as there are plenty of decent options as low as $160.

However, navigating the budget phone market can be tricky; options that look good on paper may not be in practice, and some devices will end up costing you more when you consider many come with restrictive storage. While we spend most of our time reviewing mid- to high-end handsets at Engadget, we've tested a number of the latest budget-friendly phones on the market to see cut it as the best cheap phones you can get right now.

For this guide, our top picks cost between $100 and $300. Anything less and you might as well go buy a dumb phone instead. Since they’re meant to be more affordable than flagship phones and even midrange handsets, budget smartphones involve compromises; the cheaper a device, the lower your expectations around specs, performance and experience should be. For that reason, the best advice I can give is to spend as much as you can afford. In this price range, even $50 or $100 more can get you a dramatically better product.

Second, you should know what you want most from a phone. When buying a budget smartphone, you may need to sacrifice a decent main camera for long battery life, or trade a high-resolution display for a faster CPU. That’s just what comes with the territory, but knowing your priorities will make it easier to find the right phone.

It’s also worth noting some features can be hard to find on cheaper handsets. For instance, you won’t need to search far for a device with all-day battery life — but if you want a phone with excellent camera quality, you’re better off shelling out for one of the recommendations in our midrange smartphone guide, which all come in at $600 or less.

Wireless charging and waterproofing also aren’t easy to find in this price range and forget about the fastest chipset. On the bright side, most of our recommendations come with headphone jacks, so you won’t need to buy wireless headphones.

iOS is also off the table, since, following the discontinuation of the iPhone SE, the $599 iPhone 16e is now the most affordable offering from Apple. That leaves Android as the only option in the under-$300 price range. Thankfully today, there’s little to complain about Google’s operating system – and you may even prefer it to iOS.

Lastly, keep in mind most Android manufacturers typically offer far less robust software features and support for their budget devices. In some cases, your new phone may only receive one major software update and a year or two of security patches beyond that. That applies to the OnePlus and Motorola recommendations on our list.

If you’d like to keep your phone for as long as possible, Samsung has the best software policy of any Android manufacturer in the budget space, offering at least four years of security updates on all of its devices. Recently, it even began offering six years of support on the $200 A16 5G, which we recommend below. That said, if software support (or device longevity overall) is your main focus, consider spending a bit more on the $500 Google Pixel 9a, or even the previous-gen Pixel 8a, which has planned software updates through mid-2031.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/best-cheap-phones-130017793.html?src=rss

The best cheap phones for 2026

A few years ago, it may have been fashionable to spend $1,000 on the latest flagship smartphone, but for most people, that’s neither practical nor necessary. You don't even have to spend $500 today to get a decent handset, whether it’s a refurbished iPhone or an affordable Android phone, as there are plenty of decent options as low as $160.

However, navigating the budget phone market can be tricky; options that look good on paper may not be in practice, and some devices will end up costing you more when you consider many come with restrictive storage. While we spend most of our time reviewing mid- to high-end handsets at Engadget, we've tested a number of the latest budget-friendly phones on the market to see cut it as the best cheap phones you can get right now.

For this guide, our top picks cost between $100 and $300. Anything less and you might as well go buy a dumb phone instead. Since they’re meant to be more affordable than flagship phones and even midrange handsets, budget smartphones involve compromises; the cheaper a device, the lower your expectations around specs, performance and experience should be. For that reason, the best advice I can give is to spend as much as you can afford. In this price range, even $50 or $100 more can get you a dramatically better product.

Second, you should know what you want most from a phone. When buying a budget smartphone, you may need to sacrifice a decent main camera for long battery life, or trade a high-resolution display for a faster CPU. That’s just what comes with the territory, but knowing your priorities will make it easier to find the right phone.

It’s also worth noting some features can be hard to find on cheaper handsets. For instance, you won’t need to search far for a device with all-day battery life — but if you want a phone with excellent camera quality, you’re better off shelling out for one of the recommendations in our midrange smartphone guide, which all come in at $600 or less.

Wireless charging and waterproofing also aren’t easy to find in this price range and forget about the fastest chipset. On the bright side, most of our recommendations come with headphone jacks, so you won’t need to buy wireless headphones.

iOS is also off the table, since, following the discontinuation of the iPhone SE, the $599 iPhone 16e is now the most affordable offering from Apple. That leaves Android as the only option in the under-$300 price range. Thankfully today, there’s little to complain about Google’s operating system – and you may even prefer it to iOS.

Lastly, keep in mind most Android manufacturers typically offer far less robust software features and support for their budget devices. In some cases, your new phone may only receive one major software update and a year or two of security patches beyond that. That applies to the OnePlus and Motorola recommendations on our list.

If you’d like to keep your phone for as long as possible, Samsung has the best software policy of any Android manufacturer in the budget space, offering at least four years of security updates on all of its devices. Recently, it even began offering six years of support on the $200 A16 5G, which we recommend below. That said, if software support (or device longevity overall) is your main focus, consider spending a bit more on the $500 Google Pixel 9a, or even the previous-gen Pixel 8a, which has planned software updates through mid-2031.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/best-cheap-phones-130017793.html?src=rss

The next Metal Gear Solid remaster collection arrives this summer

Volume two of the Metal Gear Solid: Master Collection will arrive on August 27, publisher Konami announced today during Sony’s latest State of Play presentation. The bundle will feature 2008’s Metal Gear Solid 4: Guns of the Patriots, the HD remaster of 2010’s Metal Gear Solid: Peace Walker and a selection of bonus content, including Metal Gear: Ghost Babel, which was originally released for Game Boy Color in 2000. All told, that’s a smaller selection of games than Konami made available with Vol. 1 of the Master Collection, but Metal Gear fans will be excited nonetheless, if only for the fact it will mark the first time MGS4 will be officially playable on a platform other than the PlayStation 3.

That it has taken Konami nearly two decades to release the conclusion of Solid Snake’s story on more systems has to do with the nature of the game as a PS3 exclusive. MGS4 took extensive advantage of the console’s unique Cell architecture, a fact that made it difficult (and expensive) proposition to port to more recent x86-based systems. In recent years, it’s been possible to emulate the game on a powerful PC, but not everyone has that kind of hardware.

Metal Gear Solid: Master Collection Vol.2 will be available on PS5, Xbox Series X/S, PC, Nintendo Switch and Nintendo Switch 2.

Update, February 12, 6:30PM ET: This story was updated after publish to add details about Metal Gear Solid: Master Collection Vol.2’s launch platforms.

This article originally appeared on Engadget at https://www.engadget.com/gaming/playstation/the-next-metal-gear-solid-remaster-collection-arrives-this-summer-231711005.html?src=rss

How to buy a GPU in 2026

One of the toughest parts of any new computer build or upgrade is finding the right video card. In a gaming PC, the GPU is easily the most important part, and you can limit your experience by going with the wrong model. The buying process can be frustrating, especially right now with memory shortages leading to higher prices. In this guide, we'll help you navigate the market and find the right GPU for your needs.

The first question to ask yourself is what kind of games do you want to play. Competitive shooters like Valorant, Overwatch and Marvel Rivals were designed to run on older hardware. As such, even entry-level GPUs like the GeForce RTX 5060 can push those games at 120 frames per second and above at 1080p (more on why that's important in a moment).

By contrast, if you want to play modern, single-player games with ray tracing and other graphical extras, you'll need a more powerful GPU. Just how much more powerful will depend on the resolution of your monitor.

A 1440p or QHD monitor has 78 percent more pixels than a 1080p screen, and a 4K display has more than twice as many pixels as a QHD panel. In short, running a game at 4K, especially at anything above 60 frames per second, is demanding, and most GPUs will need to use upscaling techniques like NVIDIA's Deep Learning Super Sampling (DLSS) and AMD's FidelityFX Super Resolution (FSR) to push new games at high refresh rates.

On the subject of resolution, it doesn't make sense to spend a lot of money on a 4K monitor only to pair it with an inexpensive GPU. That's a recipe for a bad experience. As you're shopping for a new video card, you should think about the resolution and frame rate you want to play your games. If you're in the market for both a GPU and display, be sure to check out our guide to the best gaming monitors.

If your budget allows, a good bet is to buy a midrange card that can comfortably render all but the most demanding games at 1440p and at least 144 frames per second. Put another way, you want a GPU that can saturate a monitor at its native resolution and refresh rate in as many games as possible. That will give you the smoothest possible experience in terms of motion clarity, and allow you to dabble in both competitive shooters and the latest single-player games as the mood strikes you.

Intel Arc B580 label view
Intel Arc B580 label view
Photo by Devindra Hardawar/Engadget

One of the confusing aspects of the GPU industry are all the players involved. What you need to know is that there are three main players: AMD, Intel and NVIDIA. They design the cards you can buy, but delegate the manufacturing of them to so-called add-in board (AIB) partners like ASUS, XFX, Gigabyte and others.

As you can imagine, this creates some headaches. The most annoying of which is that AMD, Intel and NVIDIA will often set recommended prices for their graphic cards, only for their partners to sell their versions of those GPUs for more than the manufacturer's suggested retail price (MSRP). For example, NVIDIA's website lists the RTX 5070 with a starting price of $549. On Newegg, there are no new 5070s listed at that price. The only models anywhere close to $549 are refurbished and open box specials. If you want one that comes sealed, that will cost you at least $630.

As for what company you should buy your new GPU from, before 2025, NVIDIA was the undisputed king of the market. Specific GeForce cards may have not offered the best rasterization performance in their price range, but between their performance in games with ray tracing and the fact NVIDIA was ahead on features like DLSS, an RTX GPU was a safe bet.

However, with this year's RTX 50 series release (and excluding models like the RTX 5080 and 5090 where there's no competition), it's safe to say NVIDIA missed the mark this generation. If you're in the market for an entry- or mid-level GPU, AMD and Intel offer better value, with cards that come with enough VRAM for now and into the future. That said, there are still a few reasons you might consider an NVIDIA GPU, starting with ray tracing.

For decades, developers have used rasterization techniques to approximate how light behaves in the real world, and the results have been commendable. But if you know what to look for, it's easy to see where the illusion falls apart. For that reason, real-time ray tracing has been a goal of industry for years, and in 2018 it became a reality with NVIDIA's first RTX cards.

In some games, effects like ray-traced reflections and global illumination are transformational. Unfortunately, those features are expensive to run, often coming at a significant frame rate drop without upscaling. Since ray tracing was optional in many games before 2025, you could save money by buying an AMD GPU. For example, even if the RX 7800 XT was worse at ray tracing than the RTX 4070, the former was often cheaper to buy, had more onboard VRAM and offered as good or better rasterization performance in many games.

However, you can't ignore ray tracing performance anymore. We're starting to see releases like Doom: The Dark Ages where the tech is an integral part of a game's rendering pipeline, and more are likely to follow in the future. Thankfully, AMD's newest cards are much better in that regard, though you'll still get an edge running an NVIDIA model. For that reason, if ray tracing is important to you, NVIDIA cards are still the way to go.

If you're new to the world of PC gaming, it can be tricky to wrap your head around refresh rates. In short, the higher the refresh rate of a monitor, the more times it can update the image it displays on screen every second, thereby producing a smoother moving picture.

For example, moving elements on a monitor with a 240Hz refresh rate will look better than on one with a 120Hz refresh rate. However, that's dependent on your GPU being able to consistently render a game at the appropriate frame rates. In the case of a 120Hz monitor, you want a GPU with enough headroom to drive most games at 120 fps. Realistically, most video cards won't be able to achieve that in every game, but it's a good baseline to aim for when shopping for a new GPU.

Since the release of NVIDIA's RTX 40-series GPU, the company has offered a feature called frame generation. As the name suggests, it allows NVIDIA's latest video cards to generate an additional frame for every frame they render normally. With the 50-series, NVIDIA has since begun offering multi-frame generation, which gives those GPUs the ability to generate up to three additional frames for every rendered frame. AMD has its own take on the tech, as does Intel, though NVIDIA's offering is considered superior to both due to how it handles frame pacing.

Frame generation is nice to have, but it's not the silver bullet it might seem. Enabling it will increase system latency, reducing how responsive your games feel. Somewhat unintuitively, high-end GPUs also benefit more from the tech than their entry-level counterparts since they can naturally render more frames. For that reason, it's best to think of frame generation as a way to get the most out of a high refresh rate display.

I've mentioned DLSS a few times already. Alongside FSR and Intel XeSS, DLSS is an example of what's known as an image reconstruction technology. More and more, native rendering is going out of fashion in game design. With ray tracing and other modern effects enabled, even the most powerful GPUs can struggle to render a game at 1440p or 4K and a playable framerate. That’s why many developers will turn to DLSS, FSR or XeSS to eke out additional performance by upscaling a lower resolution image to QHD or UHD.

Upscaling in games is nothing new. For example, the PS4 Pro used a checkerboard technique to output games in 4K. What’s different now is how modern GPUs go about it. With DLSS, NVIDIA pioneered an approach that uses machine learning to recreate an image at a higher resolution, and in the process, addressed some of the pitfalls of past upscaling methods. If you're sensitive to these sorts of things, there's still blur and shimmer with DLSS, FSR and XeSS, but it's much less pronounced and can lead to significant performance gains.

To DLSS, NVIDIA later added single and multi-frame generation. DLSS is only available on NVIDIA cards, and following the recent release of DLSS 4.5, widely considered to offer the best image quality. That's another reason why you might choose an NVIDIA card over one of its competitors.

However, if you decide to go with an AMD GPU, don't feel like you're missing out. The company recently released FSR 4. While it's not quite on par with DLSS 4 and 4.5 in terms of support and image quality, it's a major leap over FSR 3 and FSR 2.

While on the subject of DLSS, I'll also mention NVIDIA Reflex. It's a latency-reducing technology NVIDIA introduced in 2020. AMD has its own version called Radeon Anti-Lag, but here again Team Green has a slight edge. If you're serious about competitive games, Reflex can significantly reduce input lag, which will make it easier to nail your shots in Counter-Strike 2, Valorant and other shooters.

Previously, one of the reasons to pick an NVIDIA GPU over the competition was the company's solid track record of driver support. With one of the company's video cards, you were less likely to run into stability issues and games failing to launch. At the start of 2025, NVIDIA's drivers were abysmal, but the company has since corrected course.

As you're comparing different GPUs, especially those in the same tier, pay close attention to the amount of VRAM they offer. Many modern games will eat up as much VRAM as a GPU can offer, and if your card has a low amount, such as 8GB, you're likely to run into a performance bottleneck.

If your budget allows for it, always go for the model with more VRAM. Consider, for instance, the difference between the $379 RTX 5060 Ti 8GB and $429 RTX 5060 Ti 16GB. Spending an extra $50 is going to be a lot for some people, but it's the difference between a card that is only adequate for many recent releases and one that will last you for a few years. In many cases, more VRAM is better.

A slight caveat to this is when comparing models that have different memory bandwidths. A GPU that can access more of its memory faster can outperform one with more memory, even if it has less of it outright. Here, you'll want to read reviews of the models you're comparing to see how they perform in different games.

Modern GPUs are big. Most new cards will take up at least two PCI slots on the back of your motherboard. They can also vary dramatically in length, depending on the number of fans the AIB has added to cool the PCB. To be safe, be sure to check the length of the card you want to buy against the maximum clearance listed by your case manufacturer. If you have a radiator at the front of your case, you will also need to factor the size of that in your measurements. The last thing you want is to buy a card that doesn't fit in your case.

Lastly, be sure to check the recommended power supply for the card you want. As a rule of thumb, unless you know what you're doing, it's best to just stick with the manufacturer's recommendation. For instance, NVIDIA suggests pairing the RTX 5070 Ti with a 750 watt PSU. So if you're currently running a 650 watt unit, you'll need to factor in the price of a PSU upgrade with your new GPU.

NVIDIA RTX 5060 Ti
NVIDIA RTX 5060 Ti
Devindra Hardawar for Engadget

It depends. If you can find a deal on an old RTX 40 series GPU, then yes. NVIDIA's RTX 50 series don't offer greatly improved performance over their predecessors, and with most models selling for more than their suggested retail price, it's not the best time to buy a new NVIDIA card.

That said, I suspect finding a good deal on a used GPU will be difficult. Most people will know the value of what they have, and considering the current market, will probably try to get as much as they can for their old card.

You may find better deals on older AMD and Intel GPUs, but I think you're better off spending more now on a new model from one of those companies since the generational gains offered by their latest cards are much more impressive. Simply put, the 9070 XT and B580 are two of the best cards you can buy right now.

Anything older than a card from NVIDIA's 40 series or AMD's RX 6000 family is not worth considering. Unless your budget is extremely tight or you mostly play older games, you're much better off spending more to buy a new card that will last you longer.

If you've read up to this point, you're probably wondering if it's even worth buying a GPU right now. The answer is (unsurprisingly) complicated. There are a handful of great cards like the Radeon RX 9060 XT and 9070 that are absolutely worth it. The problem is finding any GPU at a price approaching those set by AMD, Intel or NVIDIA.

The AI boom, and in particular actions by OpenAI, have led to memory shortages. In turn, those shortages have caused the price of consumer GPUs, SSDs and RAM kits to skyrocket in recent months. As of our latest update to this guide, some models like the GeForce RTX 5070 Ti are selling for hundreds of dollars above MSPR.

As such, if you own a relatively recent GPU, you're probably best off trying to hold onto your current card until things settle down. But if your GPU isn't cutting it anymore, you face a difficult decision: overpay now, or wait and potentially pay even more later.

To make that decision easier, I've been maintaining a separate guide that lists a selection of GPU models you can buy close to MSPR. My goal is to update that article at least once per month, so be sure to check often.

Entry-level (1080p)

As we mentioned above, if you're only aiming to play basic competitive shooters like Valorant and Overwatch 2 in 1080p, an entry-level GPU may be all you need. While 1080p isn't an ideal resolution when it comes to sharpness, many gamers prefer it since it's easier to reach higher framerates. And it also helps that 1080p gaming monitors, like the AOC 24G15N 24-inch we recommend, tend to offer speedy refresh rates for between $100 and $200. When you're zipping through matches, you likely won't have time to take a breath and appreciate the detail from higher resolutions.

Here are our recommendations for entry-level video cards:

  • AMD Radeon RX 9060 XT 8GB: Surprisingly enough, you can actually find this AMD GPU for $300. While you'll have to live with 8GB of RAM, that's more than enough for 1080p gaming, and it also has the benefit of DLSS 4 upscaling.

  • AMD Radeon RX 7600: While it's a last-gen card, the RX 7600 is still powerful enough to handle basic shooters.

While entry-level cards can dabble with 1440p gaming, it's worth stepping up to something a bit more powerful if you actually want to achieve higher refresh rates. For most gamers, 1440p is the best balance between sharpness and high frame rates. It looks noticeably better than 1080p, and doesn't require the horsepower overhead of 4K. (And there's a good chance you won't really see a visual difference with the jump to 4K.)

Here are our recommendations for midrange GPUs:

  • NVIDIA RTX 5060 Ti: Forget the disappointing RTX 5070, the 5060 Ti delivers excellent 1080p and 1440p performance. And best of all, you can still find it under $500. (Read our NVIDIA RTX 5060 Ti review.)

  • AMD Radeon RX 9060 XT 16GB: A step up from the 8GB model we recommend above. The 16GB 9060 XT offers excellent performance across many of the latest games, and is less expensive than the 5060 Ti.

  • AMD Radeon RX 9070: AMD surprised us all with the Radeon RX 9070 and 9070 XT, two midrange cards that offered similar power to and more VRAM than NVIDIA's more expensive cards. While you won't see the RX 9070 for its $550 launch price today, you can still snag one for a slight premium at $650. (Check out our AMD Radeon RX 9070 and 9070 XT review.)

If you want the most of what modern PC games have to offer, including 4K and all of the benefits of ray tracing, then be ready to spend big bucks on a high-end GPU. If you're going this route, though, be sure you're also gaming on a high-end monitor that befits these powerful GPUs.

Here are our recommendations for premium GPUs:

  • NVIDIA RTX 5070 Ti: The RTX 5070 Ti surprised me with excellent 4K gaming performance for a launch price that was well below the RTX 5080. It's the best overall NVIDIA card if you want to play in 4K at 120Hz or beyond, but it's also the hardest to find at MSRP. (Check out our NVIDIA RTX 5070 Ti review.)

  • AMD Radeon RX 9070 XT: I already mentioned the RX 9070 XT. With shortages of the 5070 Ti, it's the best GPU you can buy now without paying a ridiculous premium. (Check out our AMD Radeon RX 9070 and 9070 XT review.)

  • NVIDIA RTX 5080: If the RTX 5070 Ti isn't enough for you, the RTX 5080's additional power and 16GB of VRAM should suit your fancy. Just be prepared to pay around $1,500 for it, a 50 percent jump from its $999 launch price.

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/how-to-buy-a-gpu-160100017.html?src=rss

Apple just made Xcode better for vibe coding

Apple has just released Xcode 26.3, and it's a big step forward in terms of the company's support of coding agents. The new release expands on the AI features the company introduced with Xcode 26 at WWDC 2025 to give systems like Claude and ChatGPT more robust access to its in-house IDE. 

With the update, Apple says Claude and OpenAI's Codex "can search documentation, explore file structures, update project settings, and verify their work visually by capturing Xcode Previews and iterating through builds and fixes." This is in contrast to earlier releases of Xcode 26 where those same agents were limited in what they could see of a developer's Xcode environment, restricting their utility. According to Apple, the change will give users tools they can use to streamline their processes and work more efficiently than before.

Developers can add Claude and Codex to their Xcode terminal from the Intelligence section of the app's setting menu. Once a provider is selected, the interface allows users to also pick their preferred model. So if you like the outputs of say GPT 5.1 over GPT 5.2, you can use the older system. 

The tighter integration with Claude and Codex was made possible by Model Context Protocol (MCP) servers Apple has deployed. MCP is a technology Anthropic debuted in fall 2024 to make it easier for large language models like Claude to share data with third-party tools and systems. Since its introduction, MCP has become an industry standard — with OpenAI, for instance, adopting the protocol last year to facilitate its own set of connections. 

Apple says it worked directly with Anthropic and OpenAI to optimize token usage through Xcode, but the company’s adoption of MCP means developers will be able to add any coding agent that supports the protocol to their terminal in the future. Xcode 26.3 is available to download for all members of the Apple Developer Program starting today, with the Mac Store availability “coming soon.”

This article originally appeared on Engadget at https://www.engadget.com/ai/apple-just-made-xcode-better-for-vibe-coding-195653049.html?src=rss

OpenAI brings its Codex coding app to Mac, with new multi-agent abilities included

Since last spring, OpenAI has offered Codex. What started life as the company's response to Claude Code is becoming something more sophisticated with the release of a new dedicated macOS app. At its most basic form, Codex is a programming agent capable of writing code for users, but now it can also manage multiple AI assistants that can work together to complete more complex tasks.

OpenAI gives an example of how this could work in practice. The company used Codex to create a Mario Kart-like racing game, complete with a selection of different playable cars, eight tracks and a collection of powerups players can use against the competition. For a single AI agent, generating a game from scratch, with all the needed visual assets, would be a tough ask, but Codex was able to complete the task because it could delegate the work of making the game to different models with complementary capabilities. 

For example, it turned to GPT Image for the visual assets, while a separate model simultaneously coded the web game. "It took on the roles of designer, game developer and QA tester to validate its work by actually playing the game," OpenAI says of the process. 

If that sounds complicated, OpenAI has tried to make it more approachable with a section of the app titled Skills. The feature bundles “instructions, resources, and scripts so Codex can reliably connect to tools, run workflows, and complete tasks according to your team’s preferences," the company explains. "The Codex app includes a dedicated interface to create and manage skills. You can explicitly ask Codex to use specific skills, or let it automatically use them based on the task at hand."

As you might imagine, Codex can also automate repetitive tasks. A dedicated Automations section of the app allows you to schedule tasks, which the software will complete in the background. "At OpenAI, we’ve been using Automations to handle the repetitive but important tasks, like daily issue triage, finding and summarizing CI failures, generating daily release briefs, checking for bugs, and more," the company said. 

The release of the Codex macOS app comes as AI startups explore what a group of AI agents working in parallel can accomplish. At the start of the year, Anysphere, the company behind Cursor, found it was possible to build a working web browser from scratch using such an approach, though it did encounter problems along the way. 

For a limited time, OpenAI is making Codex available to ChatGPT Free and Go users so they can see what's possible with this new software. At the same time, the company is doubling rates for Plus and Pro subscribers.

This article originally appeared on Engadget at https://www.engadget.com/ai/openai-brings-its-codex-coding-app-to-mac-with-new-multi-agent-abilities-included-183103262.html?src=rss

NASA used Claude to plot a route for its Perseverance rover on Mars

Since 2021, NASA's Perseverance rover has achieved a number of historic milestones, including sending back the first audio recordings from Mars. Now, nearly five years after landing on the Red Planet, it just achieved another feat. This past December, Perseverance successfully completed a route through a section of the Jezero crater plotted by Anthropic's Claude chatbot, marking the first time NASA has used a large language model to pilot the car-sized robot.    

Between December 8 and 10, Perseverance drove approximately 400 meters (about 437 yards) through a field of rocks on the Martian surface mapped out by Claude. As you might imagine, using an AI model to plot a course for Perseverance wasn't as simple as inputting a single prompt. 

As NASA explains, routing Perseverance is no easy task, even for a human. "Every rover drive needs to be carefully planned, lest the machine slide, tip, spin its wheels, or get beached," NASA said. "So ever since the rover landed, its human operators have painstakingly laid out waypoints — they call it a 'breadcrumb trail' — for it to follow, using a combination of images taken from space and the rover’s onboard cameras." 

To get Claude to complete the task, NASA had to first provide Claude Code, Anthropic's programming agent, with the "years" of contextual data from the rover before the model could begin writing a route for Perseverance. Claude then went about the mapping process methodically, stringing together waypoints from ten-meter segments it would later critique and iterate on.  

This being NASA we're talking about, engineers from the agency's Jet Propulsion Laboratory (JPL) made sure to double check the model's work before sending it to Perseverance. The JPL team ran Claude's waypoints through a simulation they use every day to confirm the accuracy of commands sent to the rover. In the end, NASA says it only had to make "minor changes" to Claude's route, with one tweak coming as a result of the fact the team had access to ground-level images Claude hadn't seen in its planning process.  

"The engineers estimate that using Claude in this way will cut the route-planning time in half, and make the journeys more consistent," NASA said. "Less time spent doing tedious manual planning — and less time spent training — allows the rover’s operators to fit in even more drives, collect even more scientific data, and do even more analysis. It means, in short, that we’ll learn much more about Mars."

While the productivity gains offered by AI are often overstated, in the case of NASA, any tool that could allow its scientists to be more efficient is sure to be welcome. Over the summer, the agency lost about 4,000 employees – accounting for about 20 percent of its workforce – due to Trump administration cuts. Going into 2026, the president had proposed gutting the agency's science budget by nearly half before Congress ultimately rejected that plan in early January. Still, even with its funding preserved just below 2025 levels, the agency has a tough road ahead. It's being asked to return to the Moon with less than half the workforce it had during the height of the Apollo program.     

For Anthropic, meanwhile, this is a major feat. You may recall last spring Claude couldn't even beat Pokémon Red. In less than a year, the company's models have gone from struggling to navigate a simple 8-bit Game Boy game to successfully plotting a course for a rover on a distant planet. NASA is excited about the possibility of future collaborations, saying "autonomous AI systems could help probes explore ever more distant parts of the solar system."

This article originally appeared on Engadget at https://www.engadget.com/ai/nasa-used-claude-to-plot-a-route-for-its-perseverance-rover-on-mars-203150701.html?src=rss

Google’s Project Genie lets you create your own 3D interactive worlds

This past summer, Google DeepMind debuted Genie 3. It’s what’s known as a world world, an AI system capable of generating images and reacting as the user moves through the environment the software is simulating. At the time, DeepMind positioned Genie 3 as a tool for training AI agents. Now, it’s making the model available to people outside of Google to try with Project Genie.

To start, you’ll need Google’s $250 per month AI Ultra plan to check out Project Genie. You’ll also need to live in the US and be 18 years or older. At launch, Project Genie offers three different modes of interaction: World Sketching, exploration and remixing. The first sees Google’s Nano Banana Pro model generating the source image Genie 3 will use to create the world you will later explore. At this stage, you can describe your character, define the camera perspective — be it first-person, third-person or isometric — and how you want to explore the world Genie 3 is about to generate. Before you can jump into the model’s creation, Nano Banana Pro will “sketch” what you’re about to see so you can make tweaks. It’s also possible to write your own prompts for worlds others have used Genie to generate.

One thing to keep in mind is that Genie 3 is not a game engine. While its outputs can look game-like, and it can simulate physical interactions, there aren’t traditional game mechanics here. Generations are also limited to 60 seconds, as is the presentation, which is capped at 24 frames per second and 720p. Still, if you’re an AI Ultra subscriber, this is a cool opportunity to see the bleeding edge of what DeepMind has been working over the past couple of years.

This article originally appeared on Engadget at https://www.engadget.com/ai/googles-project-genie-lets-you-create-your-own-3d-interactive-worlds-183646428.html?src=rss