Machine learning pioneers, including the ‘Godfather of AI,’ are awarded the Nobel Prize in Physics

Two scientists have been awarded the Nobel Prize in Physics “for foundational discoveries and inventions that enable machine learning with artificial neural networks.” John Hopfield, an emeritus professor of Princeton University, devised an associative memory that's able to store and reconstruct images and other types of patterns in data. Geoffrey Hinton, who has been dubbed the "Godfather of AI," pioneered a way to autonomously find properties in data, leading to the ability to identify certain elements in pictures.

"This year’s physics laureates’ breakthroughs stand on the foundations of physical science. They have showed a completely new way for us to use computers to aid and to guide us to tackle many of the challenges our society face," the committee wrote on X. "Thanks to their work humanity now has a new item in its toolbox, which we can choose to use for good purposes. Machine learning based on artificial neural networks is currently revolutionizing science, engineering and daily life."

However, Hinton has grown concerned about machine learning and its potential impact on society. He was part of Google's deep-learning artificial intelligence team (Google Brain, which merged with DeepMind last year) for many years before resigning in May 2023 so he could "freely speak out about the risks of AI." At the time, he expressed concern about generative AI spurring a tsunami of misinformation and having the potential to wipe out jobs, along with the possibility of fully autonomous weapons emerging.

Although Hinton acknowledged the likelihood that machine learning and AI will improve health care, "it’s going to exceed people in intellectual ability. We have no experience of what it’s like to have things smarter than us,” he told reporters, according to The New York Times. That said, Hinton, a Turing Award winner and professor of computer science at the University of Toronto, was “flabbergasted” to learn that he had become a Nobel Prize laureate.

This article originally appeared on Engadget at https://www.engadget.com/ai/machine-learning-pioneers-including-the-godfather-of-ai-are-awarded-the-nobel-prize-in-physics-132124417.html?src=rss

Microsoft may have finally made quantum computing useful

The dream of quantum computing has always been exciting: What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classical physics? But despite seeing IBM, Google and others announce iterative quantum computing hardware, they're still not being used for any practical purposes. That might change with today's announcement from Microsoft and Quantinuum, who say they've developed the most error-free quantum computing system yet.

While classical computers and electronics rely on binary bits as their basic unit of information (they can be either on or off), quantum computers work with qubits, which can exist in a superposition of two states at the same time. The trouble with qubits is that they're prone to error, which is the main reason today's quantum computers (known as Noisy Intermediate Scale Quantum [NISQ] computers) are just used for research and experimentation.

Microsoft's solution was to group physical qubits into virtual qubits, which allows it to apply error diagnostics and correction without destroying them, and run it all over Quantinuum's hardware. The result was an error rate that was 800 times better than relying on physical qubits alone. Microsoft claims it was able to run more than 14,000 experiments without any errors.

According to Jason Zander, EVP of Microsoft's Strategic Missions and Technologies division, this achievement could finally bring us to "Level 2 Resilient" quantum computing, which would be reliable enough for practical applications.

"The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems," Zander wrote in a blog post today. "In short, we need to transition to reliable logical qubits — created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation."

Microsoft's announcement is a "strong result," according to Aram Harrow, a professor of physics at MIT focusing on quantum information and computing. "The Quantinuum system has impressive error rates and control, so it was plausible that they could do an experiment like this, but it's encouraging to see that it worked," he said in an e-mail to Engadget. "Hopefully they'll be able to keep maintaining or even improving the error rate as they scale up."

Microsoft Quantum Computing
Microsoft

Researchers will be able to get a taste of Microsoft's reliable quantum computing via Azure Quantum Elements in the next few months, where it will be available as a private preview. The goal is to push even further to Level 3 quantum supercomputing, which will theoretically be able to tackle incredibly complex issues like climate change and exotic drug research. It's unclear how long it'll take to actually reach that point, but for now, at least we're moving one step closer towards practical quantum computing.

"Getting to a large-scale fault-tolerant quantum computer is still going to be a long road," Professor Harrow wrote. "This is an important step for this hardware platform. Along with the progress on neutral atoms, it means that the cold atom platforms are doing very well relative to their superconducting qubit competitors."

This article originally appeared on Engadget at https://www.engadget.com/microsoft-may-have-finally-made-quantum-computing-useful-164501302.html?src=rss

Microsoft may have finally made quantum computing useful

The dream of quantum computing has always been exciting: What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classical physics? But despite seeing IBM, Google and others announce iterative quantum computing hardware, they're still not being used for any practical purposes. That might change with today's announcement from Microsoft and Quantinuum, who say they've developed the most error-free quantum computing system yet.

While classical computers and electronics rely on binary bits as their basic unit of information (they can be either on or off), quantum computers work with qubits, which can exist in a superposition of two states at the same time. The trouble with qubits is that they're prone to error, which is the main reason today's quantum computers (known as Noisy Intermediate Scale Quantum [NISQ] computers) are just used for research and experimentation.

Microsoft's solution was to group physical qubits into virtual qubits, which allows it to apply error diagnostics and correction without destroying them, and run it all over Quantinuum's hardware. The result was an error rate that was 800 times better than relying on physical qubits alone. Microsoft claims it was able to run more than 14,000 experiments without any errors.

According to Jason Zander, EVP of Microsoft's Strategic Missions and Technologies division, this achievement could finally bring us to "Level 2 Resilient" quantum computing, which would be reliable enough for practical applications.

"The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems," Zander wrote in a blog post today. "In short, we need to transition to reliable logical qubits — created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation."

Microsoft's announcement is a "strong result," according to Aram Harrow, a professor of physics at MIT focusing on quantum information and computing. "The Quantinuum system has impressive error rates and control, so it was plausible that they could do an experiment like this, but it's encouraging to see that it worked," he said in an e-mail to Engadget. "Hopefully they'll be able to keep maintaining or even improving the error rate as they scale up."

Microsoft Quantum Computing
Microsoft

Researchers will be able to get a taste of Microsoft's reliable quantum computing via Azure Quantum Elements in the next few months, where it will be available as a private preview. The goal is to push even further to Level 3 quantum supercomputing, which will theoretically be able to tackle incredibly complex issues like climate change and exotic drug research. It's unclear how long it'll take to actually reach that point, but for now, at least we're moving one step closer towards practical quantum computing.

"Getting to a large-scale fault-tolerant quantum computer is still going to be a long road," Professor Harrow wrote. "This is an important step for this hardware platform. Along with the progress on neutral atoms, it means that the cold atom platforms are doing very well relative to their superconducting qubit competitors."

This article originally appeared on Engadget at https://www.engadget.com/microsoft-may-have-finally-made-quantum-computing-useful-164501302.html?src=rss

Apple Silicon has a hardware-level exploit that could leak private data

A team of university security researchers has found a chip-level exploit in Apple Silicon Macs. The group says the flaw can bypass the computer’s encryption and access its security keys, exposing the Mac’s private data to hackers. The silver lining is the exploit would require you to circumvent Apple’s Gatekeeper protections, install a malicious app and then let the software run for as long as 10 hours (along with a host of other complex conditions), which reduces the odds you’ll have to worry about the threat in the real world.

The exploit originates in a part of Apple’s M-series chips called Data Memory-Dependent Prefetchers (DMPs). DMPs make the processors more efficient by preemptively caching data. The DMPs treat data patterns as directions, using them to guess what information they need to access next. This reduces turnarounds and helps lead to reactions like “seriously fast,” often used to describe Apple Silicon.

The researchers discovered that attackers can use the DMP to bypass encryption. “Through new reverse engineering, we find that the DMP activates on behalf of potentially any program, and attempts to dereference any data brought into cache that resembles a pointer,” the researchers wrote. (“Pointers” are addresses or directions signaling where to find specific data.) “This behavior places a significant amount of program data at risk.”

“This paper shows that the security threat from DMPs is significantly worse than previously thought and demonstrates the first end-to-end attacks on security-critical software using the Apple m-series DMP,” the group wrote.

The researchers named the attack GoFetch, and they created an app that can access a Mac’s secure data without even requiring root access. Ars Technica Security Editor Dan Goodin explains, “M-series chips are divided into what are known as clusters. The M1, for example, has two clusters: one containing four efficiency cores and the other four performance cores. As long as the GoFetch app and the targeted cryptography app are running on the same performance cluster—even when on separate cores within that cluster — GoFetch can mine enough secrets to leak a secret key.”

The details are highly technical, but Ars Technica’s write-up is worth a read if you want to venture much further into the weeds.

But there are two key takeaways for the layperson: Apple can’t do much to fix existing chips with software updates (at least without significantly slowing down Apple Silicon’s trademark performance), and as long as you have Apple’s Gatekeeper turned on (the default), you won’t likely install malicious apps in the first place. Gatekeeper only allows apps from the Mac App Store and non-App Store installations from Apple registered developers. (You may want to be extra cautious when manually approving apps from unregistered developers in macOS security settings.) If you don’t install malicious apps outside those confines, the odds appear quite low this will ever affect your M-series Mac. 

This article originally appeared on Engadget at https://www.engadget.com/apple-silicon-has-a-hardware-level-exploit-that-could-leak-private-data-174741269.html?src=rss

The Morning After: Apple explains how third-party app stores will work in Europe

Apple is making major changes to the App Store in Europe in response to new European Union laws. Beginning in March, Apple will allow users in the EU to download apps and make purchases from outside its App Store. These changes are already being stress-tested in the iOS 17.4 beta.

Developers will be able to take payments and distribute apps from outside the App Store for the first time. Apple will still enforce a review process for apps that don’t come through its store, but it will be “focused on platform integrity and protecting users” from things like malware. The company warns it has less chance of addressing other risks like scams, abuse and harmful content.

Apple is also changing its commission structure, so developers will pay 17 percent on subscriptions and in-app purchases, reducing the fee to 10 percent for “most developers” after the first year. The company is tacking on a new three percent “payment processing” fee for transactions through its store, and there’s a new €0.50 “core technology fee” for all app downloads after the first million installations.

That’s a lot of new money numbers to process, and it could shake out differently for different developers. Apple says the new fee structure will result in most developers paying the company less, since the core technology fee will have the greatest impact on larger developers.

This all means that yes, Fortnite is returning.

— Mat Smith

​​

The biggest stories you might have missed

The FTC is investigating Microsoft, Amazon and Alphabet’s investments into AI startups

Budget retailer Newegg just started selling refurbished electronics

NASA’s Ingenuity Helicopter has flown on Mars for the final time

MIT researchers have developed a rapid 3D-printing technique that uses liquid metal

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!

Microsoft launches its metaverse-styled virtual meeting platform

Mesh is a place for your avatars to float around.

TMA
Microsoft

Microsoft has announced the launch of Mesh, a feature for employees’ avatars to meet in the same place, even if the actual people are spread out. The virtual connection platform is powered through Microsoft Teams. Currently, Microsoft’s Mesh is only available on desktop PCs and Meta Quest VR devices (if employees want a more immersive experience). Microsoft is offering a six-month free trial to anyone with a business or enterprise plan. But no legs, it seems.

Continue reading.

The Ray-Ban Meta smart glasses’ new AI powers are impressive

And worrying.

When we first reviewed the Ray-Ban Meta smart glasses, multimodal AI wasn’t ready. The feature enables the glasses to respond to queries based on what you’re looking at. Meta has now made multimodal search available for “early access.” Multimodal search is impressive, if not entirely useful yet. But Meta AI’s grasp of real-time information is shaky at best.

We tried asking it to help pick out clothes, like Mark Zuckerberg did in a recent Instagram post, and were underwhelmed. Then again, it may work best for a guy who famously wore the exact same shirt every day for years.

Continue reading.

Elon Musk confirms new low-cost Tesla model

Coming in 2025.

Elon Musk has confirmed a “next-generation low-cost” Tesla EV is in the works and is “optimistic” it’ll arrive in the second half of 2025, he said in an earnings call yesterday. He also promised “a revolutionary manufacturing system” for the vehicle. Reuters reported that the new vehicle would be a small crossover called Redwood. Musk previously stated the automaker is working on two new EV models that could sell up to five million per year, combined.

Musk said the company’s new manufacturing technique will be “very hard to copy” because “you have to copy the machine that makes the machine that makes the machine... manufacturing inception.”

I just audibly groaned reading that.

Continue reading. 

Japan’s lunar spacecraft landed upside down on the moon

It collected some data before shutting down.

TMA
JAXA

This picture just makes me sad.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-apple-explains-how-third-party-app-stores-will-work-in-europe-121528606.html?src=rss