Nikon ZR review: A highly capable cinema camera at a reasonable price

Video used to be an afterthought for Nikon, but since the company purchased RED last year, content creators are now high on its priority list. A perfect example of that is Nikon’s new $2,200 ZR: a full-frame mirrorless model that stands up against dedicated cinema cameras for a fraction of the price.

It’s the first consumer camera to capture video using RED’s 12-bit RAW format, but unlike RED’s Hollywood cameras, it has a fast and accurate autofocus system. It also comes with a huge display, pro video monitoring tools, in-body stabilization and 32-bit float internal audio recording. After shooting a short film that tested its capabilities, I can confirm that the Nikon ZR offers incredible video quality at this price.

While a bit lighter than Nikon’s Z6 III, the 1.19-pound (540-gram) ZR feels solid. It has a boxy design like Sony’s FX2 but a much smaller grip because it’s designed to be rigged up for cinema shooting with cages and handles. However, unlike the FX2 which has multiple 1/4-inch mounting threads to do such rigging, the ZR unfortunately has only one of those on the bottom.

The ZR also lacks an electronic viewfinder like the FX2, but it more than makes up for that with its huge 4-inch display — the largest I’ve ever seen on a mirrorless camera. At 1,000 nits, it’s bright enough to shoot on sunny days, extremely sharp (3.07 million dots) and flips out for vloggers. All of that makes it a perfect primary display for checking the image and controlling the camera.

Nikon has nailed the ZR’s handling, too. While it’s not covered with buttons and dials like some models, it does have two shooting dials to control exposure and a joystick for autofocus. There’s also a camera/video switch, two record buttons, a power switch and five customizable buttons. Many of Nikon’s lenses come with control rings as well, so extra manual control is available.

The menu button is unusual: you press once for the quick menu and hold to see the full menu. Given the large number of settings, I would advise anyone buying this camera to learn all the important adjustments, then customize the controls to avoid wading through dense menus while shooting.

Another unique feature is in the battery compartment. There’s a single fast CFexpress slot to handle RAW video, plus a microSD slot for proxies. The lack of a second CFexpress slot or fast SD card slot for backup isn’t ideal for a professional camera, though.

Finally, the ZR runs on the same N‑EL15c batteries as other Nikon mirrorless cameras. They allow 90 minutes of HD shooting on a charge, or 390 photos per CIPA standards. That’s mediocre, so if you’re planning long shoots, stock up on batteries.

Nikon ZR review: A highly capable cinema camera at a bargain price
Steve Dent for Engadget

The Nikon ZR has the largest selection of RAW video settings I’ve seen. The centerpiece is RED’s RAW R3D NE light codec (designed by RED for Nikon) with RED’s Log3G10 log format. It also supports Nikon’s N-RAW, ProRes/ProRes RAW and H.265 with resolution that ranges from 6K at up to 60 fps to 4K 120 fps and 1080p at 240 fps. Despite the smallish body, it can capture 6K RAW video continuously for 125 minutes without overheating.

The 24MP sensor uses a dual ISO system with native 800 and 6,400 ISOs, providing a nice range for indoor and outdoor shooting. The company claims 15+ stops of dynamic range, which is more than just about any other mirrorless camera. Other key video features include five-axis in-body stabilization with seven stops of shake reduction, waveform and vectorscope monitoring and a false color display for manual focus.

To test the camera’s features and video quality, I shot a short film in a mix of indoor low light, outdoor daytime and a mix between the two. I also shot handheld (including running with it) to test the stabilization. I primarily captured in R3D RAW, as well as Nikon’s N-RAW at the native 800 and 6,400 ISOs to maximize dynamic range. (You can take 24MP photos with this camera, but I’m focusing on video as it’s mainly designed for that.)

In order to not see a flat log profile when shooting, you’ll need to apply a look-up table (LUT) designed for RED cameras, like "Achromic," "Bleach" or "Caustic." Those are only for in-camera previews and not baked into the video, but you can apply those LUTs later in Adobe Premiere or DaVinci Resolve to get the same look.

Nikon ZR review: A highly capable cinema camera at a bargain price
Steve Dent for Engadget

With such a high native ISO, I was able to shoot inside with a single studio light. Video quality was outstanding with little noise in shadow regions, even after boosting black levels in post. Meanwhile, the RED R3D codec and Log3G10 gave me extra latitude to reveal shadow detail and dial down highlights when I shot the subject against a bright window.

When you use the R3D codec, exposure is strictly manual with no ability to set auto shutter speed (shutter angle) or f-stop. So, for a scene with varying light, I used Nikon’s N-RAW to see if it would give me the correct exposure at the beginning and end of the scene. It did a good job, with no noticeable jumps during the shot.

Video in sunlight at ISO 800 was also sharp with accurate colors after downscaling to 4K from 6K in DaVinci Resolve. ISO 800 is a relatively high native setting, though, and the ZR doesn’t have a built-in ND filter to reduce exposure. That means you’ll need to buy ND filters for outside shooting or the high shutter speeds will result in choppy video.

Cinema cameras from Blackmagic Design, Arri or RED are manual-focus only. But the ZR is a Nikon camera, and it has the best AF system I’ve seen on any of the company’s models, consistently nailing focus even with moving subjects. You can also automatically track vehicles, birds and other animals. At the same time, the ZR handles manual focus well. That’s thanks to a built-in display that’s big enough to check focus accurately and Nikon’s focus peaking setting with three levels of sensitivity.

Nikon ZR review: A highly capable cinema camera at a bargain price
Steve Dent for Engadget

In-body stabilization on the ZR wasn’t up to par with Panasonic’s S1 II, however. Video was smooth for handheld shooting if I panned the camera gently, but all my running and walking shots showed noticeable camera shake. That said, the ZR at least has in-body stabilization, unlike most cinema cameras, and most filmmakers will use a gimbal for running shots, regardless of which camera they use. (Note that the rattling you hear when the ZR is turned off is the sensor, which floats by design.)

Finally, I was able to capture good audio quality via an external microphone without any clipping worries thanks to the Nikon ZR’s 32-bit float internal audio capture. The company also touts directional capture using its built-in mics, but as with any such system, audio quality isn’t high enough for production use.

With the ZR, Nikon has shown that it’s finally catching up to and even surpassing its rivals for content creation. Whether you’re doing social media, YouTube, documentaries or even film production, this camera is versatile and powerful with few compromises. Video quality and ease of use even beats models that are double or triple the price.

The ZR’s primary competition is in the low-end cinema cameras, particularly Sony’s $2,998 FX2 and the $3,899 Canon R5C. While more expensive, both come with an electronic viewfinder that the ZR lacks, and the R5C can shoot up to 8K video. Another option is Blackmagic Design’s Pyxis 6K camera, but it only offers basic autofocus capabilities and lacks in-body stabilization.

Compared to those options, Nikon’s ZR delivers better dynamic range thanks to the inclusion of RED’s R3D RAW codec. It also comes with an excellent autofocus system and decent in-body stabilization. If you’re a creator looking to get the best video quality for the money without losing those niceties, I’d highly recommend the ZR.

This article originally appeared on Engadget at https://www.engadget.com/cameras/nikon-zr-review-a-highly-capable-cinema-camera-at-a-reasonable-price-152634311.html?src=rss

Amazon rolls out a find-a-scene Alexa+ feature for Prime Video

Amazon is rolling out a new Alexa+ feature on Fire TV that can take you to a specific moment in a given movie on Prime Video based on a natural language voice command. The company says that, when you describe a certain scene, quote or character action, Alexa+ can start playing that part of the film. The company previewed this feature at its Devices and Services event in September.

According to Amazon, you can say something like “Jump to the card scene in Love Actually" or “Jump to the Ozdust ballroom scene in Wicked with Glinda,” to quickly get to that moment. Alexa+ can apparently figure out which movie you're referring to if you don't say the title. So if you say, for instance, “Jump to the scene when John McClane says ‘come out to the coast, we’ll get together, have a few laughs,’” Prime Video will start playing that bit in Die Hard where McClane is in an air duct.

To make this work, Alexa+ uses "visual understanding" and captions to determine what's happening in each scene so it can take you to the one you're looking for. It's all processed through the X-Ray feature in Prime Video. As with Alexa+, it's built on Amazon Bedrock and it harnesses large language models such as Amazon Nova and Anthropic Claude.

Alexa+ has indexed tens of thousands of scenes across thousands of movies on Prime Video so far, including many that you can purchase or rent. Amazon plans to expand this feature to more films and scenes, as well as TV shows, in the near future. 

While this is pretty interesting from a tech perspective and how Amazon’s able to make it work, I’d be interested to know how many people actually end up using it. This isn’t how most people who genuinely love cinema watch movies — maybe just start at the beginning of a film and take it from there? Besides, if you really want to watch a specific scene, YouTube exists.

This article originally appeared on Engadget at https://www.engadget.com/ai/amazon-rolls-out-a-find-a-scene-alexa-feature-for-prime-video-150557530.html?src=rss

Amazon halts AI anime dub ‘beta’ after widespread ridicule

Amazon appears to have quietly removed its terrible AI-generated English dubs for several anime shows currently streaming on Prime Video, following widespread ridicule from viewers and industry professionals. AI dubs were recently added to Banana Fish, No Game, No Life and Vinland Saga, where they were labeled "AI beta" in the Languages section of the app.

As shows that previously only offered English subtitles, the option of a dub for those who prefer it could have been seen as a win for Amazon. But it quickly became clear that the dubs were really quite bad, completely devoid of any emotion or convincing intonation in dramatic moments. Particularly awful clips of the AI English dub for Banana Fish soon started circulating on social media, and the National Association of Voice Actors released a statement in which it branded the dubs "AI slop."

In his own statement, voice actor Daman Mills called the AI-generated dub for Banana Fish a "massive insult to us as performers." In a post on X, which at the time of writing has been liked 14,000 times, he said: "Voice Actors deserve the same level of respect as on camera performers. Anime already pays talent very little. Dub production costs shouldn’t make a dent in these companies’ pocket books. Using AI for a dub of a show that released nearly 8 YEARS AGO AND HAD NO RUSHED SCHEDULE just spits in our faces, has infuriated the consumer, and completely destroys the art." 

Decision-makers at Amazon apparently noted the backlash, as the English dub options no longer show up, as acknowledged by Mills and others yesterday. An AI-generated Spanish dub for Vinland Saga appears to have survived the silent cull, but otherwise it’s back to Japanese language-only and subtitles for the other shows.

The company is clearly committed to introducing more AI to Prime Video — along with its various other services — despite this latest public shaming. It launched an "AI-aided" dubbing program for Prime Video earlier this year, piloting English and Latin American Spanish dubs in 12 licensed series and movies on Prime Video in March. Last month, it also introduced video recaps that summarize shows’ "most pertinent plot points" using generative AI. The feature is currently in beta for select English language Prime Original shows in the US.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/amazon-halts-ai-anime-dub-beta-after-widespread-ridicule-141501051.html?src=rss

Superhuman (formerly Grammarly) has some AI updates for its Superhuman Mail app

Superhuman, the AI-powered mail app, is heading in a more agentic direction with its latest update. Its "write with AI" feature, which you could previously activate when drafting an email, now works across your inbox, calendar, and the web. This means it can now pull in information from other emails or research a topic online. The AI will think for as long as it needs before responding to a prompt and will open its Ask AI tool if it needs clarification.

Ask AI now lives in a left sidebar when you’re on desktop, so it’s always accessible should you need to draft a note, ask a question or quickly schedule a meeting without digging around in your emails. You can also now check your Ask AI history on iOS and desktop for previous conversations. Write with AI is also now available on Android, which will soon gain the other new features too.

Superhuman Mail
Superhuman

Superhuman was acquired by Grammarly earlier this year, with the latter recently rebranding so all of its AI apps now sit under the Superhuman umbrella. The mail service is seemingly primarily targeted at business rather than consumers, with its most advanced version of Write with AI and Ask AI being included in Business and Enterprise plans. The more basic standard version of Write with AI is rolled into the Starter plan for desktop and mobile.

Superhuman is promising further agentic updates in the near future.

This article originally appeared on Engadget at https://www.engadget.com/ai/superhuman-formerly-grammarly-has-some-ai-updates-for-its-superhuman-mail-app-140017716.html?src=rss

Spotify Wrapped 2025 is here and now it’s a competition?

It's that time of year again, when all of our favorite streaming platforms start dropping personalized lists of what we've been consuming. Spotify Wrapped is perhaps the biggest of the bunch and it's available for perusal right now.

As always, users can access Wrapped to find their most listened-to genres, artists, songs, albums and podcasts from the past year. This information is shareable via social media if you want random bald eagle avatars to comment on your music taste, but there's a new interactive feature called Wrapped Party.

The tool in action.
Spotify

This is a game of a sort. Spotify says it "turns your listening data into a live competition." Wrapped Party hands out awards for stuff like listening to smaller artists and obsession with a particular artist, in addition to total minutes streamed. Finally, friends can settle the age-old debate of "who listens to music more."

Spotify Wrapped is also about the platform itself, so we have plenty of little tidbits from the global user base. Bad Bunny was named the most streamed artist in the world, just ahead of his Super Bowl performance that internet bozos have turned into a controversy for some reason. This is the fourth time he's come out on top in the past five years. He also had the most popular album of the year.

He wins an award.
Spotify

The global top song is something of a surprise, as it's not Bad Bunny or even Taylor Swift. It's the Lady Gaga and Bruno Mars duet "Die With a Smile." The top podcast is, as always, The Joe Rogan Experience. At least Spotify is getting what it paid for with Rogan.

If you don't use Spotify for whatever reason, other major streaming platforms offer something similar to Wrapped. Apple Music has Replay and Amazon Music has Delivered. Even YouTube got in on the act this year, unveiling a recap for video watchers.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/music/spotify-wrapped-2025-is-here-and-now-its-a-competition-130052418.html?src=rss

Uber is launching robotaxis in Dallas

Uber has made a big push to offer robotaxis as an option for its rideshare services in more markets this year. Starting today, the company is offering autonomous vehicles as an option for customers in Dallas. The move is in partnership with Avride.

At the start, the AVs providing rides will have a person in the front seat, but Uber plans to have fully driverless operation "in the future." The company will have a small fleet of Avride's Hyundai Ioniq 5 vehicles to start, but it plans to eventually have hundreds of these AVs working in Dallas. Riders can set their preferences to increase their chances of being paired with a robotaxi in the Uber app. If someone is assigned an AV for their ride, they will have the option to switch to a traditional rideshare driver.

Uber started a partnership with Avride in October 2024, but the rideshare company has cast a wide net for collaborators. It has also worked to bring robotaxis to markets with Waymo in Austin and Atlanta, with Lucid in the Bay Area, with WeRide in Abu Dhabi, and with Momenta in Europe.

This article originally appeared on Engadget at https://www.engadget.com/transportation/uber-is-launching-robotaxis-in-dallas-120000411.html?src=rss

How to watch the ‘Christmas in Rockefeller Center’ tree lighting special tonight

It's time for the annual Rockefeller Christmas tree lighting! The Christmas in Rockefeller Center tree lighting special will air tonight, Dec. 3 from 8-10 PM ET — though coverage will start an hour prior, at 7 PM ET. The Voice and Happy's Place star Reba McEntire will host and perform at the Rockefeller Tree lighting, which will also feature performances from Halle Bailey, Michael Bublé, Kristin Chenoweth, Laufey, the Radio City Rockettes and more. Here’s how to tune into the 2025 Rockefeller Tree lighting.

The Rockefeller Center Christmas tree will be lit on Wednesday, Dec. 3, 2025.

Coverage of the 2025 Rockefeller Tree lighting will start at 7 p.m. ET. The official Christmas in Rockefeller Center tree lighting special will air from 8-10 p.m ET.

The Christmas in Rockefeller Center special will air on NBC and stream on Peacock.

Reba McEntire will host NBC's annual holiday special, and perform throughout the evening.

Alongside Reba McEntire, the tree lighting ceremony special will feature performances from Marc Anthony, Halle Bailey, Michael Bublé, Kristin Chenoweth, Laufey, New Edition, Brad Paisley, Carly Pearce, Gwen Stefani and the Radio City Rockettes, who are celebrating their 100th anniversary this year.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/streaming/how-to-watch-the-christmas-in-rockefeller-center-tree-lighting-special-tonight-111504725.html?src=rss

How to use Magnifier on a MacBook to zoom in on faraway text

One of the iPhone’s many accessibility features is something Apple calls "Magnifier," which uses the smartphone's cameras to magnify and identify objects in the world around you. For Global Accessibility Awareness Day in May this year, Apple brought Magnifier to the Mac, opening up even more places the assistive tool can be used, like classroom or work environments where you might already have a MacBook pulled out.

Magnifier requires macOS 26 Tahoe and can work with a built-in webcam, a connected third-party camera or an iPhone via Apple's Continuity feature. Provided your MacBook can run Apple’s latest software update, it’s a natural fit for zooming in on a whiteboard at the back of a large lecture hall or getting a closer look at documents on a desk in front of you. You can use the app to both capture an individual image you want to refer to later, or to analyze text in a live video feed. But where to begin? Here’s how to set up and use Magnifier on your Mac.

How to use Magnifier to identify and display text

A MacBook using Magnifier and a connected iPhone to identify and format text from a book.
A MacBook using Magnifier and a connected iPhone to identify and format text from a book.
Apple

Magnifier's most powerful feature uses the MacBook's machine learning capabilities to identify, display and format text that your camera captures. This works with text your camera can see in the room around you, and things it captures via macOS' Desk View feature. For example, to view documents on your desk:

  1. Open Magnifier.

  2. Click on the Camera section in Magnifier's menu bar and then select your Desk View camera from the dropdown menu.

  3. Click on the Reader icon (a simple illustration of a document) near the top-right of your Magnifier window.

  4. Click on the sidebar menu icon to access settings to format text.

Apple gives you options to change the color, font and background of text Magnifier identifies, among other customization options. If you'd prefer to capture faraway text, you can position a webcam or iPhone camera facing away from you and swap to it via the Camera section in Magnifier's menu bar.

You can also listen to any text Magnifier has identified by clicking on the Play button in the top-right corner of Magnifier's reader mode. Clicking the Pause button will pause playback, clicking the Skip Forward or Skip Backward buttons skip through lines of text, and if you want to adjust playback speed, you can click on the 1x button and pick a speed from the dropdown menu.

How to use Magnifier to zoom in on yourself

A screenshot of the macOS Magnifier app zoomed in on a face.
Magnifier can identify text, but it also works as a way to get a zoomed in view of your own face.
Ian Carlos Campbell for Engadget

By default, Magnifier uses your MacBook's built-in webcam, which means you'll see a view of yourself and whatever's behind you if you don't have another camera selected. This might not be usual for seeing faraway text, but it is handy if you're applying makeup, putting in contacts or doing anything else where you need a detailed view of your face.

In my tests, using Magnifier worked the best with my MacBook's built-in webcam or an iPhone. When I tried using a third-party webcam from Logitech, my live camera feed was noticeably laggy. Your mileage may vary, but if you experience any issues with your own webcam, it's worth trying your built-in webcam to see if that helps. You can swap between cameras and zoom in to your camera feed inside the Magnifier app:

  1. Open Magnifier.

  2. In the top menu bar, select Camera and then click on the camera you'd like to use in the dropdown menu.

  3. Use the slider in the top center of the Magnifier window to zoom in on yourself.

You can see a live feed of your zoomed in view in Magnifier's main window. If you click on the Camera button in the bottom-left corner of the app, you can also snap a photo to review later. Any photos you capture will appear in Magnifier's left sidebar menu. Clicking on them lets you view them, zoom in on them and adjust their visual appearance (Brightness, Contrast and other visual settings) via the Image section in Magnifier's menu bar.

This article originally appeared on Engadget at https://www.engadget.com/computing/laptops/how-to-use-magnifier-on-a-macbook-to-zoom-in-on-faraway-text-080100677.html?src=rss

Google Discover is testing AI-generated headlines and they aren’t good

Artificial intelligence is showing up everywhere in Google's services these days, whether or not people want them and sometimes in places where they really don't make a lick of sense. The latest trial from Google appears to be giving articles the AI treatment in Google Discover. The Verge noticed that some articles were being displayed in Google Discover with AI-generated headlines different from the ones in the original posts. And to the surprise of absolutely no one, some of these headlines are misleading or flat-out wrong. 

For instance, one rewritten headline claimed "Steam Machine price revealed," but the Ars Technica article's actual headline was "Valve's Steam Machine looks like a console, but don’t expect it to be priced like one." No costs have been shared yet for the hardware, either in that post or elsewhere from Valve. In our own explorations, Engadget staff also found that Discover was providing original headlines accompanied by AI-generated summaries. In both cases, the content is tagged as "Generated with AI, which can make mistakes." But it sure would be nice if the company just didn't use AI at all in this situation and thus avoided the mistakes entirely.

The instances The Verge found were apparently "a small UI experiment for a subset of Discover users," Google rep Mallory Deleon told the publication. "We are testing a new design that changes the placement of existing headlines to make topic details easier to digest before they explore links from across the web." That sounds innocuous enough, but Google has a history of hostility towards online media its frequent role as middleman between publishers and readers. Web publishers have made multiple attempts over the years to get compensation from Google for displaying portions of their content, and in at least two instances, Google has responded by cutting out those sources from search results and later claiming that showing news doesn't do much for the bottom line of its ad business. 

For those of you who do in fact want more AI in your Google Search experience, you're in luck. AI Mode, the chatbot that's already been called outright "theft" by the News Media Alliance, is getting an even more symbiotic integration into the mobile search platform. Google Search's Vice President of Product Robby Stein posted yesterday on X that the company is testing having AI Mode accessible on the same screen as an AI Overview rather than the two services existing in separate tabs. 

This article originally appeared on Engadget at https://www.engadget.com/ai/google-discover-is-testing-ai-generated-headlines-and-they-arent-good-234700720.html?src=rss

Instacart sues New York City over minimum pay, tipping laws

You can tell a lot about a company by what they're willing to sue over. Take Instacart, which just filed a lawsuit against New York City. Its beef? The company doesn't like five new city laws, set to take effect in January. They would require Instacart to pay workers more and give customers a tipping option of at least 10 percent.

Reuters reports that Instacart's suit targets Local Law 124, which mandates that grocery delivery workers receive the same minimum pay as restaurant delivery workers. It also challenged Local Law 107, which mandates 10 percent or higher tipping options (or a place to enter one manually). The lawsuit also takes aim at other laws requiring extra recordkeeping and disclosures. The new rules are set to take effect on January 26.

As is typical of companies griping about regulations that hurt their bottom lines, Instacart framed the issue as a noble fight for what's right. "When a law threatens to harm shoppers, consumers, and local grocers — and especially when it does so unlawfully — we have a responsibility to act," the company proclaimed in a blog post. "This legal challenge is about standing up for fairness, for the independence that tens of thousands of New York grocery delivery workers rely on and for affordable access to groceries for the people who need it most."

Instacart's suit reportedly claims that Congress banned state and local governments from regulating prices on platforms such as its own. It also alleges that New York's state legislature "has long taken charge" of minimum pay, and that the US Constitution doesn't allow states and cities to discriminate against out-of-state companies.

The company warns that everyone will lose if it's forced to comply. Should the laws take effect, "Instacart will be forced to restructure its platform, restrict shoppers' access to work, disrupt relationships with consumers and retailers and suffer constitutional injuries with no adequate legal remedy," it claimed in the filing.

Instacart CEO Chris Rogers, elevated to the post in May, has an estimated net worth of at least $28.6 million. His predecessor, Fidji Simo, who chairs the board and is now with OpenAI, is reportedly worth around $72.7 million. If NYC’s minimum pay laws will be as catastrophic as Instacart claims, maybe they could chip in to help.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/instacart-sues-new-york-city-over-minimum-pay-tipping-laws-220205207.html?src=rss