DoorDash can import grocery lists from iOS’ Reminders app

Though I do love walking through a supermarket and picking out my own foods, I will admit that, come winter, I often turn to delivery apps to get my products. DoorDash, one of the many delivery apps on the market, has launched a new feature that could make this process even more seamless, allowing iOS users import their grocery list from Reminders into the app. 

To take advantage of this, you can go to Reminders and copy your list or import it directly in the DoorDash app. An option should appear while you're shopping inside a store that says "Got a grocery list?" in a box on the page. From there you can click import and choose which list you want to sync based on the titles and a preview of the items. DoorDash will then show you options based on your list. So, for example, if you wrote onions, then it will let you scroll through different onions for sale and below it will have your next item with other options. 

DoorDash is also unveiling other changes, such as letting you add items from multiple stores to an order before placing it. The company has offered DoubleDash since 2021 but that only allowed you to include items from close stores after placing the original order. 

This article originally appeared on Engadget at https://www.engadget.com/apps/doordash-can-import-grocery-lists-from-ios-reminders-app-140020164.html?src=rss

How Collaborative Tools are Revolutionizing the Design Pipeline: An Interview with KeyShot

The journey of creating a product doesn’t end at design—it’s where it begins. KeyShot, a trusted name in product visualization and rendering, is evolving that journey with its innovative Product Design-to-Market Suite. Imagine a world where designers, developers, and marketers don’t work in silos but move together in perfect sync. That’s the vision KeyShot is bringing to life, and it’s already shaking up workflows for companies big and small.

We sat down with Garin Gardiner, Product Director of KeyShot Hub, to uncover how this suite is solving challenges designers didn’t even know had solutions. From effortless collaboration to smarter asset management, KeyShot isn’t just keeping up with the demands of the design world—it’s rewriting the rules. Dive into this conversation to explore how KeyShot is empowering creators to dream big and deliver faster.

Click Here to Download Now: The whitepaper for an in-depth look at how this new framework can transform your business.

Yanko Design: What specific areas in the product design process does KeyShot’s new Product Design-to-Market Suite address? How does this optimize a business’ workflow in ways that older versions of KeyShot didn’t?

Garin Gardiner: Our flagship product, KeyShot Studio, is primarily geared towards the individual designer. It was the first scientifically accurate rendering engine, now used in over two-thirds of Fortune 500 companies, with thousands of customers around the world. We’ve always worked closely with our customers to keep Studio relevant to their needs, and over two decades of development, we learned about other significant needs related to the design process, team workflows and business logistics. We saw a huge opportunity to help – and to revolutionize the way products are brought to market.

We’re introducing a concept called Product Design-to-Market, which is a holistic strategy that connects the many departments involved in product creation and market delivery. You can think of it as bridging the product design and go-to-market processes. Instead of working in silos, we’re encouraging a smooth exchange of information and assets across design, development and marketing teams. The result is faster iteration, better alignment, and a seamless transition from first sketch to market delivery.

Of course, you need the right tools to make this vision a reality. Our Product Design-to-Market Suite, which includes KeyShot Studio, also provides comprehensive design team support in KeyShot Hub and connection to the management and distribution of marketing assets in KeyShot Dock.

Yanko Design: How have early adopters responded to KeyShot Hub’s collaboration capabilities, and can you share how it has improved their design process?

Garin Gardiner: It is amazing how nearly every customer we’ve talked to, when we ask them how they’re navigating team workflows, say they struggle managing a central repository for their team to find the core items they use frequently. When individuals can’t find what they’re looking for, they often create duplications, and there’s so much time wasted in that. Hub provides that central repository, so everyone has access to the current version of the file, meaning no duplications are necessary. Plus, changes to the file can automatically be tracked and you can easily revert back to a previous version.

Another favorite is the shared material library in Hub. Customers say being able to work from the same material library makes a huge difference. If a material gets modified, the entire team will automatically get the latest and greatest the next time they use a material. They are also able to tag it for easier searchability, so they aren’t creating duplicate materials, like they often do today.

Hub’s related assets feature is really resonating with customers. When you apply materials to a scene and save it to the Hub, you are able to see all those materials linked to the scene in the Hub for a quick CMF view of your scene.

Tagging is another feature customers appreciate. When saving a rendering to the Hub it will automatically attach tags – Model Sets, Camera, Studio, Environment, Image Style, Colorway, and Materials. These tags can then be used to search for renderings. Searches can be saved for later re-use by all members of the team. Our customers care a lot about their CMF – it’s a key aspect of what they do. They can also manually update tags if they prefer.

Customers are also loving the side-by-side comparison feature between versions. You can select two versions and real-time compare them using a dynamic slider; it’s really helpful to compare differences between versions, especially when the differences are in small details. Our customers create a lot of versions of the same rendering and being able to compare versions side-by-side is helpful.

These are all features that Hub users say address the team and workflow challenges they’re facing today. Ultimately, it’s all about saving time and enabling easy collaboration, so designers can focus on their craft rather than administrative tasks. And you can see how everything works in a full demo of Hub available on YouTube.

Yanko Design: What developments in other industries are providing inspiration for KeyShot as it paves the way forward with its new Product Design-to-Market Suite?

Garin Gardiner: There’s certainly movement toward breaking down silos and supporting cross-collaboration. We have seen how companies like Microsoft have enabled richer collaboration using the cloud through their Teams platform. We have also seen design tools like Fusion transform how their customers work with Fusion Team.

These developments were part of what inspired us to offer a purpose-built Product Design-to-Market Suite to better support our customers. Now KeyShot provides speedy and intuitive rendering, support for design team workflows, and support for marketing.

Yanko Design: We’re very excited about KeyShot Dock’s enhanced Digital Asset Management system! How do you envision it helping companies better organize and distribute their 3D assets across marketing and sales channels?

Garin Gardiner: Right now, marketing teams are typically responsible for generating their own images and animations, separate from product design. They budget for product visuals and often make them from scratch, spending time and money on photography and design work. But they could be saving time and money by repurposing the 3D renderings already produced by design teams, which make it easy to create an infinite amount of marketing-worthy product visuals. CAD models and KeyShot scenes can be stored in KeyShot Dock, providing a connection between marketing and product design and empowering marketing to use those assets across go-to-market channels.

Our customers tell us that 3D visuals are much more effective than 2D images or product photography; 3D visuals lead to higher conversions and lower return rates.

Customers can expect regular updates to Dock. Over time, we are looking to enable viewing 3D interactive files like GLBs and even the possibility of generating on-demand 3D viewables from CAD models like SolidWorks, STEP and more.

Yanko Design: How do you see technologies like AI and machine learning influencing the future of 3D rendering and Digital Asset Management, and will KeyShot incorporate these innovations?

Garin Gardiner: We’re considering how to incorporate AI into our tools in a way that adds value to users. While generative AI can provide impressive results in image generation, we still believe that accurate rendering – down to highly detailed materials and brand elements – will require physics-based rendering. However, we are analyzing how AI can help our customers achieve greater efficiency in their workflows or increase the speed and quality of rendering, through processes like sampling light rays used by rendering algorithms or denoising rendered images.

On the marketing side, AI has the potential to make it faster and easier for teams to generate 2D renderings as a replacement for physical photography. Imagine feeding AI with 100% accurate product data and using it to generate creative environments around accurate renderings.

These are all possibilities we’re looking at right now. AI has so much potential to provide creative and logistical support – it’s all about making the most of it.

Image Credits: Silvester Kössler

Click Here to Download Now: The whitepaper for an in-depth look at how this new framework can transform your business.

The post How Collaborative Tools are Revolutionizing the Design Pipeline: An Interview with KeyShot first appeared on Yanko Design.

Meta is testing custom feeds for Threads

As the competition between Bluesky and Threads heats up, Meta is adding a new feature to Threads that will likely look familiar to Bluesky users: custom feeds. The Meta-owned service is starting to test a feature that allows users to pin topic-based feeds to the home screen of the app.

The change will give people additional feeds beyond the algorithmic “for you,” which will remain the default view, and their “following feed.” Users can add custom feeds by searching a keyword like “skincare” and then tapping the “...” menu and selecting “create new feed.” These feeds can be further customized by adding specific profiles of people whose posts you want to see in that feed. Users are able to add up to 128 custom feeds in the app, a Meta spokesperson said, though it’s still only a test for now so not all users have access to it just yet.

The feature is similar in many ways to Bluesky’s custom feeds, which the company introduced last year. But while there are dozens of user-created algorithmic feeds in the app, making a new one is still a technical process. Meta’s version of the feature, however, is more straightforward. It could also address some users’ complaints about Threads’ main algorithmic feed.

The latest Threads feature comes as Bluesky has had a particularly good month. Though the service is still far smaller than Threads, which has more than 275 million users, Bluesky, which has just under 17 million users at the time of this writing, has been gaining momentum. The decentralized service added a million new users in the week following the election, and added another million new sign-ups in a single day this week. That’s striking considering Threads has also been growing by about a million users a day, according to a recent post from Instagram chief Adam Mosseri. If Bluesky is able to sustain that level of growth for very long, Meta may feel even more pressure to borrow some ideas from its smaller rival.

This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-is-testing-custom-feeds-for-threads-183948414.html?src=rss

Google now offers a standalone Gemini app on iPhone

Google now offers a dedicated Gemini AI app on iPhone. First spotted by MacRumors, the free software is available to download in Australia, India, the US and the UK following a soft launch in the Philippines earlier this week.

Before today, iPhone users could access Gemini through the Google app, though there were some notable limitations. For instance, the dedicated app includes Google’s Gemini Live feature, which allows users to interact with the AI agent from their iPhone’s Dynamic Island and Lock Screen. As a result, you don’t need to have the app open on your phone’s screen to use Gemini. The software is free to download — though a Gemini Advanced subscription is necessary to use every available feature. Gemini Advanced is included in Google’s One AI Premium plan, which starts at $19 per month.

The app is compatible with iPhones running iOS 16 and later, meaning people with older devices such as the iPhone 8 and iPhone X can use the AI agent. I’ll note here that the oldest iPhone that can run Apple Intelligence is the iPhone 15 Pro. Of course, that’s not exactly a fair comparison; Apple designed its suite of AI features to rely primarily on on-device processing, and when a query requires more computational horsepower, it goes through the company’s Private Cloud Compute framework.

Either way, it’s not surprising to see Google bring a dedicated Gemini app to iPhone. Ahead of WWDC 2024, Apple had reportedly been in talks with the company to integrate the AI agent directly into its devices.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/google-now-offers-a-standalone-gemini-app-on-iphone-160025513.html?src=rss

Apple’s AI-infused Final Cut Pro 11 is now available

With its biggest update to Final Cut Pro (FCP) in years, Apple may be re-embracing the professional video creator crowd it has neglected since the launch of FCP X in 2011. The company finally unveiled its successor, Final Cut Pro 11 (FCP 11), an update that leans heavily on AI tools. At the same time, it introduced spatial video editing to produce content for the Vision Pro headset.

The key AI feature is Magnetic Mask, which lets you cut out people and other moving subjects, then stylize them or put them in another location altogether. "This powerful and precise automatic analysis provides additional flexibility to customize backgrounds and environments," Apple wrote. "Editors can also combine Magnetic Mask with color correction and video effects, allowing them to precisely control and stylize each project."

Apple's AI-infused Final Cut Pro 11 is now available
Apple

The other key new AI feature is Transcribe to Captions, which automatically analyzes interviews and other timeline audio, transcribes it and places the captions directly on the timeline — effectively automating the entire process. That feature uses an Apple-trained large language model (LLM) designed to transcribe spoken audio, the company said.

Final Cut Pro 11 also joins other pro editing apps like Premiere Pro and DaVinci Resolve in offering VR/AR video editing. "Spatial video editing" allows users to import and edit AR/VR video directly in the app, while adding effects, color correction and more. Footage can be captured from an iPhone 15 Pro or iPhone 16 models, along with Canon's R7 mirrorless camera paired with the new RF-S 7.8mm F/4 lens. Users can choose from different viewing modes to preview left- and right-eye angles, or bring their edit directly in to Apple Vision Pro to get a 3D preview. 

Apple also unveiled Final Cut Pro for iPad 2.1, further optimizing it for Apple silicon. The app also offers enhancements to the "light and color" feature that let you quickly improve the color, contrast and overall look of your video. And finally, the company released a new version of Final Cut Camera, which includes the ability to shoot in compact but high-quality HEVC files with Apple Log, rather than using storage-gobbling ProRes. 

As a professional tool, Final Cut Pro 11 is still missing features found in Resolve and Premiere Pro like text-based editing and certain advanced color correction tools. Still, the new version and features will no doubt be welcomed by FCP diehards. It's now available to download for $299 for new users (following a free 90-day trial) and is free to existing Final Cut Pro owners. 

This article originally appeared on Engadget at https://www.engadget.com/apps/apples-ai-infused-final-cut-pro-11-is-now-available-140030992.html?src=rss

The Resident Evil 2 remake will shuffle its way to Apple devices in December

Now you’ll be able to play one of the greatest zombie survival games of all time on your iPhone or iPad. Capcom’s Resident Evil 2 remake is headed to the Apple and Mac App Store on December 10.

The game won’t be available on every Apple device. You’ll need any iPhone 16 model, an iPhone 15 Pro or an iPad or Mac with the M1 chip or later. You’ll also be able to try a small portion of the game before purchasing the full experience. The game comes with “universal purchase” and “cross-progression” for all your eligible devices, according to a Capcom statement.

The Resident Evil 2 remake will offer advanced controls for touchscreens and the Mac version. Both Leon and Claire will also have “a new Auto Fire feature” so you can unload your clip into whatever’s shuffling towards you.

This is just one of four Resident Evil games available for Apple devices and computers. Capcom and Apple have released versions of Resident Evil 7: Biohazard and Resident Evil 4 on the App Store and Resident Evil: Village on the App and Mac App Store.

This article originally appeared on Engadget at https://www.engadget.com/gaming/the-resident-evil-2-remake-will-shuffle-its-way-to-apple-devices-in-december-234511380.html?src=rss

Bluesky surges to 15 million users after getting a million sign-ups in one week

Bluesky may still be the underdog in the race for alternatives to X, but the once Twitter-affiliated service is gaining momentum. The app just passed the 15 million user mark after adding more than a million new users over the last week, the company said in an update.

While Bluesky is still considerably smaller than Threads, which with 275 million users is its biggest rival, there are signs that Threads users have been increasingly curious about the upstart. “Bluesky” has been a trending topic on Threads in recent days and an in-app search suggestion shows there are more than 19,000 posts about “Bluesky.” Bluesky itself has also made a push to win over Threads users in recent weeks by posting regularly on the Meta-owned service.

That effort seems to be working. A month ago, Engadget noted, the service had just under 9 million users. Its mobile app also has the top spot in Apple’s App Store, followed by Threads and ChatGPT. Its recent success also seems to be driven, at least in part, by frustration with Elon Musk and X following the US presidential election.

A recent report from web analytics company SimilarWeb found that “more than 115,000 US web visitors deactivated their accounts,” on November 7, “more than on any previous day of Elon Musk’s tenure.” The report also noted that “web traffic and daily active users for Bluesky increased dramatically in the week before the election, and then again after election day,” with Bluesky at points seeing more web traffic than Threads. (Threads’ mobile usage, however, is still “far ahead” of Bluesky.)

Traffic for Threads and Bluesky according to SimilarWeb.
SimilarWeb

“In the US, Bluesky got more web visits than Threads in the immediate aftermath of the election,” the report notes. “For context, it’s important to note that both services are app centric, even though they support a web user interface.”

On its part, Bluesky seems intent on distinguishing itself from its larger, billionaire-controlled rivals. The company, which began as an internal project at Twitter before it spun off into an independent entity, has experimented with novel features like custom feeds, user-created moderation services and “starter packs” for new users.

“You're probably used to being trapped in a single algorithm controlled by a small group of people, that's no longer the case,” Bluesky’s COO Rose Wang shared in a video aimed at new users Tuesday. “On Bluesky, there are about 50,000 different feeds … these feeds provide a cozy corner for you to meet people with similar interests. And you can actually make friends again, because you're no longer tied to a dominant algorithm that promotes either the most polarizing posts and or the biggest brands, and that's the mandate of Bluesky.”

This article originally appeared on Engadget at https://www.engadget.com/social-media/bluesky-surges-to-15-million-users-after-getting-a-million-sign-ups-in-one-week-224213573.html?src=rss

Channel 4 in the UK now has a dedicated app for Apple Vision Pro

The initial buzz for Apple’s mixed-reality headset has died down, but new apps and experiences are still arriving for consumers who plunked down $3,500. The UK broadcaster Channel 4 just dropped a dedicated streaming app for the headset, which lets users watch stuff in “ground-breaking cinema-style.”

Channel 4 is the first UK broadcaster to take this step. The app leverages the tech inside the headset to overlay streaming content on the real world, which allows for a “full-screen viewing experience” of stuff like The Great British Bake Off and Taskmaster, in addition to multi-screen view.

Speaking of Taskmaster, the broadcaster also announced an environment based on the comedy game show. Environments on the AVP transform the world around the user, so people can watch Taskmaster while sitting in a room inspired by Taskmaster (cue that Xzibit Yo Dawg meme.) Other streaming apps have their own environments. Paramount+ offers one based on SpongeBob Squarepants and Disney+ now includes one set in Iceland.

This app doesn’t feature access to the recently-released Taskmaster VR experience. That one’s still tied to Steam VR and Meta Quest. By most accounts, it’s a pretty bad game, so the Vision Pro isn’t missing much.

This article originally appeared on Engadget at https://www.engadget.com/ar-vr/channel-4-in-the-uk-now-has-a-dedicated-app-for-apple-vision-pro-200027166.html?src=rss

Signal makes it easier to start group video calls

Signal users may be familiar with the problem of creating group chats just for a group call, but that’s about to become a thing of the past. You can now share a call link and let up to 50 people hop in, all in the span of a few seconds. The days of selecting contacts one by one are over.

Now, all you have to do is create a call link after going to the Calls tab and send it to whomever you want. The link is also reusable, which is convenient if you have fixed call times. Participants can raise their hands and send emojis. Hosts can set the room up so people must be approved before joining the conversation.

Based on Signal’s blog post, the new group call experience highly resembles Zoom. Those interested in privacy yet desire a conference call-like experience may find the new update helpful. Like Zoom, the desktop app offers more options.

Besides Zoom, these features will be familiar to frequent users of Microsoft Teams, Google Meet and some WhatsApp users. Raising hands is found on all three platforms as a non-verbal way to signal the speaker. WhatsApp does have a lower participant count of 32 people after an update in June. While there’s no raise hands function, doing certain gestures can send emojis for all to see. It’s worth noting that many apps with group call functionality are adopting similar features.

These new features are available on Android, iOS, Windows and macOS. If you don’t see them yet, we recommend updating your Signal app.

This article originally appeared on Engadget at https://www.engadget.com/apps/signal-makes-it-easier-to-start-group-video-calls-153519653.html?src=rss