Slack AI will generate transcripts and notes from huddles

Salesforce has rolled out some new AI features for its business-focused Slack chat app designed to take over mundane chores like transcription. 

A key new feature is Slack AI huddle notes to "capture key takeaways and action items so users can focus on the work at hand," the company wrote. This looks like a more powerful version of a previous Slack AI feature that recaps channel highlights and generates summaries for threads in a single click. 

When invited to a huddle, Slack AI creates a transcript based on real-time audio and messages shared in the thread. It can also organize notes with citations, action items and files shared into a canvas. All huddle attendees can then view the notes later, even if they weren't able to attend.

Slack also updated its AI search feature so that it can surface unique results for a user based on files and apps they uploaded in Slack, including canvases, transcripts from clips, documents from connected apps, Google files and more. 

Another timesaver is the new AI Workflow Builder that helps automate tasks. For instance, users can enter a prompt like "send a welcome message to teammates that join a channel" and Slack AI and Workflow Builder will will generate that functionality with no programming required. 

Also arriving in the latest update are Slack templates, pre-configured for specific use cases like managing a project, collecting feedback and triaging help requests. The new Slack AI features are now available as a paid add-on for all subscription plans, and Slack templates will roll out in October 2024. 

This article originally appeared on Engadget at https://www.engadget.com/ai/slack-ai-will-generate-transcripts-and-notes-from-huddles-120026621.html?src=rss

iPads will support third-party app stores in Europe starting September 16

Apple has revealed it will allow iPad users in the EU to install third-party app stores on their tablets (without having to sideload them) starting on September 16. You'll need to install iPadOS 18, which will be available broadly on Monday, to do so.

Back in April, the European Commission designated iPadOS as a "core platform service," meaning that like iOS, the App Store and Safari, the operating system is subject to stricter rules under the bloc's Digital Markets Act. As TechCrunch notes, Apple had six months to update iPadOS so that it complied with the DMA, which included opening up the platform to third-party app marketplaces.

Epic Games has already pledged to bring its app marketplace to iPadOS, meaning that folks in the EU should be able to play Fortnite and Fall Guys natively on compatible iPads in the near future. Several other third-party app stores have arrived on iOS in the EU since Apple added official support in March.

While the likes of AltStore PAL and the Epic Games Store aren't subject to Apple's usual app review policies, the company notarizes them for security purposes. The developers of third-party app marketplaces also need to pay a Core Technology Fee to Apple once they meet certain thresholds (the EU opened an investigation into this fee in March).

One other key change coming to iPads with the rollout of iPadOS 18 is under the surface, but one that may ultimately change how EU users browse the web on their iPads. Apple will allow third-party browsers to use their own engines on iPadOS instead of having to employ its own WebKit. This means that the likes of Mozilla and Google will be able to offer iPad versions of Firefox and Chrome that run on their own tech.

This article originally appeared on Engadget at https://www.engadget.com/mobile/tablets/ipads-will-support-third-party-app-stores-in-europe-starting-september-16-180414833.html?src=rss

Elgato’s latest Stream Deck is a $900 rackmount unit for pros

Elgato has introduced the Stream Deck Studio, a new version of its creative control tech that's firmly targeting professional broadcasters. This 19-inch rackmount console has 32 LCD keys and two rotary dials. The $900 price tag shows that this is not an entry-level purchase.

The company collaborated with broadcast software specialist Bitfocus on the Stream Deck Studio. The device can run the Companion software that works on other Stream Deck models, but also supports the company's new Buttons software. The Buttons app allows for additional interface customization designed specifically for the Stream Deck Studio.

Elgato has been expanding its Stream Deck line, which began life as a simple sidekick for livestreamers, to reach a broader range of users. For instance, it introduced an Adobe Photoshop integration aimed at visual artists. This push to reach more pro-tier customers could put Elgato into more frequent competition with rival brands like Loupedeck, which Logitech acquired last year, along with established broadcast brands like Blackmagic.

This article originally appeared on Engadget at https://www.engadget.com/computing/accessories/elgatos-latest-stream-deck-is-a-900-rackmount-unit-for-pros-215003305.html?src=rss

I don’t get why Apple’s multitrack Voice Memos require an iPhone 16 Pro

Apple’s recent iPhone event brought some nifty ideas, from the camera button to a reinvention of Google Lens and beyond. The company also announced that it's bringing simple multitrack recording to Voice Memos. This was particularly exciting for me since, well, I use Voice Memos a lot. I have nearly 500 of these little recordings that were made during the lifetime of my iPhone 14 Pro and thousands more in the cloud. You never know when you’ll need a random tune you hummed while waiting for the subway in 2013. 

So this feature felt tailor-made for me. I write songs. I play guitar. I do everything that lady in the commercial does, including opening the fridge late at night for no real reason.

A lady in front of a fridge.
Apple

Then reality hit. This isn’t a software update that will hit all iPhone models. It’s tied to the ultra-premium iPhone 16 Pro, which starts at a cool $1,000. I don’t really want to upgrade right now, so the dream of singing over an acoustic guitar track right on the Voice Memos app is dead on arrival.

Why is this particular feature walled behind the iPhone 16 Pro? It’s a simple multitrack recording function. From the ad, it looks like the app can’t even layer more than two tracks at a time. This can’t exactly be taxing that A18 Pro chip, especially when the phone can also handle 4K/120 FPS video recording in Dolby Vision. 

Pro Tools, a popular digital audio workstation, was first introduced in 1991. This was two years before Intel released the Pentium chip. Computers of that era had no trouble layering tracks. For a bit of reference, last year’s A17 Pro chip had around 19 billion transistors. An original Pentium chip had around three million. In other words, a modern smartphone chip is around 6,300 times more powerful than a 1993 Pentium-based PC.

So let us layer tracks on Voice Memos, Apple! It can't be that complicated. I’ve been using dedicated multitrack apps ever since the iPhone 3. Apple throws GarageBand in with every iPhone. Both GarageBand and third-party recording apps have a place, sure, but nothing beats the quickness and ease-of-use of Voice Memos. It’d sure be great to be able to make a quick-and-dirty acoustic demo of a song and send it out to someone without having to navigate a fairly complicated interface.

App in front of a refrigerator.
Apple

Yeah. I see the elephant in the room. There’s a part of the ad that I’ve been avoiding. The woman records the vocal layer over the guitar track without wearing headphones. She just sang into the phone while standing in front of that refrigerator. Now, that’s something old-school Pentiums could not do. There’s some microphone placement wizardry going on there, along with machine learning algorithms that reduce unwanted ambient noise. The iPhone 16 Pro has a brand-new microphone array, so I get that older models might not be able handle this particular part of the equation.

But who cares? That’s a really neat feature. It’s also completely unnecessary. If you’re reading this, you are likely already wearing earbuds/headphones or have some within reach. Record the first track without the headphones. Record the secondary layer while wearing headphones. That’s it. Problem solved. You can even do it in front of the refrigerator.

Also, both the base-level iPhone 16 and the Pro support Audio Mix, which lets people adjust various sound levels from various sources after capturing video. This is done without the new Studio Mics on the iPhone 16 Pro and seems to reduce ambient noise in a similar way. So it could be possible that there's a software solution here to handle even that elephant in the room. After all, the company credits "powerful machine learning algorithms" for this tech — if it can erase environmental wind noise, surely it can handle music playing in the background? 

So I am once again asking for Apple to let the rest of us play around with multitrack recording on Voice Memos. There’s no reason every older iPhone model couldn’t compute its way to a simple guitar/vocal two-track wav file. Pop the feature into a software update. I hear there’s one for iOS 18 coming really soon, and another for Apple Intelligence after that.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/i-dont-get-why-apples-multitrack-voice-memos-require-an-iphone-16-pro-175134621.html?src=rss

Chrome’s latest safety update will be more proactive about protecting you

Chrome is getting a series of safety updates that could improve your security while browsing online. In a release, Google announced the new features, which include protecting against abusive notifications, limiting site permissions and reviewing extensions.   

Safety Check, Chrome's security monitor, will now run continuously in the background to more readily take protective steps. The tool will let you know what steps it's taking, which should include removing permissions from sites you no longer visit and ones Google Safe Browsing believes are deceiving you into giving permission. It will also flag any alerts it deems you might not want and notify you of issues that require attention, like security issues. Plus, Safety Check on your desktop should alert you to any Chrome extensions that might pose a risk. 

Google is also reducing the number of permissions that last for sites on Chrome for desktops or Android devices. The new feature will allow you to approve mic or camera access for one time only instead of always for the site. Instead, they will have to request your permission again on the next use. Plus, Google is also expanding the ability to unsubscribe from a site on Chrome with one button beyond Pixel devices to more Android ones.   

This article originally appeared on Engadget at https://www.engadget.com/chromes-latest-safety-update-will-be-more-proactive-about-protecting-you-160046221.html?src=rss

Apple turned the Voice Memos app into a Makeshift Recording Studio for Artists

Just like the company practically redefined the calculator with Math Notes for the iPad Pro, Apple’s turned a humble voice-note app into a blessing for musicians who use it to record samples, hooks, and lyric ideas.

Apple’s latest update to the Voice Memos app is bound to capture the attention of musicians, especially those who often rely on their iPhone for quick recordings of new sparks of inspiration. With iOS 18 and the iPhone 16, Apple has introduced key features that greatly enhance the functionality of this simple recording app, making it a more useful tool for creative professionals. The ability to layer tracks within the Voice Memos app turns the otherwise basic app into a ‘sonic doodle-pad’ for creating layered multitrack compositions. Previously, musicians would have to open a separate digital audio workstation (DAW) to layer vocals and instruments together, which required time and technical know-how. Now, with this update, musicians can record an instrumental track, such as guitar or piano, and then layer vocals over it without leaving the app. This simple feature is particularly useful for singer-songwriters, allowing them to develop ideas more organically without the distraction of switching between apps. Recording two tracks simultaneously may sound limiting for professional production, but for quick idea generation and song structure building, it’s a practical improvement. Musicians can also mix the two tracks within the app, adjusting volume levels to ensure that vocals and instrumentals are well-balanced.

Advanced processing isolates the vocal from the background sound, delivering a clean, professional result without requiring additional apps or headphones. This integration offers a simple, intuitive way for musicians to build multi-layered recordings directly on their phones, a leap forward for on-the-go production. The iPhone 16 lineup also introduces new audio processing technologies powered by Apple’s A18 chip. With this boost in processing power, the devices can handle real-time audio adjustments more efficiently. Apple’s new AI software, Apple Intelligence, plays a role in optimizing the sound recording experience, ensuring every track captured through Voice Memos or other apps benefits from smart noise reduction and dynamic range adjustments. While these features may not be as obvious to casual users, they represent significant improvements for anyone serious about audio quality.

Another helpful addition is transcription, which converts voice recordings into text. For songwriters, this feature can make the creative process smoother by providing a way to quickly view and edit lyrics. Rather than having to manually type out or remember lyrics after a recording session, users can now see their words appear directly in the app. The transcription function is easy to use; after recording, users tap on the three dots next to their recording and choose the “View Transcript” option. They can then make edits to specific sections, replacing only the parts they want to change. This integration of audio and text simplifies the workflow for lyricists, allowing them to focus more on refining their craft and less on the technicalities of documenting ideas.

I wouldn’t be surprised if this feature saw further innovation over the years. Sure, Apple’s added more mics for better recording chops, and a dual-track ability to the voice-notes feature. A year from now, they could turn it into a multitrack app with the ability to cut/edit/loop samples, probably within the app. Given that the iPhone doesn’t have a Logic Pro app, expanding the Voice Memos feature to become a makeshift DAW sounds wonderful. Who knows, they could revamp Garage Band too, allowing you to make entire album demos right on your smartphone. Sounds too good to be true, sure, but who knows what the future holds?!

The new iPhones are set to hit the market on September 20, with prices starting at $799 for the base model and $999 for the Pro. It’s unclear whether the new Voice Memo features will come to older iPhones given the reliance on the A18 chip. Given that Apple Intelligence will be made available to the iPhone 15 Pro and 15 Pro Max, one could assume that at least last year’s flagship Pro models should get this new set of app-based features.

The post Apple turned the Voice Memos app into a Makeshift Recording Studio for Artists first appeared on Yanko Design.

Bluesky now lets you upload videos, but there are some caveats

It’s easy to forget that there’s another social network besides Threads for people tired of Elon Musk’s totally normal X platform. Bluesky is a fine alternative, as it definitely “feels” like Twitter. However, it has been lacking some of the features that made Twitter such an internet hotspot back in the day. Well, we just got a big one. The company just announced that users can now upload video content.

There are some caveats. First of all, the videos have to be under a minute. That’s a fairly huge hurdle for just about every piece of content other than TikTok-style shorts. As a comparison, Meta’s Threads allows for five minute videos. Also, the videos autoplay by default, though that can be handled in the settings. Finally, there’s a hard limit of 25 videos per user each day, though the company says it could tweak that in the future.

The platform supports most of the major video file types, including .mp4, .mpeg, .webm, and .mov files. Users can also attach subtitles to each video, which is a nice little bonus. There are some guardrails in place to protect against “spam and abuse.” Only users who have verified their email address can upload videos and illegal content will be “purged” from the infrastructure. There’s also a way to submit reports to the moderation team. Additionally, each video will be scanned for CSAM by Hive and Thorn.

Update to version 1.91 of the mobile app to get started, though it also works via the desktop client. Not every user will be able to access this feature right away, as version 1.91 will be a gradual rollout to “ensure a smooth experience.”

Bluesky recently added direct messages into the mix, which is something Threads doesn’t have. The platform may be a distant third, when compared to X and Threads, but it’s certainly growing. A massive influx of Brazilian users recently joined the social media site after X was banned in the country.

This article originally appeared on Engadget at https://www.engadget.com/social-media/bluesky-now-lets-you-upload-videos-but-there-are-some-caveats-185702403.html?src=rss

iPhone 16 Pro hands-on: How does a faux camera control button feel so real?

Apple's latest attempt to slightly differentiate the iPhone 16 series is... a faux button it's calling Camera Control. But unlike last year's new button, this one doesn't actually physically depress, and uses a mix of sensors and haptic feedback to simulate the sensation of movement. And in my brief hands-on right after Apple's iPhone 16 launch event, I have to say I actually thought it was a real button. 

Editors' Note: After some investigation, it turns out that at least part of the camera control is a real, depressable button. You'll feel the actual movement when you push all the way down, but the half-press is what's simulated by the iPhone 16's haptic feedback. It does a remarkably good job of simulating a two-stage button.

I got a quick look at the iPhone 16 Pro here in Apple Park, and got a deep walkthrough of the new camera control and its corresponding interface. When I first picked up the iPhone 16 Pro Max, I felt like it looked sleeker and thinner than my iPhone 15 Pro Max, which was nice. My fingers were then drawn to the new "button," which has a groove that surrounds it, which helps with identification by touch.

From the home screen, I pressed down on the camera control and the camera app quickly opened. The Apple rep guarding these phones encouraged me to push the camera control with varying pressures, as a lighter touch changed the dial that popped up onscreen next to where the button sat. I dragged my finger on this surface, and the digital knob moved along with me, although I at first found the direction of the movement slightly counterintuitive. I am, however, one of those gamers that needs to flip the direction of my controllers when looking around and navigating any environment though, so that might be just me.

There is no way to change the direction of camera control's direction when you're swiping, but you can tweak the settings to adjust pressure sensitivity. When the Apple rep asked me to push harder on the control, I was shocked at what felt like a real button moving below my fingertip. I confirmed again with the Apple rep that this was not a mechanical button that actually moves, and was met with reassuring nods. Next to me, fellow reviewer Brian Tong echoed my sentiment that the camera control feels remarkably like an actual button. 

iPhone 16 Pro
Cherlynn Low for Engadget

Aside from marveling at the physical sensation on the iPhone 16 Pro, I also took a closer look at the changes to the interface. When I first light-pushed on the camera control, a selection of options came up, allowing me to select Exposure, Zoom, Camera, Styles and Tone. Pressing harder down on each of these locked those modes and a different dial with more markings came up, and swiping on the sensor would move the wheel. In the Camera mode, I was able to quickly switch between the ultrawide, main and zoom options, similar to how the viewfinder currently operates. If you prefer to use the existing interface to switch cameras, you can still do so. 

When you pick the Styles option, you'll swipe between the new Photographic Styles that Apple introduced this year. In each of these, you can tap an icon on the top right of the app to edit them with the new touchpad-ish interface. Dragging your finger around this square at the bottom will adjust color temperature and hue settings. You can also make changes to the Photographic Style in your picture after it's been taken, so you don't have to worry too much about not liking the way something looks. 

iPhone 16 Pro
Cherlynn Low for Engadget

I also got to hold the iPhone 16 Pro Max in portrait mode and take a selfie. At first, my thumb was placed too high on the device's edge, and pressing down did nothing. I shifted the phone in my hand slightly, which felt a little precarious, then found the camera control and quickly took a shot. I'm not sure of the position here just yet, but it feels like something I'll figure out in time. 

Some of the camera improvements on the iPhone 16 Pro are new video editing features, but I didn't quite get to recording 4K120 footage yet. I did get to peek at the updated video-editing interface, which has a tab on the side for Audio Mix, which lets you isolate the voices of people on camera or make the shot sound like it was recorded inside a studio. It's all so very cinematic. I don't know that I believe people can shoot IMAX-friendly films on any iPhone ever, but the idea that you can is certainly intriguing.

Part of the reason I found the iPhone 16 Pro slightly sleeker than its predecessor is likely to do with its display. It's 6.9 inches large now, compared to its predecessor's 6.7-inch screen. However, Apple has managed to keep the handset at the same size as before, shaving the bezels down even further to do so. It's not something you'll notice without putting the two devices side by side and really scrutinizing the borders, but it makes a small difference in making this year's Pro Max feel new. 

Whether that makes a meaningful difference in maneuvering the phone or reading more content at once is something I'll wait till I can scroll Reddit for hours on my couch before judging. I'd also need more time to see if Apple Intelligence and the new A18 Pro chip will improve the iPhone 16 Pro experience and battery life. I know I'm super stoked for the update to the Voice Memo app and will be loudly singing into my iPhone 16 Pro whenever I get the chance. If you want the most comprehensive review from an aspiring singer, definitely come back to check out our full impressions soon. If not, well, you have been warned.

Update, September 09 2024, 8:04PM ET: This story has been updated to clarify that there is a real button in Camera Control, and that the half-step is what's simulated by haptic feedback.

Catch up on all the news from Apple’s iPhone 16 event!

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/iphone-16-pro-hands-on-how-does-a-faux-camera-control-button-feel-so-real-191406863.html?src=rss

Apple invents its own version of Google Lens called Visual Intelligence

Apple has introduced a new feature called Visual Intelligence with the iPhone 16, which appears to be the company's answer to Google Lens. Unveiled during its September 2024 event, Visual Intelligence aims to help users interact with the world around them in smarter ways.

The new feature is activated by a new touch-sensitive button on the right side of the device called Camera Control. With a click, Visual Intelligence can identify objects, provide information, and offer actions based on what you point it at. For instance, aiming it at a restaurant will pull up menus, hours, or ratings, while snapping a flyer for an event can add it directly to your calendar. Point it at a dog to quickly identify the breed, or click a product to search for where you can buy it online.

Later this year, Camera Control will also serve as a gateway into third-party tools with specific domain expertise, according to Apple's press release. For instance, users will be able to leverage Google for product searches or tap into ChatGPT for problem-solving, all while maintaining control over when and how these tools are accessed and what information is shared. Apple emphasized that the feature is designed with privacy in mind, meaning the company doesn’t have access to the specifics of what users are identifying or searching.

Apple claims that Visual Intelligence maintains user privacy by processing data on the device itself, ensuring that the company does not know what you clicked on.

Catch up on all the news from Apple’s iPhone 16 event!

This article originally appeared on Engadget at https://www.engadget.com/ai/apple-invents-its-own-version-of-google-lens-called-visual-intelligence-180647182.html?src=rss

Genmoji and image-generation tools for iPhone reportedly delayed until iOS 18.2

Many of Apple Intelligence’s most anticipated features will arrive in a trickle well after the release of iOS 18, and according to Bloomberg’s Mark Gurman, it could be December before the iPhone will offer things like AI-generated images and custom emoji. Apple Intelligence is expected to make its debut with iOS 18.1, which Gurman has previously reported will likely come sometime in October. Genmoji and the upcoming image-generation tool, Image Playground, reportedly won’t be among its first features. Instead, Gurman predicts they’ll ship with iOS 18.2, which he says is slated for December.

Apple showed off Genmoji and Image Playground during its June event. With Genmoji, users will be able to create custom emoji from a prompt or make emoji of real people based on their photos. Image Playground, on the other hand, will let users generate images in three styles: Animation, Illustration and Sketch. It’ll be offered as a standalone app and as a built-in tool in other apps, including Messages.

All of these features will eventually be available for the iPhone 16 line, which will be unveiled on Monday Sept. 9 at Apple’s It’s Glowtime event, as well as other recent iPhone models. Apple Intelligence will also bring ChatGPT integration, message summarization, a smarter Siri and more.

This article originally appeared on Engadget at https://www.engadget.com/mobile/genmoji-and-image-generation-tools-for-iphone-reportedly-delayed-until-ios-182-152526073.html?src=rss