Apple reportedly tested a blood glucose monitoring app

Apple is reportedly still working on glucose management — this time through software. Bloomberg’s Mark Gurman says the company tested an app this year for pre-diabetic people, helping them manage their diet and lifestyle. Apple is said not to have plans to launch the app to consumers, but it could play a part in future health products.

The company reportedly tested the app internally, with employees confirmed through a blood test to be at risk of developing Type 2 diabetes. The subjects “actively monitored their blood sugar via various devices available on the market,” logging corresponding glucose changes. The app would then note correlations between dietary changes and blood sugar levels (for example, “don’t eat the pasta”).

Gurman says Apple paused the test to focus on other health features. Bloomberg notes that the Apple Health app currently lacks meal tracking, something rival services offer. The publication also says Apple could eventually offer deeper third-party glucose tracking integration into its products.

The study reportedly wasn’t directly related to Apple’s 15-year quest to offer non-invasive blood glucose monitoring, something that’s seemingly regurgitated in Apple Watch rumors every cycle. The company’s current hardware prototype is reportedly an iPhone-sized wearable device that uses lasers to shoot light into the skin. Gurman claims Apple’s first consumer-facing version — whether in the Apple Watch or some other form — will likely only notify users if they may be pre-diabetic. Providing specific glucose levels would have to come in later iterations.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/apple-reportedly-tested-a-blood-glucose-monitoring-app-204241266.html?src=rss

Cash App users can claim thousands of dollars in a data breach settlement

Heads up if you’ve had a Cash App account over the last six years or so: you may now be able to claim thousands of dollars as a result of a class-action settlement. The company proposed the $15 million settlement earlier this year following two security incidents. If you're eligible to make a claim, you only have a few weeks to do so.

The first related breach took place in December 2021 when, according to Cash App, a former employee downloaded reports containing information on more than 8 million users. This included their full names, brokerage account numbers and, in some cases, the holdings and value of investment portfolios. Cash App disclosed the incident in April 2022.

The consolidated class-action complaint alleged that Cash App and parent company Block failed to enact sufficient security measures to prevent another data breach. This involved Cash App’s person-to-person payment services. According to the plaintiffs, “an unauthorized user accessed certain Cash App accounts in 2023 using recycled phone numbers." The complaint contended that Cash App and Block mishandled complaints related to both breaches and fraudulent transactions.

Cash App and Block have denied any wrongdoing, The New York Times reports. They say the settlement is not an admission of liability.

You may be eligible to make a claim if you had a Cash App account between August 23, 2018 and August 20 of this year. The settlement will cover up to $2,500 of out-of-pocket costs stemming from the breaches, as well as up to three hours worth of lost time at $25 per hour. Those who have sustained a monetary loss and haven’t yet been reimbursed can file a claim for that too.

If you plan to file a claim through the settlement website, you’ll need to do so by 2AM ET on November 19. A final court hearing in the case is set for December 16.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/cash-app-users-can-claim-thousands-of-dollars-in-a-data-breach-settlement-194520756.html?src=rss

Cash App users can claim thousands of dollars in a data breach settlement

Heads up if you’ve had a Cash App account over the last six years or so: you may now be able to claim thousands of dollars as a result of a class-action settlement. The company proposed the $15 million settlement earlier this year following two security incidents. If you're eligible to make a claim, you only have a few weeks to do so.

The first related breach took place in December 2021 when, according to Cash App, a former employee downloaded reports containing information on more than 8 million users. This included their full names, brokerage account numbers and, in some cases, the holdings and value of investment portfolios. Cash App disclosed the incident in April 2022.

The consolidated class-action complaint alleged that Cash App and parent company Block failed to enact sufficient security measures to prevent another data breach. This involved Cash App’s person-to-person payment services. According to the plaintiffs, “an unauthorized user accessed certain Cash App accounts in 2023 using recycled phone numbers." The complaint contended that Cash App and Block mishandled complaints related to both breaches and fraudulent transactions.

Cash App and Block have denied any wrongdoing, The New York Times reports. They say the settlement is not an admission of liability.

You may be eligible to make a claim if you had a Cash App account between August 23, 2018 and August 20 of this year. The settlement will cover up to $2,500 of out-of-pocket costs stemming from the breaches, as well as up to three hours worth of lost time at $25 per hour. Those who have sustained a monetary loss and haven’t yet been reimbursed can file a claim for that too.

If you plan to file a claim through the settlement website, you’ll need to do so by 2AM ET on November 19. A final court hearing in the case is set for December 16.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/cash-app-users-can-claim-thousands-of-dollars-in-a-data-breach-settlement-194520756.html?src=rss

Apple updates its beta testing service Testflight with redesigned invites and more

Apple’s beta testing service Testflight just got a fairly substantial update, according to a report by TechCrunch. The software refresh gives developers much more control over who can join a beta and how new features are shared. It also allows beta testers to get more information about an app before they dive in.

Developers will be able to set all kinds of criteria as to who can or cannot access the beta. This should help devs narrow the test groups to specific audiences, like those using a particular device or OS version. Testflight offers a maximum of 10,000 invitations by default, so this should help reserve spots for an intended audience. The developers can also now decrease this maximum number to whatever they want.

The update allows for more control over the beta invites, as they can highlight new features and content. Apple says that beta builds of apps that have already been approved for publication can now include screenshots and the app category along with the invite.

App creators will also be able to view metrics regarding the success of a beta invite, which includes information as to how many people viewed the invite, who opted in and why folks declined.

As for users, beta invites can include a feedback field. This is for people to let the developer know why they chose not to download an app.

This article originally appeared on Engadget at https://www.engadget.com/apps/apple-updates-its-beta-testing-service-testflight-with-redesigned-invites-and-more-185002704.html?src=rss

Apple updates its beta testing service Testflight with redesigned invites and more

Apple’s beta testing service Testflight just got a fairly substantial update, according to a report by TechCrunch. The software refresh gives developers much more control over who can join a beta and how new features are shared. It also allows beta testers to get more information about an app before they dive in.

Developers will be able to set all kinds of criteria as to who can or cannot access the beta. This should help devs narrow the test groups to specific audiences, like those using a particular device or OS version. Testflight offers a maximum of 10,000 invitations by default, so this should help reserve spots for an intended audience. The developers can also now decrease this maximum number to whatever they want.

The update allows for more control over the beta invites, as they can highlight new features and content. Apple says that beta builds of apps that have already been approved for publication can now include screenshots and the app category along with the invite.

App creators will also be able to view metrics regarding the success of a beta invite, which includes information as to how many people viewed the invite, who opted in and why folks declined.

As for users, beta invites can include a feedback field. This is for people to let the developer know why they chose not to download an app.

This article originally appeared on Engadget at https://www.engadget.com/apps/apple-updates-its-beta-testing-service-testflight-with-redesigned-invites-and-more-185002704.html?src=rss

Google Photos will show when images have been modified with AI

Big tech firms have been releasing AI tools all over their software offerings over the past year. But as it becomes ever easier to manipulate images and video with generative AI, there's been a second wave of launching companion policies to better inform people when that technology has been applied to content. Google is the latest to follow the trend.

After debuting tools like the Magic Editor last spring and incorporating AI into its video editor last month, Google Photos will begin labeling visual content that has been modified with AI. Google was already tagging AI-modified images with corresponding metadata, but now a plain language statement will accompany edited photos. In the example the company shared in its blog post, there is a section at the bottom of the image details screen titled "AI Info." This then lists a credit of the AI tool used to adapt the image. It will also state when an image has been modified with generative AI or when an image is a composite of several photos without the use of generative AI, such as with the Best Take feature. The new language will appear in Google Photos beginning next week.

This article originally appeared on Engadget at https://www.engadget.com/ai/google-photos-will-show-when-images-have-been-modified-with-ai-180007494.html?src=rss

Google Calendar’s web client finally gets a dark mode

Google Calendar’s web client just got a fairly significant refresh, all of which should bring the app more in line with Google Material Design 3. There are updated buttons, dialogs and sidebars to make the whole thing “more modern and accessible.”

The interface typography got a custom-designed refresh that adds “highly-legible typefaces” to ensure a “fresh feel” while remaining “legible and crisp.” Google Calendar now also lets users toggle between light mode and dark mode, to help reduce both battery usage and eye strain.

The redesign in action.
Google

These updates apply to the “entire calendar web experience.” This includes the task list view, which is nice. However, Google has warned that the update could impact the experience of installed Chrome extensions that are active when using Calendar. The company recommends “contacting the developers of those extensions to report any potential issues.”

The redesign begins rolling out today, but could take 15 days or longer to reach every user. You know the drill. It’s available to all Google Workspace customers and personal account holders.

This is just the latest update to the Google Calendar experience. The company recently released an app for WearOS. It also launched something called Google Essentials, which is an all-in-one Windows app that bundles up the company’s entire suite of services (including Calendar.)

This article originally appeared on Engadget at https://www.engadget.com/apps/google-calendars-web-client-finally-gets-a-dark-mode-174055306.html?src=rss

Snapchat’s camera is getting a shortcut on the iPhone lock screen

Snapchat users will soon be able to launch the app’s camera directly from the iPhone lock screen, thanks to an app update and the magic of iOS 18. The latest iPhone operating system update allows people to swap out the flashlight and camera on the lock screen for a diverse array of other useful tools. These tools will now include the Snapchat camera.

Tapping the button will launch Snapchat’s “Camera Only” mode. This will, technically, let you create a Snap, but there’s a major caveat. The lock screen allows users to take photos, but not add filters or post anything. For that, you’ll have to unlock the phone with Face ID, Touch ID or a passcode and open the actual app.

This unique feature will be available via an app update sometime this week. Of course, it also requires an update to iOS 18.0 or the forthcoming iOS 18.1. To get started, tap and hold the Lock Screen and tap Customize. Follow the prompts to remove the default options and add the Snapchat camera.

Phones running iOS 18 offer another little update for Snapchat users. The volume buttons can now be used to capture a Snap while the app is open. Just press either volume button to take a photo or hold one down to record a video.

This article originally appeared on Engadget at https://www.engadget.com/apps/snapchats-camera-is-getting-a-shortcut-on-the-iphone-lock-screen-130039644.html?src=rss

Apple Intelligence expands in iOS 18.2 developer beta, adding Genmoji, Visual Intelligence and ChatGPT

The Apple Intelligence rollout has been slow, staggered and steady since the company first unveiled its take on AI at WWDC this year. It continues today with the release of the latest developer betas for iOS 18, iPadOS 18 and macOS Sequoia. The updates in iOS 18.2, iPadOS 18.2 and macOS Sequoia (15.2) bring long-awaited features like Genmoji, Image Playground, Visual Intelligence and ChatGPT integration for those running the preview software, as well as Image Wand for iPads and more writing tools.

This follows the announcement that iOS 18.1 would be available as a stable release to the public next week, which would bring things like writing tools, notification summaries and Apple's hearing test to the masses. 

That represents the first time for people who haven't opted into beta software to check out Apple Intelligence, which the company has widely touted as the headline feature for the devices it launched this year. The iPhone 16 series, for example, were billed as phones designed for Apple Intelligence, though they launched without those features.

Now that the next set of tools is ready for developers to test, it seems like we're weeks away from them arriving to the public. For those already on the developer beta, the update will land automatically. As always, a word of caution: If you're not already familiar, beta software is meant for users to test new features and often to check for compatibility or problems. They can be buggy, so always back up your data before installing previews. In this case, you'll also need to have an Apple developer account to get access.

Today's updates brings Genmoji, which lets you create custom emoji from your keyboard. You'll go to the emoji keyboard, tap the Genmoji button next to the description or search input field, then enter what you want to create. Apple Intelligence will generate a few options, which you can swipe and select one to send. You'll be able to use them as tapback reactions to other people's messages too. Plus, you can make Genmoji based on pictures of your friends, creating more-accurate Memoji of them. Since these are all presented in emoji style, there won't be the risk of mistaking them for real pictures.

Apple is also releasing a Genmoji API today so third-party messaging apps can read and render Genmoji, and folks you text on WhatsApp or Telegram can see your hot new gym rat emoji.

Other previously announced features like Image Playground and Image Wand are also available today. The former is both a standalone app and something you can access from the Messages app via the Plus button. If you go through Messages, the system will quickly generate some suggestions based on your conversations. You can also type descriptions or select photos from your gallery as a reference, and the system will serve up an image which you can then tweak. To prevent confusion, only some art styles are available: Animation or Illustration. You won't be able to render photorealistic pictures of people. 

Image Wand will also be arriving today as an update to the Apple Pencil tool palette, helping to turn your cruddy sketches into more-polished works of art.

As announced at WWDC, Apple is bringing ChatGPT to Siri and Writing Tools, and each time your request might be well-served by OpenAI's tools, the system will suggest heading there. For example, if you ask Siri to generate an itinerary, a workout routine or even a meal plan, the assistant might say it needs to use ChatGPT to do so and ask for your permission. You can choose to have the system ask you each time it goes to GPT or surface these requests less often. 

It's worth reiterating that you don't need a ChatGPT account to use these tools, and Apple has its own agreement with OpenAI so that when you use the latter's services, your data like your IP address won't be stored or used to train models. However, if you do connect your ChatGPT account, your content will be covered by OpenAI's policies.

Elsewhere, Apple Intelligence will also show that you can compose with ChatGPT within Writing Tools, which is where you'll find things like Rewrite, Summarize and Proofread. It's also another area that's getting an update with the developer beta — a new tool called "Describe your change." This is basically a command bar that lets you tell Apple exactly what it is you want to do to your writing. "Make it sound more enthusiastic," for example, or "Check this for grammar errors." Basically, it'll make getting the AI to edit your work a bit easier, since you won't have to go to the individual sections for Proofread or Summarize, for example. You can also get it to do things lke "Turn this into a poem."

Finally, if you have an iPhone 16 or iPhone 16 Pro and are running the developer beta, you'll be able to try out Visual Intelligence. That lets you point your camera at things around you and get answers for things like math problems in your textbook or the menu of a restaurant you pass on your commute. It can tap third-party services like Google and ChatGPT, too.

Outside of the iPhone 16 series, you'll need a compatible device to check out any Apple Intelligence features. That means an iPhone 15 Pro and newer or an M-series iPad or MacBook. 

This article originally appeared on Engadget at https://www.engadget.com/ai/apple-intelligence-expands-in-ios-182-developer-beta-adding-genmoji-visual-intelligence-and-chatgpt-170920932.html?src=rss

Adobe Fresco’s previously paywalled features are now free for everyone

Adobe Fresco is Adobe’s painting app, designed to compete with apps like Clip Studio Paint and Procreate. It launched almost five years ago for $10 a year, which was reasonable, but Procreate’s one-time $13 purchase came with many powerful features, which made it the go-to option for artists who wanted to draw on iPad. But now, Adobe is making Fresco completely free to use, letting everyone use functions that used to be locked behind a paywall.

By looking at this Adobe Fresco FAQs page, we can see what the paid plans back then offered. Fresco did have a free plan from the beginning, but those who didn’t pay are missing out on more than a thousand brushes, premium shapes and the ability to import custom brushes. Now, all of these features are free to use for all users.

For the uninitiated, Adobe Fresco is available on iPhone, iPad, Windows PCs and Windows tablets. Not every model will run the app, but you can check this list to see if your device is compatible. Most recent devices should be able to use it, though — support goes all the way back to iPhones and iPads Apple released nearly 10 years ago, including the iPhone 6 and every iPad Pro.

By making Fresco completely free, Adobe may be attempting to fight against the competition now that those apps cost more money. Fresco has unique functions like motion presets to instantly animate drawings and artwork mirroring, as mentioned by The Verge. Now that you can download it for free, those interested can grab it from the Apple App Store or the Adobe website.

This article originally appeared on Engadget at https://www.engadget.com/apps/adobe-frescos-previously-paywalled-features-are-now-free-for-everyone-141956420.html?src=rss