ASRock AI QuickSet Software Tool with OpenVINO launched

Intel Arc GPU AI QuickSet Software

ASRock has unveiled the latest iteration of its AI QuickSet software tool, now featuring support for Intel Arc A-Series graphics cards. This innovative tool streamlines the process of downloading, installing, and configuring a wide array of artificial intelligence (AI) applications on systems running Microsoft Windows 10/11 and Canonical Ubuntu Linux. By integrating Intel Arc GPUs, […]

The post ASRock AI QuickSet Software Tool with OpenVINO launched appeared first on Geeky Gadgets.

Apple brings eye-tracking to recent iPhones and iPads

Ahead of Global Accessibility Awareness Day this week, Apple is issuing its typical annual set of announcements around its assistive features. Many of these are useful for people with disabilities, but also have broader applications as well. For instance, Personal Voice, which was released last year, helps preserve someone's speaking voice. It can be helpful to those who are at risk of losing their voice or have other reasons for wanting to retain their own vocal signature for loved ones in their absence. Today, Apple is bringing eye-tracking support to recent models of iPhones and iPads, as well as customizable vocal shortcuts, music haptics, vehicle motion cues and more. 

The most intriguing feature of the set is the ability to use the front-facing camera on iPhones or iPads (at least those with the A12 chip or later) to navigate the software without additional hardware or accessories. With this enabled, people can look at their screen to move through elements like apps and menus, then linger on an item to select it. 

That pause to select is something Apple calls Dwell Control, which has already been available elsewhere in the company's ecosystem like in Mac's accessibility settings. The setup and calibration process should only take a few seconds, and on-device AI is at work to understand your gaze. It'll also work with third-party apps from launch, since it's a layer in the OS like Assistive Touch. Since Apple already supported eye-tracking in iOS and iPadOS with eye-detection devices connected, the news today is the ability to do so without extra hardware.

Apple is also working on improving the accessibility of its voice-based controls on iPhones and iPads. It again uses on-device AI to create personalized models for each person setting up a new vocal shortcut. You can set up a command for a single word or phrase, or even an utterance (like "Oy!" perhaps). Siri will understand these and perform your designated shortcut or task. You can have these launch apps or run a series of actions that you define in the Shortcuts app, and once set up, you won't have to first ask Siri to be ready. 

Another improvement coming to vocal interactions is "Listen for Atypical Speech," which has iPhones and iPads use on-device machine learning to recognize speech patterns and customize their voice recognition around your unique way of vocalizing. This sounds similar to Google's Project Relate, which is also designed to help technology better understand those with speech impairments or atypical speech.

To build these tools, Apple worked with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign. The institute is also collaborating with other tech giants like Google and Amazon to further development in this space across their products.

For those who are deaf or hard of hearing, Apple is bringing haptics to music players on iPhone, starting with millions of songs on its own Music app. When enabled, music haptics will play taps, textures and specialized vibrations in tandem with the audio to bring a new layer of sensation. It'll be available as an API so developers can bring greater accessibility to their apps, too. 

Drivers with disabilities need better systems in their cars, and Apple is addressing some of the issues with its updates to CarPlay. Voice control and color filters are coming to the interface for vehicles, making it easier to control apps by talking and for those with visual impairments to see menus or alerts. To that end, CarPlay is also getting bold and large text support, as well as sound recognition for noises like sirens or honks. When the system identifies such a sound, it will display an alert at the bottom of the screen to let you know what it heard. This works similarly to Apple's existing sound recognition feature in other devices like the iPhone.

A graphic demonstrating Vehicle Motion Cues on an iPhone. On the left is a drawing of a car with two arrows on either side of its rear. The word
Apple

For those who get motion sickness while using their iPhones or iPads in moving vehicles, a new feature called Vehicle Motion Cues might alleviate some of that discomfort. Since motion sickness is based on a sensory conflict from looking at stationary content while being in a moving vehicle, the new feature is meant to better align the conflicting senses through onscreen dots. When enabled, these dots will line the four edges of your screen and sway in response to the motion it detects. If the car moves forward or accelerates, the dots will sway backwards as if in reaction to the increase in speed in that direction.

There are plenty more features coming to the company's suite of products, including Live Captions in VisionOS, a new Reader mode in Magnifier, support for multi-line braille and a virtual trackpad for those who use Assistive Touch. It's not yet clear when all of these announced updates will roll out, though Apple has historically made these features available in upcoming versions of iOS. With its developer conference WWDC just a few weeks away, it's likely many of today's tools get officially released with the next iOS.

This article originally appeared on Engadget at https://www.engadget.com/apple-brings-eye-tracking-to-recent-iphones-and-ipads-140012990.html?src=rss

Apple brings eye-tracking to recent iPhones and iPads

Ahead of Global Accessibility Awareness Day this week, Apple is issuing its typical annual set of announcements around its assistive features. Many of these are useful for people with disabilities, but also have broader applications as well. For instance, Personal Voice, which was released last year, helps preserve someone's speaking voice. It can be helpful to those who are at risk of losing their voice or have other reasons for wanting to retain their own vocal signature for loved ones in their absence. Today, Apple is bringing eye-tracking support to recent models of iPhones and iPads, as well as customizable vocal shortcuts, music haptics, vehicle motion cues and more. 

The most intriguing feature of the set is the ability to use the front-facing camera on iPhones or iPads (at least those with the A12 chip or later) to navigate the software without additional hardware or accessories. With this enabled, people can look at their screen to move through elements like apps and menus, then linger on an item to select it. 

That pause to select is something Apple calls Dwell Control, which has already been available elsewhere in the company's ecosystem like in Mac's accessibility settings. The setup and calibration process should only take a few seconds, and on-device AI is at work to understand your gaze. It'll also work with third-party apps from launch, since it's a layer in the OS like Assistive Touch. Since Apple already supported eye-tracking in iOS and iPadOS with eye-detection devices connected, the news today is the ability to do so without extra hardware.

Apple is also working on improving the accessibility of its voice-based controls on iPhones and iPads. It again uses on-device AI to create personalized models for each person setting up a new vocal shortcut. You can set up a command for a single word or phrase, or even an utterance (like "Oy!" perhaps). Siri will understand these and perform your designated shortcut or task. You can have these launch apps or run a series of actions that you define in the Shortcuts app, and once set up, you won't have to first ask Siri to be ready. 

Another improvement coming to vocal interactions is "Listen for Atypical Speech," which has iPhones and iPads use on-device machine learning to recognize speech patterns and customize their voice recognition around your unique way of vocalizing. This sounds similar to Google's Project Relate, which is also designed to help technology better understand those with speech impairments or atypical speech.

To build these tools, Apple worked with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign. The institute is also collaborating with other tech giants like Google and Amazon to further development in this space across their products.

For those who are deaf or hard of hearing, Apple is bringing haptics to music players on iPhone, starting with millions of songs on its own Music app. When enabled, music haptics will play taps, textures and specialized vibrations in tandem with the audio to bring a new layer of sensation. It'll be available as an API so developers can bring greater accessibility to their apps, too. 

Drivers with disabilities need better systems in their cars, and Apple is addressing some of the issues with its updates to CarPlay. Voice control and color filters are coming to the interface for vehicles, making it easier to control apps by talking and for those with visual impairments to see menus or alerts. To that end, CarPlay is also getting bold and large text support, as well as sound recognition for noises like sirens or honks. When the system identifies such a sound, it will display an alert at the bottom of the screen to let you know what it heard. This works similarly to Apple's existing sound recognition feature in other devices like the iPhone.

A graphic demonstrating Vehicle Motion Cues on an iPhone. On the left is a drawing of a car with two arrows on either side of its rear. The word
Apple

For those who get motion sickness while using their iPhones or iPads in moving vehicles, a new feature called Vehicle Motion Cues might alleviate some of that discomfort. Since motion sickness is based on a sensory conflict from looking at stationary content while being in a moving vehicle, the new feature is meant to better align the conflicting senses through onscreen dots. When enabled, these dots will line the four edges of your screen and sway in response to the motion it detects. If the car moves forward or accelerates, the dots will sway backwards as if in reaction to the increase in speed in that direction.

There are plenty more features coming to the company's suite of products, including Live Captions in VisionOS, a new Reader mode in Magnifier, support for multi-line braille and a virtual trackpad for those who use Assistive Touch. It's not yet clear when all of these announced updates will roll out, though Apple has historically made these features available in upcoming versions of iOS. With its developer conference WWDC just a few weeks away, it's likely many of today's tools get officially released with the next iOS.

This article originally appeared on Engadget at https://www.engadget.com/apple-brings-eye-tracking-to-recent-iphones-and-ipads-140012990.html?src=rss

Google’s Android OS to Get New AI Features

Android

Google’s Android operating system has come a long way since its inception, and the latest AI-powered updates are set to transform the way users interact with their mobile devices. By integrating advanced artificial intelligence features directly into the operating system, Google aims to make everyday tasks more intuitive and efficient, ultimately enhancing the overall user […]

The post Google’s Android OS to Get New AI Features appeared first on Geeky Gadgets.

Chaos V-Ray 6.1 free hardware benchmarking tool released

free benchmarking tool released

The V-Ray 6 Benchmark, a free standalone application from Chaos, has emerged as a catalyst for professionals seeking to evaluate the rendering speeds of their hardware. This powerful tool allows users to quickly and efficiently assess the capabilities of leading CPUs and GPUs, making it an essential resource for anyone involved in the field of […]

The post Chaos V-Ray 6.1 free hardware benchmarking tool released appeared first on Geeky Gadgets.

Data Scientist vs AI Engineer what’s the difference?

Data Scientist vs AI Engineer

In the rapidly evolving world of technology, two roles have gained significant prominence of late: Data Scientist vs AI Engineer. While these positions share some similarities, they also have distinct responsibilities, skill sets, and work environments. Understanding the differences and overlaps between these roles is crucial for organizations looking to leverage data and artificial intelligence […]

The post Data Scientist vs AI Engineer what’s the difference? appeared first on Geeky Gadgets.

Posted in Uncategorized | Tagged

Phone-sized mini PC lets you take your computer and your work anywhere

Our smartphones have become so powerful that, in theory, they have hardware equivalent to entry-level laptops from a few years back. But despite all that silicon muscle, they can’t really replace our everyday PCs mostly because of the operating system that’s used on most of these computers: Windows. Small, portable, and inconspicuous computers have always been a dream for both users and business owners, whether it’s for working on the go or setting up kiosks, security systems, or space-efficient workstations. This small brick tries to deliver exactly that, giving you the flexibility you need for any kind of computing in almost any context in a size that’s no larger than high-capacity power banks.

Designer: Minisforum

With the popularity of the Apple Mac Mini and, now, the Mac Studio, mini PCs have become more visible in the market. These desktop alternatives, however, are still meant to sit on a table or even on a shelf despite their small sizes. But while these small computers do offer plenty of power to support even some content creation or light gaming, that hardware also closes the door on many possible use cases that would have required them to be less tied down to a table.

The Minisforum S100 is a small, sleek box that you might easily mistake for a power bank. Ironically, it doesn’t actually have its own battery but is a mini PC that you can bring along with you or install in the narrowest of spaces. Despite its small size, it actually boasts a complete set of standard connectivity options, including Wi-Fi 6, Bluetooth 6.2, 2.5 Gbps Ethernet, USB-A ports, HDMI, and 65W USB-C.

The latter two are what make this design so portable and flexible. On a typical desktop, you can connect it to a USB-C monitor that supports USB-C PowerDelivery so that you don’t even have to plug the S100 into an outlet. You can connect another monitor via HDMI to have a dual-screen setup for increased productivity. And when you’re done for the day, you can pick up the palm-sized mini PC and go, maybe even work or chill in a cafe by connecting an external display and a power bank. The Ethernet port also supports Power-over-Ethernet or PoE, so you can use the S100 as the brains behind a camera security system and not have it plugged in at all.

Despite the flexibility that the Minisforum S100 offers, its performance is hampered a bit by the quad-core Intel N100 processor that runs the show, definitely not the best among the chip maker’s “mobile” processors. Its lower-power operation, however, does allow the mini PC to sip rather than chug electricity and keep thermals equally low. You definitely won’t be running heavy applications, but for $189, a Windows 11 computer you can easily slip into your bag or even your pocket might actually be worth the price.

The post Phone-sized mini PC lets you take your computer and your work anywhere first appeared on Yanko Design.

Sony Xperia 10 VI Smartphone Unveiled

Sony Xperia 10 VI

The Sony Xperia 10 VI is a mid-range smartphone that offers an impressive array of features and capabilities, making it an excellent choice for everyday use. With its blend of performance, durability, and user-friendly features, the Xperia 10 VI is designed to cater to the needs of a wide range of users, from tech enthusiasts […]

The post Sony Xperia 10 VI Smartphone Unveiled appeared first on Geeky Gadgets.

Threads gets its own fact-checking program

This might come as a shock to you but the things people put on social media aren't always truthful — really blew your mind there, right? Due to this, it can be challenging for people to know what's real without context or expertise in a specific area. That's part of why many platforms use a fact-checking team to keep an eye (often more so look like they're keeping an eye) on what's getting shared. Now, Threads is getting its own fact-checking program, Adam Mosseri, head of Instagram and de-facto person in charge at Threads, announced. He first shared the company's plans to do so in December. 

Mosseri stated that Threads "recently" made it so that Meta's third-party fact-checkers could review and rate any inaccurate content on the platform. Before the shift, Meta was having fact-checks conducted on Facebook and Instagram and then matching "near-identical false content" that users shared on Threads. However, there's no indication of exactly when the program started or if it's global.

Then there's the matter of seeing how effective it really can be. Facebook and Instagram already had these dedicated fact-checkers, yet misinformation has run rampant across the platforms. Ahead of the 2024 Presidential election — and as ongoing elections and conflicts happen worldwide — is it too much to ask for some hardcore fact-checking from social media companies?

This article originally appeared on Engadget at https://www.engadget.com/threads-gets-its-own-fact-checking-program-130013115.html?src=rss

Threads gets its own fact-checking program

This might come as a shock to you but the things people put on social media aren't always truthful — really blew your mind there, right? Due to this, it can be challenging for people to know what's real without context or expertise in a specific area. That's part of why many platforms use a fact-checking team to keep an eye (often more so look like they're keeping an eye) on what's getting shared. Now, Threads is getting its own fact-checking program, Adam Mosseri, head of Instagram and de-facto person in charge at Threads, announced. He first shared the company's plans to do so in December. 

Mosseri stated that Threads "recently" made it so that Meta's third-party fact-checkers could review and rate any inaccurate content on the platform. Before the shift, Meta was having fact-checks conducted on Facebook and Instagram and then matching "near-identical false content" that users shared on Threads. However, there's no indication of exactly when the program started or if it's global.

Then there's the matter of seeing how effective it really can be. Facebook and Instagram already had these dedicated fact-checkers, yet misinformation has run rampant across the platforms. Ahead of the 2024 Presidential election — and as ongoing elections and conflicts happen worldwide — is it too much to ask for some hardcore fact-checking from social media companies?

This article originally appeared on Engadget at https://www.engadget.com/threads-gets-its-own-fact-checking-program-130013115.html?src=rss