The ambitious expansion will reportedly take the form of a new chip manufacturing facility, a packaging site and a research and development space. It’ll all be located in or near Taylor, Texas, as that’s where the pre-existing semiconductor facility was built. The current manufacturing hub isn’t operational yet, but will begin building “crucial logic chips” later this year. For the geographically challenged, Taylor is around a 40 minute drive from Austin.
If this actually happens, it’ll be a huge win for the Biden administration. One of the main goals of the CHIPS Act, after all, is to lure global chipmakers to build on US soil. To that end, Washington plans on awarding more than $6 billion to Samsung as further incentive to keep things running in the good ole USA.
The CHIPS Act has allowed the federal government to award funding and offer loans to many tech companies to encourage domestic spending. Back in February, the multinational semiconductor company GlobalFoundries received a grant of $1.5 billion to help pay for a major US expansion, in addition to a $1.6 billion loan. It plans on building a new fabrication facility in Malta, New York, which will handle the manufacture of chips for the automotive, aerospace, defense and AI industries.
More recently, Intel received the largest CHIPS grant to date, snagging up to $8.5 billion to continue various US-based operations. The current plan is for Intel to use that money to manufacture plants that make leading-edge semiconductor chips meant for use in AI and other advanced applications. The company’s building two new fabrication facilities in Arizona and two in Ohio. Additionally, it’s going to use the money to modernize two pre-existing fabs in New Mexico and expand one location in Oregon. All told, Intel is going to invest $100 billion in US-based chip manufacturing. The various projects are expected to create 20,000 construction and 10,000 manufacturing jobs.
The Biden administration signed the CHIPS and Science Act into law back in 2022 to foster domestic semiconductor research and manufacturing and to lessen America’s reliance on Chinese suppliers. It sets aside $52 billion in tax credits and funding for firms to expand stateside production.
This article originally appeared on Engadget at https://www.engadget.com/samsung-is-doubling-its-semiconductor-investment-in-texas-to-44-billion-154322399.html?src=rss
Meta says that its current approach to labeling AI-generated content is too narrow and that it will soon apply a "Made with AI" badge to a broader range of videos, audio and images. Starting in May, it will append the label to media when it detects industry-standard AI image indicators or when users acknowledge that they’re uploading AI-generated content. The company may also apply the label to posts that fact-checkers flag, though it's likely to downrank content that's been identified as false or altered.
The company announced the measure in the wake of an Oversight Board decision regarding a video that was maliciously edited to depict President Joe Biden touching his granddaughter inappropriately. The Oversight Board agreed with Meta's decision not to take down the video from Facebook as it didn't violate the company's rules regarding manipulated media. However, the board suggested that Meta should “reconsider this policy quickly, given the number of elections in 2024.”
Meta says it agrees with the board's "recommendation that providing transparency and additional context is now the better way to address manipulated media and avoid the risk of unnecessarily restricting freedom of speech, so we’ll keep this content on our platforms so we can add labels and context." The company added that, in July, it will stop taking down content purely based on violations of its manipulated video policy. "This timeline gives people time to understand the self-disclosure process before we stop removing the smaller subset of manipulated media," Meta's vice president of content policy Monika Bickert wrote in a blog post.
Meta had been applying an “Imagined with AI” label to photorealistic images that users whip up using the Meta AI tool. The updated policy goes beyond the Oversight Board's labeling recommendations, Meta says. "If we determine that digitally-created or altered images, video or audio create a particularly high risk of materially deceiving the public on a matter of importance, we may add a more prominent label so people have more information and context," Bickert wrote.
While the company generally believes that transparency and allowing appropriately labeled AI-generated photos, images and audio to remain on its platforms is the best way forward, it will still delete material that breaks the rules. "We will remove content, regardless of whether it is created by AI or a person, if it violates our policies against voter interference, bullying and harassment, violence and incitement, or any other policy in our Community Standards," Bickert noted.
The Oversight Board told Engadget in a statement that it was pleased Meta took its recommendations on board. It added that it would review the company's implementation of them in a transparency report down the line.
"While it is always important to find ways to preserve freedom of expression while protecting against demonstrable offline harm, it is especially critical to do so in the context of such an important year for elections," the board said. "As such, we are pleased that Meta will begin labeling a wider range of video, audio and image content as 'Made with AI' when they detect AI image indicators or when people indicate they have uploaded AI content. This will provide people with greater context and transparency for more types of manipulated media, while also removing posts which violate Meta’s rules in other ways."
Update 4/5 12:55PM ET: Added comment from The Oversight Board.
This article originally appeared on Engadget at https://www.engadget.com/meta-plans-to-more-broadly-label-ai-generated-content-152945787.html?src=rss
Meta says that its current approach to labeling AI-generated content is too narrow and that it will soon apply a "Made with AI" badge to a broader range of videos, audio and images. Starting in May, it will append the label to media when it detects industry-standard AI image indicators or when users acknowledge that they’re uploading AI-generated content. The company may also apply the label to posts that fact-checkers flag, though it's likely to downrank content that's been identified as false or altered.
The company announced the measure in the wake of an Oversight Board decision regarding a video that was maliciously edited to depict President Joe Biden touching his granddaughter inappropriately. The Oversight Board agreed with Meta's decision not to take down the video from Facebook as it didn't violate the company's rules regarding manipulated media. However, the board suggested that Meta should “reconsider this policy quickly, given the number of elections in 2024.”
Meta says it agrees with the board's "recommendation that providing transparency and additional context is now the better way to address manipulated media and avoid the risk of unnecessarily restricting freedom of speech, so we’ll keep this content on our platforms so we can add labels and context." The company added that, in July, it will stop taking down content purely based on violations of its manipulated video policy. "This timeline gives people time to understand the self-disclosure process before we stop removing the smaller subset of manipulated media," Meta's vice president of content policy Monika Bickert wrote in a blog post.
Meta had been applying an “Imagined with AI” label to photorealistic images that users whip up using the Meta AI tool. The updated policy goes beyond the Oversight Board's labeling recommendations, Meta says. "If we determine that digitally-created or altered images, video or audio create a particularly high risk of materially deceiving the public on a matter of importance, we may add a more prominent label so people have more information and context," Bickert wrote.
While the company generally believes that transparency and allowing appropriately labeled AI-generated photos, images and audio to remain on its platforms is the best way forward, it will still delete material that breaks the rules. "We will remove content, regardless of whether it is created by AI or a person, if it violates our policies against voter interference, bullying and harassment, violence and incitement, or any other policy in our Community Standards," Bickert noted.
The Oversight Board told Engadget in a statement that it was pleased Meta took its recommendations on board. It added that it would review the company's implementation of them in a transparency report down the line.
"While it is always important to find ways to preserve freedom of expression while protecting against demonstrable offline harm, it is especially critical to do so in the context of such an important year for elections," the board said. "As such, we are pleased that Meta will begin labeling a wider range of video, audio and image content as 'Made with AI' when they detect AI image indicators or when people indicate they have uploaded AI content. This will provide people with greater context and transparency for more types of manipulated media, while also removing posts which violate Meta’s rules in other ways."
Update 4/5 12:55PM ET: Added comment from The Oversight Board.
This article originally appeared on Engadget at https://www.engadget.com/meta-plans-to-more-broadly-label-ai-generated-content-152945787.html?src=rss
We highlight Samsung's Galaxy Tab A9+ in our Android tablet buying guide for those who just want a competent slate for as little money as possible. If that describes you, take note: The 11-inch device is now on sale for $170 at several retailers, including Amazon, Walmart and Best Buy. This deal has technically been available for a couple of weeks, but it still represents the lowest price we've tracked. For reference, Samsung typically sells the tablet for $220. Both the Graphite and Silver finishes are discounted.
Do note, though, that this price applies the base model, which includes 4GB of RAM and 64GB of storage. The latter is expandable with a microSD card, but the limited memory can cause some stutters if you want to push the the tablet for anything beyond casual streaming and web browsing. Samsung sells a higher-spec model with 8GB of RAM and 128GB of storage: That one will be a better buy for gaming and more involved use, and it's also $50 off at $220, another all-time low.
As my colleague Sam Rutherford notes in our buying guide, the Galaxy Tab A9+ isn't likely to wow you in any one area, but it covers the basics well. Its 11-inch LCD display is well-sized and has a faster-than-usual 90Hz refresh rate, which helps the UI feel smooth to scroll through. The screen has a wide 16:10 aspect ratio, so it's better suited to landscape mode than the 4:3 display on an iPad. The metal and plastic frame is slick for the price, while the 7,040mAh battery should hold up for a day or two of casual use. And though no Android tablet really nails the software experience, most people should find Samsung's One UI to be cleaner than something like Amazon's Fire OS. The company says it'll provide OS updates through the eventual Android 16 and security updates through October 2027.
That said, this is still a cheap tablet. The Galaxy Tab A9+'s Snapdragon 695 chip is speedy enough but no powerhouse, and its charging speed tops out at a relatively meager 15W. There's no fingerprint reader, included stylus or formal water-resistance rating, either. If you're not beholden to Android, one of Apple's iPads will still be more well-rounded (though we expect to see new models arrive in the coming weeks). Still, at these prices, the Galaxy Tab A9+ is a solid buy if you're on a tighter budget.
We highlight Samsung's Galaxy Tab A9+ in our Android tablet buying guide for those who just want a competent slate for as little money as possible. If that describes you, take note: The 11-inch device is now on sale for $170 at several retailers, including Amazon, Walmart and Best Buy. This deal has technically been available for a couple of weeks, but it still represents the lowest price we've tracked. For reference, Samsung typically sells the tablet for $220. Both the Graphite and Silver finishes are discounted.
Do note, though, that this price applies the base model, which includes 4GB of RAM and 64GB of storage. The latter is expandable with a microSD card, but the limited memory can cause some stutters if you want to push the the tablet for anything beyond casual streaming and web browsing. Samsung sells a higher-spec model with 8GB of RAM and 128GB of storage: That one will be a better buy for gaming and more involved use, and it's also $50 off at $220, another all-time low.
As my colleague Sam Rutherford notes in our buying guide, the Galaxy Tab A9+ isn't likely to wow you in any one area, but it covers the basics well. Its 11-inch LCD display is well-sized and has a faster-than-usual 90Hz refresh rate, which helps the UI feel smooth to scroll through. The screen has a wide 16:10 aspect ratio, so it's better suited to landscape mode than the 4:3 display on an iPad. The metal and plastic frame is slick for the price, while the 7,040mAh battery should hold up for a day or two of casual use. And though no Android tablet really nails the software experience, most people should find Samsung's One UI to be cleaner than something like Amazon's Fire OS. The company says it'll provide OS updates through the eventual Android 16 and security updates through October 2027.
That said, this is still a cheap tablet. The Galaxy Tab A9+'s Snapdragon 695 chip is speedy enough but no powerhouse, and its charging speed tops out at a relatively meager 15W. There's no fingerprint reader, included stylus or formal water-resistance rating, either. If you're not beholden to Android, one of Apple's iPads will still be more well-rounded (though we expect to see new models arrive in the coming weeks). Still, at these prices, the Galaxy Tab A9+ is a solid buy if you're on a tighter budget.
The Apple Vision Pro is an impressive piece of hardware, and the eye-tracking/hand gesture input combo is fantastic for navigating menus and the like. It’s not so great for gaming. There haven't been many easy ways to connect a third-party controller for playing iPad or cloud games. This is changing, however, as accessory manufacturer 8BitDo just announced Vision Pro compatibility for a number of its controllers.
These accessories are officially supported by Apple, so they should work as soon as you make a Bluetooth connection. No muss and no fuss. All told, eight devices got the Apple seal of approval here. One such gadget is the company’s Ultimate Bluetooth Controller, which we basically called the perfect gamepad for PC.
8BitDo
Other compatible devices include various iterations of the SN30 Pro controller, the Lite 2 and the NES-inspired N30 Pro 2. The integration isn’t just for game controllers, as 8BitDo also announced AVP compatibility for its Retro Mechanical Keyboard. Of course, the Vision Pro works out of the box with most Bluetooth keyboards.
This is pretty big news, however, as media consumption is one of the best parts of the Vision Pro experience. Video games fall squarely in that category. Just about every iPad title works on the device. If playing Cut the Rope on a giant virtual screen doesn’t do it for you, the headset also integrates with Xbox Cloud Gaming and Nvidia GeForce Now for access to AAA titles.
8BitDo announced official controller support for Apple devices last year, though this was primarily for smartphones, tablets and Mac computers. The integration was thanks to new controller firmware and Apple's recent iOS 16.3, iPadOS 16.3, tvOS 16.3 and macOS 13.2 updates. It looks like all of the accessories that work with iPhones and iPads also work with the Vision Pro.
This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-owners-now-have-more-decent-controller-options-150055872.html?src=rss
The Apple Vision Pro is an impressive piece of hardware, and the eye-tracking/hand gesture input combo is fantastic for navigating menus and the like. It’s not so great for gaming. There haven't been many easy ways to connect a third-party controller for playing iPad or cloud games. This is changing, however, as accessory manufacturer 8BitDo just announced Vision Pro compatibility for a number of its controllers.
These accessories are officially supported by Apple, so they should work as soon as you make a Bluetooth connection. No muss and no fuss. All told, eight devices got the Apple seal of approval here. One such gadget is the company’s Ultimate Bluetooth Controller, which we basically called the perfect gamepad for PC.
8BitDo
Other compatible devices include various iterations of the SN30 Pro controller, the Lite 2 and the NES-inspired N30 Pro 2. The integration isn’t just for game controllers, as 8BitDo also announced AVP compatibility for its Retro Mechanical Keyboard. Of course, the Vision Pro works out of the box with most Bluetooth keyboards.
This is pretty big news, however, as media consumption is one of the best parts of the Vision Pro experience. Video games fall squarely in that category. Just about every iPad title works on the device. If playing Cut the Rope on a giant virtual screen doesn’t do it for you, the headset also integrates with Xbox Cloud Gaming and Nvidia GeForce Now for access to AAA titles.
8BitDo announced official controller support for Apple devices last year, though this was primarily for smartphones, tablets and Mac computers. The integration was thanks to new controller firmware and Apple's recent iOS 16.3, iPadOS 16.3, tvOS 16.3 and macOS 13.2 updates. It looks like all of the accessories that work with iPhones and iPads also work with the Vision Pro.
This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-owners-now-have-more-decent-controller-options-150055872.html?src=rss
Having a fancy webcam is all well and good, but another thing you might need to seriously upgrade the quality of your video calls and livestreams is a decent key light. It will illuminate your face to help you stand out from the background and help the camera discern your features more clearly. You don’t need to break the bank to get a decent key light either. Logitech’s Litra Beam is currently $10 off at $90. That’s only $5 more than the lowest price we’ve seen for it.
The Litra Beam looks a bit like an LED reading lamp and it would be a fairly stylish addition to many setups. It has a three-way adjustable stand, allowing you to tweak the height, tilt and rotation as needed, while its ability to run on either USB or AC power gives you more placement options.
The device uses TrueSoft tech, which, according to Logitech, provides "balanced, full-spectrum LED light with cinematic color accuracy for a natural, radiant look across all skin tones." A frameless diffuser helps mitigate harsh shadows, according to the company.
You'll be able to adjust the Litra Beam's brightness, color temperature, presets and other settings through the Logitech G Hub desktop app, which also allows you to manage multiple lights at once. In addition, the key light has five physical buttons on the rear for quick switching between brightness and color temperature settings.
This article originally appeared on Engadget at https://www.engadget.com/logitechs-litra-beam-key-light-is-10-percent-off-right-now-141839351.html?src=rss
Having a fancy webcam is all well and good, but another thing you might need to seriously upgrade the quality of your video calls and livestreams is a decent key light. It will illuminate your face to help you stand out from the background and help the camera discern your features more clearly. You don’t need to break the bank to get a decent key light either. Logitech’s Litra Beam is currently $10 off at $90. That’s only $5 more than the lowest price we’ve seen for it.
The Litra Beam looks a bit like an LED reading lamp and it would be a fairly stylish addition to many setups. It has a three-way adjustable stand, allowing you to tweak the height, tilt and rotation as needed, while its ability to run on either USB or AC power gives you more placement options.
The device uses TrueSoft tech, which, according to Logitech, provides "balanced, full-spectrum LED light with cinematic color accuracy for a natural, radiant look across all skin tones." A frameless diffuser helps mitigate harsh shadows, according to the company.
You'll be able to adjust the Litra Beam's brightness, color temperature, presets and other settings through the Logitech G Hub desktop app, which also allows you to manage multiple lights at once. In addition, the key light has five physical buttons on the rear for quick switching between brightness and color temperature settings.
This article originally appeared on Engadget at https://www.engadget.com/logitechs-litra-beam-key-light-is-10-percent-off-right-now-141839351.html?src=rss
On Monday, April 8, a total solar eclipse will be visible across a swath of North America, from Mexico’s Pacific coast to the easternmost reaches of Canada. And in those few minutes of daytime darkness, all sorts of interesting phenomena are known to occur — phenomena NASA would like our help measuring.
During a total solar eclipse, temperatures may drop and winds may slow down or change their course. Animals have been observed to behave unusually — you might hear crickets start their evening chatter a few hours early. Even radio communications can be disrupted due to changes in the ionosphere while the sun’s light is blocked. And, the sun’s corona — its outermost atmosphere — will come into view, presenting scientists (and those of us helping them) with a rare opportunity to study this layer that’s normally invisible to the naked eye.
NASA has lots of research efforts planned for the eclipse, and has sponsored a handful of citizen science campaigns that anyone can take part in if they’re in or near the path of totality, or the areas where people on the ground can watch the sun become completely obscured by the moon. The path of totality crosses 13 US states, including parts of Texas, Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire and Maine. It’s an event of some significance; the next time a total solar eclipse passes over that much of the contiguous US won’t be until 2045.
All you’ll need to join in is equipment you already own, like a smartphone, and a few minutes set aside before the eclipse to go through the training materials.
NASA's Scientific Visualization Studio
Help measure the shape of the sun
One such citizen science project is SunSketcher, a concerted effort to measure the true shape of the sun. While the sun is closer to being a perfect sphere than other celestial bodies that have been observed, it’s still technically an oblate spheroid, being a smidge wider along its equator. The SunSketcher team plans to get a more precise measurement by crowd-sourcing observations of Baily's Beads, or the little spots of sunlight that peek out from behind the moon at certain points in the eclipse.
The Baily’s Bead effect is “the last piece of the sun seen before totality and the first to appear after totality,” NASA explained in a blog post. “For a few seconds, these glimmers of light look like beads along the moon’s edge.” They’re visible thanks to the uneven topographical features on the lunar surface.
You’ll need to download the free SunSketcher app, which is available for iOS and Android on the App Store and Google Play Store. Then, a few minutes before totality (the exact time is location-dependent), put your phone on Do Not Disturb, hit “Start” in the app and prop up the phone in a place where it has a good view of the sun. After that, leave it be until the eclipse is over — the app will automatically take pictures of Baily’s Beads as they show up.
There’s a tutorial on the SunSketcher website if you want to familiarize yourself with the process beforehand. When it’s all said and done, the pictures will be uploaded to SunSketcher’s server. They’ll eventually be combined with observations from all over to “create an evolving pattern of beads” that may be able to shed better light on the size and shape of the sun.
The SunSketcher images probably won’t blow you away, so if you’re hoping to get some great pictures of the eclipse, you’ll want to have another camera on hand for that (with the appropriate filters to protect your eyes and the device’s sensors).
NASA / Aubrey Gemignani
Record changes in your surroundings
Eclipse-watchers can also use their smartphones to record the environmental changes that take place when the sun dips behind the moon as part of a challenge run by Global Learning and Observations to Benefit the Environment (Globe). You’ll need an air temperature thermometer as well for this task, and can start logging observations in the days before the eclipse if you feel like being extra thorough.
Temperatures at the surface can, in some cases, drop as much as 10 degrees Fahrenheit during a total solar eclipse, according to NASA. And certain types of clouds have been observed to dissipate during these brief cooldowns, resulting in unexpectedly clear skies in the moments before totality. Data collected with the help of citizen scientists during the 2017 total solar eclipse showed that areas with heavier cloud cover experienced a less extreme drop in surface temperatures.
To participate this time around, download the Globe Observer app from the App Store or Google Play Store, and then open the Globe Eclipse tool from the in-app menu. There, you’ll be able to jot down your temperature measurements and take photos of the sky to record any changes in cloud cover, and make notes about the wind conditions. Plan to dedicate a few hours to this one — NASA asks that you include observations from 1-2 hours before and after the eclipse in addition to what you’ll record during. “You will measure temperature every 5-10 minutes and clouds every 15-30 minutes or whenever you see change,” NASA says.
You can keep using the Globe Observer app for citizen science beyond eclipse day, too. There are programs running all year round for recording observations of things like clouds, land use, mosquito habitats and tree heights. The eclipse tool, though, is only available when there’s an eclipse happening.
Listen to the sounds of wildlife
Observations going back nearly 100 years have added support to the idea that total solar eclipses temporarily throw some animals out of whack. Inspired by a 1935 study that gathered observations on animal behavior during an eclipse three years prior, the Eclipse Soundscapes Project is inviting members of the public to take note of what they hear before, during and after totality, and share their findings.
To be an Observer for the project, it’s recommended that you first sign up on the website and go through the brief training materials so you can get a sense of what type of information the project is looking for. The website also has printable field notes pages you can use to record your observations on eclipse day. You should start taking notes down at least 10 minutes before totality. Only after the eclipse is over will you need to fill out the webform to submit your observations along with your latitude and longitude.
If you happen to have an AudioMoth acoustic monitoring device and a spare microSD card lying around, you can go a step further and record the actual sounds of the environment during the eclipse as a Data Collector. You’ll need to set everything up early — the project says to do it on Saturday, April 6 before noon — and let it record until at least 5PM local time on April 10. At that point, you can turn it off, submit your notes online and mail in the SD card. All of the details for submission can be found on the project’s website.
NASA
Take photos of the solar corona
The Eclipse Megamovie 2024 is an initiative designed to study the sun’s corona and plasma plumes from locations in the path of totality, building off of a previous campaign from the 2017 total solar eclipse. It’s already selected a team of 100 Science Team Alpha Recruits (STARs) who underwent training and were given 3D-printed tracking mounts for their cameras to shoot the best possible images. But, the project will still be accepting photo submissions from any enthusiasts who have a DSLR (and a solar filter) and want to participate.
The Photography Guide is pretty exhaustive, so don’t wait until eclipse day to start figuring out your setup. You’ll be able to submit your photos after the eclipse through a form on the website.
However you choose to spend the eclipse, whether you’re collecting data for a citizen science mission or just planning to kick back and observe, make sure you have everything in place well ahead of the time. While the partial eclipse phases will last over an hour, totality will be over and done in about 3.5-4.5 minutes depending on where you’re watching from. You wouldn’t want to miss out on some of that time because you were fumbling with your camera.
Totality will start shortly after 11AM local time (2PM ET) for western Mexico, moving northeastward over the subsequent two-or-so hours before exiting land near Newfoundland, Canada around 5:30PM local time. There will still be something to see for people outside the path of totality, too. Most of the US will be treated to a partial eclipse that day. You can find out exactly when the eclipse will be visible from your location with this tool on NASA’s website, along with the percentage of sun coverage you can expect to witness.
This article originally appeared on Engadget at https://www.engadget.com/nasa-will-be-studying-the-total-solar-eclipse-heres-how-you-can-help-140011076.html?src=rss