One of these concept lunar vehicles could join NASA’s Artemis V astronauts on the moon

Three companies are vying for the opportunity to send their own lunar vehicle to the moon to support NASA’s upcoming Artemis missions. The space agency announced this week that it’s chosen Intuitive Machines, Lunar Outpost and Venturi Astrolab to develop their lunar terrain vehicles (LTV) in a feasibility study over the next year. After that, only one is expected to be selected for a demonstration mission, in which the vehicle will be completed and sent to the moon for performance and safety tests. NASA is planning to use the LTV starting with the Artemis V crew that’s projected to launch in early 2030.

The LTV that eventually heads to the moon’s south pole needs to function as both a crewed and uncrewed vehicle, serving sometimes as a mode of transportation for astronauts and other times as a remotely operated explorer. NASA says it’ll contract the chosen vehicle for lunar services through 2039, with all the task orders relating to the LTV amounting to a potential value of up to $4.6 billion. The selected company will also be able to use its LTV for commercial activities in its down time.

Lunar Outpost's Lunar Dawn LTV concept is pictured in a rendering showing it driving on the moon
Lunar Outpost
Venturi Astrolab's concept lunar terrain vehicle, Flex pictured alongside renderings of a solar powered rover and lander on the moon
Astrolab

Intuitive Machines, which will be developing an LTV called the Moon Racer, has already bagged multiple contracts with NASA as part of the Commercial Lunar Payload Services (CLPS) program, and in February launched its first lander, Odysseus, to the moon to achieve the first commercial moon landing. Venturi Astrolab will be developing a vehicle it’s dubbed Flex, while Lunar Outpost will be working on an LTV called Lunar Dawn. All must be able to support a crew of two astronauts and withstand the extreme conditions of the lunar south pole. 

 “We will use the LTV to travel to locations we might not otherwise be able to reach on foot, increasing our ability to explore and make new scientific discoveries,” said Jacob Bleacher, a chief exploration scientist at NASA.

This article originally appeared on Engadget at https://www.engadget.com/one-of-these-concept-lunar-vehicles-could-join-nasas-artemis-v-astronauts-on-the-moon-202448277.html?src=rss

One of these concept lunar vehicles could join NASA’s Artemis V astronauts on the moon

Three companies are vying for the opportunity to send their own lunar vehicle to the moon to support NASA’s upcoming Artemis missions. The space agency announced this week that it’s chosen Intuitive Machines, Lunar Outpost and Venturi Astrolab to develop their lunar terrain vehicles (LTV) in a feasibility study over the next year. After that, only one is expected to be selected for a demonstration mission, in which the vehicle will be completed and sent to the moon for performance and safety tests. NASA is planning to use the LTV starting with the Artemis V crew that’s projected to launch in early 2030.

The LTV that eventually heads to the moon’s south pole needs to function as both a crewed and uncrewed vehicle, serving sometimes as a mode of transportation for astronauts and other times as a remotely operated explorer. NASA says it’ll contract the chosen vehicle for lunar services through 2039, with all the task orders relating to the LTV amounting to a potential value of up to $4.6 billion. The selected company will also be able to use its LTV for commercial activities in its down time.

Lunar Outpost's Lunar Dawn LTV concept is pictured in a rendering showing it driving on the moon
Lunar Outpost
Venturi Astrolab's concept lunar terrain vehicle, Flex pictured alongside renderings of a solar powered rover and lander on the moon
Astrolab

Intuitive Machines, which will be developing an LTV called the Moon Racer, has already bagged multiple contracts with NASA as part of the Commercial Lunar Payload Services (CLPS) program, and in February launched its first lander, Odysseus, to the moon to achieve the first commercial moon landing. Venturi Astrolab will be developing a vehicle it’s dubbed Flex, while Lunar Outpost will be working on an LTV called Lunar Dawn. All must be able to support a crew of two astronauts and withstand the extreme conditions of the lunar south pole. 

 “We will use the LTV to travel to locations we might not otherwise be able to reach on foot, increasing our ability to explore and make new scientific discoveries,” said Jacob Bleacher, a chief exploration scientist at NASA.

This article originally appeared on Engadget at https://www.engadget.com/one-of-these-concept-lunar-vehicles-could-join-nasas-artemis-v-astronauts-on-the-moon-202448277.html?src=rss

Apple’s second-generation AirPods Pro are back on sale for $190

Apple’s second-generation AirPods Pro have dipped to under $200 in a deal from Amazon. The AirPods Pro, which normally cost $250, are $60 off right now, bringing the price down to just $190. That’s the same price we saw during Amazon’s Big Spring Sale. The AirPods Pro offer a number of premium features over the standard AirPods, including active noise cancellation for when you want to shut out the world, and an impressive transparency mode for when you want to hear your surroundings.

The second-generation AirPods Pro came out in 2022 and brought Apple’s H2 chip to the earbuds for a notable performance boost. It offers Adaptive Audio, which will automatically switch between Active Noise Cancellation and Transparency Mode based on what’s going on around you. With Conversation Awareness, they can lower the volume when you’re speaking and make it so other people's voices are easier to hear.

We gave this version of the AirPods Pro a review score of 88, and it’s one of our picks for the best wireless earbuds on the market. The second-generation AirPods Pro are dust, sweat and water resistant, so they should hold up well for workouts, and they achieve better battery life than the previous generation. They can get about six hours of battery life with features like ANC enabled, and that goes up to as much as 30 hours with the charging case. Apple says popping the AirPods Pro in the case for 5 minutes will give you an hour of additional listening or talking time.

AirPods Pro also offer Personalized Spatial Audio with head tracking for more immersive listening while you’re watching TV or movies. The gesture controls that were introduced with this generation of the earbuds might take some getting used to, though. With AirPods Pro, you can adjust the volume by swiping the touch control.

Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/apples-second-generation-airpods-pro-are-back-on-sale-for-190-142626914.html?src=rss

Apple’s second-generation AirPods Pro are back on sale for $190

Apple’s second-generation AirPods Pro have dipped to under $200 in a deal from Amazon. The AirPods Pro, which normally cost $250, are $60 off right now, bringing the price down to just $190. That’s the same price we saw during Amazon’s Big Spring Sale. The AirPods Pro offer a number of premium features over the standard AirPods, including active noise cancellation for when you want to shut out the world, and an impressive transparency mode for when you want to hear your surroundings.

The second-generation AirPods Pro came out in 2022 and brought Apple’s H2 chip to the earbuds for a notable performance boost. It offers Adaptive Audio, which will automatically switch between Active Noise Cancellation and Transparency Mode based on what’s going on around you. With Conversation Awareness, they can lower the volume when you’re speaking and make it so other people's voices are easier to hear.

We gave this version of the AirPods Pro a review score of 88, and it’s one of our picks for the best wireless earbuds on the market. The second-generation AirPods Pro are dust, sweat and water resistant, so they should hold up well for workouts, and they achieve better battery life than the previous generation. They can get about six hours of battery life with features like ANC enabled, and that goes up to as much as 30 hours with the charging case. Apple says popping the AirPods Pro in the case for 5 minutes will give you an hour of additional listening or talking time.

AirPods Pro also offer Personalized Spatial Audio with head tracking for more immersive listening while you’re watching TV or movies. The gesture controls that were introduced with this generation of the earbuds might take some getting used to, though. With AirPods Pro, you can adjust the volume by swiping the touch control.

Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/apples-second-generation-airpods-pro-are-back-on-sale-for-190-142626914.html?src=rss

Best Buy’s Geek Squad agents say they were hit by mass layoffs this week

Geek Squad agents have been flooding Reddit with images of their badges and posts about “going sleeper” after the company reportedly conducted mass layoffs this week. A former employee who spoke to 404 Media said they were sent an email notifying them to work from home on Wednesday and were then called individually to be told the news about their jobs. Some, per 404 Media’s sources and numerous Reddit posts, were longtime Geek Squad agents who had been with the company for more than 10 or even 20 years. Best Buy has not yet responded to Engadget’s request for comment.

There has been an outpouring of support for the laid off workers on the unofficial Geek Squad subreddit, where many have lamented the loss of jobs they’d dedicated much of their lives to and noted that things in the lead up had been heading in a concerning direction. Some commented that their hours had dwindled in recent months, with one former employee telling 404 Media it’s been “a struggle to get by.”

Best Buy conducted mass layoffs affecting employees at its retail stores just last spring, and as The Verge reports, CEO Corie Barry indicated during the company’s February earnings call that more layoffs were coming in 2024 as Best Buy shifts resources toward AI and other areas.

This article originally appeared on Engadget at https://www.engadget.com/best-buys-geek-squad-agents-say-they-were-hit-by-mass-layoffs-this-week-185720480.html?src=rss

Best Buy’s Geek Squad agents say they were hit by mass layoffs this week

Geek Squad agents have been flooding Reddit with images of their badges and posts about “going sleeper” after the company reportedly conducted mass layoffs this week. A former employee who spoke to 404 Media said they were sent an email notifying them to work from home on Wednesday and were then called individually to be told the news about their jobs. Some, per 404 Media’s sources and numerous Reddit posts, were longtime Geek Squad agents who had been with the company for more than 10 or even 20 years. Best Buy has not yet responded to Engadget’s request for comment.

There has been an outpouring of support for the laid off workers on the unofficial Geek Squad subreddit, where many have lamented the loss of jobs they’d dedicated much of their lives to and noted that things in the lead up had been heading in a concerning direction. Some commented that their hours had dwindled in recent months, with one former employee telling 404 Media it’s been “a struggle to get by.”

Best Buy conducted mass layoffs affecting employees at its retail stores just last spring, and as The Verge reports, CEO Corie Barry indicated during the company’s February earnings call that more layoffs were coming in 2024 as Best Buy shifts resources toward AI and other areas.

This article originally appeared on Engadget at https://www.engadget.com/best-buys-geek-squad-agents-say-they-were-hit-by-mass-layoffs-this-week-185720480.html?src=rss

OpenAI and Google reportedly used transcriptions of YouTube videos to train their AI models

OpenAI and Google trained their AI models on text transcribed from YouTube videos, potentially violating creators’ copyrights, according to The New York Times. The report, which describes the lengths OpenAI, Google and Meta have gone to in order to maximize the amount of data they can feed to their AIs, cites numerous people with knowledge of the companies’ practices. It comes just days after YouTube CEO Neal Mohan said in an interview with Bloomberg Originals that OpenAI’s alleged use of YouTube videos to train its new text-to-video generator, Sora, would go against the platform’s policies.

According to the NYT, OpenAI used its Whisper speech recognition tool to transcribe more than one million hours of YouTube videos, which were then used to train GPT-4. The Information previously reported that OpenAI had used YouTube videos and podcasts to train the two AI systems. OpenAI president Greg Brockman was reportedly among the people on this team. Per Google’s rules, “unauthorized scraping or downloading of YouTube content” is not allowed, Matt Bryant, a spokesperson for Google, told NYT, also saying that the company was unaware of any such use by OpenAI.

The report, however, claims there were people at Google who knew but did not take action against OpenAI because Google was using YouTube videos to train its own AI models. Google told NYT it only does so with videos from creators who have agreed to this. Engadget has reached out to Google and OpenAI for comment.

The NYT report also claims Google asked a team to tweak its privacy policy in June 2023 to more broadly cover its use of publicly available content, including Google Docs and Google Sheets, to train its AI models and products. The changes, which Google says were made for clarity's sake, were published in July. Bryant told NYT that this type of data is only used with the permission of users who opt into Google’s experimental features tests, and that the company “did not start training on additional types of data based on this language change.” The change added Bard as an example of what that data might be used for. 

Correction, April 6, 2024, 3:45PM ET: This story originally stated that Google updated its privacy policy in June 2022. The policy update was actually made in 2023. We apologize for the error.

This article originally appeared on Engadget at https://www.engadget.com/openai-and-google-reportedly-used-transcriptions-of-youtube-videos-to-train-their-ai-models-163531073.html?src=rss

OpenAI and Google reportedly used transcriptions of YouTube videos to train their AI models

OpenAI and Google trained their AI models on text transcribed from YouTube videos, potentially violating creators’ copyrights, according to The New York Times. The report, which describes the lengths OpenAI, Google and Meta have gone to in order to maximize the amount of data they can feed to their AIs, cites numerous people with knowledge of the companies’ practices. It comes just days after YouTube CEO Neal Mohan said in an interview with Bloomberg Originals that OpenAI’s alleged use of YouTube videos to train its new text-to-video generator, Sora, would go against the platform’s policies.

According to the NYT, OpenAI used its Whisper speech recognition tool to transcribe more than one million hours of YouTube videos, which were then used to train GPT-4. The Information previously reported that OpenAI had used YouTube videos and podcasts to train the two AI systems. OpenAI president Greg Brockman was reportedly among the people on this team. Per Google’s rules, “unauthorized scraping or downloading of YouTube content” is not allowed, Matt Bryant, a spokesperson for Google, told NYT, also saying that the company was unaware of any such use by OpenAI.

The report, however, claims there were people at Google who knew but did not take action against OpenAI because Google was using YouTube videos to train its own AI models. Google told NYT it only does so with videos from creators who have agreed to this. Engadget has reached out to Google and OpenAI for comment.

The NYT report also claims Google asked a team to tweak its privacy policy in June 2023 to more broadly cover its use of publicly available content, including Google Docs and Google Sheets, to train its AI models and products. The changes, which Google says were made for clarity's sake, were published in July. Bryant told NYT that this type of data is only used with the permission of users who opt into Google’s experimental features tests, and that the company “did not start training on additional types of data based on this language change.” The change added Bard as an example of what that data might be used for. 

Correction, April 6, 2024, 3:45PM ET: This story originally stated that Google updated its privacy policy in June 2022. The policy update was actually made in 2023. We apologize for the error.

This article originally appeared on Engadget at https://www.engadget.com/openai-and-google-reportedly-used-transcriptions-of-youtube-videos-to-train-their-ai-models-163531073.html?src=rss

NASA will be studying the total solar eclipse. Here’s how you can help

On Monday, April 8, a total solar eclipse will be visible across a swath of North America, from Mexico’s Pacific coast to the easternmost reaches of Canada. And in those few minutes of daytime darkness, all sorts of interesting phenomena are known to occur — phenomena NASA would like our help measuring.

During a total solar eclipse, temperatures may drop and winds may slow down or change their course. Animals have been observed to behave unusually — you might hear crickets start their evening chatter a few hours early. Even radio communications can be disrupted due to changes in the ionosphere while the sun’s light is blocked. And, the sun’s corona — its outermost atmosphere — will come into view, presenting scientists (and those of us helping them) with a rare opportunity to study this layer that’s normally invisible to the naked eye.

NASA has lots of research efforts planned for the eclipse, and has sponsored a handful of citizen science campaigns that anyone can take part in if they’re in or near the path of totality, or the areas where people on the ground can watch the sun become completely obscured by the moon. The path of totality crosses 13 US states, including parts of Texas, Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire and Maine. It’s an event of some significance; the next time a total solar eclipse passes over that much of the contiguous US won’t be until 2045.

All you’ll need to join in is equipment you already own, like a smartphone, and a few minutes set aside before the eclipse to go through the training materials.

A map showing the path of totality across the United States
NASA's Scientific Visualization Studio

Help measure the shape of the sun

One such citizen science project is SunSketcher, a concerted effort to measure the true shape of the sun. While the sun is closer to being a perfect sphere than other celestial bodies that have been observed, it’s still technically an oblate spheroid, being a smidge wider along its equator. The SunSketcher team plans to get a more precise measurement by crowd-sourcing observations of Baily's Beads, or the little spots of sunlight that peek out from behind the moon at certain points in the eclipse.

The Baily’s Bead effect is “the last piece of the sun seen before totality and the first to appear after totality,” NASA explained in a blog post. “For a few seconds, these glimmers of light look like beads along the moon’s edge.” They’re visible thanks to the uneven topographical features on the lunar surface.

You’ll need to download the free SunSketcher app, which is available for iOS and Android on the App Store and Google Play Store. Then, a few minutes before totality (the exact time is location-dependent), put your phone on Do Not Disturb, hit “Start” in the app and prop up the phone in a place where it has a good view of the sun. After that, leave it be until the eclipse is over — the app will automatically take pictures of Baily’s Beads as they show up.

There’s a tutorial on the SunSketcher website if you want to familiarize yourself with the process beforehand. When it’s all said and done, the pictures will be uploaded to SunSketcher’s server. They’ll eventually be combined with observations from all over to “create an evolving pattern of beads” that may be able to shed better light on the size and shape of the sun.

The SunSketcher images probably won’t blow you away, so if you’re hoping to get some great pictures of the eclipse, you’ll want to have another camera on hand for that (with the appropriate filters to protect your eyes and the device’s sensors).

The Bailey's Beads  effect is seen as the moon makes its final move over the sun during the total solar eclipse on Monday, August 21, 2017 above Madras, Oregon. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina. A partial solar eclipse was visible across the entire North American continent along with parts of South America, Africa, and Europe.  Photo Credit: (NASA/Aubrey Gemignani)
NASA / Aubrey Gemignani

Record changes in your surroundings

Eclipse-watchers can also use their smartphones to record the environmental changes that take place when the sun dips behind the moon as part of a challenge run by Global Learning and Observations to Benefit the Environment (Globe). You’ll need an air temperature thermometer as well for this task, and can start logging observations in the days before the eclipse if you feel like being extra thorough.

Temperatures at the surface can, in some cases, drop as much as 10 degrees Fahrenheit during a total solar eclipse, according to NASA. And certain types of clouds have been observed to dissipate during these brief cooldowns, resulting in unexpectedly clear skies in the moments before totality. Data collected with the help of citizen scientists during the 2017 total solar eclipse showed that areas with heavier cloud cover experienced a less extreme drop in surface temperatures.

To participate this time around, download the Globe Observer app from the App Store or Google Play Store, and then open the Globe Eclipse tool from the in-app menu. There, you’ll be able to jot down your temperature measurements and take photos of the sky to record any changes in cloud cover, and make notes about the wind conditions. Plan to dedicate a few hours to this one — NASA asks that you include observations from 1-2 hours before and after the eclipse in addition to what you’ll record during. “You will measure temperature every 5-10 minutes and clouds every 15-30 minutes or whenever you see change,” NASA says.

You can keep using the Globe Observer app for citizen science beyond eclipse day, too. There are programs running all year round for recording observations of things like clouds, land use, mosquito habitats and tree heights. The eclipse tool, though, is only available when there’s an eclipse happening.

Listen to the sounds of wildlife

Observations going back nearly 100 years have added support to the idea that total solar eclipses temporarily throw some animals out of whack. Inspired by a 1935 study that gathered observations on animal behavior during an eclipse three years prior, the Eclipse Soundscapes Project is inviting members of the public to take note of what they hear before, during and after totality, and share their findings.

To be an Observer for the project, it’s recommended that you first sign up on the website and go through the brief training materials so you can get a sense of what type of information the project is looking for. The website also has printable field notes pages you can use to record your observations on eclipse day. You should start taking notes down at least 10 minutes before totality. Only after the eclipse is over will you need to fill out the webform to submit your observations along with your latitude and longitude.

If you happen to have an AudioMoth acoustic monitoring device and a spare microSD card lying around, you can go a step further and record the actual sounds of the environment during the eclipse as a Data Collector. You’ll need to set everything up early — the project says to do it on Saturday, April 6 before noon — and let it record until at least 5PM local time on April 10. At that point, you can turn it off, submit your notes online and mail in the SD card. All of the details for submission can be found on the project’s website.

A chart showing what time the eclipse will begin and end in 13 cities across the US
NASA

Take photos of the solar corona

The Eclipse Megamovie 2024 is an initiative designed to study the sun’s corona and plasma plumes from locations in the path of totality, building off of a previous campaign from the 2017 total solar eclipse. It’s already selected a team of 100 Science Team Alpha Recruits (STARs) who underwent training and were given 3D-printed tracking mounts for their cameras to shoot the best possible images. But, the project will still be accepting photo submissions from any enthusiasts who have a DSLR (and a solar filter) and want to participate.

The Photography Guide is pretty exhaustive, so don’t wait until eclipse day to start figuring out your setup. You’ll be able to submit your photos after the eclipse through a form on the website.

However you choose to spend the eclipse, whether you’re collecting data for a citizen science mission or just planning to kick back and observe, make sure you have everything in place well ahead of the time. While the partial eclipse phases will last over an hour, totality will be over and done in about 3.5-4.5 minutes depending on where you’re watching from. You wouldn’t want to miss out on some of that time because you were fumbling with your camera.

Totality will start shortly after 11AM local time (2PM ET) for western Mexico, moving northeastward over the subsequent two-or-so hours before exiting land near Newfoundland, Canada around 5:30PM local time. There will still be something to see for people outside the path of totality, too. Most of the US will be treated to a partial eclipse that day. You can find out exactly when the eclipse will be visible from your location with this tool on NASA’s website, along with the percentage of sun coverage you can expect to witness.

This article originally appeared on Engadget at https://www.engadget.com/nasa-will-be-studying-the-total-solar-eclipse-heres-how-you-can-help-140011076.html?src=rss

NASA will be studying the total solar eclipse. Here’s how you can help

On Monday, April 8, a total solar eclipse will be visible across a swath of North America, from Mexico’s Pacific coast to the easternmost reaches of Canada. And in those few minutes of daytime darkness, all sorts of interesting phenomena are known to occur — phenomena NASA would like our help measuring.

During a total solar eclipse, temperatures may drop and winds may slow down or change their course. Animals have been observed to behave unusually — you might hear crickets start their evening chatter a few hours early. Even radio communications can be disrupted due to changes in the ionosphere while the sun’s light is blocked. And, the sun’s corona — its outermost atmosphere — will come into view, presenting scientists (and those of us helping them) with a rare opportunity to study this layer that’s normally invisible to the naked eye.

NASA has lots of research efforts planned for the eclipse, and has sponsored a handful of citizen science campaigns that anyone can take part in if they’re in or near the path of totality, or the areas where people on the ground can watch the sun become completely obscured by the moon. The path of totality crosses 13 US states, including parts of Texas, Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire and Maine. It’s an event of some significance; the next time a total solar eclipse passes over that much of the contiguous US won’t be until 2045.

All you’ll need to join in is equipment you already own, like a smartphone, and a few minutes set aside before the eclipse to go through the training materials.

A map showing the path of totality across the United States
NASA's Scientific Visualization Studio

Help measure the shape of the sun

One such citizen science project is SunSketcher, a concerted effort to measure the true shape of the sun. While the sun is closer to being a perfect sphere than other celestial bodies that have been observed, it’s still technically an oblate spheroid, being a smidge wider along its equator. The SunSketcher team plans to get a more precise measurement by crowd-sourcing observations of Baily's Beads, or the little spots of sunlight that peek out from behind the moon at certain points in the eclipse.

The Baily’s Bead effect is “the last piece of the sun seen before totality and the first to appear after totality,” NASA explained in a blog post. “For a few seconds, these glimmers of light look like beads along the moon’s edge.” They’re visible thanks to the uneven topographical features on the lunar surface.

You’ll need to download the free SunSketcher app, which is available for iOS and Android on the App Store and Google Play Store. Then, a few minutes before totality (the exact time is location-dependent), put your phone on Do Not Disturb, hit “Start” in the app and prop up the phone in a place where it has a good view of the sun. After that, leave it be until the eclipse is over — the app will automatically take pictures of Baily’s Beads as they show up.

There’s a tutorial on the SunSketcher website if you want to familiarize yourself with the process beforehand. When it’s all said and done, the pictures will be uploaded to SunSketcher’s server. They’ll eventually be combined with observations from all over to “create an evolving pattern of beads” that may be able to shed better light on the size and shape of the sun.

The SunSketcher images probably won’t blow you away, so if you’re hoping to get some great pictures of the eclipse, you’ll want to have another camera on hand for that (with the appropriate filters to protect your eyes and the device’s sensors).

The Bailey's Beads  effect is seen as the moon makes its final move over the sun during the total solar eclipse on Monday, August 21, 2017 above Madras, Oregon. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina. A partial solar eclipse was visible across the entire North American continent along with parts of South America, Africa, and Europe.  Photo Credit: (NASA/Aubrey Gemignani)
NASA / Aubrey Gemignani

Record changes in your surroundings

Eclipse-watchers can also use their smartphones to record the environmental changes that take place when the sun dips behind the moon as part of a challenge run by Global Learning and Observations to Benefit the Environment (Globe). You’ll need an air temperature thermometer as well for this task, and can start logging observations in the days before the eclipse if you feel like being extra thorough.

Temperatures at the surface can, in some cases, drop as much as 10 degrees Fahrenheit during a total solar eclipse, according to NASA. And certain types of clouds have been observed to dissipate during these brief cooldowns, resulting in unexpectedly clear skies in the moments before totality. Data collected with the help of citizen scientists during the 2017 total solar eclipse showed that areas with heavier cloud cover experienced a less extreme drop in surface temperatures.

To participate this time around, download the Globe Observer app from the App Store or Google Play Store, and then open the Globe Eclipse tool from the in-app menu. There, you’ll be able to jot down your temperature measurements and take photos of the sky to record any changes in cloud cover, and make notes about the wind conditions. Plan to dedicate a few hours to this one — NASA asks that you include observations from 1-2 hours before and after the eclipse in addition to what you’ll record during. “You will measure temperature every 5-10 minutes and clouds every 15-30 minutes or whenever you see change,” NASA says.

You can keep using the Globe Observer app for citizen science beyond eclipse day, too. There are programs running all year round for recording observations of things like clouds, land use, mosquito habitats and tree heights. The eclipse tool, though, is only available when there’s an eclipse happening.

Listen to the sounds of wildlife

Observations going back nearly 100 years have added support to the idea that total solar eclipses temporarily throw some animals out of whack. Inspired by a 1935 study that gathered observations on animal behavior during an eclipse three years prior, the Eclipse Soundscapes Project is inviting members of the public to take note of what they hear before, during and after totality, and share their findings.

To be an Observer for the project, it’s recommended that you first sign up on the website and go through the brief training materials so you can get a sense of what type of information the project is looking for. The website also has printable field notes pages you can use to record your observations on eclipse day. You should start taking notes down at least 10 minutes before totality. Only after the eclipse is over will you need to fill out the webform to submit your observations along with your latitude and longitude.

If you happen to have an AudioMoth acoustic monitoring device and a spare microSD card lying around, you can go a step further and record the actual sounds of the environment during the eclipse as a Data Collector. You’ll need to set everything up early — the project says to do it on Saturday, April 6 before noon — and let it record until at least 5PM local time on April 10. At that point, you can turn it off, submit your notes online and mail in the SD card. All of the details for submission can be found on the project’s website.

A chart showing what time the eclipse will begin and end in 13 cities across the US
NASA

Take photos of the solar corona

The Eclipse Megamovie 2024 is an initiative designed to study the sun’s corona and plasma plumes from locations in the path of totality, building off of a previous campaign from the 2017 total solar eclipse. It’s already selected a team of 100 Science Team Alpha Recruits (STARs) who underwent training and were given 3D-printed tracking mounts for their cameras to shoot the best possible images. But, the project will still be accepting photo submissions from any enthusiasts who have a DSLR (and a solar filter) and want to participate.

The Photography Guide is pretty exhaustive, so don’t wait until eclipse day to start figuring out your setup. You’ll be able to submit your photos after the eclipse through a form on the website.

However you choose to spend the eclipse, whether you’re collecting data for a citizen science mission or just planning to kick back and observe, make sure you have everything in place well ahead of the time. While the partial eclipse phases will last over an hour, totality will be over and done in about 3.5-4.5 minutes depending on where you’re watching from. You wouldn’t want to miss out on some of that time because you were fumbling with your camera.

Totality will start shortly after 11AM local time (2PM ET) for western Mexico, moving northeastward over the subsequent two-or-so hours before exiting land near Newfoundland, Canada around 5:30PM local time. There will still be something to see for people outside the path of totality, too. Most of the US will be treated to a partial eclipse that day. You can find out exactly when the eclipse will be visible from your location with this tool on NASA’s website, along with the percentage of sun coverage you can expect to witness.

This article originally appeared on Engadget at https://www.engadget.com/nasa-will-be-studying-the-total-solar-eclipse-heres-how-you-can-help-140011076.html?src=rss