NASA will be studying the total solar eclipse. Here’s how you can help

On Monday, April 8, a total solar eclipse will be visible across a swath of North America, from Mexico’s Pacific coast to the easternmost reaches of Canada. And in those few minutes of daytime darkness, all sorts of interesting phenomena are known to occur — phenomena NASA would like our help measuring.

During a total solar eclipse, temperatures may drop and winds may slow down or change their course. Animals have been observed to behave unusually — you might hear crickets start their evening chatter a few hours early. Even radio communications can be disrupted due to changes in the ionosphere while the sun’s light is blocked. And, the sun’s corona — its outermost atmosphere — will come into view, presenting scientists (and those of us helping them) with a rare opportunity to study this layer that’s normally invisible to the naked eye.

NASA has lots of research efforts planned for the eclipse, and has sponsored a handful of citizen science campaigns that anyone can take part in if they’re in or near the path of totality, or the areas where people on the ground can watch the sun become completely obscured by the moon. The path of totality crosses 13 US states, including parts of Texas, Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire and Maine. It’s an event of some significance; the next time a total solar eclipse passes over that much of the contiguous US won’t be until 2045.

All you’ll need to join in is equipment you already own, like a smartphone, and a few minutes set aside before the eclipse to go through the training materials.

A map showing the path of totality across the United States
NASA's Scientific Visualization Studio

Help measure the shape of the sun

One such citizen science project is SunSketcher, a concerted effort to measure the true shape of the sun. While the sun is closer to being a perfect sphere than other celestial bodies that have been observed, it’s still technically an oblate spheroid, being a smidge wider along its equator. The SunSketcher team plans to get a more precise measurement by crowd-sourcing observations of Baily's Beads, or the little spots of sunlight that peek out from behind the moon at certain points in the eclipse.

The Baily’s Bead effect is “the last piece of the sun seen before totality and the first to appear after totality,” NASA explained in a blog post. “For a few seconds, these glimmers of light look like beads along the moon’s edge.” They’re visible thanks to the uneven topographical features on the lunar surface.

You’ll need to download the free SunSketcher app, which is available for iOS and Android on the App Store and Google Play Store. Then, a few minutes before totality (the exact time is location-dependent), put your phone on Do Not Disturb, hit “Start” in the app and prop up the phone in a place where it has a good view of the sun. After that, leave it be until the eclipse is over — the app will automatically take pictures of Baily’s Beads as they show up.

There’s a tutorial on the SunSketcher website if you want to familiarize yourself with the process beforehand. When it’s all said and done, the pictures will be uploaded to SunSketcher’s server. They’ll eventually be combined with observations from all over to “create an evolving pattern of beads” that may be able to shed better light on the size and shape of the sun.

The SunSketcher images probably won’t blow you away, so if you’re hoping to get some great pictures of the eclipse, you’ll want to have another camera on hand for that (with the appropriate filters to protect your eyes and the device’s sensors).

The Bailey's Beads  effect is seen as the moon makes its final move over the sun during the total solar eclipse on Monday, August 21, 2017 above Madras, Oregon. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina. A partial solar eclipse was visible across the entire North American continent along with parts of South America, Africa, and Europe.  Photo Credit: (NASA/Aubrey Gemignani)
NASA / Aubrey Gemignani

Record changes in your surroundings

Eclipse-watchers can also use their smartphones to record the environmental changes that take place when the sun dips behind the moon as part of a challenge run by Global Learning and Observations to Benefit the Environment (Globe). You’ll need an air temperature thermometer as well for this task, and can start logging observations in the days before the eclipse if you feel like being extra thorough.

Temperatures at the surface can, in some cases, drop as much as 10 degrees Fahrenheit during a total solar eclipse, according to NASA. And certain types of clouds have been observed to dissipate during these brief cooldowns, resulting in unexpectedly clear skies in the moments before totality. Data collected with the help of citizen scientists during the 2017 total solar eclipse showed that areas with heavier cloud cover experienced a less extreme drop in surface temperatures.

To participate this time around, download the Globe Observer app from the App Store or Google Play Store, and then open the Globe Eclipse tool from the in-app menu. There, you’ll be able to jot down your temperature measurements and take photos of the sky to record any changes in cloud cover, and make notes about the wind conditions. Plan to dedicate a few hours to this one — NASA asks that you include observations from 1-2 hours before and after the eclipse in addition to what you’ll record during. “You will measure temperature every 5-10 minutes and clouds every 15-30 minutes or whenever you see change,” NASA says.

You can keep using the Globe Observer app for citizen science beyond eclipse day, too. There are programs running all year round for recording observations of things like clouds, land use, mosquito habitats and tree heights. The eclipse tool, though, is only available when there’s an eclipse happening.

Listen to the sounds of wildlife

Observations going back nearly 100 years have added support to the idea that total solar eclipses temporarily throw some animals out of whack. Inspired by a 1935 study that gathered observations on animal behavior during an eclipse three years prior, the Eclipse Soundscapes Project is inviting members of the public to take note of what they hear before, during and after totality, and share their findings.

To be an Observer for the project, it’s recommended that you first sign up on the website and go through the brief training materials so you can get a sense of what type of information the project is looking for. The website also has printable field notes pages you can use to record your observations on eclipse day. You should start taking notes down at least 10 minutes before totality. Only after the eclipse is over will you need to fill out the webform to submit your observations along with your latitude and longitude.

If you happen to have an AudioMoth acoustic monitoring device and a spare microSD card lying around, you can go a step further and record the actual sounds of the environment during the eclipse as a Data Collector. You’ll need to set everything up early — the project says to do it on Saturday, April 6 before noon — and let it record until at least 5PM local time on April 10. At that point, you can turn it off, submit your notes online and mail in the SD card. All of the details for submission can be found on the project’s website.

A chart showing what time the eclipse will begin and end in 13 cities across the US
NASA

Take photos of the solar corona

The Eclipse Megamovie 2024 is an initiative designed to study the sun’s corona and plasma plumes from locations in the path of totality, building off of a previous campaign from the 2017 total solar eclipse. It’s already selected a team of 100 Science Team Alpha Recruits (STARs) who underwent training and were given 3D-printed tracking mounts for their cameras to shoot the best possible images. But, the project will still be accepting photo submissions from any enthusiasts who have a DSLR (and a solar filter) and want to participate.

The Photography Guide is pretty exhaustive, so don’t wait until eclipse day to start figuring out your setup. You’ll be able to submit your photos after the eclipse through a form on the website.

However you choose to spend the eclipse, whether you’re collecting data for a citizen science mission or just planning to kick back and observe, make sure you have everything in place well ahead of the time. While the partial eclipse phases will last over an hour, totality will be over and done in about 3.5-4.5 minutes depending on where you’re watching from. You wouldn’t want to miss out on some of that time because you were fumbling with your camera.

Totality will start shortly after 11AM local time (2PM ET) for western Mexico, moving northeastward over the subsequent two-or-so hours before exiting land near Newfoundland, Canada around 5:30PM local time. There will still be something to see for people outside the path of totality, too. Most of the US will be treated to a partial eclipse that day. You can find out exactly when the eclipse will be visible from your location with this tool on NASA’s website, along with the percentage of sun coverage you can expect to witness.

This article originally appeared on Engadget at https://www.engadget.com/nasa-will-be-studying-the-total-solar-eclipse-heres-how-you-can-help-140011076.html?src=rss

NASA will be studying the total solar eclipse. Here’s how you can help

On Monday, April 8, a total solar eclipse will be visible across a swath of North America, from Mexico’s Pacific coast to the easternmost reaches of Canada. And in those few minutes of daytime darkness, all sorts of interesting phenomena are known to occur — phenomena NASA would like our help measuring.

During a total solar eclipse, temperatures may drop and winds may slow down or change their course. Animals have been observed to behave unusually — you might hear crickets start their evening chatter a few hours early. Even radio communications can be disrupted due to changes in the ionosphere while the sun’s light is blocked. And, the sun’s corona — its outermost atmosphere — will come into view, presenting scientists (and those of us helping them) with a rare opportunity to study this layer that’s normally invisible to the naked eye.

NASA has lots of research efforts planned for the eclipse, and has sponsored a handful of citizen science campaigns that anyone can take part in if they’re in or near the path of totality, or the areas where people on the ground can watch the sun become completely obscured by the moon. The path of totality crosses 13 US states, including parts of Texas, Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire and Maine. It’s an event of some significance; the next time a total solar eclipse passes over that much of the contiguous US won’t be until 2045.

All you’ll need to join in is equipment you already own, like a smartphone, and a few minutes set aside before the eclipse to go through the training materials.

A map showing the path of totality across the United States
NASA's Scientific Visualization Studio

Help measure the shape of the sun

One such citizen science project is SunSketcher, a concerted effort to measure the true shape of the sun. While the sun is closer to being a perfect sphere than other celestial bodies that have been observed, it’s still technically an oblate spheroid, being a smidge wider along its equator. The SunSketcher team plans to get a more precise measurement by crowd-sourcing observations of Baily's Beads, or the little spots of sunlight that peek out from behind the moon at certain points in the eclipse.

The Baily’s Bead effect is “the last piece of the sun seen before totality and the first to appear after totality,” NASA explained in a blog post. “For a few seconds, these glimmers of light look like beads along the moon’s edge.” They’re visible thanks to the uneven topographical features on the lunar surface.

You’ll need to download the free SunSketcher app, which is available for iOS and Android on the App Store and Google Play Store. Then, a few minutes before totality (the exact time is location-dependent), put your phone on Do Not Disturb, hit “Start” in the app and prop up the phone in a place where it has a good view of the sun. After that, leave it be until the eclipse is over — the app will automatically take pictures of Baily’s Beads as they show up.

There’s a tutorial on the SunSketcher website if you want to familiarize yourself with the process beforehand. When it’s all said and done, the pictures will be uploaded to SunSketcher’s server. They’ll eventually be combined with observations from all over to “create an evolving pattern of beads” that may be able to shed better light on the size and shape of the sun.

The SunSketcher images probably won’t blow you away, so if you’re hoping to get some great pictures of the eclipse, you’ll want to have another camera on hand for that (with the appropriate filters to protect your eyes and the device’s sensors).

The Bailey's Beads  effect is seen as the moon makes its final move over the sun during the total solar eclipse on Monday, August 21, 2017 above Madras, Oregon. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina. A partial solar eclipse was visible across the entire North American continent along with parts of South America, Africa, and Europe.  Photo Credit: (NASA/Aubrey Gemignani)
NASA / Aubrey Gemignani

Record changes in your surroundings

Eclipse-watchers can also use their smartphones to record the environmental changes that take place when the sun dips behind the moon as part of a challenge run by Global Learning and Observations to Benefit the Environment (Globe). You’ll need an air temperature thermometer as well for this task, and can start logging observations in the days before the eclipse if you feel like being extra thorough.

Temperatures at the surface can, in some cases, drop as much as 10 degrees Fahrenheit during a total solar eclipse, according to NASA. And certain types of clouds have been observed to dissipate during these brief cooldowns, resulting in unexpectedly clear skies in the moments before totality. Data collected with the help of citizen scientists during the 2017 total solar eclipse showed that areas with heavier cloud cover experienced a less extreme drop in surface temperatures.

To participate this time around, download the Globe Observer app from the App Store or Google Play Store, and then open the Globe Eclipse tool from the in-app menu. There, you’ll be able to jot down your temperature measurements and take photos of the sky to record any changes in cloud cover, and make notes about the wind conditions. Plan to dedicate a few hours to this one — NASA asks that you include observations from 1-2 hours before and after the eclipse in addition to what you’ll record during. “You will measure temperature every 5-10 minutes and clouds every 15-30 minutes or whenever you see change,” NASA says.

You can keep using the Globe Observer app for citizen science beyond eclipse day, too. There are programs running all year round for recording observations of things like clouds, land use, mosquito habitats and tree heights. The eclipse tool, though, is only available when there’s an eclipse happening.

Listen to the sounds of wildlife

Observations going back nearly 100 years have added support to the idea that total solar eclipses temporarily throw some animals out of whack. Inspired by a 1935 study that gathered observations on animal behavior during an eclipse three years prior, the Eclipse Soundscapes Project is inviting members of the public to take note of what they hear before, during and after totality, and share their findings.

To be an Observer for the project, it’s recommended that you first sign up on the website and go through the brief training materials so you can get a sense of what type of information the project is looking for. The website also has printable field notes pages you can use to record your observations on eclipse day. You should start taking notes down at least 10 minutes before totality. Only after the eclipse is over will you need to fill out the webform to submit your observations along with your latitude and longitude.

If you happen to have an AudioMoth acoustic monitoring device and a spare microSD card lying around, you can go a step further and record the actual sounds of the environment during the eclipse as a Data Collector. You’ll need to set everything up early — the project says to do it on Saturday, April 6 before noon — and let it record until at least 5PM local time on April 10. At that point, you can turn it off, submit your notes online and mail in the SD card. All of the details for submission can be found on the project’s website.

A chart showing what time the eclipse will begin and end in 13 cities across the US
NASA

Take photos of the solar corona

The Eclipse Megamovie 2024 is an initiative designed to study the sun’s corona and plasma plumes from locations in the path of totality, building off of a previous campaign from the 2017 total solar eclipse. It’s already selected a team of 100 Science Team Alpha Recruits (STARs) who underwent training and were given 3D-printed tracking mounts for their cameras to shoot the best possible images. But, the project will still be accepting photo submissions from any enthusiasts who have a DSLR (and a solar filter) and want to participate.

The Photography Guide is pretty exhaustive, so don’t wait until eclipse day to start figuring out your setup. You’ll be able to submit your photos after the eclipse through a form on the website.

However you choose to spend the eclipse, whether you’re collecting data for a citizen science mission or just planning to kick back and observe, make sure you have everything in place well ahead of the time. While the partial eclipse phases will last over an hour, totality will be over and done in about 3.5-4.5 minutes depending on where you’re watching from. You wouldn’t want to miss out on some of that time because you were fumbling with your camera.

Totality will start shortly after 11AM local time (2PM ET) for western Mexico, moving northeastward over the subsequent two-or-so hours before exiting land near Newfoundland, Canada around 5:30PM local time. There will still be something to see for people outside the path of totality, too. Most of the US will be treated to a partial eclipse that day. You can find out exactly when the eclipse will be visible from your location with this tool on NASA’s website, along with the percentage of sun coverage you can expect to witness.

This article originally appeared on Engadget at https://www.engadget.com/nasa-will-be-studying-the-total-solar-eclipse-heres-how-you-can-help-140011076.html?src=rss

Glow-in-the-dark petunias could usher in a new trend in indoor gardening

Indoor gardening and plants gained momentum around 2-3 years ago as people sought ways to cope with boredom and insanity while cooped up at home. Since then, it has become fashionable to raise greens inside homes, whether for food, aesthetics, or both. But as captivating as green living things may look during the day, their aesthetic value drops completely when you can no longer see them at night or in the dark. Of course, you could buy one of those hi-tech planters that have built-in lights, but that costs money not just for the product but also for the electricity it consumes. It would definitely be enchanting and magical if the plants could glow on their own, and that’s exactly the marvel that these glowing petunias are bringing to the table, literally.

Designer: Light Bio

There are some things that naturally glow in the dark, and, no, we’re not just talking fireflies and some iridescent rocks. Bioluminescent plants actually occur more often in nature, except they aren’t exactly the type of plants that you’d proudly display in a pot on your shelf or coffee table. But what if you could have that same magical ability on indoor plants and flowers? You’d probably be the talk of your friends and the town for as long as the plant is alive.

The Firefly Petunia is exactly that, a new and regulation-approved breed of the popular garden flower that, if you haven’t caught on yet, glows in the dark. This isn’t the first attempt to breed a bioluminescent houseplant, but it seems to be on track to being to most successful to date. Unlike previous experiments, this first mixed the genes of a glowing mushroom with a tobacco plant to great success. Of course, you wouldn’t want to grow that inside your home, so it’s a good thing that petunias are a close and, more importantly, compatible cousin.

What makes the Firefly Petunia even more special is that it requires no extra care or steps to make it glow since it’s all part of the plant’s growing process. Simply make sure that it gets enough sunlight during the day, which is something you should be doing anyway, and then watch it light up in the dark of night. The bioluminescence can even be an indicator of the plant’s health, because parts that are growing faster, like flower buds, also glow the brightest. When the plant starts to dim, it’s time to check its condition or prune dead parts.

This glow-in-the-dark flower is just the first step in the company’s grand plan, which includes making the petunias glow in more colors other than plain white. Research is also underway to extend the capabilities outside of this species, so it might only be a matter of time before we see all kinds of plants and flowers glowing in the dark, turning your home into a magical garden every night.

The post Glow-in-the-dark petunias could usher in a new trend in indoor gardening first appeared on Yanko Design.

The Morning After: NASA has to make a time zone for the Moon

The White House has published a policy memo asking NASA to create a new time standard for the Moon by 2026. Coordinated Lunar Time (LTC) will establish an official time reference to help guide future lunar missions. The US, China, Japan, India and Russia have space missions to the Moon planned or completed.

NASA (and the White House) aren’t the only ones trying. The European Space Agency is also trying to make a time zone outside of Earth’s… zone.

Given the Moon’s weaker gravity, time moves slightly faster there. “The same clock we have on Earth would move at a different rate on the Moon,” NASA space communications and navigation chief Kevin Coggins told Reuters.

You saw Interstellar, right? Er, just like that. Exactly like that. No further questions.

— Mat Smith

The biggest stories you might have missed

Meta’s AI image generator struggles to create images of couples of different races

Our favorite cheap smartphone is on sale for $250 right now

OnePlus rolls out its own version of Google’s Magic Eraser

How to watch (and record) the solar eclipse on April 8

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!

Microsoft may have finally made quantum computing useful

The most error-free quantum solution yet, apparently.

What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classic physics? Despite all the heady dreams of quantum computing and press releases from IBM and Google, it's still a what-if. Microsoft now says it’s developed the most error-free quantum computing system yet, with Quantinuum. It’s not a thing I can condense into a single paragraph. You… saw Interstellar, right?

Continue reading.

Stability AI’s audio generator can now create three-minute ‘songs’

Still not that good, though.

Stability AI just unveiled Stable Audio 2.0, an upgraded version of its music-generation platform. With this system, you can use your own text to create up to three minutes of audio, which is roughly the length of a song. You can hone the results by choosing a genre or even uploading audio to inspire the algo. It’s fun — try it out. Just don’t add vocals, trust me.

Continue reading.

Bloomberg says Apple is developing personal robots now

EVs schmee vees.

Apple, hunting for its next iPhone / Apple Watch / Vision Pro (maybe?), might be trying to get into robots. According to Bloomberg’s Mark Gurman, one area the company is exploring is personal robotics — and it started looking at electric vehicles too. The report says Apple has started working on a mobile robot to follow users around their home and has already developed a table-top device that uses a robot to move a screen around.

Continue reading.

Another Matrix movie is happening.

Not like this.

TMA
Warner Bros.

Whoa.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-nasa-has-to-make-a-time-zone-for-the-moon-111554408.html?src=rss

The Morning After: NASA has to make a time zone for the Moon

The White House has published a policy memo asking NASA to create a new time standard for the Moon by 2026. Coordinated Lunar Time (LTC) will establish an official time reference to help guide future lunar missions. The US, China, Japan, India and Russia have space missions to the Moon planned or completed.

NASA (and the White House) aren’t the only ones trying. The European Space Agency is also trying to make a time zone outside of Earth’s… zone.

Given the Moon’s weaker gravity, time moves slightly faster there. “The same clock we have on Earth would move at a different rate on the Moon,” NASA space communications and navigation chief Kevin Coggins told Reuters.

You saw Interstellar, right? Er, just like that. Exactly like that. No further questions.

— Mat Smith

The biggest stories you might have missed

Meta’s AI image generator struggles to create images of couples of different races

Our favorite cheap smartphone is on sale for $250 right now

OnePlus rolls out its own version of Google’s Magic Eraser

How to watch (and record) the solar eclipse on April 8

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!

Microsoft may have finally made quantum computing useful

The most error-free quantum solution yet, apparently.

What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classic physics? Despite all the heady dreams of quantum computing and press releases from IBM and Google, it's still a what-if. Microsoft now says it’s developed the most error-free quantum computing system yet, with Quantinuum. It’s not a thing I can condense into a single paragraph. You… saw Interstellar, right?

Continue reading.

Stability AI’s audio generator can now create three-minute ‘songs’

Still not that good, though.

Stability AI just unveiled Stable Audio 2.0, an upgraded version of its music-generation platform. With this system, you can use your own text to create up to three minutes of audio, which is roughly the length of a song. You can hone the results by choosing a genre or even uploading audio to inspire the algo. It’s fun — try it out. Just don’t add vocals, trust me.

Continue reading.

Bloomberg says Apple is developing personal robots now

EVs schmee vees.

Apple, hunting for its next iPhone / Apple Watch / Vision Pro (maybe?), might be trying to get into robots. According to Bloomberg’s Mark Gurman, one area the company is exploring is personal robotics — and it started looking at electric vehicles too. The report says Apple has started working on a mobile robot to follow users around their home and has already developed a table-top device that uses a robot to move a screen around.

Continue reading.

Another Matrix movie is happening.

Not like this.

TMA
Warner Bros.

Whoa.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-nasa-has-to-make-a-time-zone-for-the-moon-111554408.html?src=rss

The White House tells NASA to create a new time zone for the Moon

On Tuesday, The White House published a policy memo directing NASA to create a new time standard for the Moon by 2026. Coordinated Lunar Time (LTC) will establish an official time reference to help guide future lunar missions. It arrives as a 21st-century space race emerges between (at least) the US, China, Japan, India and Russia.

The memo directs NASA to work with the Departments of Commerce, Defense, State, and Transportation to plan a strategy to put LTC into practice by December 31, 2026. International cooperation will also play a role, especially with signees of the Artemis Accords. Established in 2020, they’re a set of common principles between a growing list of (currently) 37 countries that govern space exploration and operating principles. China and Russia are not part of that group.

“As NASA, private companies, and space agencies around the world launch missions to the Moon, Mars, and beyond, it’s important that we establish celestial time standards for safety and accuracy,” OSTP Deputy Director for National Security Steve Welby wrote in a White House press release. “A consistent definition of time among operators in space is critical to successful space situational awareness capabilities, navigation, and communications, all of which are foundational to enable interoperability across the U.S. government and with international partners.”

Einstein’s theories of relativity dictate that time changes relative to speed and gravity. Given the Moon’s weaker gravity (and movement differences between it and Earth), time moves slightly faster there. So an Earth-based clock on the lunar surface would appear to gain an average of 58.7 microseconds per Earth day. As the US and other countries plan Moon missions to research, explore and (eventually) build bases for permanent residence, using a single standard will help them synchronize technology and missions requiring precise timing.

“The same clock that we have on Earth would move at a different rate on the moon,” NASA space communications and navigation chief Kevin Coggins told Reuters. “Think of the atomic clocks at the U.S. Naval Observatory (in Washington). They’re the heartbeat of the nation, synchronizing everything. You’re going to want a heartbeat on the moon.”

Photo of the Moon, captured by NASA, in exquisite detail.
NASA

The White House wants LTC to coordinate with Coordinated Universal Time (UTC), the standard by which all of Earth’s time zones are measured. Its memo says it wants the new time zone to enable accurate navigation and scientific endeavors. It also wants LTC to maintain resilience if it loses contact with Earth while providing scalability for space environments “beyond the Earth-Moon system.”

NASA’s Artemis program aims to send crewed missions back to the Moon for the first time since the Apollo missions of the 1960s and 70s. The space agency said in January that Artemis 2, which will fly around the Moon with four people onboard, is now set for a September 2025 launch. Artemis 3, which plans to put humans back on the Moon’s surface, is now scheduled for 2026.

In addition to the US, China aims to put astronauts on the Moon before 2030 as the world’s two foremost global superpowers take their race to space. Although no other countries have announced crewed missions to the lunar surface, India (which put a module and rover on the Moon’s South Pole last year), Russia (its mission around the same time didn’t go so well), the United Arab Emirates, Japan, South Korea and private companies have all demonstrated lunar ambitions in recent years.

In addition to enabling further scientific exploration, technological establishment and resource mining, the Moon could serve as a critical stop on the way to Mars. It could test technologies and provide fuel and supply needs for eventual human missions to the Red Planet.

This article originally appeared on Engadget at https://www.engadget.com/the-white-house-tells-nasa-to-create-a-new-time-zone-for-the-moon-193957377.html?src=rss

The White House tells NASA to create a new time zone for the Moon

On Tuesday, The White House published a policy memo directing NASA to create a new time standard for the Moon by 2026. Coordinated Lunar Time (LTC) will establish an official time reference to help guide future lunar missions. It arrives as a 21st-century space race emerges between (at least) the US, China, Japan, India and Russia.

The memo directs NASA to work with the Departments of Commerce, Defense, State, and Transportation to plan a strategy to put LTC into practice by December 31, 2026. International cooperation will also play a role, especially with signees of the Artemis Accords. Established in 2020, they’re a set of common principles between a growing list of (currently) 37 countries that govern space exploration and operating principles. China and Russia are not part of that group.

“As NASA, private companies, and space agencies around the world launch missions to the Moon, Mars, and beyond, it’s important that we establish celestial time standards for safety and accuracy,” OSTP Deputy Director for National Security Steve Welby wrote in a White House press release. “A consistent definition of time among operators in space is critical to successful space situational awareness capabilities, navigation, and communications, all of which are foundational to enable interoperability across the U.S. government and with international partners.”

Einstein’s theories of relativity dictate that time changes relative to speed and gravity. Given the Moon’s weaker gravity (and movement differences between it and Earth), time moves slightly faster there. So an Earth-based clock on the lunar surface would appear to gain an average of 58.7 microseconds per Earth day. As the US and other countries plan Moon missions to research, explore and (eventually) build bases for permanent residence, using a single standard will help them synchronize technology and missions requiring precise timing.

“The same clock that we have on Earth would move at a different rate on the moon,” NASA space communications and navigation chief Kevin Coggins told Reuters. “Think of the atomic clocks at the U.S. Naval Observatory (in Washington). They’re the heartbeat of the nation, synchronizing everything. You’re going to want a heartbeat on the moon.”

Photo of the Moon, captured by NASA, in exquisite detail.
NASA

The White House wants LTC to coordinate with Coordinated Universal Time (UTC), the standard by which all of Earth’s time zones are measured. Its memo says it wants the new time zone to enable accurate navigation and scientific endeavors. It also wants LTC to maintain resilience if it loses contact with Earth while providing scalability for space environments “beyond the Earth-Moon system.”

NASA’s Artemis program aims to send crewed missions back to the Moon for the first time since the Apollo missions of the 1960s and 70s. The space agency said in January that Artemis 2, which will fly around the Moon with four people onboard, is now set for a September 2025 launch. Artemis 3, which plans to put humans back on the Moon’s surface, is now scheduled for 2026.

In addition to the US, China aims to put astronauts on the Moon before 2030 as the world’s two foremost global superpowers take their race to space. Although no other countries have announced crewed missions to the lunar surface, India (which put a module and rover on the Moon’s South Pole last year), Russia (its mission around the same time didn’t go so well), the United Arab Emirates, Japan, South Korea and private companies have all demonstrated lunar ambitions in recent years.

In addition to enabling further scientific exploration, technological establishment and resource mining, the Moon could serve as a critical stop on the way to Mars. It could test technologies and provide fuel and supply needs for eventual human missions to the Red Planet.

This article originally appeared on Engadget at https://www.engadget.com/the-white-house-tells-nasa-to-create-a-new-time-zone-for-the-moon-193957377.html?src=rss

Microsoft may have finally made quantum computing useful

The dream of quantum computing has always been exciting: What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classical physics? But despite seeing IBM, Google and others announce iterative quantum computing hardware, they're still not being used for any practical purposes. That might change with today's announcement from Microsoft and Quantinuum, who say they've developed the most error-free quantum computing system yet.

While classical computers and electronics rely on binary bits as their basic unit of information (they can be either on or off), quantum computers work with qubits, which can exist in a superposition of two states at the same time. The trouble with qubits is that they're prone to error, which is the main reason today's quantum computers (known as Noisy Intermediate Scale Quantum [NISQ] computers) are just used for research and experimentation.

Microsoft's solution was to group physical qubits into virtual qubits, which allows it to apply error diagnostics and correction without destroying them, and run it all over Quantinuum's hardware. The result was an error rate that was 800 times better than relying on physical qubits alone. Microsoft claims it was able to run more than 14,000 experiments without any errors.

According to Jason Zander, EVP of Microsoft's Strategic Missions and Technologies division, this achievement could finally bring us to "Level 2 Resilient" quantum computing, which would be reliable enough for practical applications.

"The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems," Zander wrote in a blog post today. "In short, we need to transition to reliable logical qubits — created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation."

Microsoft's announcement is a "strong result," according to Aram Harrow, a professor of physics at MIT focusing on quantum information and computing. "The Quantinuum system has impressive error rates and control, so it was plausible that they could do an experiment like this, but it's encouraging to see that it worked," he said in an e-mail to Engadget. "Hopefully they'll be able to keep maintaining or even improving the error rate as they scale up."

Microsoft Quantum Computing
Microsoft

Researchers will be able to get a taste of Microsoft's reliable quantum computing via Azure Quantum Elements in the next few months, where it will be available as a private preview. The goal is to push even further to Level 3 quantum supercomputing, which will theoretically be able to tackle incredibly complex issues like climate change and exotic drug research. It's unclear how long it'll take to actually reach that point, but for now, at least we're moving one step closer towards practical quantum computing.

"Getting to a large-scale fault-tolerant quantum computer is still going to be a long road," Professor Harrow wrote. "This is an important step for this hardware platform. Along with the progress on neutral atoms, it means that the cold atom platforms are doing very well relative to their superconducting qubit competitors."

This article originally appeared on Engadget at https://www.engadget.com/microsoft-may-have-finally-made-quantum-computing-useful-164501302.html?src=rss

Microsoft may have finally made quantum computing useful

The dream of quantum computing has always been exciting: What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classical physics? But despite seeing IBM, Google and others announce iterative quantum computing hardware, they're still not being used for any practical purposes. That might change with today's announcement from Microsoft and Quantinuum, who say they've developed the most error-free quantum computing system yet.

While classical computers and electronics rely on binary bits as their basic unit of information (they can be either on or off), quantum computers work with qubits, which can exist in a superposition of two states at the same time. The trouble with qubits is that they're prone to error, which is the main reason today's quantum computers (known as Noisy Intermediate Scale Quantum [NISQ] computers) are just used for research and experimentation.

Microsoft's solution was to group physical qubits into virtual qubits, which allows it to apply error diagnostics and correction without destroying them, and run it all over Quantinuum's hardware. The result was an error rate that was 800 times better than relying on physical qubits alone. Microsoft claims it was able to run more than 14,000 experiments without any errors.

According to Jason Zander, EVP of Microsoft's Strategic Missions and Technologies division, this achievement could finally bring us to "Level 2 Resilient" quantum computing, which would be reliable enough for practical applications.

"The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems," Zander wrote in a blog post today. "In short, we need to transition to reliable logical qubits — created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation."

Microsoft's announcement is a "strong result," according to Aram Harrow, a professor of physics at MIT focusing on quantum information and computing. "The Quantinuum system has impressive error rates and control, so it was plausible that they could do an experiment like this, but it's encouraging to see that it worked," he said in an e-mail to Engadget. "Hopefully they'll be able to keep maintaining or even improving the error rate as they scale up."

Microsoft Quantum Computing
Microsoft

Researchers will be able to get a taste of Microsoft's reliable quantum computing via Azure Quantum Elements in the next few months, where it will be available as a private preview. The goal is to push even further to Level 3 quantum supercomputing, which will theoretically be able to tackle incredibly complex issues like climate change and exotic drug research. It's unclear how long it'll take to actually reach that point, but for now, at least we're moving one step closer towards practical quantum computing.

"Getting to a large-scale fault-tolerant quantum computer is still going to be a long road," Professor Harrow wrote. "This is an important step for this hardware platform. Along with the progress on neutral atoms, it means that the cold atom platforms are doing very well relative to their superconducting qubit competitors."

This article originally appeared on Engadget at https://www.engadget.com/microsoft-may-have-finally-made-quantum-computing-useful-164501302.html?src=rss

This camera captures 156.3 trillion frames per second

Scientists have created a blazing-fast scientific camera that shoots images at an encoding rate of 156.3 terahertz (THz) to individual pixels — equivalent to 156.3 trillion frames per second. Dubbed SCARF (swept-coded aperture real-time femtophotography), the research-grade camera could lead to breakthroughs in fields studying micro-events that come and go too quickly for today’s most expensive scientific sensors.

SCARF has successfully captured ultrafast events like absorption in a semiconductor and the demagnetization of a metal alloy. The research could open new frontiers in areas as diverse as shock wave mechanics or developing more effective medicine.

Leading the research team was Professor Jinyang Liang of Canada’s Institut national de la recherche scientifique (INRS). He’s a globally recognized pioneer in ultrafast photography who built on his breakthroughs from a separate study six years ago. The current research was published in Nature, summarized in a press release from INRS and first reported on by Science Daily.

Professor Liang and company tailored their research as a fresh take on ultrafast cameras. Typically, these systems use a sequential approach: capture frames one at a time and piece them together to observe the objects in motion. But that approach has limitations. “For example, phenomena such as femtosecond laser ablation, shock-wave interaction with living cells, and optical chaos cannot be studied this way,” Liang said.

Components of a research-grade camera spread in a row on a scientific table.
SCARF
Institut national de la recherche scientifique

The new camera builds on Liang’s previous research to upend traditional ultrafast camera logic. “SCARF overcomes these challenges,” INRS communication officer Julie Robert wrote in a statement. “Its imaging modality enables ultrafast sweeping of a static coded aperture while not shearing the ultrafast phenomenon. This provides full-sequence encoding rates of up to 156.3 THz to individual pixels on a camera with a charge-coupled device (CCD). These results can be obtained in a single shot at tunable frame rates and spatial scales in both reflection and transmission modes.”

In extremely simplified terms, that means the camera uses a computational imaging modality to capture spatial information by letting light enter its sensor at slightly different times. Not having to process the spatial data at the moment is part of what frees the camera to capture those extremely quick “chirped” laser pulses at up to 156.3 trillion times per second. The images’ raw data can then be processed by a computer algorithm that decodes the time-staggered inputs, transforming each of the trillions of frames into a complete picture.

Remarkably, it did so “using off-the-shelf and passive optical components,” as the paper describes. The team describes SCARF as low-cost with low power consumption and high measurement quality compared to existing techniques.

Although SCARF is focused more on research than consumers, the team is already working with two companies, Axis Photonique and Few-Cycle, to develop commercial versions, presumably for peers at other higher learning or scientific institutions.

For a more technical explanation of the camera and its potential applications, you can view the full paper in Nature.

This article originally appeared on Engadget at https://www.engadget.com/this-camera-captures-1563-trillion-frames-per-second-184651322.html?src=rss