Webb telescope images show an unprecedented and ‘chaotic’ view of the center of our galaxy

The James Webb telescope is back with some more gorgeous images. This time, the telescope eyed the center of the Milky Way galaxy, shining a light on the densest part of our surrounding environs in “unprecedented detail.” Specifically, the images are sourced from a star-forming region called Sagittarius C, or Sgr C for short.

This area is about 300 light-years from the galaxy’s supermassive black hole, Sagittarius A, and over 25,000 light-years from a little blue rock called Earth. All told, the region boasts over 500,000 stars and various clusters of protostars, which are stars that are still forming and gaining mass. The end result? A stunning cloud of chaos, especially when compared to our region of space, which is decidedly sparse in comparison.

As a matter of fact, the galactic center is “the most extreme environment” in the Milky Way, as stated by University of Virginia professor Jonathan Tan, who assisted the observation team. There has never been any data on this region with this “level of resolution and sensitivity”, until now, thanks to the power of the Webb telescope.

At the center of everything is a massive protostar that weighs more than 30 times our sun. This actually makes the area seem less populated than it actually is, as this solar object blocks light from behind it, so not even Webb can see all of the stars in the region. So what you’re looking at is a conservative estimate of just how crowded the area is. It’s like the Times Square of space, only without a Guy Fieri restaurant (for now.)

James Webb telescope image.
NASA, ESA, CSA, STScI, and S. Crowe (University of Virginia).

The data provided by these images will allow researchers to put current theories of star formation to “their most rigorous test.” To that end, Webb’s NIRCam (Near-Infrared Camera) instrument captured large-scale emission imagery from ionized hydrogen, the blue on the lower side of the image. This is likely the result of young and massive stars releasing energetic photons, but the vast size of the region came as a surprise to researchers, warranting further study.

The observation team’s principal investigator, Samuel Crowe, said that the research enabled by these and forthcoming images will allow scientists to understand the nature of massive stars which is akin to “learning the origin story of much of the universe.”

This is obviously not the first interesting image produced by the James Webb telescope. We’ve seen stars born in the Virgo constellation, water around a comet in the main asteroid belt and a fairly offputting view of the Pillars of Creation, among others. It’s seen things you people wouldn't believe and, luckily, it won’t all be gone like tears in the rain because of the internet and because Webb’s still out there.

This article originally appeared on Engadget at https://www.engadget.com/webb-telescope-images-show-an-unprecedented-and-chaotic-view-of-the-center-of-our-galaxy-185912370.html?src=rss

SpaceX loses another Starship and Super Heavy rocket in double explosion during test

SpaceX's second test flight of its Starship spacecraft — which it hopes will one day ferry humans to the moon and Mars — ended in an explosion Saturday morning minutes after taking off from the company's spaceport in Boca Chica, Texas. Starship launched just after 8AM ET atop a Super Heavy rocket, the largest rocket in the world. 

Moments after completing stage separation, when the Super Heavy booster detached itself from Starship, the rocket's first stage exploded. Starship, however, continued on for several more minutes, surpassing the flight time of its predecessor. A faint explosion could be seen in the livestream around the 8-minute mark, and hosts confirmed soon after that they'd lost contact with the craft. 

Unlike in its first test, which came to an end about 24 miles above Earth's surface, Starship was able to reach space this time around. At the time of its explosion, the livestream's tracker clocked it at an altitude of about 92 miles.

Today’s flight was also SpaceX’s first attempt at its new separation technique called “hot staging,” in which it fired up Starship’s engines before the craft detached from the still-firing first stage. It managed to complete the motions before Super Heavy exploded, with Starship already far away. SpaceX will now have to figure out tweaks to its booster to help it withstand future hot-staging attempts.

But, as with the last test that ended in an explosion, SpaceX is still billing it all as a success. Kate Tice, one of the livestream's hosts and a quality engineering manager for SpaceX, said it was “an incredibly successful day, even though we did have a RUD — or rapid unscheduled disassembly — of both the Super Heavy booster and the ship. We got so much data and that will all help to improve for our next flight.”

This article originally appeared on Engadget at https://www.engadget.com/spacex-loses-another-starship-after-rocket-explodes-during-test-flight-143503845.html?src=rss

Watch SpaceX’s Starship lift off for its second fully integrated test flight

At 8AM ET today, SpaceX will open a 20-minute launch window for Starship's second-ever fully integrated test flight. If everything goes well during the pre-flight procedures, and if the weather cooperates, then we'll see the company's spacecraft make another attempt to reach space. SpaceX completed Starship's first fully integrated launch in April. While it was considered a success, the company wasn't able to meet all its objectives and had to intentionally blow up the spacecraft after its two stages failed to separate. 

As a result of that incident, the Federal Aviation Administration (FAA) had grounded Starship while authorities conducted an investigation. They found that the explosion scattered debris across 385 acres of land, caused pulverized concrete to rain down on areas up to 6.5 miles northwest of the pad site, and started a wildfire at Boca Chica State Park. The FAA required SpaceX to make 63 corrective actions before it could give the company clearance to fly its reusable spacecraft again. 

SpaceX said that this flight will debut several changes it implemented due to what happened during Starship's first test flight. They include a new hot-stage separation system, a new electronic Thrust Vector Control (TVC) system for Super Heavy Raptor engines, reinforcements to the pad foundation and a water-cooled steel flame deflector.

The company's live broadcast of the launch starts at 7:24AM ET on its website and on X. If the Starship's stages can successfully separate this time around, its upper stage will fly across the planet before splashing down off a Hawaiian coast.

This article originally appeared on Engadget at https://www.engadget.com/watch-spacexs-starship-lift-off-for-its-second-fully-integrated-test-flight-121559318.html?src=rss

MIT tests new ingestible sensor that records your breathing through your intestines

MIT researchers developed an ingestible capsule that can monitor vital signs including heart rate and breathing patterns from within a patient’s GI tract. The scientists also say that the novel device has the potential to also be used to detect signs of respiratory depression during an opioid overdose. Giovanni Traverso, an associate professor of mechanical engineering at MIT who has been working on developing a range of ingestible sensors, told Engadget that the device will be especially useful for sleep studies.

Conventionally, sleep studies require patients to be hooked up to a number of sensors and devices. In labs and in at-home studies, sensors can be attached to a patient’s scalp, temples, chest and lungs with wires. A patient may also wear a nasal cannula, chest belt and pulse oximeter which can connect to a portable monitor. “As you can imagine, trying to sleep with all of this machinery can be challenging,” Traverso told Engadget.

Clear pill tab
MIT

This trial, which used a capsule made by Celero Systems —A start-up led by MIT and Harvard researchers— marks the first time ingestible sensor technology was tested in humans. Aside from the start-up and MIT, the research was spearheaded by experts at West Virginia University and other hospital affiliates.

The capsule contains two small batteries and a wireless antenna that transmits data. The ingestible sensor, which is the size of a vitamin capsule, traveled through the gastrointestinal tract, and collected signals from the device while it was in the stomach. The participants stayed at a sleep lab overnight while the device recorded respiration, heart rate, temperature and gastric motility. The sensor was also able to detect sleep apnea in one of the patients during the trial. The findings suggest that the ingestible was able to measure health metrics on par with medical-grade diagnostic equipment at the sleep center. Traditionally, patients that need to get diagnosed with specific sleep disorders are required to stay overnight at a sleep lab, where they get hooked onto an array of sensors and devices. Ingestible sensor technology eliminates the need for that.

Importantly, MIT says there were no adverse effects reported due to capsule ingestion. The capsule typically passes through a patient within a day or so, though that short internal shelf life may also limit how effective it could be as a monitoring device. Traverso told Engadget that he aims to have Celetro, which he co-founded, eventually contain a mechanism that will allow the capsule to sit in a patient’s stomach for a week.

Dr. Ali Rezai, the executive chair of the West Virginia University Rockefeller Neuroscience Institute, said that there is a huge potential for creating a new pathway through this device that will help providers identify when a patient is overdosing according to their vitals. In the future, researchers even anticipate that devices could incorporate drugs internally: overdose reversal agents, such as nalmefene, could be slowly administered if a sensor records that a person’s breathing rate slowed or stopped. More data from the studies will be made available in the coming months.

This article originally appeared on Engadget at https://www.engadget.com/mit-tests-new-ingestible-sensor-that-record-your-breathing-through-your-intestines-224823353.html?src=rss

SpaceX prepares for Starship’s second test flight after securing FAA clearance

SpaceX aims to send Starship to space for its second test flight on November 17, now that the Federal Aviation Administration (FAA) has given it the clearance to do so. The company completed its next-generation spacecraft's first fully integrated launch in April, but it wasn't able to meet all its objectives, including having its upper stage fly across our planet before re-entering the atmosphere and splashing down in the ocean near Hawaii. SpaceX had to intentionally blow up the vehicle in the sky after an onboard fire had prevented its two stages from separating. 

According to federal agencies, debris from the rocket explosion was found across 385 acres of land on SpaceX's facility and at Boca Chica State Park. It caused wildfire to break out on 3.5 acres of state park land and had led to a "plume cloud of pulverized concrete that deposited material up to 6.5 miles northwest of the pad site." The FAA grounded Starship until SpaceX took dozens of corrective actions, including a vehicle redesign to prevent leaks and fires. As Space notes, the agency finished its safety review in September, but it still had to work with the US Fish and Wildlife Service (USFWS) to finish an updated environmental review of the spacecraft. 

For now, the FAA has given SpaceX the license to fly Starship for one flight. The company will open the spacecraft's two-hour launch window at 8AM EST on November 17, and if all goes well, Starship will fly across the planet and splash down off a Hawaiian coast as planned. Starship, of course, has to keep acing test flights before it can go into service. The fully reusable spacecraft represents SpaceX's future, since the company plans to use it for missions to geosynchronous orbit, the moon and Mars. 

This article originally appeared on Engadget at https://www.engadget.com/spacex-prepares-for-starships-second-test-flight-after-securing-faa-clearance-035159364.html?src=rss

Researchers use magnetic fields for non-invasive blood glucose monitoring

Synex Medical, a Toronto-based biotech research firm backed by Sam Altman (the CEO of OpenAI), has developed a tool that can measure your blood glucose levels without a finger prick. It uses a combination of low-field magnets and low-frequency radio waves to directly measure blood sugar levels non-invasively when a user inserts a finger into the device.

The tool uses magnetic resonance spectroscopy (MRS), which is similar to an MRI. Jamie Near, an Associate Professor at the University of Toronto who specializes in the research of MRS technology told Engadget that, “[an] MRI uses magnetic fields to make images of the distribution of hydrogen protons in water that is abundant in our body tissues. In MRS, the same basic principles are used to detect other chemicals that contain hydrogen.” When a user’s fingertip is placed inside the magnetic field, the frequency of a specific molecule, in this case glucose, is measured in parts per million. While the focus was on glucose for this project, MRS could be used to measure metabolites, according to the Synex, including lactate, ketones and amino acids.

Synex Medical diagnostic
Synex Medical

Matthew Rosen, a Harvard physicist whose research spans from fundamental physics to bioimaging in the field of MRI, told Engadget that he thinks the device is “clever” and “a great idea.” Magnetic resonance technology is a common technique used for chemical analysis of compounds, however, traditional resonance technologies operate at high magnetic fields and they're very expensive.

Synex found a way to get clear readings from low magnetic fields. “They’ve overcome the challenges really by developing a method that has high sensitivity and high specificity,” Rosen says. “Honestly, I have been doing magnetic resonance for thirty years. I never thought people could do glucose with a benchtop machine… you could do it with a big machine no problem.”

Professor Andre Simpson, a researcher and center director at the University of Toronto also told Engadget that he thinks Synex’s device is the “real deal.” “MRI machines can fit an entire human body and have been used to target molecule concentrations in the brain through localized spectroscopy,” he explained. “Synex has shrunk this technology to measure concentrations in a finger. I have reviewed their white paper and seen the instrument work.” Simpson said Synex’s ability to retrofit MRS technology into a small box is an engineering feat.

As of now, there are no commercially available devices that can measure blood glucose non-invasively. While there are continuous glucose monitors on the market that use microneedles, which are minimally invasive, there is still a risk of infection.

But there is competition in the space for no-prick diagnostics tools. Know Labs is trying to get approval for a portable glucose monitor that relies on a custom-made Bio-RFID sensing technology, which uses radio waves to detect blood glucose levels in the palm of your hand. When the Know Labs device was tested up against a Dexcom G6 continuous glucose monitor in a study, readings of blood glucose levels using its palm sensor technology were “within threshold” only 46 percent of the time. While the readings are technically in accordance with FDA accuracy limits for a new blood glucose monitor, Know Labs is still working out kinks through scientific research before it can begin FDA clinical trials.

Another start-up, German company DiaMonTech, is currently developing a pocket-sized diagnostic device that is still being tested and fine-tuned to measure glucose through “photothermal detection.” It uses mid-infrared lasers that essentially scan the tissue fluid at the fingertip to detect glucose molecules. CNBC and Bloomberg reported that even Apple has been “quietly developing” a sensor that can check your blood sugar levels through its wearables, though the company never confirmed. Founder and CEO of Synex, Ben Nashman, told Engadget that eventually, the company would like to develop a wearable. But further miniaturization was needed before they could bring a commercial product to market.

Rosen says he isn't sure how the sensor technology can be retrofitted for smartwatches or wearables just yet. But he can imagine a world where these tools complement blood-based diagnostics. “Is it good enough for clinical use? I have to leave that for what clinicians have to say.”

Update, November 16 2023, 10:59 AM ET: This story has been updated to clarify that a comment from the company was made by the CEO of Synex and not a company representative. 

This article originally appeared on Engadget at https://www.engadget.com/researchers-use-magnetic-fields-for-non-invasive-blood-glucose-monitoring-215052628.html?src=rss

NASA can’t talk to its Mars robots for two weeks because the sun is in the way

NASA’s Mars exploration robots will be on their own for the next two weeks while the space agency waits out a natural phenomenon that will prevent normal communications. Mars and Earth have reached positions in their orbits that put them on opposite sides of the sun, in an alignment known as solar conjunction. During this time, NASA says it’s risky to try and send commands to its instruments on Mars because interference from the sun could have a detrimental effect.

To prevent any issues, NASA is taking a planned break from giving orders until the planets move into more suitable positions. The pause started on Saturday and will go on until November 25. A Mars solar conjunction occurs every two years, and while the rovers will be able to send basic health updates home throughout most of the period, they’ll go completely silent for the two days when the sun blocks Mars entirely. 

That means the Perseverance and Curiosity rovers, the Ingenuity helicopter, the Mars Reconnaissance Orbiter, and the Odyssey and MAVEN orbiters will be left to their own devices for a little while. Their onboard instruments will continue to gather data for their respective missions, but won’t send this information back to Earth until the blackout ends.

This article originally appeared on Engadget at https://www.engadget.com/nasa-cant-talk-to-its-mars-robots-for-two-weeks-because-the-sun-is-in-the-way-213022922.html?src=rss

A neural network can map large icebergs 10,000 times faster than humans

One of the major benefits of certain artificial intelligence models is that they can speed up menial or time-consuming tasks —- and not just to whip up terrible "art" based on a brief text input. University of Leeds researchers have unveiled a neural network that they claim can map an outline of a large iceberg in just 0.01 seconds.

Scientists are able to track the locations of large icebergs manually. After all, one that was included in this study was the size of Singapore when it broke off from Antarctica a decade ago. But it's not feasible to manually track changes in icebergs' area and thickness — or how much water and nutrients they're releasing into seas.

"Giant icebergs are important components of the Antarctic environment," Anne Braakmann-Folgmann, lead author of a paper on the neural network, told the European Space Agency. "They impact ocean physics, chemistry, biology and, of course, maritime operations. Therefore, it is crucial to locate icebergs and monitor their extent, to quantify how much meltwater they release into the ocean.”

Until now, manual mapping has proven to be more accurate than automated approaches, but it can take a human analyst several minutes to outline a single iceberg. That can rapidly become a time- and labor-intensive process when multiple icebergs are concerned.

The researchers trained an algorithm called U-net using imagery captured by the ESA's Copernicus Sentinel-1 Earth-monitoring satellites. The algorithm was tested on seven icebergs. The smallest had an area roughly the same as Bern, Switzerland and the largest had approximately the same area as Hong Kong.

With 99 percent accuracy, the new model is said to surpass previous attempts at automation, which often struggled to tell the difference between icebergs and sea ice and other features. It's also 10,000 times faster than humans at mapping icebergs.

"Being able to map iceberg extent automatically with enhanced speed and accuracy will enable us to observe changes in iceberg area for several giant icebergs more easily and paves the way for an operational application," Dr. Braakmann-Folgmann said.

This article originally appeared on Engadget at https://www.engadget.com/a-neural-network-can-map-large-icebergs-10000-times-faster-than-humans-212855550.html?src=rss

A neural network can map large icebergs 10,000 times faster than humans

One of the major benefits of certain artificial intelligence models is that they can speed up menial or time-consuming tasks —- and not just to whip up terrible "art" based on a brief text input. University of Leeds researchers have unveiled a neural network that they claim can map an outline of a large iceberg in just 0.01 seconds.

Scientists are able to track the locations of large icebergs manually. After all, one that was included in this study was the size of Singapore when it broke off from Antarctica a decade ago. But it's not feasible to manually track changes in icebergs' area and thickness — or how much water and nutrients they're releasing into seas.

"Giant icebergs are important components of the Antarctic environment," Anne Braakmann-Folgmann, lead author of a paper on the neural network, told the European Space Agency. "They impact ocean physics, chemistry, biology and, of course, maritime operations. Therefore, it is crucial to locate icebergs and monitor their extent, to quantify how much meltwater they release into the ocean.”

Until now, manual mapping has proven to be more accurate than automated approaches, but it can take a human analyst several minutes to outline a single iceberg. That can rapidly become a time- and labor-intensive process when multiple icebergs are concerned.

The researchers trained an algorithm called U-net using imagery captured by the ESA's Copernicus Sentinel-1 Earth-monitoring satellites. The algorithm was tested on seven icebergs. The smallest had an area roughly the same as Bern, Switzerland and the largest had approximately the same area as Hong Kong.

With 99 percent accuracy, the new model is said to surpass previous attempts at automation, which often struggled to tell the difference between icebergs and sea ice and other features. It's also 10,000 times faster than humans at mapping icebergs.

"Being able to map iceberg extent automatically with enhanced speed and accuracy will enable us to observe changes in iceberg area for several giant icebergs more easily and paves the way for an operational application," Dr. Braakmann-Folgmann said.

This article originally appeared on Engadget at https://www.engadget.com/a-neural-network-can-map-large-icebergs-10000-times-faster-than-humans-212855550.html?src=rss

ESA releases stunning first images from Euclid, its ‘dark universe detective’

The European Space Agency (ESA) has released the first images from its Euclid space telescope — a spacecraft peering 10 billion years into the past to create the largest 3D map of the universe yet. From the distinctive Horsehead Nebula (pictured above) to a “hidden” spiral galaxy that looks much like the Milky Way, Euclid is giving us the clearest look yet at both known and previously unseen objects speckling enormous swathes of the sky.

Euclid is investigating the “dark” universe, searching for signs of how dark energy and dark matter have influenced the evolution of the cosmos. It’ll observe one-third of the sky over the next six years, studying billions of galaxies with its 4-foot-wide telescope, visible-wavelength camera and near-infrared camera/spectrometer. Euclid launched in July 2023, and while its official science mission doesn't start until early 2024, it’s already blowing scientists away with its early observations.

Perseus cluster of galaxies as seen by the Euclid spacecraft
ESA

Euclid’s observation of the Perseus Cluster (above), which sits 240 million light-years away, is the most detailed ever, showing not just the 1,000 galaxies in the cluster itself, but roughly 100,000 others that lay farther away, according to ESA. The space telescope also caught a look at a Milky-Way-like spiral galaxy dubbed IC 342 (below), or the “Hidden Galaxy,” nicknamed as such because it lies behind our own and is normally hard to see clearly.

Euclid spacecraft's view of the spiral galaxy IC 342
ESA

Euclid is able to observe huge portions of the sky, and it's the only telescope in operation able to image certain objects like globular clusters in their entirety in just one shot, according to ESA. Globular clusters like NGC 6397, pictured below, contain hundreds of thousands of gravity-bound stars. Euclid's observation of the cluster is unmatched in its level of detail, ESA says.

The spacecraft is able to see objects that have been too faint for others to observe. Its detailed observation of the well-known Horsehead Nebula, a stellar nursery in the Orion constellation, for example, could reveal young stars and planets that have previously gone undetected.

Euclid spacecraft's view of the Globular cluster NGC 6397
ESA
Euclid spacecraft's view of the irregular galaxy NGC 6822
ESA

Euclid also observed the dwarf galaxy, NGC 6822 (pictured above), which sits just 1.6 million light years away. This small, ancient galaxy could hold clues on how galaxies like our own came to be. It's only the beginning for Euclid, but it's already helping to unlock more information on the objects in our surrounding universe, both near and far. 

“We have never seen astronomical images like this before, containing so much detail,” said René Laureijs, ESA’s Euclid Project Scientist, of the first batch of images. “They are even more beautiful and sharp than we could have hoped for, showing us many previously unseen features in well-known areas of the nearby universe.”

This article originally appeared on Engadget at https://www.engadget.com/esa-releases-stunning-first-images-from-euclid-its-dark-universe-detective-203948971.html?src=rss