Researchers use magnetic fields for non-invasive blood glucose monitoring

Synex Medical, a Toronto-based biotech research firm backed by Sam Altman (the CEO of OpenAI), has developed a tool that can measure your blood glucose levels without a finger prick. It uses a combination of low-field magnets and low-frequency radio waves to directly measure blood sugar levels non-invasively when a user inserts a finger into the device.

The tool uses magnetic resonance spectroscopy (MRS), which is similar to an MRI. Jamie Near, an Associate Professor at the University of Toronto who specializes in the research of MRS technology told Engadget that, “[an] MRI uses magnetic fields to make images of the distribution of hydrogen protons in water that is abundant in our body tissues. In MRS, the same basic principles are used to detect other chemicals that contain hydrogen.” When a user’s fingertip is placed inside the magnetic field, the frequency of a specific molecule, in this case glucose, is measured in parts per million. While the focus was on glucose for this project, MRS could be used to measure metabolites, according to the Synex, including lactate, ketones and amino acids.

Synex Medical diagnostic
Synex Medical

Matthew Rosen, a Harvard physicist whose research spans from fundamental physics to bioimaging in the field of MRI, told Engadget that he thinks the device is “clever” and “a great idea.” Magnetic resonance technology is a common technique used for chemical analysis of compounds, however, traditional resonance technologies operate at high magnetic fields and they're very expensive.

Synex found a way to get clear readings from low magnetic fields. “They’ve overcome the challenges really by developing a method that has high sensitivity and high specificity,” Rosen says. “Honestly, I have been doing magnetic resonance for thirty years. I never thought people could do glucose with a benchtop machine… you could do it with a big machine no problem.”

Professor Andre Simpson, a researcher and center director at the University of Toronto also told Engadget that he thinks Synex’s device is the “real deal.” “MRI machines can fit an entire human body and have been used to target molecule concentrations in the brain through localized spectroscopy,” he explained. “Synex has shrunk this technology to measure concentrations in a finger. I have reviewed their white paper and seen the instrument work.” Simpson said Synex’s ability to retrofit MRS technology into a small box is an engineering feat.

As of now, there are no commercially available devices that can measure blood glucose non-invasively. While there are continuous glucose monitors on the market that use microneedles, which are minimally invasive, there is still a risk of infection.

But there is competition in the space for no-prick diagnostics tools. Know Labs is trying to get approval for a portable glucose monitor that relies on a custom-made Bio-RFID sensing technology, which uses radio waves to detect blood glucose levels in the palm of your hand. When the Know Labs device was tested up against a Dexcom G6 continuous glucose monitor in a study, readings of blood glucose levels using its palm sensor technology were “within threshold” only 46 percent of the time. While the readings are technically in accordance with FDA accuracy limits for a new blood glucose monitor, Know Labs is still working out kinks through scientific research before it can begin FDA clinical trials.

Another start-up, German company DiaMonTech, is currently developing a pocket-sized diagnostic device that is still being tested and fine-tuned to measure glucose through “photothermal detection.” It uses mid-infrared lasers that essentially scan the tissue fluid at the fingertip to detect glucose molecules. CNBC and Bloomberg reported that even Apple has been “quietly developing” a sensor that can check your blood sugar levels through its wearables, though the company never confirmed. Founder and CEO of Synex, Ben Nashman, told Engadget that eventually, the company would like to develop a wearable. But further miniaturization was needed before they could bring a commercial product to market.

Rosen says he isn't sure how the sensor technology can be retrofitted for smartwatches or wearables just yet. But he can imagine a world where these tools complement blood-based diagnostics. “Is it good enough for clinical use? I have to leave that for what clinicians have to say.”

Update, November 16 2023, 10:59 AM ET: This story has been updated to clarify that a comment from the company was made by the CEO of Synex and not a company representative. 

This article originally appeared on Engadget at https://www.engadget.com/researchers-use-magnetic-fields-for-non-invasive-blood-glucose-monitoring-215052628.html?src=rss

NASA can’t talk to its Mars robots for two weeks because the sun is in the way

NASA’s Mars exploration robots will be on their own for the next two weeks while the space agency waits out a natural phenomenon that will prevent normal communications. Mars and Earth have reached positions in their orbits that put them on opposite sides of the sun, in an alignment known as solar conjunction. During this time, NASA says it’s risky to try and send commands to its instruments on Mars because interference from the sun could have a detrimental effect.

To prevent any issues, NASA is taking a planned break from giving orders until the planets move into more suitable positions. The pause started on Saturday and will go on until November 25. A Mars solar conjunction occurs every two years, and while the rovers will be able to send basic health updates home throughout most of the period, they’ll go completely silent for the two days when the sun blocks Mars entirely. 

That means the Perseverance and Curiosity rovers, the Ingenuity helicopter, the Mars Reconnaissance Orbiter, and the Odyssey and MAVEN orbiters will be left to their own devices for a little while. Their onboard instruments will continue to gather data for their respective missions, but won’t send this information back to Earth until the blackout ends.

This article originally appeared on Engadget at https://www.engadget.com/nasa-cant-talk-to-its-mars-robots-for-two-weeks-because-the-sun-is-in-the-way-213022922.html?src=rss

A neural network can map large icebergs 10,000 times faster than humans

One of the major benefits of certain artificial intelligence models is that they can speed up menial or time-consuming tasks —- and not just to whip up terrible "art" based on a brief text input. University of Leeds researchers have unveiled a neural network that they claim can map an outline of a large iceberg in just 0.01 seconds.

Scientists are able to track the locations of large icebergs manually. After all, one that was included in this study was the size of Singapore when it broke off from Antarctica a decade ago. But it's not feasible to manually track changes in icebergs' area and thickness — or how much water and nutrients they're releasing into seas.

"Giant icebergs are important components of the Antarctic environment," Anne Braakmann-Folgmann, lead author of a paper on the neural network, told the European Space Agency. "They impact ocean physics, chemistry, biology and, of course, maritime operations. Therefore, it is crucial to locate icebergs and monitor their extent, to quantify how much meltwater they release into the ocean.”

Until now, manual mapping has proven to be more accurate than automated approaches, but it can take a human analyst several minutes to outline a single iceberg. That can rapidly become a time- and labor-intensive process when multiple icebergs are concerned.

The researchers trained an algorithm called U-net using imagery captured by the ESA's Copernicus Sentinel-1 Earth-monitoring satellites. The algorithm was tested on seven icebergs. The smallest had an area roughly the same as Bern, Switzerland and the largest had approximately the same area as Hong Kong.

With 99 percent accuracy, the new model is said to surpass previous attempts at automation, which often struggled to tell the difference between icebergs and sea ice and other features. It's also 10,000 times faster than humans at mapping icebergs.

"Being able to map iceberg extent automatically with enhanced speed and accuracy will enable us to observe changes in iceberg area for several giant icebergs more easily and paves the way for an operational application," Dr. Braakmann-Folgmann said.

This article originally appeared on Engadget at https://www.engadget.com/a-neural-network-can-map-large-icebergs-10000-times-faster-than-humans-212855550.html?src=rss

A neural network can map large icebergs 10,000 times faster than humans

One of the major benefits of certain artificial intelligence models is that they can speed up menial or time-consuming tasks —- and not just to whip up terrible "art" based on a brief text input. University of Leeds researchers have unveiled a neural network that they claim can map an outline of a large iceberg in just 0.01 seconds.

Scientists are able to track the locations of large icebergs manually. After all, one that was included in this study was the size of Singapore when it broke off from Antarctica a decade ago. But it's not feasible to manually track changes in icebergs' area and thickness — or how much water and nutrients they're releasing into seas.

"Giant icebergs are important components of the Antarctic environment," Anne Braakmann-Folgmann, lead author of a paper on the neural network, told the European Space Agency. "They impact ocean physics, chemistry, biology and, of course, maritime operations. Therefore, it is crucial to locate icebergs and monitor their extent, to quantify how much meltwater they release into the ocean.”

Until now, manual mapping has proven to be more accurate than automated approaches, but it can take a human analyst several minutes to outline a single iceberg. That can rapidly become a time- and labor-intensive process when multiple icebergs are concerned.

The researchers trained an algorithm called U-net using imagery captured by the ESA's Copernicus Sentinel-1 Earth-monitoring satellites. The algorithm was tested on seven icebergs. The smallest had an area roughly the same as Bern, Switzerland and the largest had approximately the same area as Hong Kong.

With 99 percent accuracy, the new model is said to surpass previous attempts at automation, which often struggled to tell the difference between icebergs and sea ice and other features. It's also 10,000 times faster than humans at mapping icebergs.

"Being able to map iceberg extent automatically with enhanced speed and accuracy will enable us to observe changes in iceberg area for several giant icebergs more easily and paves the way for an operational application," Dr. Braakmann-Folgmann said.

This article originally appeared on Engadget at https://www.engadget.com/a-neural-network-can-map-large-icebergs-10000-times-faster-than-humans-212855550.html?src=rss

ESA releases stunning first images from Euclid, its ‘dark universe detective’

The European Space Agency (ESA) has released the first images from its Euclid space telescope — a spacecraft peering 10 billion years into the past to create the largest 3D map of the universe yet. From the distinctive Horsehead Nebula (pictured above) to a “hidden” spiral galaxy that looks much like the Milky Way, Euclid is giving us the clearest look yet at both known and previously unseen objects speckling enormous swathes of the sky.

Euclid is investigating the “dark” universe, searching for signs of how dark energy and dark matter have influenced the evolution of the cosmos. It’ll observe one-third of the sky over the next six years, studying billions of galaxies with its 4-foot-wide telescope, visible-wavelength camera and near-infrared camera/spectrometer. Euclid launched in July 2023, and while its official science mission doesn't start until early 2024, it’s already blowing scientists away with its early observations.

Perseus cluster of galaxies as seen by the Euclid spacecraft
ESA

Euclid’s observation of the Perseus Cluster (above), which sits 240 million light-years away, is the most detailed ever, showing not just the 1,000 galaxies in the cluster itself, but roughly 100,000 others that lay farther away, according to ESA. The space telescope also caught a look at a Milky-Way-like spiral galaxy dubbed IC 342 (below), or the “Hidden Galaxy,” nicknamed as such because it lies behind our own and is normally hard to see clearly.

Euclid spacecraft's view of the spiral galaxy IC 342
ESA

Euclid is able to observe huge portions of the sky, and it's the only telescope in operation able to image certain objects like globular clusters in their entirety in just one shot, according to ESA. Globular clusters like NGC 6397, pictured below, contain hundreds of thousands of gravity-bound stars. Euclid's observation of the cluster is unmatched in its level of detail, ESA says.

The spacecraft is able to see objects that have been too faint for others to observe. Its detailed observation of the well-known Horsehead Nebula, a stellar nursery in the Orion constellation, for example, could reveal young stars and planets that have previously gone undetected.

Euclid spacecraft's view of the Globular cluster NGC 6397
ESA
Euclid spacecraft's view of the irregular galaxy NGC 6822
ESA

Euclid also observed the dwarf galaxy, NGC 6822 (pictured above), which sits just 1.6 million light years away. This small, ancient galaxy could hold clues on how galaxies like our own came to be. It's only the beginning for Euclid, but it's already helping to unlock more information on the objects in our surrounding universe, both near and far. 

“We have never seen astronomical images like this before, containing so much detail,” said René Laureijs, ESA’s Euclid Project Scientist, of the first batch of images. “They are even more beautiful and sharp than we could have hoped for, showing us many previously unseen features in well-known areas of the nearby universe.”

This article originally appeared on Engadget at https://www.engadget.com/esa-releases-stunning-first-images-from-euclid-its-dark-universe-detective-203948971.html?src=rss

NASA discovered that an asteroid named Dinky actually has its own moon

NASA’s Lucy spacecraft, first launched in 2021 to explore the Trojan asteroids trapped near Jupiter, has made an interesting discovery. The spacecraft found an asteroid, nicknamed Dinky, that actually has a smaller asteroid orbiting it, as originally reported by Scientific American. That’s right. It’s basically a moon with its own moon. It’s an ouroboros of cosmic curiosity.

The technical term here is a binary asteroid pair and Dinky, whose real name is Dinkinesh, was spotted by Lucy during a quick fly by. That’s when the spacecraft spotted the smaller “moon” orbiting it.

“A binary was certainly a possibility,” Jessica Sunshine, a planetary scientist at the University of Maryland, told Scientific American. “But it was not expected, and it’s really cool.”

As a matter of fact, the fly by itself wasn’t supposed to find anything of note. It was simply a trial run for the team to hone its skills before investigating the aforementioned Trojan asteroids orbiting the sun ahead of and behind Jupiter. The team wanted to make sure Lucy’s probe would successfully latch onto a space rock, even when both objects were moving extremely fast. Guess what? It worked. Hal Levinson, a planetary scientist at the Southwest Research Institute and principal investigator of the Lucy mission, said that the test was “amazingly successful.”

As for Dinky and its, uh, even dinkier satellite, NASA scientists still have a long way to go with its investigation, as only about one third of the relevant data has been beamed down to Earth. NASA has released a series of images showing Dinky and its pseudo-moon, but not any actual data as of yet.

Even just from these images, however, you can tell a lot about these two celestial bodies. There’s a visible equatorial ridge on the main body of Dinky aka Dinkinesh and a secondary ridge-line branching off from it. The parent asteroid is covered in craters, likely the result of numerous hits by other asteroids. Levinson says that there are more images to come of the secondary satellite and that these pictures suggest that the junior asteroid has some “interesting” stuff going on. He goes on to say that the shape is “really bizarre.”

Binary asteroid pairs are not rare, as researchers have found that around 15 percent of near-Earth asteroids boast a cute lil orbital companion. NASA and affiliated researchers are still waiting for more data on the pair, including color images and spectroscopy that should shed some more light on the two asteroids. Levinson says “there’s a lot of cool stuff to come.”

In the meantime, Lucy will continue on its original mission, to investigate those mysterious Trojan asteroids near Jupiter. It’ll make contact with one in 2025.

This article originally appeared on Engadget at https://www.engadget.com/nasa-discovered-that-an-asteroid-named-dinky-actually-has-its-own-moon-173028204.html?src=rss

A commercial spaceplane capable of orbital flight is ready for NASA testing

NASA will soon start testing what is dubbed as the world’s first commercial spaceplane capable of orbital flight, which will eventually be used to resupply the International Space Station. The agency is set to take delivery of Sierra Space’s first Dream Chaser, which should provide an alternative to SpaceX spacecraft for trips to the ISS.

In the coming weeks, the spaceplane (which is currently at Sierra Space’s facility in Colorado) will make its way to a NASA test site in Ohio. The agency will put the vehicle, which has been named Tenacity, through its paces for between one and three months. According to Ars Technica, NASA will conduct vibration, acoustic and temperature tests to ensure Tenacity can survive the rigors of a rocket launch. NASA engineers, along with government and contractor teams, are running tests to make sure it's safe for Tenacity to approach the ISS.

All going well, Tenacity is scheduled to make its first trip to space in April on the second flight of United Launch Alliance's Vulcan rocket. The rocket has yet to make its own first test flight, which is currently expected to happen in December. However, given how things tend to go with spaceflight, delays are always a possibility on both fronts.

The spaceplane has foldable wings, which allow it to fit inside the payload of the rocket. On its first mission, Tenacity is scheduled to stay at the ISS for 45 days. Afterward, it will return to Earth at the former space shuttle landing strip at the Kennedy Space Center in Florida rather than dropping into the ocean as many spacecraft tend to do. Sierra says the spacecraft is capable of landing at any compatible commercial runway.

“Plunging into the ocean is awful," Sierra Space CEO Tom Vice told Ars Technica. "Landing on a runway is really nice." The company claims Dream Chaser can bring cargo back to Earth at fewer than 1.5 Gs, which is important to help protect sensitive payloads. The spaceplane will be capable of taking up to 12,000 pounds of cargo to the ISS and bringing up to around 4,000 pounds of cargo back to terra firma. Sierra plans for its Dream Chaser fleet to eventually be capable of taking humans to low-Earth orbit too.

As things stand, SpaceX is the only company that operates fully certified spacecraft for NASA missions. Boeing also won a contract to develop a capsule for NASA back in 2014, but Starliner has yet to transport any astronauts.to the ISS. Sierra Nevada (from which Sierra Space was spun out in 2021) previously competed with those businesses for NASA commercial crew program contracts, but it lost out. However, after the company retooled Dream Chaser to focus on cargo operations for the time being, NASA chose Sierra to join its stable of cargo transportation providers in 2016.

Dream Chaser's first trip to the ISS has been a long time coming. It was originally planned for 2019 but the project was beset by delays. COVID-19 compounded those, as it constricted supply chains for key parts that Sierra Space needed before the company brought more of its construction work in house. The company is now aiming to have a second, human-rated version of Dream Chaser ready for the 2026 timeframe.

NASA has long been interested in using spaceplanes, dating back to the agency's early days, and it seems closer than ever to being able to use such vehicles. Virgin Galactic (which just carried out its fifth commercial flight on Thursday) uses spaceplanes for tourist and research flights, its vehicle is only capable of suborbital operations. With Dream Chaser, Sierra has loftier goals.

This article originally appeared on Engadget at https://www.engadget.com/a-commercial-spaceplane-capable-of-orbital-flight-is-ready-for-nasa-testing-185542776.html?src=rss

NASA is launching a free streaming service with live shows and original series

NASA has announced a new streaming service called NASA+ that’s set to hit most major platforms next week. It’ll be completely free, with no subscription requirements, and you won’t be forced to sit through ads. NASA+ will be available starting November 8.

The space agency previously teased the release of its upcoming streaming service over the summer as it more broadly revamped its digital presence. At the time, it said NASA+ would be available on the NASA iOS and Android apps, and streaming players including Roku, Apple TV and Fire TV. You’ll also be able to watch it on the web. 

There aren’t too many details out just yet about the content itself, but NASA says its family friendly programming “embeds you into our missions” with live coverage and original video series. NASA already has its own broadcast network called NASA TV, and the new streaming service seems to be an expansion of that. But, we’ll know more when it officially launches next Wednesday.

This article originally appeared on Engadget at https://www.engadget.com/nasa-is-launching-a-free-streaming-service-with-live-shows-and-original-series-150128180.html?src=rss

HTC is sending VR headsets to the ISS to help cheer up lonely astronauts

Whether it's for a tour of the International Space Station (ISS) or a battle with Darth Vader, most VR enthusiasts are looking to get off this planet and into the great beyond. HTC, however, is sending VR headsets to the ISS to give lonely astronauts something to do besides staring into the star-riddled abyss.

The company partnered up with XRHealth and engineering firm Nord Space to send HTC VIVE Focus 3 headsets to the ISS as part of an ongoing effort to improve the mental health of astronauts in the midst of long assignments on the station. These headsets are pre-loaded with unique software that has been specifically designed to meet the mental health needs of literal space cadets, so they aren’t just for playing Walkabout Mini Golf during the off hours (though that’s not a bad idea.)

The headsets feature new camera tracking tech that was specially developed and adapted to work in microgravity, including eye-tracking sensors to better assess the mental health status of astronauts. These sensors are coupled with software intended to “maintain mental health while in orbit.” The headsets have also been optimized to stabilize alignment and, as such, reduce the chances of motion sickness. Can you imagine free-floating vomit in space?

Danish astronaut Andreas Mogensen will be the first ISS crew member to use the VR headset for preventative mental health care during his six-month mission as commander of the space station. HTC notes that astronauts are often isolated for “months and years at a time” while stationed in space. 

This leads to the question of internet connectivity. After all, Mogensen and his fellow astronauts would likely want to connect with family and friends while wearing their brand-new VR headsets. Playing Population: One by yourself is not exactly satisfying.

The internet used to be really slow on the ISS, with speeds resembling a dial-up connection to AOL in 1995. However, recent upgrades have boosted Internet speeds to around 600 megabits-per-second (Mbps) on the station. As a comparison, the average download speed in the US is about 135 Mbps. So we’d actually be the bottleneck in this scenario, and not the astronauts. The ISS connection should allow for even the most data-hungry VR applications.

These souped-up Vive Focus 3 headsets are heading up to the space station shortly, though there’s no arrival date yet. It’s worth noting that it took some massive feats of engineering to even get these headsets to work in microgravity, as so many aspects of a VR headset depend on normal Earth gravity.

This article originally appeared on Engadget at https://www.engadget.com/htc-is-sending-vr-headsets-to-the-iss-to-help-cheer-up-lonely-astronauts-120019661.html?src=rss

NYU is developing 3D streaming video tech with the help of its dance department

NYU is launching a project to spur the development of immersive 3D video for dance education — and perhaps other areas. Boosted by a $1.2 million four-year grant from the National Science Foundation, the undertaking will try to make Point-Cloud Video (PCV) tech viable for streaming.

A point cloud is a set of data points in a 3D space representing the surface of a subject or environment. NYU says Point-Cloud Video, which strings together point-cloud frames into a moving scene, has been under development for the last decade. However, it’s typically too data-intensive for practical purposes, requiring bandwidth far beyond the capabilities of today’s connected devices.

The researchers plan to address those obstacles by “reducing bandwidth consumption and delivery latency, and increasing power consumption efficiency so that PCVs can be streamed far more easily,” according to an NYU Engineering blog post published Monday. Project leader Yong Liu, an NYU electrical and computer engineering professor, believes modern breakthroughs make that possible. “With recent advances in the key enabling technologies, we are now at the verge of completing the puzzle of teleporting holograms of real-world humans, creatures and objects through the global Internet,” Liu wrote on Monday. 

ChatGPT maker OpenAI launched a model last year that can create 3D point clouds from text prompts. Engadget reached out to the project leader to clarify whether it or other generative AI tools are part of the process, and we’ll update this article if we hear back.

The team will test the technology with the NYU Tisch School of the Arts and the Mark Morris Dance Group’s Dance Center. Dancers from both organizations will perform on a volumetric capture stage. The team will stream their movements live and on-demand, offering educational content for aspiring dancers looking to study from high-level performers — and allowing engineers to test and tweak their PCV technology.

The researchers envision the work opening doors to more advanced VR and mixed reality streaming content. “The success of the proposed research will contribute towards wide deployment of high quality and robust PCV streaming systems that facilitate immersive augmented, virtual and mixed reality experience and create new opportunities in many domains, including education, business, healthcare and entertainment,” Liu said.

“Point-Cloud Video holds tremendous potential to transform a range of industries, and I’m excited that the research team at NYU Tandon prioritized dance education to reap those benefits early,” said Jelena Kovačević, NYU Tandon Dean.

This article originally appeared on Engadget at https://www.engadget.com/nyu-is-developing-3d-streaming-video-tech-with-the-help-of-its-dance-department-211947160.html?src=rss