Stanford’s ‘accelerator on a chip’ could revolutionize medical care

When the SLAC National Accelerator Laboratory at Stanford first opened its doors in 1966, it had already earned the distinction of housing the world's longest linear accelerator: A 3.2 kilometer monstrosity buried 25 feet under the gently rolling hil...

Why buy the cow when you can biofabricate the milk for free?

Thought to have been Lord Krishna's favorite animal, the cow has achieved a uniquely sacred status in India. Their slaughter is prohibited through most of the country, beef consumption is largely outlawed as well and woe be the unlucky soul accused o...

Plastic-plucking robots are the future of recycling

We are living in the Age of Plastic. In 2015, the world's industries created 448 million tons of it -- twice as much as it did in 1998. -- and the rate of production is only accelerating. However, our recycling efforts have not matched pace. In fact,...

ICYMI: Submersible sticky situations and elongating elastomer electrodes

Today on In Case You Missed It: Researchers from Purdue University and the Office of Naval Research teamed up to develop a new kind of glue that even works underwater. The synthetic compound is derived from proteins used by muscles to keep themse...

The Eye-Sync system can diagnose concussions in one minute

Concussions are no joke -- just ask Cam Newton -- but a new diagnostic system developed in conjunction with Stanford University could revolutionize the way these head injuries are determined. The Eye-Sync, from the ThinkSync company, uses a modified...

Stanford researchers ‘cool’ sunlight to improve solar cell efficiency

A team of researchers from Stanford University have devised an ingenious means of boosting the efficiency of solar panels by exploiting a fundamental physics phenomenon. Solar panels lose efficiency as they heat up. Just as the top of your head rad...

Biological Drones Could Explore the Surface of Mars

Biological Drone 01

Growing drones after reaching the Red Planet’s surface is a much more convenient alternative to shipping all the hardware and assembling it there, or at least that’s what some genius students are trying to prove.

The thought that future drones might be developed in Petri dishes instead of hardware factories is a bit scary, and could bring some of David Cronenberg’s movies to mind. However, this approach has its advantages, as evidenced by a team of students from Stanford University, Spelman College, and Brown University. Their biological drone was an entry in the 2014 International Genetically Engineered Machine competition, and chances are that a version of that drone could one day fly across Mars.

The iGEM team collaborated with Lynn Rothschild, a synthetic biologist at the NASA Ames Research Center in California, to develop a drone using not much besides fungi and plants. Considering the type of biological material that has been used, one may be concerned of the fungi colony that might form in the event of a crash. However, the “components” of the drone are already dead by the time it’s airborne, so no one should worry about that.

“These are lightweight, cheap, and won’t litter the environment. It’s about as big a concern as leaving your sweater outside,” claimed Rothschild, emphasizing only a few of the bio-UAV’s advantages.

Of course, there still is some technology involved, as designing the bio-drone needs to be done in 3D modelling software. Next, the design file is sent to biomaterials company Ecovative Design that uses vacuforming to create an 8-inch square of fungal mycelium. The hard structure of the drone is composed of straw and dead leaves that are placed in the mold. “The biomaterial gets inoculated with fungus, then fungus grows throughout all the material in the mold,” pointed out Eli Block, one of the team’s members. “Before where it was kind of a loose material, after growing for a few weeks, it was a single solid chunk.”

The bio-drone’s chassis is actually made using two such molds, as Rothschild explained: “So it looks like a dried sandwich, and it’s the weight and feel of Styrofoam.”

Block also emphasized the importance of having the bio-drone sterilized: “The point of it is you’re not flying anything that could introduce negative organisms into the environment. Also you have this new biomaterial that you don’t want to get eaten by mold. So you don’t want it to break down immediately.”

Needless to say, in order to withstand Red Planet’s harsh atmospheric conditions, the drones need a bit more than that. However, the iGEM team has managed to create radiation and extreme temperature-resistant bacteria by inserting genes of “extremophile” bacteria in E. Coli.

“Let’s say you have certain cells that can sense carbon dioxide levels or radiation, and do some kind of color readout. You could be sensing things in parallel, without bulky sensors on your drone,” said Block, pointing out that genetically-modified biological sensors could even be used to replace conventional sensor hardware.

Not at last, the team intends to waterproof the bio-UAVs, and they turned to paper wasps to achieve this. “Paper wasps are an organism that when they build their nests they actually use cellulose, plant cellulose,” explains Jotthe Kannappan, a junior at Stanford, and a member of the iGEM team. “They chew bark on trees, spit it out, and something in that saliva makes cellulose really waterproof and really thermal-resistant. So our experiment was to figure out what protein in the saliva does that.”

The team even thought of a way of making the radiation-resistant cells become biodegradable: “It was changing what some of the genetic code stood for,” says Rothschild. “It’s like you took a book and every time you saw the word ‘today,’ you read the word ‘blue.’ The idea is that if this crashed somewhere and there were living cells on it, they would be speaking a different language than other cells in the environment.”

This is a very ambitious project that should become a reality, if we are to go to Mars anytime soon.

Be social! Follow Walyou on Facebook and Twitter, and read more related stories about the Zano selfie drone, or the GoPro action drones.

Game Controller Measures Player’s Emotions, Adapts the Game to Them

Emotion-Reading Game Controller

Engineers from Stanford University invented a way of making games even more interactive than when using an Oculus Rift VR headset. Their method involves a game controller that reads emotions and adapts the game consequently.

The game controller that Stanford University researchers developed in collaboration with Stanford University was first presented at CES 2014, but it looks like the team has made some progress since then. It’s a pity that the developers of this innovative device have not yet found a name for it. Still, that doesn’t take away from the revolution this game controller may start if it ever hits the market. Of course, there would have to be some support from game developers as well, as without that the controller doesn’t do more than just measuring the frustration levels.

Corey McCall, leader of the game controller project and a PhD candidate at the Stanford University explained how the device is able to create a more interactive experience.”By measuring those outputs, we can understand what’s happening in the brain almost instantaneously.”

According to McCall, the game controller would also act in lieu of parental control, “We can also control the game for children. If parents are concerned that their children are getting too wrapped up in the game, we can tone it down or remind them that it’s time for a healthy break.”

From a gamer’s viewpoint, however, this won’t end up well! Let’s take a look at the possible options. If the player enjoys a game very much, he will continue playing it. On the other hand, if the game seems impossible to finish or it looks like a particular boss is impossible to kill, the player will undoubtedly experience some frustration that will determine him to try over and over again until he succeeds.

Below is a video that was posted a few days ago on Stanford University’s YouTube channel that details how the emotions-reading game controller is supposed to look and work.

At this point, I believe it would be much easier to make this controller compatible with Android and iOS smartphones, tablets and consoles, as mobile games are probably easier to adapt to physiological reactions. Either way, it’s great to see that researchers find new ways to make games more interesting. Creating games that adapt to the way we play them and to the way we react to them might change people’s perception regarding dumb NPCs that perform the same actions all the time.

Be social! Follow Walyou on Facebook and Twitter, and read more related stories about Valve’s new Steam controller and the Audojo iPad controller.

EyeGo Accessories Enable Eye Exams With Nothing More but a Smartphone

EyeGo Adapter - Smartphone Eye Exam 2

Even though it’s not the best idea in the world to diagnose yourself or have someone from the family do it at home, the EyeGo adapters for smartphones might trigger the alarm that would push people to go see an ophthalmologist.

An ophthalmologist’s office includes all sorts of devices that can be used to examine someone’s eyes in great detail. This way, not only the diopters can be determined, but also the thickness of the cornea, for example, among many-many other things. In rural areas and developing countries, such complex examinations usually cannot be performed either because there isn’t any trained personnel to do them, or the necessary equipment cannot be bought. In such cases, the EyeGo adapters for smartphones could be used for performing eye exams in an easy, quick manner.

Assistant professor of ophthalmology Dr. Robert Chang and ophthalmology resident Dr. David Myung developed two EyeGo adapters that attach easily to any smartphone with a decent camera. Why two adapters? It’s quite simple, actually. One of them is used for analyzing the cornea (the front surface of the eye), while the other one focuses the light on the retina (the back of the eye). While not exactly professional equipment, the EyeGo system does a great job in preliminary identifying any eye problems that people might have.

As the two developers said, the EyeGo adapters are meant to “make it easy for anyone with minimal training to take a picture of the eye and share it securely with other health practitioners or store it in the patient’s electronic record.” It’s quite obvious that these smartphone adapters will put ophthalmology under a new light, so I only hope that the two researchers continue their work.

When I was 4-years old, my grandparents discovered that the center of my eyes looked clouded, after noticing that I was standing really close to the TV to see whatever was on. There hadn’t been any cataracts cases, or any severe eye diseases prior to my case in the family, so there was nothing suggesting that they should check my eyes periodically. From this point of view, I regret being born in a time when medical technology wasn’t as widely spread as it is today. While my problem couldn’t have been avoided, as it had a genetic characteristic, it could’ve been discovered earlier, probably making things a bit better for me.

Be social! Follow Walyou on Facebook and Twitter, and read more related stories about the smart glasses that enable nurses to see through your skin, and the castAR glasses that could pose a threat for Oculus Rift.