Tag Archives: NSF
Feds give Google OK to test Project Wing drone deliveries
Science fund lets kids learn 3D printing, gene modification
ICYMI: Space junk reboot, biological machine v2 and more
Gravitational waves are our window into the early universe
Science confirms that gravitational waves exist
University of Illinois’ Blue Waters supercomputer now running around the clock
Things got a tad hairy for the University of Illinois at Urbana-Champaign's Blue Waters supercomputer when IBM halted work on it in 2011, but with funding from the National Science Foundation, the one-petaflop system is now crunching numbers 24/7. The behemoth resides within the National Center for Supercomputing Applications (NCSA) and is composed of 237 Cray XE6 cabinets and 32 of the XK7 variety. NVIDIA GK110 Kepler GPU accelerators line the inside of the machine and are flanked by 22,640 compute nodes, which each pack two AMD 6276 Interlagos processors clocked at 2.3 GHz or higher. At its peak performance, the rig can churn out 11.61 quadrillion calculations per second. According to the NCSA, all that horsepower earns Blue Waters the title of the most powerful supercomputer on a university campus. Now that it's cranking away around-the-clock, it'll be used in projects investigating everything from how viruses infect cells to weather predictions.
University of Illinois’ Blue Waters supercomputer now running around the clock
Things got a tad hairy for the University of Illinois at Urbana-Champaign's Blue Waters supercomputer when IBM halted work on it in 2011, but with funding from the National Science Foundation, the one-petaflop system is now crunching numbers 24/7. The behemoth resides within the National Center for Supercomputing Applications (NCSA) and is composed of 237 Cray XE6 cabinets and 32 of the XK7 variety. NVIDIA GK110 Kepler GPU accelerators line the inside of the machine and are flanked by 22,640 compute nodes, which each pack two AMD 6276 Interlagos processors clocked at 2.3 GHz or higher. At its peak performance, the rig can churn out 11.61 quadrillion calculations per second. According to the NCSA, all that horsepower earns Blue Waters the title of the most powerful supercomputer on a university campus. Now that it's cranking away around-the-clock, it'll be used in projects investigating everything from how viruses infect cells to weather predictions.
Earth’s largest telescope gets to work in Chile after 30 years of planning
Nestled within the Chilean Andes, the new Atacama Large Millimeter-submillimeter Array (ALMA) is now open for space-staring business. The biggest, most complex telescope project to date, ALMA will be able to peer into the deeper reaches of space with "unprecedented power", according to astronomer Chris Hadfield. Covering around half of the universe's light spectrum, between infrared and radio waves, the new telescope should be able to detect distant planets, black holes and other intergalactic notables.
The Chilean desert's lack of humidity was a big reason for the telescope's placement, 16,400 feet above sea-level, aiding precision of the scope. But it's a global project, with the US contributing $500 million and making it the NSF's biggest investment ever. From Japan, Fujitsu's contribution to exploring the final frontier consists of 35 PRIMERGY x86 servers, tied together with a dedicated (astronomy-centric) computational unit. The supercomputer will process 512 billion telescope samples per second, which ought to be more than enough to unlock a few more secrets of the cosmos.
Via: PopSci
Researchers propose à la carte internet services, overhaul for web infrastructure
A quintet of researchers funded by the National Science Foundation have envisioned a new internet architecture, one where features could be purchased à la carte. The proposed framework would allow users to fine tune their experience by choosing from a variety of connection services. Let's say, for example, that a customer's connection is fine for browsing the web, but it doesn't pass muster for streaming content -- a service dedicated to video delivery could be added to close the gap. "Ultimately, this should make the internet more flexible and efficient, and will drive innovation among service providers to cater to user needs," report co-author Rudra Dutta told The Abstract. A piecemeal next-gen web is no easy feat, however, as it would require revamping the web's infrastructure with new protocols for choosing particular features, completing payments and monitoring network performance. The group's rough blueprint will be presented at a conference next week, but you can thumb through their short paper at the source.
Filed under: Internet
Researchers propose à la carte internet services, overhaul for web infrastructure originally appeared on Engadget on Sat, 11 Aug 2012 07:11:00 EDT. Please see our terms for use of feeds.
Permalink | | Email this | Comments