GitHub will store all of its public open source code in an Arctic vault

Let's face it, there are a lot of things that could bring about the end of the world as we know it -- heightened political tensions, climate change, even an asteroid. In the event that things go FUBAR, what will happen to the masses upon masses of da...

Tap Strap 2 adds gesture control to any Bluetooth-enabled device

Tap made a name for itself with its futuristic wearable keyboards, now it's introduced a new Minority Report-style feature guaranteed to make you feel like you're in a sci-fi movie. The Tap Strap 2's new AirMouse feature lets you control any Bluetoot...

GNU founder Richard Stallman resigns from MIT, Free Software Foundation

After reports revealed the lengths undertaken by some at MIT to accept donations from convicted sex offender and sex trafficker Jeffrey Epstein, MIT Media Lab director Joi Ito resigned. Now, computer scientist Richard Stallman, founder of the GNU ope...

Microsoft releases its first preview of Power Toys for Windows 10

If you've been a PC user since the days of Windows 95 and Windows XP, then you may recognize the name Power Toys from a set of Microsoft-developed system utilities. After a few generations on the shelf, the concept has returned and now the first prev...

Google, Intel and Microsoft form data protection consortium

It's common to secure data when its sitting put or flying to its destination, but not so much when you're actually using it -- there's still a risk someone could peek at your content while you work. Industry heavyweights might help keep your info se...

Facebook releases tools to flag harmful content on GitHub

Facebook wants to rid the internet of garbage. But it can't do that alone. So today, it's making two of its photo- and video-flagging technologies open-source and available on GitHub. It hopes the algorithms will help others find and remove harmful c...

Google pushes for an official web crawler standard

One of the cornerstones of Google's business (and really, the web at large) is the robots.txt file that sites use to exclude some of their content from the search engine's web crawler, Googlebot. It minimizes pointless indexing and sometimes keeps s...