Intruders ‘borrowed’ Tesla’s public cloud for cryptocurrency mining

Tesla isn't immune to the plague of cryptocurrency mining hijacks, it seems. Security researchers at RedLock have reported that intruders gained access to Tesla's Kubernetes console (where it deploys and manages containerized apps) without needing a...

FedEx left sensitive customer data exposed on unsecured server

It seems like there's no end to the data breach stories. Uber covered their problem up, then had to answer to Congress. Equifax's initial response to its massive data exposure added its own security issue. Federal employees were even found stealing d...

Thousands of Amazon S3 data stores left unsecured due to misconfiguration

Thousands of Amazon S3 data stores left unsecured due to misconfiguration

Will Vandevanter, of Help Net Security, made a rather disturbing discovery: thousands of Amazon S3 data "buckets" were improperly configured and left exposed to prying eyes. Vandevanter started his probe by generating URLs using the names of major companies and sites that use Amazon's cloud storage service. In the end he uncovered 12,328 of the so-called buckets -- 1,951 of which were visible to the public. Those folders were home to some 126 billion files that contain everything from personal data hosted by a social networking service, sales records, video game source code and even unencrypted backups of databases. By default, S3 accounts are set to private, which means these stores of potentially sensitive data had to be flipped to public manually -- most likely by accident. Amazon has responded to the discovery by alerting users who might have inadvertently made their files publicly accessible. If you've got an S3 account of your own, now would be an excellent time to double check your own settings. And if you're looking for more details of Vandevanter's research, hit up the source link.

Filed under: ,

Comments

Via: The Verge

Source: Help Net Security

Google Compute Engine brings Linux virtual machines ‘at Google scale’

As anticipated, Google has just launched its cloud service for businesses at Google I/O 2012, called Google Compute Engine. Starting today Urs Holzle announced "anyone with large-scale computing needs" can access the infrastructure and efficiency of Google's datacenters. The company is promising both performance and stability -- Amazon EC2 they're coming for you -- claiming "this is how infrastructure as a service is supposed to work". It's also promising "50 percent more computes per dollar" than competitors. Beta testers will be on hand at later meetings to give impressions of the service, if you want to know how running your apps on 700,000 (and counting) cores feels. During the presentation we got a demo of a genome app and we're sure if we understood what was going on, it would have been impressive. Hit the source links below for more details on "computing without limits" or to sign up for a test yourself.

Check the live blog for more details as they're revealed.

Check out our full coverage of Google I/O 2012's developer conference at our event hub!

Google Compute Engine brings Linux virtual machines 'at Google scale' originally appeared on Engadget on Thu, 28 Jun 2012 13:47:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceGoogle Developers Blog, Google Compute Engine  | Email this | Comments