10 Mobile App Frameworks Speeding Up Development


Developing a mobile app has become as easy as developing a web sites with the use of app frameworks. Today people familiar with HTML, CSS, and JavaScript can easily create apps for Android and iOS....

Apple is Looking for Maps Web UI Designer to Work on Secret Project


A brand new top secret project is in the works for Apple. That is why it placed a wanted ad in the job listings section recently. According to the advertisement, Apple was offering anyone qualified...

Rogers LTE hits 18 new regions, delivers speedy data in Saskatoon

Rogers LTE hits 18 new regions, delivers speedy data in Saskatoon

Rogers promised that October 1st would be a grand day for its LTE expansion plans, and we're now learning that it might have been underpromising to overdeliver later. The carrier just flicked the 4G switch for 18 cities and regions, or eight more territories than it had promised just two weeks ago. Most of the coverage still focuses on the southern tip of Ontario, including London, the Oshawa area and RIM's hometown of Waterloo, but there's a much more trans-Canada bent to the official deployment. Western cities like Saskatoon and Victoria now fit into Rogers' LTE map beyond a previously announced Edmonton, while the Quebec rollout is going past Quebec City to include Sherbrooke and Trois-Rivières. All told, the one day of growth is enough to supply Rogers LTE to almost 60 percent of Canada's population -- a convenient figure when one of the year's more important LTE smartphones just became available less than two weeks prior.

[Thanks, Jon]

Filed under: , , ,

Rogers LTE hits 18 new regions, delivers speedy data in Saskatoon originally appeared on Engadget on Tue, 02 Oct 2012 02:33:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceRogers RedBoard  | Email this | Comments

Google bots learning to read webpages like humans, one step closer to knowing everything

Google bots now read webpages more like humans, one step closer to knowing everything

Google just launched its Knowledge Graph, a tool intended to deliver more accurate information by analyzing the way users search. Of course, with a desire to provide better search results comes a need for improved site-reading capabilities. JavaScript and AJAX have traditionally put a wrench in Google bots' journey through a webpage, but it looks like the search engine has developed some smarter specimens. While digging through Apache logs, a developer spotted evidence that bots now execute the JavaScript they encounter -- and rather than just mining for URLS, the crawlers seem to be mimicking how users click on objects to activate them. That means bots can dig deeper into the web, accessing databases and other content that wasn't previously indexable. Looks like Google is one step closer to success on its quest to know everything.

Google bots learning to read webpages like humans, one step closer to knowing everything originally appeared on Engadget on Thu, 17 May 2012 00:36:00 EDT. Please see our terms for use of feeds.

Permalink Ars Technica  |  sourceswapped.cc/blog  | Email this | Comments