Snapchat is adding new location tracking abilities to its parental control features. The changes will give parents new visibility into their children’s Snap Map settings and allow them to keep tabs on their whereabouts.
The new features, which will be available “over the coming weeks,” will be added to Snapchat’s Family Center, the app’s portal for parental control features. With the updates, parents will be able to request to view their child’s location or share their own. Parents can also opt to receive “travel notifications” when their child leaves specific places, like their school or home.
Separately, Family Center, which already allows parents to keep tabs on who their children are chatting with, will also allow them to see who their teen has shared their location with in the app’s Snap Map.
That feature could help address some criticism the company has faced about the role its app’s location sharing abilities has played in crucial safety issues. Snapchat’s location sharing has come under particular scrutiny by safety advocates who have alleged it had enabled teens to connect with strangers, including drug dealers and potential predators. The feature was called out in a lawsuit brought by New Mexico’s Attorney General earlier this year over alleged safety lapses at the company.
In its latest update, Snap notes that it bars all users from sharing their location info with users who aren’t already their friends. And the company says it plans to push additional reminders to users about their Snap Map settings “prompting them to be extra thoughtful about their” choices.
This article originally appeared on Engadget at https://www.engadget.com/social-media/snapchat-will-let-parents-track-their-kids-through-family-center-130004215.html?src=rss
Bluesky may still be the underdog in the race for alternatives to X, but the once Twitter-affiliated service is gaining momentum. The app just passed the 15 million user mark after adding more than a million new users over the last week, the company said in an update.
While Bluesky is still considerably smaller than Threads, which with 275 million users is its biggest rival, there are signs that Threads users have been increasingly curious about the upstart. “Bluesky” has been a trending topic on Threads in recent days and an in-app search suggestion shows there are more than 19,000 posts about “Bluesky.” Bluesky itself has also made a push to win over Threads users in recent weeks by posting regularly on the Meta-owned service.
That effort seems to be working. A month ago, Engadget noted, the service had just under 9 million users. Its mobile app also has the top spot in Apple’s App Store, followed by Threads and ChatGPT. Its recent success also seems to be driven, at least in part, by frustration with Elon Musk and X following the US presidential election.
A recent report from web analytics company SimilarWeb found that “more than 115,000 US web visitors deactivated their accounts,” on November 7, “more than on any previous day of Elon Musk’s tenure.” The report also noted that “web traffic and daily active users for Bluesky increased dramatically in the week before the election, and then again after election day,” with Bluesky at points seeing more web traffic than Threads. (Threads’ mobile usage, however, is still “far ahead” of Bluesky.)
SimilarWeb
“In the US, Bluesky got more web visits than Threads in the immediate aftermath of the election,” the report notes. “For context, it’s important to note that both services are app centric, even though they support a web user interface.”
On its part, Bluesky seems intent on distinguishing itself from its larger, billionaire-controlled rivals. The company, which began as an internal project at Twitter before it spun off into an independent entity, has experimented with novel features like custom feeds, user-created moderation services and “starter packs” for new users.
“You're probably used to being trapped in a single algorithm controlled by a small group of people, that's no longer the case,” Bluesky’s COO Rose Wang shared in a video aimed at new users Tuesday. “On Bluesky, there are about 50,000 different feeds … these feeds provide a cozy corner for you to meet people with similar interests. And you can actually make friends again, because you're no longer tied to a dominant algorithm that promotes either the most polarizing posts and or the biggest brands, and that's the mandate of Bluesky.”
This article originally appeared on Engadget at https://www.engadget.com/social-media/bluesky-surges-to-15-million-users-after-getting-a-million-sign-ups-in-one-week-224213573.html?src=rss
Threads could start getting ads much sooner than Meta has let on. The company is now planning to bring ads to its newest app “early next year” with the first ads arriving in January of 2025, according to a new report in The Information.
That suggests Meta is looking to start making money on the rapidly growing service far sooner than Meta executives have previously suggested. In August, when the app reached 200 million users, Mark Zuckerberg said Threads could become the company’s next billion-user service. He said making money off the app would be a "multi-year" effort.
“All these new products, we ship them, and then there's a multi-year time horizon between scaling them and then scaling them into not just consumer experiences but very large businesses,” Zuckerberg said. In the company’s most recent earnings call, Meta CFO Susan Li said the company doesn’t “expect Threads to be a meaningful driver of 2025 revenue at this time.”
According to The Information, Meta is planning a slow rollout for ads on Threads. The company will start with “a small number” of advertisers in January. It’s unclear how quickly the effort may expand. "Since our priority is to build consumer value first and foremost, there are no ads or monetization features currently on Threads," a Meta spokesperson said in a statement.
Meta’s reported plans highlight just how quickly the service has grown in recent months. Threads has 275 million monthly users and is seeing more than 1 million new sign-ups a day, according to Zuckerberg. That makes it by far the largest of the X alternatives that have sprung up over the last couple years.
Bluesky, another popular Twitter-like service, has also seen significant growth recently, adding a million new users in the last week, the company said Tuesday. It is still much smaller than Threads with 15 million users. Like Threads, it also currently has no advertising and the company has said it plans to experiment with subscription-based features.
Update November 13, 2024, 2 PM ET: Added a statement from a Meta spokesperson.
This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-will-reportedly-bring-ads-to-threads-as-soon-as-january-183044211.html?src=rss
President-elect Donald Trump has named Elon Musk as the leader of a new “Department of Government Efficiency," that will “dismantle Government Bureaucracy, slash excess regulations, cut wasteful expenditures, and restructure Federal Agencies.” The Tesla CEO and owner of X will spearhead the effort along with former presidential candidate Vivek Ramaswamy, Trump announced in a statement on Truth Social.
The scope of the role isn’t exactly clear. Trump’s press release said that “the Department of Government Efficiency will provide advice and guidance from outside of Government, and will partner with the White House and Office of Management & Budget to drive large scale structural reform, and create an entrepreneurial approach to Government never seen before.” It also stated that “their work will conclude no later than July 4, 2026.”
Musk shared the news on X, but didn’t indicate how the role might impact his obligations at his various other companies. Musk, who poured millions of dollars into Super Pac boosting Trump’s campaign, has previously talked about his desire to work with Trump to cut government spending. He did, however, joke about potential "merch" for the operation. "Republican politicians have dreamed about the objectives of 'DOGE' for a very long time," Trump's statement said.
This article originally appeared on Engadget at https://www.engadget.com/big-tech/elon-musk-will-lead-a-new-department-of-government-efficiency-donald-trump-says-015521217.html?src=rss
Canada has ordered TikTok to shut down its operations in the country, citing unspecified “national security risks” posed by the company and its parent ByteDance. With the move, TikTok will be forced to “wind up” all business in the country, though the Canadian government stopped short of banning the app.
“The government is taking action to address the specific national security risks related to ByteDance Ltd.’s operations in Canada through the establishment of TikTok Technology Canada, Inc,” Canada’s Minister of Innovation, Science and Industry François-Philippe Champagne said in a statement. “The decision was based on the information and evidence collected over the course of the review and on the advice of Canada’s security and intelligence community and other government partners.”
Canada’s crackdown on TikTok follows a “multi-step national security review process” by its intelligence agencies, the government said in a statement. As the CBCpoints out, the country previously banned the app from official government devices. It also comes several months after the United States passed a law that could ban the app stateside. US lawmakers have also cited national security concerns and the app’s ties to China. TikTok has mounted an extensive legal challenge to the law.
In a statement, a TikTok spokesperson said the company would challenge Canada’s order as well. "Shutting down TikTok’s Canadian offices and destroying hundreds of well-paying local jobs is not in anyone's best interest, and today's shutdown order will do just that,” the spokesperson said. “We will challenge this order in court. The TikTok platform will remain available for creators to find an audience, explore new interests and for businesses to thrive."
This article originally appeared on Engadget at https://www.engadget.com/big-tech/canada-orders-tiktok-to-shut-down-its-business-operations-in-the-country-due-to-national-security-risks-002615440.html?src=rss
New details are continuing to surface about the hacking of US telecom companies by a China-linked group that targeted US officials and campaign staffers. Now, The Wall Street Journalreports that the hackers’ access was even greater than what’s been previously reported, and that the communications of “potentially thousands of Americans” may have been impacted.
Last week, The New York Times reported that FBI investigators suspected call logs and SMS messages had been accessed by the hacking group, known as “Salt Typhoon.” The group reportedly targeted the phones of diplomats and government officials, as well as people associated with both presidential campaigns.
Now, The WSJ is reporting that the hackers, who were “likely” working for a Chinese intelligence agency, spent “eight months or more” in US telecom infrastructure, and that they may have been able to scoop up the data of thousands of people who were in contact with the targeted individuals.
The Journal confirms earlier reports that the hackers “limited their targets to several dozen select, high-value political and national-security figures.” But the hackers, who reportedly exploited routers used by telecom firms, had “the ability to access the phone data of virtually any American who is a customer of a compromised carrier — a group that includes AT&T and Verizon.” Both AT&T and Verizon declined to comment on the report.
This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/new-report-details-vast-spying-by-china-linked-telecom-hackers-010347224.html?src=rss
Meta is opening up its Llama AI models to government agencies and contractors working on national security, the company said in an update. The group includes more than a dozen private sector companies that partner with the US government, including Amazon Web Services, Oracle and Microsoft, as well as defense contractors like Palantir and Lockheed Martin.
Mark Zuckerberg hinted at the move last week during Meta’s earnings call, when he said the company was “working with the public sector to adopt Llama across the US government.” Now, Meta is offering more details about the extent of that work.
Oracle, for example, is “building on Llama to synthesize aircraft maintenance documents so technicians can more quickly and accurately diagnose problems, speeding up repair time and getting critical aircraft back in service.” Amazon Web Services and Microsoft, according to Meta, are “using Llama to support governments by hosting our models on their secure cloud solutions for sensitive data.”
Meta is also providing similar access to Llama to governments and contractors in the UK, Canada, Australia and New Zealand, Bloombergreported. In a blog post, Meta’s President of Global Affairs, Nick Clegg, suggested the partnerships will help the US compete with China in the global arms race over artificial intelligence. “We believe it is in both America and the wider democratic world’s interest for American open source models to excel and succeed over models from China and elsewhere,” he wrote. “As an American company, and one that owes its success in no small part to the entrepreneurial spirit and democratic values the United States upholds, Meta wants to play its part to support the safety, security and economic prosperity of America – and of its closest allies too.”
This article originally appeared on Engadget at https://www.engadget.com/ai/meta-opens-its-llama-ai-models-to-government-agencies-for-national-security-182355077.html?src=rss
If you’re excited, or even just a little curious, about the future of augmented reality, Meta’s Orion prototype makes the most compelling case yet for the technology.
For Meta, Orion is about more than finally making AR glasses a reality. It’s also the company’s best shot at becoming less dependent on Apple and Google’s app stores, and the rules that come with them. If Orion succeeds, then maybe we won’t need smartphones for much at all. Glasses, Zuckerberg has speculated, might eventually become “the main way we do computing.”
At the moment, it’s still way too early to know if Zuckerberg’s bet will actually pay off. Orion is, for now, still a prototype. Meta hasn’t said when it might become widely available or how much it might cost. That's partly because the company, which has already poured tens of billions of dollars into AR and VR research, still needs to figure out how to make Orion significantly more affordable than the $10,000 it reportedly costs to make the current version. It also needs to refine Orion’s hardware and software. And, perhaps most importantly, the company will eventually need to persuade its vast user base that AI-infused, eye-tracking glasses offer a better way to navigate the world.
Still, Meta has been eager to show off Orion since its reveal at Connect. And, after recently getting a chance to try out Orion for myself, it’s easy to see why: Orion is the most impressive AR hardware I’ve seen.
Meta’s first AR glasses
Meta has clearly gone to great lengths to make its AR glasses look, well, normal. While Snap has been mocked for its oversized Spectacles, Orion’s shape and size is closer to a traditional pair of frames.
Even so, they’re still noticeably wide and chunky. The thick black frames, which house an array of cameras, sensors and custom silicon, may work on some face shapes, but I don’t think they are particularly flattering. And while they look less cartoonish than Snap’s AR Spectacles, I’m pretty sure I’d still get some funny looks if I walked around with them in public. At 98 grams, the glasses were noticeably bulkier than my typical prescription lenses, but never felt heavy.
In addition to the actual glasses, Orion relies on two other pieces of kit: a 182-gram “wireless compute puck, which needs to stay near the glasses, and an electromyography (EMG) wristband that allows you to control the AR interface with a series of hand gestures. The puck I saw was equipped with its own cameras and sensors, but Meta told me they’ve since simplified the remote control-shaped device so that it’s mainly used for connectivity and processing.
When I first saw the three-piece Orion setup at Connect, my first thought was that it was an interesting compromise in order to keep the glasses smaller. But after trying it all together, it really doesn’t feel like a compromise at all.
The glasses were a bit wider than my face.
Karissa Bell for Engadget
You control Orion’s interface through a combination of eye tracking and gestures. After a quick calibration the first time you put the glasses on, you can navigate the AR apps and menus by glancing around the interface and tapping your thumb and index finger together. Meta has been experimenting with wrist-based neural interfaces for years, and Orion’s EMG wristband is the result of that work. The band, which feels like little more than a fabric watch band, uses sensors to detect the electrical signals that occur with even subtle movements of your wrist and fingers. Meta then uses machine learning to decode those signals and send them to the glasses.
That may sound complicated, but I was surprised by how intuitive the navigation felt. The combination of quick gestures and eye tracking felt much more precise than hand tracking controls I’ve used in VR. And while Orion also has hand-tracking abilities, it feels much more natural to quickly tap your fingers together than to extend your hands out in front of your face.
What it’s like to use Orion
Meta walked me through a number of demos meant to show off Orion’s capabilities. I asked Meta AI to generate an image, and to come up with recipes based on a handful of ingredients on a shelf in front of me. The latter is a trick I’ve also tried with the Ray-Ban Meta Smart Glasses, except with Orion, Meta AI was also able to project the recipe steps onto the wall in front of me.
I also answered a couple of video calls, including one from a surprisingly lifelike Codec Avatar. I watched a YouTube video, scrolled Instagram Reels, and dictated a response to an incoming message. If you’ve used mixed reality headsets, much of this will sound familiar, and a lot of it wasn’t that different from what you can do in VR headsets.
The magic of AR, though, is that everything you see is overlaid onto the world around you and your surroundings are always fully visible. I particularly appreciated this when I got to the gaming portion of the walkthrough. I played a few rounds of a Meta-created game called Stargazer, where players control a retro-looking spacecraft by moving their head to avoid incoming obstacles while shooting enemies with finger tap gestures. Throughout that game, and a subsequent round of AR Pong, I was able to easily keep up a conversation with the people around me while I played. As someone who easily gets motion sick from VR gaming, I appreciated that I never felt disoriented or less aware of my surroundings.
Orion’s displays rely on silicon carbide lenses, micro-LED projectors and waveguides. The actual lenses are clear, though they can dim depending on your environment. One of the most impressive aspects is the 70-degree field of view. It was noticeably wider and more immersive than what I experienced with Snap’s AR Spectacles, which have a 46-degree field of view. At one point, I had three windows open in one multitasking view: Instagram Reels, a video call and a messaging inbox. And while I was definitely aware of the outer limits of the display, I could easily see all three windows without physically moving my head or adjusting my position. It’s still not the all-encompassing AR of sci-fi flicks, but it was wide enough I never struggled to keep the AR content in view.
What was slightly disappointing, though, was the resolution of Orion’s visuals. At 13 pixels per degree, the colors all seemed somewhat muted and projected text was noticeably fuzzy. None of it was difficult to make out, but it was much less vivid than what I saw on Snap’s AR Spectacles, which have a 37 pixels per degree resolution.
Meta’s VP of Wearable Devices, Ming Hua, told me that one of the company’s top priorities is to increase the brightness and resolution of Orion’s displays. She said that there’s already a version of the prototype with twice the pixel density, so there’s good reason to believe this will improve over time. She’s also optimistic that Meta will eventually be able to bring down the costs of its AR tech, eventually reducing it to something “similar to a high end phone.”
What does it mean?
Leaving my demo at Meta’s headquarters, I was reminded of the first time I tried out a prototype of the wireless VR headset that would eventually become known as Quest, back in 2016. Called Santa Cruz at the time, it was immediately obvious, even to an infrequent VR user, that the wireless, room-tracking headset was the future of the company’s VR business. Now, it’s almost hard to believe there was a time when Meta’s headsets weren’t fully untethered.
Orion has the potential to be much bigger. Now, Meta isn’t just trying to create a more convenient form factor for mixed reality hobbyists and gamers. It’s offering a glimpse into how it views the future, and what our lives might look like when we’re no longer tethered to our phones.
For now, Orion is still just that: a glimpse. It’s far more complex than anything the company has attempted with VR. Meta still has a lot of work to do before that AR-enabled future can be a reality. But the prototype shows that much of that vision is closer than we think.
This article originally appeared on Engadget at https://www.engadget.com/ar-vr/metas-orion-prototype-offers-a-glimpse-into-our-ar-future-123038066.html?src=rss
If you’re excited, or even just a little curious, about the future of augmented reality, Meta’s Orion prototype makes the most compelling case yet for the technology.
For Meta, Orion is about more than finally making AR glasses a reality. It’s also the company’s best shot at becoming less dependent on Apple and Google’s app stores, and the rules that come with them. If Orion succeeds, then maybe we won’t need smartphones for much at all. Glasses, Zuckerberg has speculated, might eventually become “the main way we do computing.”
At the moment, it’s still way too early to know if Zuckerberg’s bet will actually pay off. Orion is, for now, still a prototype. Meta hasn’t said when it might become widely available or how much it might cost. That's partly because the company, which has already poured tens of billions of dollars into AR and VR research, still needs to figure out how to make Orion significantly more affordable than the $10,000 it reportedly costs to make the current version. It also needs to refine Orion’s hardware and software. And, perhaps most importantly, the company will eventually need to persuade its vast user base that AI-infused, eye-tracking glasses offer a better way to navigate the world.
Still, Meta has been eager to show off Orion since its reveal at Connect. And, after recently getting a chance to try out Orion for myself, it’s easy to see why: Orion is the most impressive AR hardware I’ve seen.
Meta’s first AR glasses
Meta has clearly gone to great lengths to make its AR glasses look, well, normal. While Snap has been mocked for its oversized Spectacles, Orion’s shape and size is closer to a traditional pair of frames.
Even so, they’re still noticeably wide and chunky. The thick black frames, which house an array of cameras, sensors and custom silicon, may work on some face shapes, but I don’t think they are particularly flattering. And while they look less cartoonish than Snap’s AR Spectacles, I’m pretty sure I’d still get some funny looks if I walked around with them in public. At 98 grams, the glasses were noticeably bulkier than my typical prescription lenses, but never felt heavy.
In addition to the actual glasses, Orion relies on two other pieces of kit: a 182-gram “wireless compute puck, which needs to stay near the glasses, and an electromyography (EMG) wristband that allows you to control the AR interface with a series of hand gestures. The puck I saw was equipped with its own cameras and sensors, but Meta told me they’ve since simplified the remote control-shaped device so that it’s mainly used for connectivity and processing.
When I first saw the three-piece Orion setup at Connect, my first thought was that it was an interesting compromise in order to keep the glasses smaller. But after trying it all together, it really doesn’t feel like a compromise at all.
The glasses were a bit wider than my face.
Karissa Bell for Engadget
You control Orion’s interface through a combination of eye tracking and gestures. After a quick calibration the first time you put the glasses on, you can navigate the AR apps and menus by glancing around the interface and tapping your thumb and index finger together. Meta has been experimenting with wrist-based neural interfaces for years, and Orion’s EMG wristband is the result of that work. The band, which feels like little more than a fabric watch band, uses sensors to detect the electrical signals that occur with even subtle movements of your wrist and fingers. Meta then uses machine learning to decode those signals and send them to the glasses.
That may sound complicated, but I was surprised by how intuitive the navigation felt. The combination of quick gestures and eye tracking felt much more precise than hand tracking controls I’ve used in VR. And while Orion also has hand-tracking abilities, it feels much more natural to quickly tap your fingers together than to extend your hands out in front of your face.
What it’s like to use Orion
Meta walked me through a number of demos meant to show off Orion’s capabilities. I asked Meta AI to generate an image, and to come up with recipes based on a handful of ingredients on a shelf in front of me. The latter is a trick I’ve also tried with the Ray-Ban Meta Smart Glasses, except with Orion, Meta AI was also able to project the recipe steps onto the wall in front of me.
I also answered a couple of video calls, including one from a surprisingly lifelike Codec Avatar. I watched a YouTube video, scrolled Instagram Reels, and dictated a response to an incoming message. If you’ve used mixed reality headsets, much of this will sound familiar, and a lot of it wasn’t that different from what you can do in VR headsets.
The magic of AR, though, is that everything you see is overlaid onto the world around you and your surroundings are always fully visible. I particularly appreciated this when I got to the gaming portion of the walkthrough. I played a few rounds of a Meta-created game called Stargazer, where players control a retro-looking spacecraft by moving their head to avoid incoming obstacles while shooting enemies with finger tap gestures. Throughout that game, and a subsequent round of AR Pong, I was able to easily keep up a conversation with the people around me while I played. As someone who easily gets motion sick from VR gaming, I appreciated that I never felt disoriented or less aware of my surroundings.
Orion’s displays rely on silicon carbide lenses, micro-LED projectors and waveguides. The actual lenses are clear, though they can dim depending on your environment. One of the most impressive aspects is the 70-degree field of view. It was noticeably wider and more immersive than what I experienced with Snap’s AR Spectacles, which have a 46-degree field of view. At one point, I had three windows open in one multitasking view: Instagram Reels, a video call and a messaging inbox. And while I was definitely aware of the outer limits of the display, I could easily see all three windows without physically moving my head or adjusting my position. It’s still not the all-encompassing AR of sci-fi flicks, but it was wide enough I never struggled to keep the AR content in view.
What was slightly disappointing, though, was the resolution of Orion’s visuals. At 13 pixels per degree, the colors all seemed somewhat muted and projected text was noticeably fuzzy. None of it was difficult to make out, but it was much less vivid than what I saw on Snap’s AR Spectacles, which have a 37 pixels per degree resolution.
Meta’s VP of Wearable Devices, Ming Hua, told me that one of the company’s top priorities is to increase the brightness and resolution of Orion’s displays. She said that there’s already a version of the prototype with twice the pixel density, so there’s good reason to believe this will improve over time. She’s also optimistic that Meta will eventually be able to bring down the costs of its AR tech, eventually reducing it to something “similar to a high end phone.”
What does it mean?
Leaving my demo at Meta’s headquarters, I was reminded of the first time I tried out a prototype of the wireless VR headset that would eventually become known as Quest, back in 2016. Called Santa Cruz at the time, it was immediately obvious, even to an infrequent VR user, that the wireless, room-tracking headset was the future of the company’s VR business. Now, it’s almost hard to believe there was a time when Meta’s headsets weren’t fully untethered.
Orion has the potential to be much bigger. Now, Meta isn’t just trying to create a more convenient form factor for mixed reality hobbyists and gamers. It’s offering a glimpse into how it views the future, and what our lives might look like when we’re no longer tethered to our phones.
For now, Orion is still just that: a glimpse. It’s far more complex than anything the company has attempted with VR. Meta still has a lot of work to do before that AR-enabled future can be a reality. But the prototype shows that much of that vision is closer than we think.
This article originally appeared on Engadget at https://www.engadget.com/ar-vr/metas-orion-prototype-offers-a-glimpse-into-our-ar-future-123038066.html?src=rss
Last month at Meta Connect, Mark Zuckerberg said that Meta AI was “on track” to become the most-used generative AI assistant in the world. The company has now passed a significant milestone toward that goal, with Meta AI passing the 500 million user mark, Zuckerberg revealed during the company’s latest earnings call.
The half billion user mark comes just barely a year after the social network first launched its AI assistant last fall. Zuckerberg said the company still expects to become the “most-used” assistant by the end of 2024, though he's never specified how the company is measuring that metric.
Meta’s assistant isn’t the only AI tool that’s boosting the company’s business. Zuckerberg said that AI improvements in its feed and video recommendations have led to an 8 percent increase in time spent on Facebook and a 5 percent increase for Instagram this year. Advertisers are also taking advantage of the company’s AI tools, he said, with more than 15 million ads created with generative AI in the last month alone. “We believe that there's a lot more upside here,” Zuckerberg said.
Outside of AI, Meta’s Threads app also continues to surge. The service now has “almost 275 million” monthly users, according to Zuckerberg. “It's been growing more than a million sign ups per day,” Zuckerberg said, adding that “engagement is growing too.”
This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-ai-has-more-than-500-million-users-220353427.html?src=rss