Nissan’s interactive robots team up to make in-car parenting a breeze and favorite activity for babies onboard

Experts suggest, going out with your fussy infant for a drive can lull them to sleep and smoothen their peace. But sitting in the car, facing the other way around in their chair, a child may not always be the happiest and you may not have all the control you desire. To your rescue, Japanese auto giant Nissan has designed cute robots that will make in-car parenting a breeze and a favorite activity for your tiny winy.

This is being made possible by Nissan through Iruyo: The Intelligent Puppet. The automaker has collaborated with compatriot Akachan Honpo, a baby goods retailer, to make the Iruyo, furry babysitter. It comes built-in with a range of sensors and communication devices to ensure parents get an affordable toy robot they can safely have around their baby.

Designer: Nissan x Akachan Honpo

To cut the chase, Iruyo: The Intelligent Puppet is a moniker used for a set of two similar-looking state-of-the-art robotic companions that vary visually in their size and come in thoughtfully fashioned red and subtler pink and white colors. The little one, aptly called the Baby Iruyo, is designed to sit near the driver – essentially in the cup holder – while the elder Iruyo rests in the backseat, facing the child sitting in their chair.

Baby Iruyo is programmed to understand voice commands; guess, only in Japanese for now. So, when the child gets cranky in the backseat, the parent can speak some consoling commands such as “I’m here” or “play peek-a-boo” and the little Iruyo relays that to the Iruyo, facing the child, which then makes suitable gestures to keep the kid entertained.

Well, Iruyo is either avatar is only a concept for now, but already the highly researched companion robot has been found to be a preoccupying activity in 90 percent of babies. Half of this percentage of babies have even experienced mood enhancements with the robot by their side. This data is instigating the makers to continue with trials and general experience sessions. One such event is slated at the Akachan Honpo store in LaLaport Yokohama on February 10 and 11.

The post Nissan’s interactive robots team up to make in-car parenting a breeze and favorite activity for babies onboard first appeared on Yanko Design.

Senate tells social media CEOs they have ‘blood on their hands’ for failing to protect children

The CEOs of Meta, Snap, Discord, X and TikTok testified at a high-stakes Senate Judiciary Committee hearing on child exploitation online. During the hearing, Mark Zuckerberg, Evan Spiegel, Jason Citron, Linda Yaccarino and Shou Chew spent nearly four hours being grilled by lawmakers about their records on child safety. 

The hearing was the first time Spiegel, Citron and Yaccarino testified to Congress. Notably, all three were subpoenaed by the committee after refusing to appear voluntarily, according to lawmakers. Judiciary Committee Chair Senator Dick Durbin noted that Citron “only accepted services of his subpoena after US Marshals were sent to Discord’s headquarters at taxpayers’ expense.”

The hearing room was filled with parents of children who had been victims of online exploitation on social media. Many members of the audience silently held up photos of their children as the CEOs entered the room, and Durbin kicked off the hearing with a somber video featuring victims of child exploitation and their parents.

“Discord has been used to groom, abduct and abuse children,” Durbin said. “Meta’s Instagram helped connect and promote a network of pedophiles. Snapchat’s disappearing messages have been co-opted by criminals who financially extort young victims. TikTok has become a quote platform of choice for predators to access, engage and groom children for abuse. And the prevalence of CSAM on X has grown as the company has gutted its trust and safety workforce.”

During the hearing, many of the senators shared personal stories of parents whose children had died by suicide after being exploited online. "Mr. Zuckerberg, you and the companies before us — I know you don't mean it to be so — but you have blood on your hands," Senator Lindsey Graham said in his opening remarks. The audience applauded. 

While years of similar hearings have so far failed to produce any new laws, there is growing bipartisan support in Congress for new safety regulations. As Tech Policy Press points out, there are currently more than half a dozen bills dealing with children's online safety that have been proposed by senators. These include the Kids Online Safety Act (KOSA), which would require platforms to create more parental control and safety features and submit to independent audits, and COPPA 2.0, a revised version of the 1998 Children and Teens' Online Privacy Protection Act, which would bar companies from collecting or monetizing children’s data without consent.

Senators have also proposed a number of bills to address child exploitation, including the EARN IT Act, currently in its third iteration since 2020, and the STOP CSAM Act. None of these have advanced to the Senate floor for a vote. Many of these bills have faced intense lobbying from the tech industry, though some companies in attendance said they were open to some bills and some aspects of the legislation.

Spiegel said that Snap supports KOSA. Yaccarino said X supports the STOP CSAM Act. Shou and Citron both declined to specifically endorse the bills they were asked about, but said they were open to more discussions.

Zuckerberg suggested a different approach, saying he supported age verification and parental control requirements at the app store level, which would effectively shift the burden to Apple and Google. "Apple already requires parental consent when a child does a payment with an app, so it should be pretty trivial to pass a law that requires them to make it so parents have control anytime a child downloads an app,” Zuckerberg said.

Meta has come increased pressure in recent months following a lawsuit from 41 states for harming teens’ mental health. Court documents from the suit allege that Meta turned a blind eye to children under 13 using its service, did little to stop adults from sexually harassing teens on Facebook and that Zuckerberg personally intervened to stop an effort to ban plastic surgery filters on Instagram.

Unsurprisingly, Zuckerberg came under particular scrutiny during the hearing. In one awkward exchange, Senator Graham asked Zuckerberg if the parents of a child who died by suicide after falling victim to a sextortion scheme should be able to sue Meta. Zuckerberg, looking uncomfortable, paused and said “I think that they can sue us.”

Later, Senator Josh Hawley pressed the Meta founder on whether he would personally apologize to the parents in the hearing room. Zuckerberg stood up and faced the audience. "I’m sorry for everything you have all been through," he said. "No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer."

Spiegel was also asked to directly address parents. “Mr. Spiegel, there are a number of parents who have children who have been able to access illegal drugs on your platform, what do you say to those parents,” Scenario Laphonza Butler asked the Snap founder. “I’m so sorry,” he said.

As with many past hearings featuring tech CEOs, some lawmakers strayed off topic. Multiple senators pressed Chew on TikTok’s relationship with China, as well as its handling of content moderation during the Israel-Hamas war. Senator Tom Cotton repeatedly asked TikTok's CEO about his citizenship (Chew is Singaporean). 

There were also some bizarre moments, like when Senator John Kennedy asked Spiegel if he knew the meaning of “yada yada yada” (Spiegel claimed he was “not familiar” with the phrase). “Can we agree … what you do is what you believe and everything else is just cottage cheese,” Kennedy asked.

During the hearing, many of the companies touted their existing safety features and parental controls (Meta launched several updates in the lead-up to the hearing). Yaccarino, who repeatedly claimed that X was a “brand new company” said X was considering adding parental controls. “Being a 14-month-old company we have reprioritized child protection and safety measures,” she said. “And we have just begun to talk about and discuss how we can enhance those with parental controls.”

In the US, the National Suicide Prevention Lifeline is 1-800-273-8255 or you can simply dial 988. Crisis Text Line can be reached by texting HOME to 741741 (US), 686868 (Canada), or 85258 (UK). Wikipedia maintains a list of crisis lines for people outside of those countries.

This article originally appeared on Engadget at https://www.engadget.com/senate-tells-social-media-ceos-they-have-blood-on-their-hands-for-failing-to-protect-children-170411884.html?src=rss

Proposed California bill would let parents block algorithmic social feeds for children

California will float a pair of bills designed to protect children from social media addiction and preserve their private data. The Protecting Youth from Social Media Addiction Act (SB 976) and California Children’s Data Privacy Act (AB 1949) were introduced Monday by the state’s Attorney General Rob Bonta, State Senator Nancy Skinner and Assemblymember Buffy Wicks. The proposed legislation follows a CA child safety bill that was set to go into effect this year but is now on hold.

SB 976 could give parents the power to remove addictive algorithmic feeds from their children’s social channels. If passed, it would allow parents of children under 18 to choose between the default algorithmic feed — typically designed to create profitable addictions — and a less habit-forming chronological one. It would also let parents block all social media notifications and prevent their kids from accessing social platforms during nighttime and school hours.

 “Social media companies have designed their platforms to addict users, especially our kids. Countless studies show that once a young person has a social media addiction, they experience higher rates of depression, anxiety, and low self-esteem,” California Senator Nancy Skinner (D-Berkeley) wrote in a press release. “We’ve waited long enough for social media companies to act. SB 976 is needed now to establish sensible guardrails so parents can protect their kids from these preventable harms.”

L to R: California AG Rob Bonta, CA State Senator Nancy Skinner and Assemblymember Buffy Wicks standing at a podium in a classroom.
L to R: California AG Rob Bonta, State Senator Nancy Skinner and Assemblymember Buffy Wicks
The Office of Nancy Skinner

Meanwhile, AB 1949 would attempt to strengthen data privacy for CA children under 18. The bill’s language gives the state’s consumers the right to know what personal information social companies collect and sell and allows them to prevent the sale of their children’s data to third parties. Any exceptions would require “informed consent,” which must be from a parent for children under 13.

In addition, AB 1949 would close loopholes in the California Consumer Privacy Act (CCPA) that fail to protect the data of 17-year-olds effectively. The CCPA reserves its most robust protections for those under 16.

“This bill is a crucial step in our work to close the gaps in our privacy laws that have allowed tech giants to exploit and monetize our kids’ sensitive data with impunity,” wrote Wicks (D-Oakland).

The bills may be timed to coincide with a US Senate hearing (with five Big Tech CEOs in tow) on Wednesday covering children’s online safety. In addition, California is part of a 41-state coalition that sued Meta in October for harming children’s mental health. The Wall Street Journal reported in 2021 that internal Meta (Facebook at the time) documents described “tweens” as “a valuable but untapped audience.”

This article originally appeared on Engadget at https://www.engadget.com/proposed-california-bill-would-let-parents-block-algorithmic-social-feeds-for-children-220132956.html?src=rss

Proposed California bill would let parents block algorithmic social feeds for children

California will float a pair of bills designed to protect children from social media addiction and preserve their private data. The Protecting Youth from Social Media Addiction Act (SB 976) and California Children’s Data Privacy Act (AB 1949) were introduced Monday by the state’s Attorney General Rob Bonta, State Senator Nancy Skinner and Assemblymember Buffy Wicks. The proposed legislation follows a CA child safety bill that was set to go into effect this year but is now on hold.

SB 976 could give parents the power to remove addictive algorithmic feeds from their children’s social channels. If passed, it would allow parents of children under 18 to choose between the default algorithmic feed — typically designed to create profitable addictions — and a less habit-forming chronological one. It would also let parents block all social media notifications and prevent their kids from accessing social platforms during nighttime and school hours.

 “Social media companies have designed their platforms to addict users, especially our kids. Countless studies show that once a young person has a social media addiction, they experience higher rates of depression, anxiety, and low self-esteem,” California Senator Nancy Skinner (D-Berkeley) wrote in a press release. “We’ve waited long enough for social media companies to act. SB 976 is needed now to establish sensible guardrails so parents can protect their kids from these preventable harms.”

L to R: California AG Rob Bonta, CA State Senator Nancy Skinner and Assemblymember Buffy Wicks standing at a podium in a classroom.
L to R: California AG Rob Bonta, State Senator Nancy Skinner and Assemblymember Buffy Wicks
The Office of Nancy Skinner

Meanwhile, AB 1949 would attempt to strengthen data privacy for CA children under 18. The bill’s language gives the state’s consumers the right to know what personal information social companies collect and sell and allows them to prevent the sale of their children’s data to third parties. Any exceptions would require “informed consent,” which must be from a parent for children under 13.

In addition, AB 1949 would close loopholes in the California Consumer Privacy Act (CCPA) that fail to protect the data of 17-year-olds effectively. The CCPA reserves its most robust protections for those under 16.

“This bill is a crucial step in our work to close the gaps in our privacy laws that have allowed tech giants to exploit and monetize our kids’ sensitive data with impunity,” wrote Wicks (D-Oakland).

The bills may be timed to coincide with a US Senate hearing (with five Big Tech CEOs in tow) on Wednesday covering children’s online safety. In addition, California is part of a 41-state coalition that sued Meta in October for harming children’s mental health. The Wall Street Journal reported in 2021 that internal Meta (Facebook at the time) documents described “tweens” as “a valuable but untapped audience.”

This article originally appeared on Engadget at https://www.engadget.com/proposed-california-bill-would-let-parents-block-algorithmic-social-feeds-for-children-220132956.html?src=rss

TikTok gives parents even more control over what their teens see

TikTok has recently faced scrutiny over child safety issues in the US and elsewhere due to its youth-skewing userbase and reams of inappropriate content on the platform. Now, the company (owned by China's ByteDance) has announced that it's is giving parents more control over what their teens can see. It's adding new content filtering controls to its "Family Pairing" feature, letting parents filter out videos containing specific words or hashtags — while still keeping kids in the loop. 

TikTok introduced Family Pairing back in 2020 as a way to let parents link directly to their kids' accounts then remotely disable direct messages, set screen time limits and enable a "restricted content" mode. And last year, it added a tool that automatically filters out videos with words or hashtags users may not want to see in their For You or Following feeds. 

The new controls essentially combine those two features, giving parents the option to remotely filter out videos from their kids accounts in For You or Following with specific words or hashtags. "We're bringing this [content filtering] tool to Family Pairing to empower caregivers to help reduce the likelihood of their teen viewing content they may uniquely find jarring," TikTok wrote.

TikTok's latest tool gives parents more control over what their teens can see
TikTok

At the same time, kids will be alerted to their parents' selected filters and can choose not to opt-in, the company told Sky News. "By default, teens can view the keywords their caregiver has added and we believe this transparency can also help to prompt conversations about online boundaries and safety," the company wrote. "We also wanted to make sure we respect young people's right to participate."

At the same time, TikTok announced that it will form a global Youth Council later this year. The aim, it said, will be to "listen to the experiences of those who directly use our platform and be better positioned to make changes to create the safest possible experience for our community."

TikTok has been criticized for exposing children to videos showing self-harm, eating disorders and other inappropriate content, often disguised by slightly altered hashtags designed by bypass moderation. The company is facing new content regulations in UK via the Online Safety Bill, and US lawmakers are working on a Kids Online Safety Act that would force social media companies like TikTok to add online safeguards for children. TikTok was recently banned in Montana, but the company is suing the state on the grounds that the ban violates the First Amendment and other laws. 

This article originally appeared on Engadget at https://www.engadget.com/tiktok-gives-parents-even-more-control-over-what-their-teens-see-093558339.html?src=rss

Senators reintroduce COPPA 2.0 bill to tighten child safety online

Yet more senators are trying to resurrect legislation aimed at protecting kids' online privacy. Senators Bill Cassidy and Ed Markey have reintroduced a "COPPA 2.0" (Children and Teens' Online Privacy Protection Act) bill that would expand and revise the 1998 law to deal with the modern internet, particularly social media.

COPPA 2.0 would bar companies from gathering personal data from teens aged 13 to 16 without their consent. It would ban all targeted advertising to children and teens, and create a "bill of rights" that limits personal info gathering for marketing purposes. The measure would also require a button to let kids and parents delete personal data when it's "technologically feasible."

The sequel potentially makes it easier to take action in the first place. Where COPPA requires direct knowledge that companies are collecting data from kids under 13, 2.0 would cover apps and services that are "reasonably likely" to have children as users. The Federal Trade Commission, meanwhile, would have to establish a division committed to regulating youth marketing and privacy.

Cassidy and Markey portray the bill as necessary to tackle a "mental health crisis" where tech giants allegedly play a role. The politicians argue that social networks amplify teens' negative feelings, pointing to Facebook's own research as evidence.

Social networks have tried to clamp down on misuses of child data. Meta's Facebook and Instagram have limited ad targeting for teens, for instance. However, there have also been concerns that online platforms haven't gone far enough. On top of earlier calls for bans on ad targeting, states like Arkansas and Utah have already passed laws respectively requiring age verification and parental permission for social media. Another Senate bill, the Protecting Kids on Social Media Act, would require parents' approval across the US.

Whether or not COPPA 2.0 makes it to the President's desk for signature isn't clear. The first attempt got stuck in committee ahead of the current Congress session. It also comes right as other senators are making attempts to revive the EARN IT Act (aimed at curbing child sexual abuse material) and the Kids Online Safety Act (meant to fight toxic online content as a whole). All three reintroductions are bipartisan, but they'll need considerably stronger support in the Senate, plus successful equivalents in the House, to become law.

This article originally appeared on Engadget at https://www.engadget.com/senators-reintroduce-coppa-20-bill-to-tighten-child-safety-online-165043087.html?src=rss

Senators reintroduce COPPA 2.0 bill to tighten child safety online

Yet more senators are trying to resurrect legislation aimed at protecting kids' online privacy. Senators Bill Cassidy and Ed Markey have reintroduced a "COPPA 2.0" (Children and Teens' Online Privacy Protection Act) bill that would expand and revise the 1998 law to deal with the modern internet, particularly social media.

COPPA 2.0 would bar companies from gathering personal data from teens aged 13 to 16 without their consent. It would ban all targeted advertising to children and teens, and create a "bill of rights" that limits personal info gathering for marketing purposes. The measure would also require a button to let kids and parents delete personal data when it's "technologically feasible."

The sequel potentially makes it easier to take action in the first place. Where COPPA requires direct knowledge that companies are collecting data from kids under 13, 2.0 would cover apps and services that are "reasonably likely" to have children as users. The Federal Trade Commission, meanwhile, would have to establish a division committed to regulating youth marketing and privacy.

Cassidy and Markey portray the bill as necessary to tackle a "mental health crisis" where tech giants allegedly play a role. The politicians argue that social networks amplify teens' negative feelings, pointing to Facebook's own research as evidence.

Social networks have tried to clamp down on misuses of child data. Meta's Facebook and Instagram have limited ad targeting for teens, for instance. However, there have also been concerns that online platforms haven't gone far enough. On top of earlier calls for bans on ad targeting, states like Arkansas and Utah have already passed laws respectively requiring age verification and parental permission for social media. Another Senate bill, the Protecting Kids on Social Media Act, would require parents' approval across the US.

Whether or not COPPA 2.0 makes it to the President's desk for signature isn't clear. The first attempt got stuck in committee ahead of the current Congress session. It also comes right as other senators are making attempts to revive the EARN IT Act (aimed at curbing child sexual abuse material) and the Kids Online Safety Act (meant to fight toxic online content as a whole). All three reintroductions are bipartisan, but they'll need considerably stronger support in the Senate, plus successful equivalents in the House, to become law.

This article originally appeared on Engadget at https://www.engadget.com/senators-reintroduce-coppa-20-bill-to-tighten-child-safety-online-165043087.html?src=rss

Sorry, but you still have to push this $3,800 electric-assist stroller

Non-parents may not believe it, but pushing a pram around can be a fairly strenuous task, especially when the train gets rough. It’s a full body workout to push two kids under four in my old Uppababy Vista, which weighed the same as an iceberg and had the turning circle of the Titanic. To remedy this, Canadian startup GlüxKind has developed an electrically-assisted stroller that’ll make pushing easier, and can even drive itself, albeit only when your kid isn’t on board.

The GlüxKind Ella is the brainchild of Anne Hunger and Kevin Huang, a couple who were less than whelmed when looking for a stroller for their own daughter. They decided to build their own device by strapping an electric skateboard to a regular stroller, and started developing their product from there. The device has three modes, the first of which is to add electric assist to the wheels as you’re pushing it around.

Trying this in an admittedly limited demo at CES, it feels very much like the sort of power boost you get with an e-bike. You still have to push this thing around, but you only have to make a fairly meager level of effort before the motor kicks in and helps you out. As well as easier forward motion, you’ll also find turning to be a lot snappier than you may expect, useful too when you’re trying to maneuver your rugrat in tight spaces. It’ll also prove useful when going uphill, or if you’re carrying lots of groceries in Ella’s surprisingly large cargo space.

I’m told that the battery will last for around eight hours of mixed use, and you’ll need to charge it at the end of every day, more or les.

You can also set the pram to rock your baby to sleep, moving backwards and forwards by about a foot. This, I’m sure, will be a godsend to parents who are otherwise praying for divine intervention at 3am as their precious child refuses to sleep. I’m aware that there are some safety caveats about using such a feature on a regular basis, but being able to call on the feature in a pinch will surely be an instant-sell to some harangued parents.

The last mode, and the most eye-catching, is self-driving, where the stroller will drive ahead of you by a couple of feet. It’ll maintain power when going up hill, and brake so it stays close to you when you’re going down the other side. But crucially, the system is designed to not work if you put your kid in the seat and expect the pram to do all of the work. A weight sensor in the bassinet and stroller chair will block the function if it detects the presence of a child.

A product like this is, understandably, going to be at the higher end of the price scale, and when it hits Kickstarter this spring, the first 100 units will set you back $3,800. Once that early bird special is done with, the price is likely to climb a little higher, but for that you’ll also get built-in GPS so you can track where your pram is if you’ve asked friends and family to babysit. GlüxKind also has plans to build out a community feature to find and connect like-minded parents — the sort of whom are also prepared to spend north of four grand on a self-driving stroller.

Snapchat Family Center shows parents their children’s friends list

Snapchat has launched a parental control portal that allows parents to keep an eye on who their young teenagers have been chatting with. The new in-app feature called Family Center shows parents their kids' friends list, as well as who they've messaged in the last seven days. Take note that parents can only see who their teens have been talking to, but they won't be able to read their chat history. Snap says the center was designed to "reflect the way... parents engage with their teens in the real world" in that they know (for the most part) who their kids have been hanging out with but don't listen in on their conversations.

In addition, parents can confidentially report accounts they think might be violating Snap's rules straight from the Family Center. Back in January, Snapchat changed its friend recommendation feature following calls for increased safety on the app by making it harder for adults to connect with teen users: In particular, it stopped showing accounts owned by 13-to-17-year-old users in Quick Add. Teens also can't have public profiles and have to be mutual friends to be able to communicate with each other. Plus, their accounts will only show up in search results under certain circumstances, such as if the one searching has a mutual friend with them.

Snap promised to launch new parental controls and other features designed to protect underage users on its service last year. The company revealed its plans in a hearing wherein lawmakers put the pressure on social networks and apps that cater to teens, such as Snapchat and TikTok, to do more to protect children on their platforms. 

Family Center is completely voluntary, and teens can always leave the portal if they want — they'll even be given the choice to accept or ignore a parent's invitation to join. And since the feature was made for underage teens, users who turn 18 will automatically be removed from the tool.

The company plans to roll out more features for the Family Center on top of what it already has. It will allow parents to easily see the newest friends their teens have added in the coming weeks. And over the next months, Snap will add content controls for parents, as well as the ability for teens to notify their parents whenever they report an account or a piece of content.

Instagram is getting ‘parental supervision’ features

Meta is introducing new “parental supervision” features for Instagram and virtual reality. The update will be available first for Instagram, which has faced a wave of scrutiny for its impact on teens and children, with new parental controls coming to Quest headsets over the next few months.

On Instagram, the controls will be part of a new “Family Center,” where parents can set time limits and access information about their teen’s activity on the app. For now, parents will be able to see a list of accounts their teen is following, as well as which accounts follow them. Parents will also be notified if their teen reports another user.

Notably, the update is for now only available in the United States and parents will only be able to access the parental control features if the teens “initiate supervision” within the app themselves. Teens will also need to approve any parental requests for parental supervision. “Over the next few months we’ll add additional features, including letting parents set the hours during which their teen can use Instagram, and the ability for more than one parent to supervise a teen’s account,” Instagram Head Adam Mosseri writes in a blog post.

Parents will be able to access account information of their teens.
Instagram

The new features, which were first promised back in December, arrive after Instagram was forced to “pause” work on a dedicated app for kids younger than 13 after a whistleblower disclosed internal research documenting Instagram’s impact on teens’ mental health. The disclosures prompted lawmakers to push Meta to end work on Instagram Kids entirely. So far, Meta executives have declined to do so.

Mosseri said the company also plans to add similar parental control features to its Quest headsets so parents can also set limits on their children’s activities in virtual reality. Those features, which won’t launch for a few more months, will enable parents to restrict VR content rated for ages 13 and up and set other limits on VR purchases. Meta is also working on a “Parent Dashboard” for the Oculus app so parents can keep tabs on what their children are watching and how much time they are spending in VR.