What we watched: Bluey’s joyful finales

It’s never good to recommend a comedy by saying it makes you weep, but somehow Bluey, a comedy for kids, feels more real and more truthful than anything else on TV. I see so much of myself in Bandit’s triumphs and failures as he tries to parent his two daughters. I nod along to all of his unsuccessful parenting tactics that, I’ll admit, I’ve also tried on my own two kids. And then, at the end of so many episodes, I’ll realize that the front of my t-shirt is wet with tears because I've been crying.

There can’t be many people unfamiliar with Bluey, the biggest kids’ TV series on the planet, if not the biggest series overall. Each seven-minute episode is a slice-of-life sitcom about the Heelers, a family of anthropomorphic dogs living in Brisbane, Australia. Bluey and her younger sister Bingo live with parents Bandit and Chilli. The show started out focused on the playtimes the kids would have with each other or their parents. But it quickly sprawled out to create a rich world in the vein of The Simpsons, with a whole city’s worth of storylines. It can now regularly relegate the Heelers to the background to focus on the show’s deep cast of characters.

It closed out its third season with last Sunday's “The Sign,” a (comparatively) epic 28-minute episode and this week with “Surprise,” a sweet little postscript. The former’s long running time was described as a dry-run for any potential Bluey movie, wrapping up a number of the show’s storylines. It focuses on a wedding taking place at the Heeler’s home in the shadow of the family’s plan to relocate to another city. I won’t spoil too much beyond saying “The Sign” is a story about the bigness of change and how that affects parents and kids alike. Much of it focused on Bandit’s decision to move for a better-paid job and the way that impacted Chilli and the two girls. It’s a complicated issue, especially because it highlights that parents often just want to do what’s best for the kids.

This is a screencap from 'Ghostbasket' but there was no way I was going to pass up an opportunity to post a picture of Bluey and Bingo as their granny characters.
Ludo Studio

“Surprise,” meanwhile, focuses more on the mundane struggle of Bandit trying to play two different games with his daughters at the same time. Much as Bluey wants to be just seven minutes of silly fun, it can’t quite help but be honest about the emotional and physical labor of parenting. All Bandit wants to do is sit down and watch sport on the TV but his daughters won’t allow him that luxury. He’s chased around the house, forced to pretend to teach a tennis ball to ride a bike and then pelted with ping pong balls fired from a toy launcher. (Bluey’s happy to highlight how often Bandit will get hit in the groin as a consequence of whatever game the girls are playing.)

The payoff to all of that effort comes in the final half minute of the episode, which is when I started sobbing. As much as it may be pitched as a palate cleanser after the scale and emotional heft of the previous episode, the final moments offer a real (if pleasant) punch to the gut. I can’t help but feel plenty of parallels in Bluey’s life and that of my own (similarly-aged) daughter, and feel a lot of kinship with Bandit as well. If I’m one one-hundredth as good a parent as this silly cartoon dog who often gets it wrong, then I’ll feel like I’ve done a good job.

There’s been speculation that this third season may be the end for Bluey. Bloomberg reported the uncertainty around creator Joe Brumm’s future with the show, although producer Sam Moor has said it will continue in some form. Any delay would also risk that the child actors – who remain anonymous for their own safety — will age out of being able to play their roles. But in many ways, Bluey can’t not continue given the show is now a multi-billion dollar cash cow for the BBC, which owns a big chunk of the show’s rights.

I don’t want to say goodbye to Bluey and the Heelers, and I’d prefer they kept the cast as-is and let them grow up alongside Bandit and Chilli. That, to me, would be an honest thing to do, rather than indulging in the fakery that dogs so many TV shows which face this problem. But if they have to go, I’ll choose to remember Bluey’s three perfect seasons through the highs and lows of parenting.

This article originally appeared on Engadget at https://www.engadget.com/what-we-watched-blueys-joyful-finales-161527282.html?src=rss

Tinder is making it easier to share date details with family and friends

Tinder has revealed a feature that both helps users share their excitement about a date with loved ones and acts as a safety tool. The Share My Date feature lets users share details about a planned date with a single link.

The URL can point to details including the location, date and time of the rendezvous along with a photo of your match and a link to their profile. The page can include some notes too. You can edit your date plans so those you share that link with have the most up-to-date info. Dates can be set in the app up to 30 days in advance. For those lucky folks out there who have a bunch of matches they make IRL plans with, you can create an unlimited number of dates and share those with your loved ones.

Tinder says that around 51 percent of users under 30 already share date details with their friends, while 19 percent of users do so with their mom. It's always a good idea to let someone know where and when you're going on a date and details about the person you're meeting up with, just to be safe. Share My Date could simplify the process a bit. Back in 2020, Match.com debuted a date check-in feature that let users send details about their date to emergency contacts if things weren't going well.

Tinder will roll out Share My Date over the coming months. It'll be available in the US, UK, Australia, Canada, Singapore, India, Ireland, Germany, France, Spain, Japan, Brazil, Switzerland, Mexico, Netherlands, Italy, Korea, Vietnam and Thailand.

This article originally appeared on Engadget at https://www.engadget.com/tinder-is-making-it-easier-to-share-date-details-with-family-and-friends-040105977.html?src=rss

Senate tells social media CEOs they have ‘blood on their hands’ for failing to protect children

The CEOs of Meta, Snap, Discord, X and TikTok testified at a high-stakes Senate Judiciary Committee hearing on child exploitation online. During the hearing, Mark Zuckerberg, Evan Spiegel, Jason Citron, Linda Yaccarino and Shou Chew spent nearly four hours being grilled by lawmakers about their records on child safety. 

The hearing was the first time Spiegel, Citron and Yaccarino testified to Congress. Notably, all three were subpoenaed by the committee after refusing to appear voluntarily, according to lawmakers. Judiciary Committee Chair Senator Dick Durbin noted that Citron “only accepted services of his subpoena after US Marshals were sent to Discord’s headquarters at taxpayers’ expense.”

The hearing room was filled with parents of children who had been victims of online exploitation on social media. Many members of the audience silently held up photos of their children as the CEOs entered the room, and Durbin kicked off the hearing with a somber video featuring victims of child exploitation and their parents.

“Discord has been used to groom, abduct and abuse children,” Durbin said. “Meta’s Instagram helped connect and promote a network of pedophiles. Snapchat’s disappearing messages have been co-opted by criminals who financially extort young victims. TikTok has become a quote platform of choice for predators to access, engage and groom children for abuse. And the prevalence of CSAM on X has grown as the company has gutted its trust and safety workforce.”

During the hearing, many of the senators shared personal stories of parents whose children had died by suicide after being exploited online. "Mr. Zuckerberg, you and the companies before us — I know you don't mean it to be so — but you have blood on your hands," Senator Lindsey Graham said in his opening remarks. The audience applauded. 

While years of similar hearings have so far failed to produce any new laws, there is growing bipartisan support in Congress for new safety regulations. As Tech Policy Press points out, there are currently more than half a dozen bills dealing with children's online safety that have been proposed by senators. These include the Kids Online Safety Act (KOSA), which would require platforms to create more parental control and safety features and submit to independent audits, and COPPA 2.0, a revised version of the 1998 Children and Teens' Online Privacy Protection Act, which would bar companies from collecting or monetizing children’s data without consent.

Senators have also proposed a number of bills to address child exploitation, including the EARN IT Act, currently in its third iteration since 2020, and the STOP CSAM Act. None of these have advanced to the Senate floor for a vote. Many of these bills have faced intense lobbying from the tech industry, though some companies in attendance said they were open to some bills and some aspects of the legislation.

Spiegel said that Snap supports KOSA. Yaccarino said X supports the STOP CSAM Act. Shou and Citron both declined to specifically endorse the bills they were asked about, but said they were open to more discussions.

Zuckerberg suggested a different approach, saying he supported age verification and parental control requirements at the app store level, which would effectively shift the burden to Apple and Google. "Apple already requires parental consent when a child does a payment with an app, so it should be pretty trivial to pass a law that requires them to make it so parents have control anytime a child downloads an app,” Zuckerberg said.

Meta has come increased pressure in recent months following a lawsuit from 41 states for harming teens’ mental health. Court documents from the suit allege that Meta turned a blind eye to children under 13 using its service, did little to stop adults from sexually harassing teens on Facebook and that Zuckerberg personally intervened to stop an effort to ban plastic surgery filters on Instagram.

Unsurprisingly, Zuckerberg came under particular scrutiny during the hearing. In one awkward exchange, Senator Graham asked Zuckerberg if the parents of a child who died by suicide after falling victim to a sextortion scheme should be able to sue Meta. Zuckerberg, looking uncomfortable, paused and said “I think that they can sue us.”

Later, Senator Josh Hawley pressed the Meta founder on whether he would personally apologize to the parents in the hearing room. Zuckerberg stood up and faced the audience. "I’m sorry for everything you have all been through," he said. "No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer."

Spiegel was also asked to directly address parents. “Mr. Spiegel, there are a number of parents who have children who have been able to access illegal drugs on your platform, what do you say to those parents,” Scenario Laphonza Butler asked the Snap founder. “I’m so sorry,” he said.

As with many past hearings featuring tech CEOs, some lawmakers strayed off topic. Multiple senators pressed Chew on TikTok’s relationship with China, as well as its handling of content moderation during the Israel-Hamas war. Senator Tom Cotton repeatedly asked TikTok's CEO about his citizenship (Chew is Singaporean). 

There were also some bizarre moments, like when Senator John Kennedy asked Spiegel if he knew the meaning of “yada yada yada” (Spiegel claimed he was “not familiar” with the phrase). “Can we agree … what you do is what you believe and everything else is just cottage cheese,” Kennedy asked.

During the hearing, many of the companies touted their existing safety features and parental controls (Meta launched several updates in the lead-up to the hearing). Yaccarino, who repeatedly claimed that X was a “brand new company” said X was considering adding parental controls. “Being a 14-month-old company we have reprioritized child protection and safety measures,” she said. “And we have just begun to talk about and discuss how we can enhance those with parental controls.”

In the US, the National Suicide Prevention Lifeline is 1-800-273-8255 or you can simply dial 988. Crisis Text Line can be reached by texting HOME to 741741 (US), 686868 (Canada), or 85258 (UK). Wikipedia maintains a list of crisis lines for people outside of those countries.

This article originally appeared on Engadget at https://www.engadget.com/senate-tells-social-media-ceos-they-have-blood-on-their-hands-for-failing-to-protect-children-170411884.html?src=rss

Proposed California bill would let parents block algorithmic social feeds for children

California will float a pair of bills designed to protect children from social media addiction and preserve their private data. The Protecting Youth from Social Media Addiction Act (SB 976) and California Children’s Data Privacy Act (AB 1949) were introduced Monday by the state’s Attorney General Rob Bonta, State Senator Nancy Skinner and Assemblymember Buffy Wicks. The proposed legislation follows a CA child safety bill that was set to go into effect this year but is now on hold.

SB 976 could give parents the power to remove addictive algorithmic feeds from their children’s social channels. If passed, it would allow parents of children under 18 to choose between the default algorithmic feed — typically designed to create profitable addictions — and a less habit-forming chronological one. It would also let parents block all social media notifications and prevent their kids from accessing social platforms during nighttime and school hours.

 “Social media companies have designed their platforms to addict users, especially our kids. Countless studies show that once a young person has a social media addiction, they experience higher rates of depression, anxiety, and low self-esteem,” California Senator Nancy Skinner (D-Berkeley) wrote in a press release. “We’ve waited long enough for social media companies to act. SB 976 is needed now to establish sensible guardrails so parents can protect their kids from these preventable harms.”

L to R: California AG Rob Bonta, CA State Senator Nancy Skinner and Assemblymember Buffy Wicks standing at a podium in a classroom.
L to R: California AG Rob Bonta, State Senator Nancy Skinner and Assemblymember Buffy Wicks
The Office of Nancy Skinner

Meanwhile, AB 1949 would attempt to strengthen data privacy for CA children under 18. The bill’s language gives the state’s consumers the right to know what personal information social companies collect and sell and allows them to prevent the sale of their children’s data to third parties. Any exceptions would require “informed consent,” which must be from a parent for children under 13.

In addition, AB 1949 would close loopholes in the California Consumer Privacy Act (CCPA) that fail to protect the data of 17-year-olds effectively. The CCPA reserves its most robust protections for those under 16.

“This bill is a crucial step in our work to close the gaps in our privacy laws that have allowed tech giants to exploit and monetize our kids’ sensitive data with impunity,” wrote Wicks (D-Oakland).

The bills may be timed to coincide with a US Senate hearing (with five Big Tech CEOs in tow) on Wednesday covering children’s online safety. In addition, California is part of a 41-state coalition that sued Meta in October for harming children’s mental health. The Wall Street Journal reported in 2021 that internal Meta (Facebook at the time) documents described “tweens” as “a valuable but untapped audience.”

This article originally appeared on Engadget at https://www.engadget.com/proposed-california-bill-would-let-parents-block-algorithmic-social-feeds-for-children-220132956.html?src=rss

Proposed California bill would let parents block algorithmic social feeds for children

California will float a pair of bills designed to protect children from social media addiction and preserve their private data. The Protecting Youth from Social Media Addiction Act (SB 976) and California Children’s Data Privacy Act (AB 1949) were introduced Monday by the state’s Attorney General Rob Bonta, State Senator Nancy Skinner and Assemblymember Buffy Wicks. The proposed legislation follows a CA child safety bill that was set to go into effect this year but is now on hold.

SB 976 could give parents the power to remove addictive algorithmic feeds from their children’s social channels. If passed, it would allow parents of children under 18 to choose between the default algorithmic feed — typically designed to create profitable addictions — and a less habit-forming chronological one. It would also let parents block all social media notifications and prevent their kids from accessing social platforms during nighttime and school hours.

 “Social media companies have designed their platforms to addict users, especially our kids. Countless studies show that once a young person has a social media addiction, they experience higher rates of depression, anxiety, and low self-esteem,” California Senator Nancy Skinner (D-Berkeley) wrote in a press release. “We’ve waited long enough for social media companies to act. SB 976 is needed now to establish sensible guardrails so parents can protect their kids from these preventable harms.”

L to R: California AG Rob Bonta, CA State Senator Nancy Skinner and Assemblymember Buffy Wicks standing at a podium in a classroom.
L to R: California AG Rob Bonta, State Senator Nancy Skinner and Assemblymember Buffy Wicks
The Office of Nancy Skinner

Meanwhile, AB 1949 would attempt to strengthen data privacy for CA children under 18. The bill’s language gives the state’s consumers the right to know what personal information social companies collect and sell and allows them to prevent the sale of their children’s data to third parties. Any exceptions would require “informed consent,” which must be from a parent for children under 13.

In addition, AB 1949 would close loopholes in the California Consumer Privacy Act (CCPA) that fail to protect the data of 17-year-olds effectively. The CCPA reserves its most robust protections for those under 16.

“This bill is a crucial step in our work to close the gaps in our privacy laws that have allowed tech giants to exploit and monetize our kids’ sensitive data with impunity,” wrote Wicks (D-Oakland).

The bills may be timed to coincide with a US Senate hearing (with five Big Tech CEOs in tow) on Wednesday covering children’s online safety. In addition, California is part of a 41-state coalition that sued Meta in October for harming children’s mental health. The Wall Street Journal reported in 2021 that internal Meta (Facebook at the time) documents described “tweens” as “a valuable but untapped audience.”

This article originally appeared on Engadget at https://www.engadget.com/proposed-california-bill-would-let-parents-block-algorithmic-social-feeds-for-children-220132956.html?src=rss

Sundance documentary Eternal You shows how AI companies are ‘resurrecting’ the dead

A woman has a text chat with her long-dead lover. A family gets to hear a deceased elder speak again. A mother gets another chance to say goodbye to her child, who died suddenly, via a digital facsimile. This isn't a preview of the next season of Black Mirror — these are all true stories from the Sundance documentary Eternal You, a fascinating and frightening dive into tech companies using AI to digitally resurrect the dead.

It's yet another way modern AI, which includes large language models like ChatGPT and similar bespoke solutions, has the potential to transform society. And as Eternal You shows, the AI afterlife industry is already having a profound effect on its early users.

The film opens on a woman having a late night text chat with a friend: "I can't believe I'm trying this, how are you?" she asks, as if she's using the internet for the first time. "I'm okay. I'm working, I'm living. I'm... scared," her friend replies. When she asks why, they reply, "I'm not used to being dead."

Christi Angel, speaking to an AI model of her friend via Project December, in the documentary Eternal You
Beetz Brothers Film Production

It turns out the woman, Christi Angel, is using the AI service Project December to chat with a simulation of her first love, who died many years ago. Angel is clearly intrigued by the technology, but as a devout Christian, she's also a bit spooked out by the prospect of raising the dead. The AI system eventually gives her some reasons to be concerned: Cameroun reveals that he's not in heaven, as she assumes. He's in hell.

"You're not in hell," she writes back. "I am in hell," the AI chatbot insists. The digital Cameroun says he's in a "dark and lonely" place, his only companions are "mostly addicts." The chatbot goes on to say he's currently haunting a treatment center and later suggests "I'll haunt you." That was enough to scare Angel and question why she was using this service in the first place.

While Angel was aware she was talking to a digital recreation of Cameroun, which was based on the information she provided to Project December, she interacted with the chatbot as if she was actually chatting with him on another plane of existence. That's a situation that many users of AI resurrection services will likely encounter: Rationality can easily overwhelm your emotional response while "speaking" with a dead loved one, even if the conversation is just occurring over text.

In the film, MIT sociologist Sherry Turkle suggests that our current understanding of how AI affects people is similar to our relationship with social media over a decade ago. That makes it a good time to ask questions about the human values and purposes it's serving, she says. If we had a clearer understanding of social media early on, maybe we could have pushed Facebook and Twitter to confront misinformation and online abuse more seriously. (Perhaps the 2016 election would have looked very different if we were aware of how other countries could weaponize social media.)

A series of cameras captures a person's expression in Eternal You
Beetz Brothers Film Production

Eternal You also introduces us to Joshua Barbeau, a freelance writer who became a bit of an online celebrity in 2021 when The San Francisco Chronicle reported on his Project December chatbot: a digital version of his ex-fiancee Jessica. At first, he used Project December to chat with pre-built bots, but he eventually realized he could use the underlying technology (GPT-3, at the time) to create one with Jessica's personality. Their conversations look natural and clearly comfort Barbeau. But we're still left wondering if chatting with a facsimile of his dead fiancee is actually helping Barbeau to process his grief. It could just as easily be seen as a crutch that he feels compelled to pay for.

It's also easy to be cynical about these tools, given what we see from their creators in the film. We meet Jason Rohrer, the founder and Project December and a former indie game designer, who comes across as a typical techno-libertarian.

"I believe in personal responsibility," he says, after also saying that he's not exactly in control of the AI models behind Project December, and right before we see him nearly crash a drone into his co-founders face. "I believe that consenting adults can use that technology however they want and they're responsible for the results of whatever they're doing. It's not my job as the creator of the technology to prevent the technology from being released, because I'm afraid of what somebody might do with it."

But, as MIT's Turkle points out, reanimating the dead via AI introduces moral questions that engineers like Rohrer likely aren't considering. "You're dealing with something much more profound in the human spirit," she says. "Once something is constituted enough that you can project onto it, this life force. It's our desire to animate the world, which is human, which is part of our beauty. But we have to worry about it, we have to keep it in check. Because I think it's leading us down a dangerous path."

Creating virtual models in Eternal You
Beetz Brothers Film Production

Another service, Hereafter.ai, lets users record stories to create a digital avatar of themselves, which family members can talk to now or after they die. One woman was eager to hear her father's voice again, but when she presented the avatar to her family the reaction was mixed. Younger folks seemed intrigue, but the older generation didn't want any part of it. "I fear that sometimes we can go too far with technology," her father's sister said. "I would just love to remember him as a person who was wonderful. I don't want my brother to appear to me. I'm satisfied knowing he's at peace, he's happy, and he's enjoying the other brothers, his mother and father."

YOV, an AI company that also focuses on personal avatars, or "Versonas," wants people to have seamless communication with their dead relatives across multiple channels. But, like all of these other digital afterlife companies, it runs into the same moral dilemmas. Is it ethical to digitally resurrect someone, especially if they didn't agree to it? Is the illusion of speaking to the dead more helpful or harmful for those left behind?

The most troubling sequence in Eternal You focuses on a South Korean mother, Jang Ji-sun, who lost her young child and remains wracked with guilt about not being able to say goodbye. She ended up being the central subject in a VR documentary, Meeting You, which was broadcast in South Korea in early 2020. She went far beyond a mere text chat: Jang donned a VR headset and confronted a startlingly realistic model of her child in virtual reality. The encounter was clearly moving for Jang, and the documentary received plenty of media attention at the time.

"There's a line between the world of the living and the world of the dead," said Kim Jong-woo, the producer behind Meeting You. "By line, I mean the fact that the dead can't come back to life. But people saw the experience as crossing that line. After all, I created an experience in which the beloved seemed to have returned. Have I made some huge mistake? Have I broken the principle of humankind? I don't know... maybe to some extent."

Eternal You paints a haunting portrait of an industry that's already revving up to capitalize on grief-stricken people. That's not exactly new; psychics and people claiming to speak to the dead have been around for our entire civilization. But through AI, we now have the ability to reanimate those lost souls. While that might be helpful for some, we're clearly not ready for a world where AI resurrection is commonplace.

This article originally appeared on Engadget at https://www.engadget.com/sundance-documentary-eternal-you-shows-how-ai-companies-are-resurrecting-the-dead-153025316.html?src=rss

Instagram will start telling night owl teens to close the app and go to sleep

Instagram has revealed its latest mindfulness feature targeted at teens. When a younger user scrolls for more than 10 minutes in the likes of Reels or their direct messages, the app will suggest that they close the app and get to bed.

These "Nighttime Nudges" will automatically appear on teens' accounts and it won't be possible to switch them off. Instagram didn't specify whether the feature will be enabled for all teenagers or only under-18s. 

The idea, according to Instagram, is to give teens who aren't already using features such as Take a Break reminders to close the app for the night. "We want teens to leave Instagram feeling like the time they spend on the app is meaningful and intentional, and we know sleep is particularly important for young people," Instagram said.

The new tool follows other features Instagram has rolled out to help teens and their parents manage time spent on the app. Along with Take a Break and parental supervision features, this includes the likes of Quiet Mode. The latter enables teens to mute notifications, automatically reply to messages and let their friends and followers know that they're unavailable and doing something else, such as studying or sleeping.

This article originally appeared on Engadget at https://www.engadget.com/instagram-will-start-telling-night-owl-teens-to-close-the-app-and-go-to-sleep-152600078.html?src=rss