YouTube, Snapchat, TikTok Face Senate Scrutiny Over Protecting Children
It's not just Facebook that's under a microscope for the public health risks it may pose — YouTube, Snapchat and TikTok faced aggressive questions from a Senate subcommittee yesterday. Meanwhile, The Washington Post reports Facebook prioritized "angry" emoji reaction posts in news feeds.
The Washington Post:
TikTok, Snap, YouTube Defend How They Protect Kids Online In Congressional Hearing
TikTok, Snapchat and YouTube, all social media sites popular with teens and young adults, faced a barrage of questions and accusations Tuesday from lawmakers who want the companies to do more to protect children online. Executives from all three companies committed to sharing internal research on how their products affect kids — an issue that has come to the forefront in the past several weeks as tens of thousands of pages of Facebook’s internal documents have been revealed by a whistleblower. (Lerman and Lima, 10/26)
TechCrunch:
In Hearing With Snap, TikTok And YouTube, Lawmakers Tout New Rules To Protect Kids Online
The hearing, held by the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security, managed to stay on topic about half of the time. The committee’s Republican members were keen to steer their rare time with a TikTok executive toward questions about privacy concerns over the company’s relationship with the Chinese government. Diversions notwithstanding, a few of the hearing’s more useful moments saw the three policy leads pressed to answer yes/no questions about specific policy proposals crawling through Congress. The hearing featured testimony from Snap VP of Global Public Policy Jennifer Stout, TikTok’s VP and Head of Public Policy Michael Beckerman and Leslie Miller, who leads government affairs and public policy at YouTube. (Hatmaker, 10/26)
NPR:
YouTube, Snapchat And TikTok Child Safety Hearing: 4 Key Takeaways
Lawmakers in the Senate hammered representatives from Snapchat, TikTok and YouTube on Tuesday, in a combative hearing about whether the tech giants do enough to keep children safe online. It marked the first time Snapchat and TikTok have landed in the hot seat in Washington, D.C., and for nearly four hours lawmakers pressed the officials about how the apps have been misused to promote bullying, worsen eating disorders and help teens buy dangerous drugs or engage in reckless behavior. (Allyn, 10/26)
In updates on the firestorm surrounding Facebook —
AP:
America 'On Fire': Facebook Watched As Trump Ignited Hate
Leaked Facebook documents provide a first-hand look at how Trump’s social media posts ignited more anger in an already deeply divided country that was eventually lit “on fire” with reports of hate speech and violence across the platform. Facebook’s own internal, automated controls, meant to catch posts that violate rules, predicted with almost 90% certainty that Trump’s message broke the tech company’s rules against inciting violence. Yet, the tech giant didn’t take any action on Trump’s message. (Seitz, 10/27)
The Washington Post:
Facebook Prioritized ‘Angry’ Emoji Reaction Posts In News Feeds
Five years ago, Facebook gave its users five new ways to react to a post in their news feed beyond the iconic “like” thumbs-up: “love,” “haha,” “wow,” “sad” and “angry.” Behind the scenes, Facebook programmed the algorithm that decides what people see in their news feeds to use the reaction emoji as signals to push more emotional and provocative content — including content likely to make them angry. Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook’s business. (Merrill and Oremus, 10/26)