The heads of YouTube, Snap and TikTok are responding in turn in Washington.
WASHINGTON – Lawmakers have been hammering Facebook for weeks over how they say the platform is harming its younger users. But they showed on Tuesday that their concerns about data privacy, damaging posts and transparency extend to other major web services as well.
During a hearing that lasted more than three hours, a bipartisan group of senators told executives at YouTube, Snap and TikTok they feared the company’s software could steer young people towards inappropriate posts, mismanaged consumer data and did not do enough to spot dangerous content on their platforms. Lawmakers have repeatedly said their staff may have found harmful content – including self-harm and pornography-related posts – in company products, sometimes while logged in as a teen user.
Senator Richard Blumenthal, Democrat of Connecticut, opened the hearing by accusing companies of increasingly attracting young people to their products.
“All you do is add users, especially children, and keep them on your apps for longer,” said Blumenthal, who heads the Senate Commerce Committee subcommittee that held the audience.
The tough questions reflect growing pressure on the nation’s largest social media companies to protect children who use their products from content that exposes them to violence or danger or lowers their self-esteem. The pressure has risen sharply in the past two weeks, after Frances Haugen, the former Facebook product manager who leaked thousands of pages of internal documents, told the committee how the company knew its products were making teens. worse in their skin.
Lawmakers have increasingly started discussing legislation to better protect children online. A group of House lawmakers have proposed a bill that would open platforms to litigation if their algorithms amplify content related to serious harm. And Mr Blumenthal suggested on Tuesday that U.S. officials could pass a children’s design code similar to the one that recently came into effect in Britain that applies new rules to how businesses use children’s data.
Any new rule would have to go through a blocked Congress. But child welfare proposals don’t necessarily face the partisan divisions that can thwart other attempts to regulate the tech giants.
“This is one of the few areas where Congress can actually do anything and there is a bipartisan consensus,” said Nu Wexler, former communications staff member for tech companies and lawmakers in Washington. “For lawmakers, in some ways child safety is the path of least resistance. “
The companies sent executives with political experience to answer questions. TikTok was represented by Michael Beckerman, its public policy officer for the Americas, who ran a leading lobbying group for internet companies. Leslie Miller, YouTube vice president for government affairs and public policy and former Democratic political assistant, appeared on behalf of the streaming site. Snap, the parent company of Snapchat, sent Jennifer Stout, its vice president of global public policy and former deputy chief of staff to John Kerry.
Companies were quick to try to distance themselves from each other, claiming that they were already taking important steps to protect child users.
Ms Stout said Snapchat was a “social media antidote” and pointed out the differences between Snapchat and Instagram. She said her company’s app focused on connecting people who already knew each other in real life, rather than providing them with a constant stream of content from strangers. And she said he was focusing on privacy, deleting pictures and messages by default.
She also pointed out that Snapchat moderates the public content it promotes more heavily than other social media companies. Human moderators review content from publishers before promoting it on Discover, the public section of Snapchat that contains news and entertainment, Ms. Stout said. Content from Spotlight, Snap’s authoring program that promotes its users’ videos, is screened by artificial intelligence before being distributed and reviewed by human moderators before it can be viewed by more than 25 users, Ms. Stout added.
Mr Beckerman said TikTok is different from other platforms which focus more on direct communication between users.
“This is uplifting and entertaining content,” he said. “People love it.”
He said policymakers should look at systems that check whether users are old enough to use a product, suggesting that legislation should include age verification language “across apps.”
Lawmakers also hammered Mr Beckerman over whether Chinese ownership of TikTok could expose consumer data in Beijing. Critics have long argued that the company would be forced to hand over Americans’ data to the Chinese government if asked.
“Access controls to our data are carried out by our American teams,” said Mr. Beckerman. “And as independent researchers, independent experts have pointed out, the data TikTok has on the app is not important to national security and is of low sensitivity.”
Understanding Facebook Papers
A struggling tech giant. The leak of internal documents by a former Facebook employee provided intimate insight into the operations of the covert social media company and renewed calls for better regulation of the company’s wide reach in the lives of its users.
Senators have repeatedly tried to get companies to commit to more transparency for researchers to investigate the health and safety of their platforms, as well as support elements of potential protection legislation. of privacy.
YouTube’s Ms. Miller declined to be stuck in a series of conversations with senators. When Mr Blumenthal asked if companies would allow independent researchers access to algorithms, datasets and data privacy practices, Ms Miller replied, “It would depend on the details, but we are always looking to partner with experts in these important areas. . “
Mr Blumenthal countered that YouTube’s response “certainly indicates a strong reluctance or even resistance to providing access.”
Likewise, Ms Miller seemed reluctant to engage in aspects of potential privacy legislation, such as a proposal to update the Children’s Online Privacy Protection Act. Specifically, she questioned whether YouTube would support a ban on targeted advertising to children or restrictions on adding “likes” or comments to videos – even though Ms Miller said the company does not ‘already did not allow such features on children’s content.
Companies have often argued that they are already taking the kinds of actions that may be required by laws in the future.
“We believe regulation is necessary, but given the speed at which technology is developing and the speed at which regulation can be implemented, regulation alone cannot do the job,” said Ms. Stout.
Lawmakers have resisted efforts by leaders to portray their employers as the exception to concerns about children’s safety online.
“I understand from your testimony that your defense is this: we are not Facebook,” Blumenthal said. “Being different from Facebook is no defense. This bar is in the gutter.