The legislation would impose stricter privacy rules and safeguards for children on the internet and social media, but concerns about free speech and fierce industry lobbying pose challenges.
The Senate on Tuesday passed bipartisan legislation to impose sweeping safety and privacy requirements for children and teens on social media and other technology platforms, voting overwhelmingly to send the measure to the House, where its fate was uncertain.
Passage of the measure, which has been the subject of a dogged advocacy campaign by parents who say their children lost their lives because of something they found or saw on social media, marked a rare bipartisan achievement at a time of deep polarization in Congress.
Despite the lopsided support among Republicans and Democrats, the package faces a fierce lobbying effort by technology companies that are resisting new regulation, and deep skepticism among free speech advocates who argue that it would chill individual expression and potentially harm some of those whom the bill aims to protect.
The vote was 91 to 3 to approve the measure, sending it to the House, which is in a summer recess until September. The legislation is the product of years of work by lawmakers and parents to overhaul digital privacy and safety laws as social networking sites, digital gaming and other online platforms increasingly dominate children’s and teens’ lives.
“I used to run marathons, and I feel like I used to feel halfway through: ‘Wow, I got this far, and I’m feeling good,’” Senator Richard Blumenthal, Democrat of Connecticut and one of the chief sponsors of the legislation, said in an interview on Tuesday. “The second half can be the toughest in some ways, but also the easiest psychologically because you see the end in sight.”
The centerpiece of the legislation would create a “duty of care” for social networking platforms that mandates they protect minors against mental health disorders and from abuse, sexual exploitation and other harms. Companies could be held liable for failing to filter out content or limit features that could lead to those adverse impacts.