Mayor Eric Adams says it’s a public health hazard. Gov. Kathy Hochul calls it “poison.” Attorney General Letitia James claims it’s a “crisis.”
In recent weeks, some of New York’s top elected officials have used their bully pulpits to take aim against what, for them, has become a common enemy: Social media and its effect on kids.
“We cannot stand by and let big tech monetize our children’s privacy and jeopardize their mental health,” Adams said in his State of the City address last week.
Adams, Hochul and James — whose offices all maintain active and large social-media presences of their own — are part of a growing national trend of state- and city-level officials pushing for laws that target how major online platforms like TikTok, Instagram and Facebook interact with children.
Lawmakers across 35 states and Puerto Rico introduced legislation last year that was spurred by concern over social media’s effect on youth mental health, according to the National Conference of State Legislatures. Of those, 12 states adopted measures with varying degrees of action, including New Jersey, which launched a commission to study the issue.
Now, New York is on the verge of joining them, with Hochul and James pushing a pair of measures that would restrict social media platforms from collecting data from minors and exposing them to addictive algorithms. And in New York City, Adams’ administration issued a public health advisory last week warning parents not to give their kids access to smartphones or other devices that can access social media until at least age 14.
Adams has gone as far as to compare the harms of social media to the harms of tobacco, with his office publishing a graphic showing a cartoonish pack of cigarettes, each one labeled with a different online platform.
The mayor’s office blasted out the graphic across its social media accounts, naturally.
Hochul, meanwhile, has pointed to the mass shooting at a Buffalo supermarket in 2022, where an 18-year-old gunman who had discovered racist theories on social media platforms and message boards killed 10 Black people.
“When you think about the algorithms on social media that are targeted toward these young people when their minds are still young and developing — their views aren’t cemented yet, but this is where they’re getting influences,” Hochul said last week during an event at the John Jay College of Criminal Justice.
She continued: “It’s poison. It’s poison.”
The push to regulate social media companies comes on the heels of state and local officials urging tech giants to take down videos of people subway surfing following a handful of deaths last year. A 14-year-old died attempting to ride on top of an F train in Brooklyn earlier this month, prompting local lawmakers to once again urge social media companies to monitor their platforms for subway surfing videos and remove them.
In Albany, Hochul and James have joined forces with Brooklyn Sen. Andrew Gounardes and Queens Assemblymember Nily Rozic to push a pair of social media-focused bills.
One of them, the Stop Addictive Feed Exploitation (SAFE) for Kids Act, would prohibit social media companies from subjecting users under the age of 18 to an addictive, algorithm-based feed of posts — unless their parent or guardian specifically consents to it. Instead, minors would be shown a simple, chronological feed with posts only from accounts they follow.
The legislation would also restrict platforms from pinging a minor’s device with a notification from the hours of midnight to 6 a.m., and provide an option for a user to block their own access during those times.
“If a young user wants to have that algorithmic experience or the addictive experience, they have to opt into that with parental consent,” Gounardes said during a Zoom press conference on Friday. “But at the default, you have to show a straight chronological feed of who that user chooses to follow — their family, their friends, the Taylor Swift fan page, whoever, whatever.”
The other bill, known as the New York Child Data Protection Act, would prohibit platforms from collecting and selling the data of users under the age of 18 for advertising purposes. If enacted, those users could opt into allowing data collection — though those under the age of 13 would need parental consent.
Under both bills, the penalty is up to $5,000 per violation. The state attorney general would be able to pursue violations, or the parent or guardian of a minor could sue.
Tech companies and their trade groups are already pushing back, trying to get ahead of any pending regulations.
In recent months, Meta — the parent company of Facebook and Instagram — launched an ad campaign highlighting the push for greater parental controls on its platform. The campaign directs users to a website where the company advocates for a federal bill that would require app stores — not the social media platforms themselves — to verify a user’s age when downloading an app.
Tech:NYC is an organization that represents the city’s tech industry and includes members like Meta and Google, which owns YouTube. In October, the organization’s president and CEO, Julie Samuels, said the pair of bills in Albany would “inadvertently risk” First Amendment and user privacy rights.
“In most cases, identity verification requires users to share multiple government documents, potentially putting their privacy at risk,” Samuels said in a statement. “Before moving forward with these proposals, it is essential that the attorney general find a common best practice for verifying users’ ages that preserves user privacy.”
Similar data-privacy legislation in California was stuck down in the federal courts, with a trade group representing Meta, Google, Amazon and TikTok arguing that the state’s data-privacy measures for children violate free-speech rights and the commerce clause of the U.S. Constitution. The case is currently on appeal after a lower court blocked the law from taking effect.
But James and the bill sponsors said they took a different approach from other states when crafting their bills.
The New York legislation doesn’t focus on the content on the social media platforms themselves. Nor does it take the more extreme approach the state of Montana took, when it tried to block TikTok altogether before the courts stepped in. Instead, the New York legislation focuses on how content is presented to users — and provides ways for their parents to opt in to the approaches the platforms prefer.
Still, James said she’s anticipating legal challenges from tech giants if the Legislature approves the bills.
“We are confident that these will pass constitutional muster,” she said during the Zoom press conference. “We are confident that there will be challenges, but we will be victorious at the end of the day because our objective is to protect children — because unfortunately, the federal government has failed to do so.”
Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, opposes the SAFE for Kids Act. He argues there’s no effective way to verify a user’s age online without forcing them to give up personal information.
“We’ve been trying to police age on the internet since the dawn of the web,” he said. “We’ve seen failed efforts to do this since the 90s, if not earlier. And yet it keeps being the same approach: People try to bake more surveillance into the Internet, track our identities more closely and claim that that’s going to be what keeps us safe online.”
Hochul included the two New York bills in her state budget proposal earlier this month. She and state lawmakers have until March 31 to get a state budget in place for the fiscal year that begins the next day.