New York Gov. Kathy Hochul on Thursday signed a bill that would allow parents to block their children from getting social media posts suggested by a platform’s algorithm, a move to limit feeds critics argue are addictive.
Under the legislation, feeds on apps like TikTok and Instagram would be limited for people under age 18 to posts from accounts they follow, rather than content suggested by an automated algorithm. It would also block platforms from sending minors notifications on suggested posts between midnight and 6 a.m.
Both provisions could be turned off if a minor gets what the bill defines as “verifiable parental consent.”
The law does not take effect immediately. State Attorney General Letitia James is now tasked with crafting rules to determine mechanisms for verifying a user’s age and parental consent. After the rules are finalized, social media companies will have 180 days to implement the regulations.
“We can protect our kids. We can tell the companies that you are not allowed to do this, you don’t have a right to do this, that parents should have say over their children’s lives and their health, not you,” Hochul, a Democrat, said at a bill signing ceremony in Manhattan.
The signing is the first step in what is expected to be a drawn out process of rule making, and a probable lawsuit from social media companies to block the law.
NetChoice, a tech industry trade group that includes X and Meta, has criticized the legislation as unconstitutional.
“This is an assault on free speech and the open internet by the State of New York,” Carl Szabo, vice president and general counsel of NetChoice, said in a statement. “New York has created a way for the government to track what sites people visit and their online activity by forcing websites to censor all content unless visitors provide an ID to verify their age.”
Most of the biggest social media platforms send users a steady stream of suggested videos, photographs and other content, using a computer to try and predict what will keep users entertained and engaged for as long as possible. The algorithms use a variety of factors to curate that content, including what a user has clicked on before and interests of other people with similar preferences.
The bill marks the latest attempt by a state to regulate social media as part of concerns over how children interact with the platforms.
California Gov. Gavin Newsom this week announced plans to work with the Legislature on a bill to restrict smartphone usage for students during the school day, though he didn’t provide exact details on what the proposal would include. Newsom in 2019 signed a bill allowing school districts to limit or ban smartphones while at school.
There hasn’t been broad legislation on the subject at the federal level but it is a common point of discussion in Washington. This week the U.S. surgeon general called on Congress to put warning labels on social media platforms similar to those on cigarettes, citing mental health dangers for children using the sites.
Some tech companies, with pressure mounting, have decided to set up parental controls on their platforms. Last year, Meta, the parent company of Facebook and Instagram, created tools that allowed parents to set time limits on the apps for children.
The New York legislation, debuted last October, had faced major pushback in the Legislature from the tech industry.
“Social media platforms manipulate what our children see online to keep them on the platforms as long as possible,” said James, a Democrat who pushed for the bill. “The more time young people spend on social media, the more they are at risk of developing serious mental health concerns.”
—Anthony Izaguirre, Associated Press