As concerns grow about the harmful effects of social media on teens, platforms such as Snapchat, TikTok and Instagram are incorporating measures they say will make those services safer and more age-appropriate. But those innovations rarely address the underlying problem: algorithms that generate content that can pull anyone, not just teenagers, to dangerous sites.
The tools in question help to some extent. For example, they prevent strangers from sending messages to boys. But they also share serious drawbacks, starting with the fact that teens can get around sanctions by lying about their age. Platforms, on the other hand, leave the monitoring to the parents. And they do little or nothing to veto inappropriate or harmful content generated by algorithms that can affect the mental and physical well-being of teens.
“These platforms are aware that their algorithms can sometimes amplify harmful content, and they do not take steps to stop it,” said Irene Lee, a privacy consultant at the nonprofit Common Sense Media. The more time teens spend online, the more they engage, and the more addicted they are, the more money the platform makes, he said. “I don’t think they have much incentive to change that.”
As an example, consider Snapchat, which on Tuesday introduced new controls for parents called “Family Center,” a tool that allows parents to see who their kids are. Sending a message, even if they don’t deliver its content. One detail: Both parents and kids need to sign up for the service.
Nona Farahnik Yadgar, director of platform policy and social impact at Snap, says it’s like when parents want to know who their child is dating.
If kids go to a friend’s house or visit friends at a mall, she said, parents ask, “Who are you going to see?” or “Where do you know each other from?”. The new tool, he said, “allows parents to have these types of conversations, preserving the privacy and autonomy of teens.”
These conversations are important, experts say. In an ideal world, parents would have frequent conversations with their children about the dangers posed by social networks and the Internet. But according to Josh Golin, CEO of Fairplay, an Internet child watchdog, many kids use a surprising number of platforms, all of which are constantly evolving, and parents hardly have the time to navigate that world. are knowledgeable enough.
“It would be better for the platform to take action than to add a burden to an already overburdened parent,” Golin said.
He said the new controls also don’t solve many of Snapchat’s problems. From the fact that boys can lie about their age to the “compulsive use” that the platform encourages. It also doesn’t help that messages disappear after a short period of time, which facilitates “cyberbullying”.
Farahnik Yadgar said Snapchat has taken “strong measures” to prevent children from saying they are over the age of 13. Those lying about their age will be immediately banned from the platform.
Even boys over the age of 13 lie about their age, exaggerate it, and are given a chance to fix it.
Detecting those lies isn’t easy, but the platform has several ways to do it. If a guy’s friends are all younger, he may have exaggerated his age. Companies use artificial intelligence to detect anomalies. The interests expressed by the user may also reveal their actual age. And parents can catch their kids lying about their age if they try to activate the controls available to them and fail to find their child when they enter their real age.
In March, the state attorney general launched a nationwide investigation into TikTok to study the platform’s potentially harmful effects on children’s mental health.
TikTok is the most popular app among American teens, according to a report released Wednesday by the Pew Research Center, which found that 67% of Chinese use the platform. The company says it promotes age-appropriate user experiences and says that some services, such as direct messaging, are not available to younger users. He claims to have a device that allows monitoring the time a boy spends on stage and what he sees.
But there are some people who say that these measures are not enough.
“It’s easy for kids to get on these controls and do whatever they want,” said Lai of Common Sense Media.
According to Pew, Instagram, owned by Facebook and parent company Meta, is the second most popular app among kids. 62% use it, followed by Snapchat with 59%. Only 32% said they use Facebook, compared to 71% in 2014 and 2015.
Last year, former Facebook employee Frances Haugen revealed that the network knew the algorithms it used were contributing to the mental disorders of many children who use Instagram, especially women.
That revelation prompted several steps to be taken, but Lee said they “turn the issue around without attacking the roots.”