Section Branding
Header Content
Parents To Facebook: Don't Make A Kid-Only Instagram, Just A Better Instagram
Primary Content
Social media companies prohibit kids under 13 from signing up because of federal privacy law. But parents like Danielle Hawkins can tell you a different story.
"She got on Instagram and Snapchat without my approval when she was about 12," Hawkins, a mom of four who lives near Detroit, said of her eldest daughter.
The tech companies are well aware of this problem. Facebook CEO Mark Zuckerberg told a congressional hearing in March that his company knows kids get around the age limits on apps like Instagram, the photo-sharing network Facebook owns.
"There [are] clearly a large number of people under the age of 13 who would want to use a service like Instagram," he said.
Now, Facebook is working on a solution for underage kids: "We're exploring having a service for Instagram that allows under 13s on, because we worry that kids may find ways to try to lie and evade some of our systems," Zuckerberg told lawmakers. "But if we create a safe system that has appropriate parent controls, then we might be able to get people into using that instead."
The project, which Facebook calls Instagram Youth, would likely give parents the ability to monitor and limit what their kids do on the app. Facebook hasn't made public any concrete details or timeline, but that hasn't eased the criticism.
Parents say struggles with social apps start at early age
Parents say Zuckerberg is right: many kids are going on social media, despite the age-13 limit set by apps like Instagram, Snapchat and TikTok.
Charity White-Voth, a mom in San Diego, said the struggle began long before her daughter's 13th birthday.
"I was the last holdout of her friend's parents around Snapchat," she said. Her daughter told her all her friends were using the app best known for disappearing messages.
"She was not joking," White-Voth said. "They were on it and they were using it. And I was like, 'I just don't feel comfortable. I don't think it's the right thing to do.'"
She relented once her daughter turned 13, but she still worries that her daughter is too young to appreciate that what she posts online will be on the internet forever.
"I worry about her being 13, having poor impulse control, hormones are raging ... just that inability to think long term," White-Voth said. "I worry about sending something out that's inappropriate, that somehow is going to get screenshot by somebody else."
Another source of unease for many parents is the focus on likes, followers and selfies that is especially pronounced on visual platforms like Instagram, TikTok and Snapchat.
"Body image, who you are, how accepted you are, is a very big part of becoming a teenager," said Hawkins, the Detroit area mom. "Being able to have people on another side of a screen ... tell you who you are or how good you are? You really can't comprehend what that actually does to the psyche."
Her oldest daughter, who signed up for Instagram and Snapchat last year at age 12, is no longer allowed to use social media.
"We had to pull the reins on it. We just realized that it really wasn't beneficial to her education, to her emotional state," Hawkins said.
Growing concern social media use may be linked to mental health problems
These worries about the role screen time in general and social media in particular play in kids' wellbeing are grounded, said Blythe Winslow, co-founder of Everyschool.org, a nonprofit that advises schools on how to use technology.
"Kids have more anxiety and depression ... Empathy is on the decline. Creativity is on the decline. Suicide rates in kids ages 10 to 14 have tripled" between 2007 and 2017, she said, referring to a 2019 report from the Centers for Disease Control and Prevention.
"Parents fear that social media might be linked to a lot of those problems," she said.
As the mom of two tween girls, Winslow knows first-hand how hard these choices are for parents.
"My 11-year-old has been gunning for social media probably since she was eight or nine years old," she said. "Most of her friends have TikTok, and they love TikTok."
Researchers say the risks to kids from being on platforms where they can interact with adults are urgent. A recent report from the nonprofit Thorn, which builds technology to defend children from online sexual abuse, found more than a third of kids ages 9-12 said they had a "potentially harmful online experience" with someone they believed was 18 or older. Nineteen percent reported having an online sexual interaction with someone they believed to be an adult.
Many are skeptical a social media network just for kids would keep out adults with bad intentions.
"If you build a community for children, adults that really want to get into that will figure out how to get into it as well," said Julie Cordua, Thorn's CEO.
Critics seize on fears to pressure Facebook
These fears are not just unsettling parents. They're fueling a backlash to Instagram Youth from child safety advocates, members of congress, and 44 attorneys general, who are urging Facebook to scrap the idea entirely.
Critics cite worries about online predators, links to depression and body image concerns and fears for kids' privacy. And, they say, Facebook just doesn't have a good track record when it comes to protecting users.
Jim Steyer, chief executive of the advocacy group Common Sense Media, describes Facebook's tactics as "the classic brand marketing approach, which is, hook kids as early as possible" — which benefits its business by ensuring a pipeline of users.
"One, you get their loyalty from cradle to grave. And two, if you're lucky, you get their parents to come with them," he said.
Facebook says Instagram Youth is still in the early stages and that it's prioritizing safety and privacy.
"As every parent knows, kids are already online. We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing," Facebook spokesperson Liza Crenshaw said in a statement.
"We will develop these experiences in consultation with experts in child development, child safety and mental health, and privacy advocates. We also look forward to working with legislators and regulators. In addition, we commit to not showing ads in any Instagram experience we develop for people under the age of 13," she said.
"We're not asking social media to parent our kids"
Some parents NPR spoke with said they would be interested in letting their kids use a version of the app with more limited content and the ability to monitor what they're up to.
But San Diego dad Buyung Santoso says his kids, ages 11 and 13, wouldn't go for that.
"My daughter said yesterday that she didn't think that it was going to work," he said, "because kids can do whatever they want, regardless of whether you need permission or not."
Critics say instead of creating new apps for children, tech companies should concentrate on making their existing products safer for the kids they know are already on there.
"The most important thing platforms can do is not close their eyes to it, but deeply recognize how their platforms will be abused and build for that — to make that most vulnerable user safe," said Cordua, the Thorn CEO.
Titania Jordan works at Atlanta software company Bark, which helps parents monitor their kids' online activity, and is mom to a 12-year-old son who loves TikTok and Snapchat.
"We're not asking social media to parent our kids," she said. "It's not their job. Just don't make our job harder."
Editor's note: Facebook is among NPR's financial supporters. TikTok helps fund NPR-produced videos from Planet Money that appear on the social media platform.
Copyright 2021 NPR. To see more, visit https://www.npr.org.