Section Branding
Header Content
Facebook parent company Meta releases new parental controls for Instagram
Primary Content
NPR's Leila Fadel talks to Antigone Davis, global head of safety at Meta, about company changes that will address problems with the way teenagers use its platforms.
Transcript
LEILA FADEL, HOST:
It's been a year of intense public scrutiny for Facebook's parent company, Meta, including a Wall Street Journal investigation, a whistleblower and a congressional inquiry into why the company failed to act on its own internal research showing teen users attribute mental health problems to its platforms. The company is releasing a set of tools today that it says will improve things. They include allowing parents to approve downloads their kids make in the company's virtual reality platform, allowing parents to see how much time their children are spending using virtual reality headsets and letting parents invite their kids to allow parental supervision over their Instagram accounts. I asked Meta's head of safety, Antigone Davis, if these things will improve teen's mental health.
ANTIGONE DAVIS: One of the tools that we actually are launching is called Nudge. And essentially, what that does is if we see a teen is engaging with content for an extended period of time, we actually will nudge them to look at different content. And this was actually developed with experts.
FADEL: Even though Instagram is supposed to be just for 13 and over, a lot of kids under 13 are on these platforms, and they're not supposed to be. Is there progress at making sure that that doesn't happen?
DAVIS: Yeah, so we do have specific safeguards in place. There's a screen that pops up when you're setting up an account. We actually allow reporting. We also are developing AI to better identify people who are under the age of 13.
FADEL: Because the age verification - I mean, you can lie.
DAVIS: You can. And there's no - you know, there really is no one panacea for solving that problem. And it's a problem that the industry faces. And we're trying to come up with multiple ways to address that issue.
FADEL: Mata paused work on Instagram Youth for kids 13 and under. And a lot of people think an Instagram just for young people under 13 would only perpetuate more danger for kids. And yet Meta has insisted it does plan to create this platform. Why?
DAVIS: I'm not sure I would categorize it as insisting, but what I would say is this. You mentioned earlier that young people try to get onto technologies under the age of 13. They do. And what is really important is that whatever experience a child is having, it's actually age-appropriate, and it's built with them in mind.
FADEL: Yeah, but a lot of kids will try to get into their parents' alcohol cabinet or try a cigarette, but do we create a young version of that for them?
DAVIS: Well, I wouldn't compare those two things. I think these technologies offer a lot for young people in terms of education, information, the ability to make social connections, the ability to develop social skills, the ability to have fun in ways that just don't fit within that comparison. But I do think that when you think about other things like riding a bicycle, we actually try to take kids along on that journey and help prepare them for using these technologies. Will they look different when a child is under the age of 13? Certainly, they should.
FADEL: I think the reason I talked about it in that way is that even for adults, there are dangers when it comes to using social media. Adults report feeling addicted. It's also linked to mental health struggles in adults. So is it responsible to continue to court younger and younger users onto a platform that can be dangerous even for adults?
DAVIS: What I would say is that this isn't an issue of courting younger users. This is really an issue of trying to understand where people and families are at. Anything that we do, we do with expert guidance. So you can count on us to be thinking and listening to researchers in this area and the boundaries that the research reveals.
FADEL: I'm going to just push back on that a little bit because one of the big concerns last year when internal research was leaked to The Wall Street Journal was that the company didn't act on it at the time. A sample size showed that teens linked body image issues, eating disorders, suicidal thoughts to Instagram. And yet the company didn't act. So has something changed since then when it comes to health and safety and internal research and what it shows?
DAVIS: So let me start by saying that has not been my experience at Meta. So my experience has been that we are constantly looking at these issues and that we are constantly developing policy and product changes that reflect what we learn. That's why we do this kind of research to actually develop and improve our products.
FADEL: Meta's head of safety, Antigone Davis, thanks for being on the program.
DAVIS: Thank you.
RACHEL MARTIN, HOST:
And a note here - Meta pays NPR to license NPR content. Transcript provided by NPR, Copyright NPR.