WASHINGTON—U.S. Senator Chris Murphy (D-Conn.) on Thursday spoke at a U.S. Senate Health, Education, Labor, and Pensions Committee hearing exploring the causes of the youth mental health crisis and potential solutions.

Murphy highlighted recent investments to address the mental health crisis, including more than $15 billion from the Bipartisan Safer Communities Act (BSCA): “I'm so proud of this Congress, and many on this committee for passing the BSCA. And I'm glad to get an update today on how that money is being impactful to help our kids. It's part of a trend. We're spending more money today on mental health than ever before. We've gone from about $155 billion in 2010 to $238 billion in 2020. And it's still not enough.”

On getting at the roots of this mental health crisis, including the urgency of addressing social media’s role: “I think this hearing is so important, because I think we've come to the conclusion that unless you get at the root causes of unhappiness and isolation and loneliness, there is almost no amount of money that can make up for that inattention. I wanted to continue this conversation about social media, in part, because I think it's the most immediate public policy concern that this Congress can tackle, and because I think there's easy agreement between Republicans and Democrats, but not because I think it's the whole story. I think, frankly, we probably spend more time than we should talking about this narrow but very important problem that our kids are facing. But I think we can do something about it, so let's not lose the opportunity.”

In a question to Surgeon General Vivek Murthy, Murphy asked how Congress should tackle this issue: “As you look at the sort of cornucopia of policy options that Congress is looking at here, what direction would you point us in to best protect our kids? What are the tactics that the social media companies are using that are most disturbing to you?”

In April, Murphy along with U.S. Senators Brian Schatz (D-Hawai‘i), Tom Cotton (R-Ark.), and Katie Britt (R-Ala.) introduced new legislation to help protect children from the harmful impacts of social media. The Protecting Kids on Social Media Act would set a minimum age of 13 to use social media apps and would require parental consent for 13 through 17 year-olds.  The bill would also prevent social media companies from feeding content using algorithms to users under the age of 18.

A full transcript of Murphy’s exchange with Surgeon General Vivek Murthy:

MURPHY: Thank you very much, Mr. Chairman. Thank you for holding this very important hearing. I'm so proud of this Congress, and many on this committee for passing the Bipartisan Safer Communities Act. And I'm glad to get an update today on how that money is being impactful to help our kids. It's part of a trend. We're spending more money today on mental health than ever before. We've gone from about $155 billion in 2010 to $238 billion in 2020. And it's still not enough.

“But I think this hearing is so important, because I think we've come to the conclusion that unless you get at the root causes of unhappiness and isolation and loneliness, there is almost no amount of money that can make up for that inattention.

“So, Dr. Murthy, I wanted to continue this conversation about social media. In part, because I think it's the most immediate public policy concern that this Congress can tackle, and because I think there's easy agreement between Republicans and Democrats, but not because I think it's the whole story. I think, frankly, we probably spend more time than we should talking about this narrow but very important problem that our kids are facing. But I think we can do something about it, so let's not lose the opportunity.

“I know the administration hasn't endorsed any specific policy proposals, but just give us your take on which direction we should head. So Senator Schatz and Cotton and Britt and I have a piece of legislation that says you got to do age verification and make sure that you can't get on before you're 13, you can't use algorithm boosting on younger kids, and parents have to have a say. Other legislation says you should have a standard that applies to social media companies so that they're only putting healthy content online.

“As you look at the sort of cornucopia of policy options that Congress is looking at here, what direction would you point us in to best protect our kids? What are the tactics that the social media companies are using that are most disturbing to you? And I'll just give you maybe the statistic that is most worrying to me. There's a recent study that shows within two minutes of establishing a TikTok account, a teenager can be fed information, glorifying suicide. Within four minutes of establishing a TikTok account, a teenager can be sent content celebrating eating disorders. That's how quickly really damaging, really dangerous content can get to kids who are in crisis, who are in trouble. And it just compels us in this Congress to do something about it. So give us a little bit of advice.”

MURTHY: “Well, thanks, Senator. And I just want to also appreciate your leadership on the issue of loneliness and isolation. It's been an honor to work together with you on addressing this. And I agree with you that there are multiple drivers of the mental health crisis, social media is one of them. And I agree, it's an important one for us to address a few.

“A few things I would say in terms of avenues to focus on, recognizing that social media has been around for almost two decades. During that time there has been a lot of evolution in the platforms, a lot of different ways in which it is affecting our kids’ lives. And so there's actually a lot to do here if we truly want to make them safe.

“I think one certainly is around enforcement concerning age. Many platforms established 13 as the age at which kids can use social media, by the way that is not based on health grounds, the age of 13, and although many people think it is.  But 40% of kids eight through 12 are on social media. So whatever rules the platforms have in place are very poorly enforced.

“I think the second area where it is we need to protect the privacy, data privacy of kids. Kids at this point do not have sufficient control over their data, and neither do their parents. This data is often used to direct ads to them and other content is driven by the algorithms. We need to give kids and parents control over that data.

“The third is actually around data transparency. Companies are not fully disclosing the data they have about the health impacts of their platforms to kids, to parents, to researchers, and to the public at large. This is what researchers tell us all the time. And without that transparency, we don't even know the full extent of the problem in which kids are most affected and so it's hard to target interventions.

“But finally, I would say when it comes to safety standards, this is a place where I do think we can build models off of other products where we've established safety standards whether it’s for cars, car seats, medication, baby formula, whatever you want to consider as a parallel. But the bottom line is we need to have standards that push companies to assure us and to be able to demonstrate through data that they are not exposing our kids to harmful content, that they are not in fact allowing kids to be bullied and harassed online, particularly by strangers, and also that they're not promoting the use of features that lead to excessive use. We know that kids are vulnerable stage here of development, and they're still pretty least susceptible to some of these features. We can't have companies taking advantage of those. “

“So these are some areas that I think are essential for us. And I would be happy to work with you and other members of this committee, as you develop legislation around this because I think this can't come soon enough.”

MURPHY: “I appreciate our partnership. I appreciate your guidance to this committee. The only thing we cannot afford to do is to stand pat. There's a consensus here that we can find with your help.”

###