The Surgeon General says that social media companies have a role to play in protecting the mental health of children since the misuse of its platforms appears to contribute to the problem.
That came in an advisory report about what Surgeon General Vivek H. Murthy said was an alarming increase in certain mental health challenges for young people and what can be done to stem that troubling tide.
He signaled that Big Tech platforms, when not “deployed” safely and responsibly, can contribute to the mental health problem by “pitting us against each other, reinforcing negative behaviors like bullying and exclusion, and undermining the safe and supportive environments young people need and deserve.”
He suggested that the pandemic has put even greater urgency on the need for action.
Under the heading of “What Social Media, Video Gaming, and Other Technology Companies Can Do,” the report echoed many of the issues being discussed Wednesday (Dec. 8) in a Senate subcommittee hearing with Instagram CEO Adam Mosseri.
In social media’s “digital public spaces,” said the report, “there can be tension between what’s best for the technology company and what’s best for the individual user or for society. Business models are often built around maximizing user engagement as opposed to safeguarding users’ health and ensuring that users engage with one another in safe and healthy ways. This translates to technology companies focusing on maximizing time spent, not time well spent.”
It cites researchers asserting that Big Tech can expose kids to “bullying, contribute to obesity and eating disorders, trade off with sleep, encourage children to negatively compare themselves to others, and lead to depression, anxiety, and self-harm.”
And while it concedes there are benefits to online activity like connecting with friends, learning new things or accessing healthcare, it says there is also a “clear need” to better understand how technology impacts users most at risk.
The report provided advice for how tech companies can help, including a call to “step up and take responsibility for creating a safe digital environment for children and youth,” and at the least “much more transparency” about their products, including allowing outside researchers to access data, something legislators from both parties have asked Meta (formerly Facebook) to do following whistleblower Frances Haugen’s public release of some internal research.
Among the other recommendations: 1) provide user-friendly tools for healthy online interactions, 2) limit exposure to harmful online content, 3) allow users to opt out of harmful content, and 4) actively promote content that supports mental health and wellbeing as well as equitable access to that content.