Content

Wednesday, May 1, 2024

Meta's Effect of Content Control

By: Jose Galvan


Social Media tab of the author. While self perceived as bare bones, more apps become "necessary" in navigating socializing online


CONTENT WARNING: ARTICLE CONTAINS DISCUSSIONS ON TOPICS OF SELF-HARM AND SUICIDE

Back in January, the social media apps Instagram and Facebook's parent company, Meta, has started hiding content about topics of eating disorders, suicide, and self-harm and the like from teenagers' accounts. 

While Meta has previously strived to avoid recommending non-age appropriate content to these accounts, they are now full pushing to block all of this kind of content to avoid teenagers from being exposed to such sensitive topics. The problems for this regulation start in multiple avenues of effectiveness, time, and the idea of censorship.

An immediate problem when it comes to effectiveness is the likelihood of people lying about their age when signing up for these apps. While certain states have attempted to implement age checks for things like pornographic websites, there still has yet to be an effective and universal way to verify age (You can't ID a 14 year old who doesn't even have an ID). As such, a restriction like this heavily works off the honor code, and as news about such implementation circulates, more young people are bound to lie to not get content restrictions. In interviews with students, while a small sample, all have admitted to lying about their age when signing up for social media, at least initially. 

"Even back in middle school, I was like 20 years old on all the apps," says Jagger Clark, one of the students interviewed, "it's just what everyone did, I thought you weren't able to join unless you were an adult.". Clark went on to talk about the effects that unrestricted internet and social media access has on someone so young. "You would scroll past like funny memes or something and then it's just a video of someone who just hanged themselves" Clark recalls. While appreciating the steps that Meta is taking, Clark doesn't believe they'll be too effective without intervention and support from parents. 

That's where another important factor comes in for the problem of Meta, and subsequently other social media platforms, is the dependence of home life like parental intervention to really get this content restriction to work. However, parents don't seem too likely to want to work with Meta on this, because of the amount of time it took to implement such regulation. In an interview with the Associated Press, Josh Golin, executive director of the children’s online advocacy group Fairplay, expressed criticism for the time it took to implement such content restrictions, calling it, "an incredible slap in the face to parents who have lost their kids to online harm on Instagram."

This is a sentiment that rang very similar for another interviewed student. Danny Huynh, a student on the Armstrong Campus, was surprised to see that they can restrict content. While not using social media as much themselves, they admit to having it since middle school, as well as being another student that lied about their age on the platforms. "I mean with all that you see that just shouldn't be on the platform in the first place, I genuinely thought they just couldn't restrict content like that" Huynh says. Huynh also questioned if this was only going into effect/being pushed more because of the U.S. Government's increased interest in regulation of social media apps, specifically citing the hearings of both Mark Zuckerberg of Meta, and Shou Chew of TikTok.

A big issue that comes up, especially when bringing this up as an American discussion, is the idea of censorship in this country. As we have consistently strived to be the "land of the free with certain unalienable right", with freedom of speech being one of them, censorship will always have a tough time getting through, no matter what the purpose behind it is. What also follows up is having to blur the line between censorship for the protection of young minds, and censorship to reduce the voices of suicide and self-harm education. 

"People should know about this kind of stuff, not to inspire them to do it, but to get some form of understanding of what people go through," says Ariel Johnson, another student here at Georgia Southern. "I understand that some people may not be as mentally strong to view those kinds of topics, but that shouldn't people who are from looking into it and seeing what people go through.". Johnson further expressed the importance of education of these topics as a measure to prevent them from happening, though acknowledging problems like the Netflix show 13 Reasons Why as a motivator for many teenage suicides.

This problem is intrinsically multifaceted, and has many nuanced issues when it comes to censorship, despite it being for the safety of younger minds. Whether Meta is doing the wrong thing now, or the right thing too late, only time will tell.