The voice of the student.
Kids+on+the+web....+how+are+they+protected%3F+

Wikipedia Commons

Kids on the web…. how are they protected?

Kids on the Web

February 4, 2022

Social media isn’t designed for young users. Apps like TikTok, Instagram, and YouTube have been growing in popularity within an incredibly young audience. Whether it’s the ads that aren’t always kid-friendly or the dangers of interacting with strangers, the internet isn’t a safe space for young minds. Making a change is crucial for the safety of our future generations, but is making social media applications designed for children a step in the right direction? 

Many parents and children argue that social media is all for fun and games. Social media gives its users a way to connect with others, usually based on similar interests. It is seen as a space where they can upload content and create their digital footprint.

Facebook is currently working on a version of Instagram targeted at the audience that doesn’t meet the age requirement: children under 13. In a statement released by Adam Mosseri, the Head of Instagram, “We started this project to address an important problem seen across our industry: kids are getting phones younger and younger, misrepresenting their age, and downloading apps that are meant for those 13 or older”. With the increasing rate of technology, children are having online social lives at younger ages than ever before. Whether they’re playing Fortnite in a lobby or enjoying themselves in a Roblox server, they are online and unknowingly facing harm. 

Instagram Kids plans to rely on parental supervision tools and focus on creating “age-appropriate experiences” for tweens. This being said, the app will be planned to require a parent’s permission to create an account, won’t have ads, and will give parents the ability to see their child’s activity on the app (followers, chats, and more). 

Many criticize the current existence of social media apps catered for children. Throughout the years, Google’s Youtube Kids has made news headlines for showing young children violent and sexual cartoons. People have learned to manipulate the app to get inappropriate content on kids’ screens. Youtube Kids has since stated that after a video gets flagged, it is manually reviewed and that “any videos that don’t belong in the app” are removed within hours. 

How will the major companies respond to the call to protect children on the internet? (Wikipedia Commons)

Facebook’s own Messenger Kids describe themselves as a “safer app for kids to connect, communicate and play with family and friends”. However, safety is never guaranteed, hence the word “safer”. Messenger Kids has also faced its own set of problems- more specifically, how easy it is to get inappropriate content past the filters. Inappropriate content is said to be removed in less than a few hours and results in the account being disabled. 

Although the support teams of these apps catered to kids are said to be quick and efficient, there is a lack of consideration for the children who were exposed to such content. There is no way to prevent these occurrences despite the app’s filters. Strangers with the intent to show children inappropriate content will figure their way around any obstacles in doing so. The lack of consequences is what makes this so common. 

The proposition of Instagram Kids has faced backlash from lawmakers. Senators Ed Markey and Richard Blumenthal along with U.S. Representatives Kathy Castor and Lori Trahan attended a hearing on Instagram’s impact on youth in May of 2021. Senator Ed Markey is notorious for prioritizing the safety of children online, being one of the original sponsors of the Children’s Online Privacy Protection Act (COPPA) that was enacted in 1998. This act protects children under 13 from accessing the internet and prevents the collection of personal information online. The lawmakers stated: 

“Facebook has a clear record of failing to protect children on its platforms,” “When it comes to putting people before profits, Facebook has forfeited the benefit of the doubt, and we strongly urge Facebook to abandon its plans to launch a version of Instagram for kids.” 

Lawmakers continued to express concern about the fact that certain content could affect a child’s health. A child’s mind is too impressionable to face the internet by themselves. Whether it’s eating disorders, self-harm, or even dangerous stunts, seeing certain acts may cause encouragement and end in unnecessary trauma. 

After facing backlash, the making of Instagram Kids has been put on hold since September of 2021. They claim to be working on the app’s design to make it more child-appropriate and to work out any possible loopholes. It is unknown when the final version will be released, but time will tell when they try to get it approved by lawmakers once again. 

Simply put, there are too many downsides to creating social media platforms designed for children. The potential harm to a child’s mental health and to privacy is too much for child safety experts and lawmakers to bear. While it is important to recognize that young children play a large role in online presence, we shouldn’t leave room for a single flaw in the app’s design that would potentially put a child in harm’s way. 

Leave a Comment
Donate to The Wave
$495
$1200
Contributed
Our Goal

Your donation will help support not only the student Journalism and Yearbook clubs at Marco Island Academy, but as well as any new equipment, club improvements, and annual website hosting costs.

About the Writer
Photo of Paola Cortazar
Paola Cortazar, Opinions Editor

Paola Cortazar is a senior at Marco Island Academy and the Opinion Editor for The Wave. She enjoys working hard to get good grades in all of her classes...

The Wave • Copyright 2024 • FLEX WordPress Theme by SNOLog in

Donate to The Wave
$495
$1200
Contributed
Our Goal

Comments (0)

All The Wave Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *