Opinion: Beware the dangers of social media
December 21, 2022
Sitting in your classroom, or maybe your office, possibly even relaxing at home…Think about those around you, the things you’ve told them, or specifically held from them. Would you let them go through your phone? From text messages, to bank information, everything you buy? Would you be comfortable with your boss, significant other, or what about someone you didn’t even know? The scary truth is that they already can, most notably with Facebook, Instagram, and other new social media sites.
For the past 20 years there’s been a concentrated push by organizations to focus their advertising through social media, such as the army running their own semi-recruitment ads, and ranging all the way to different companies buying information on users in bulk, making it easier for these organizations to directly target you.
Throughout this rumble in the variety of social media using your own personal information, TikTok has stayed generally on top of the competition, making its way into most of our days one way or another.
Although the past few years haven’t been the cleanest for TikTok, with multiple challenges such as the KIA, the orbeez, and the tide pod, all with their own stories of young kids involved in crime sometimes leading to the death of themselves and those around them.
Regardless of most of these challenges occurring on the Tiktok platform, Tiktok itself does not often take responsibility. Poor monitoring, and lazy management leads to short videos of horrific scenes or challenges with the intent to harm others across the platforms.
Thus far ByteDance, the owner behind TikTok, hasn’t released a statement about these challenges or a plan on how they’re going to keep people safe on their app.
As a growing adult, I’m nervous for myself, my friends, my brother, and those around him. I’m worried that those I know would fall into the trap of one of these challenges. These sites spend a majority of their resources coercing millions into consuming hours of content, content they themselves don’t want to regulate or take responsibility for, leading to thousands of different harmful narratives and lies.
The scariest part is when I get up to look around, I see it working.