Social Media: A Battleground for Humanity Vs. Technology

Anoushka Sarup
3 min readDec 20, 2020

I opened Snapchat today morning to be pleasantly surprised by the fact that I’d taken enough photos to generate my year-in-review for 2020. Although many social media apps now have a similar feature, I’d always found Snapchat’s one interesting due to its categorizing of content to paint the illusion of a truly fulfilling year.

With software that can analyze the images and text present in your photos and videos, it can identify which ones are selfies, which ones are taken outside, what time those photos were taken, and the like. It then proceeds to use that information to present to you a “year of laughs”, “early mornings”, “sharpening those knife skills”, and more. That’s pretty impressive — but even Snapchat’s technology has its limits, as seen in Twitter user @leahclarkxx’s post where the photo used for “a year of flexing green thumbs” was of her Nana’s burial.

A picture of Twitter user @leahclarkxx’s post.
Source: Twitter (user @leahclarkxx)

And this isn’t the first time that we’ve encountered the insensitivity of technology. In 2014, a story went viral of Facebook’s Year in Review showing a grieving dad a photo of his beloved daughter who had passed away that year (his ‘top post’) surrounded by clip-art partygoers. It was tactless, cruel and done without his consent. It was also six years ago.

But if you scroll through the comments under Leah’s Twitter post, it’s clear that nothing has changed: social media is still as harsh as ever. Sure, these features might work great for the vast majority of people, but it refuses to take into account those who’ve experienced losses or hardships. And in a year like 2020, the least one would’ve expected is that these companies would show a little sensitivity.

Which brings us to the great question: are these features even for their users? Even if we ignore this year and say that only a minority of individuals would have experienced such bad years that this content may upset them, why shouldn’t these apps take into account the feelings of the minority? Isn’t the aim of this software to make the platform’s users happy and enrich their experience? Unless, of course, the aim is just to increase engagement as social media companies try to wrestle users away from the others in order to have the greatest reach and power of influence.

But that’s what’s so sad, isn’t it? Platforms that thrive solely on the people that use it refuse to care about those same people. We’ve been reduced to clicks and faceless troves of data as our needs and desires become secondary to these companies’ expansion and income. The technology being designed for us doesn’t have us at its core, and we’re left at the mercy of algorithms and codes which have no heart nor mind to consider the feelings of users.

The need of the hour is encoding ethics into our software — involve a human aspect, design empathetically. Even if we don’t yet have AI that can decide whether a person would want to see something of this nature or not, at least put measures in place to ensure that the user’s consent is being taken to show such content (which a few platforms have started doing, thankfully). However, that doesn’t mean that our goal shouldn’t be to develop technology that abides by certain ethics.

Social media was built upon fundamentally human concepts such as forming connections, sharing moments, documenting emotions. So to see it being ruled by heartless algorithms and AI is disheartening. Humanity versus technology: who will prevail in this social media game?

--

--