Facebook, Soylent Green, and you

 

You'll take my Soylent Green from my cold, dead hand!  Wait... no.  OK, you'll pry my social media feed from my cold, dead hand!  There we go.  That's where I was going.  I think.  Anyway, there once was a movie called Soylent Green.  It existed entirely for the Twilight Zone, pre-M. Night Shama... fuckitwhatever twist ending.  The world is overpopulated, and people are kept fed, in part, by the consumption of processed human corpses, repackaged as "Soylent Green."  Hence the revelation at the end, with glorious overacting by Chuck-Moses.  Nobody ever watches the movie, because the movie sucked.  It's just a meme now.  Memes.  And that brings us to... Facebook.  Which, like the movie, sucks.  So hey, change the name!  Whatever.

So Facebook sucks.  Over the last few weeks, we have had a sequence of news stories about not just the effects of Facebook and Instagram (owned by Zucky and bride of Zucky), but how much internal research Facebook has had about their own effects.  So, come on!  Get 'em!  They're an evil corporation imposing evil on us!  R...ight?

Yet Facebook is not a thing imposed on the world by some douche named, "Mark Zuckerberg."  OK, is Mark Zuckerberg a douche?  Yes.  I can't dispute that.  We can go Heathers on this.  Why is he such a megabitch?  Because he can be.  But the rest of that statement?  False.  Facebook is like Soylent Green.  It's people.  It's you.  (Not me.  I'm not on Facebook, nor any social media.  Blogspot doesn't count, since nobody reads it.  Also, it's owned by Google, not Facebook, which is roughly 3% less evil than Zuckerberg, so there.)  Where was I?  Oh, right.  Facebook is people.  It's you.

Amid all of the pearl-clutching and oh-my-god, I-can't-believe-Facebook-did-that, let's remember that nobody makes anyone use Facebook.  Nobody makes anyone believe anything on Facebook.  If you use Facebook, that's on you.  If you are gullible enough to believe something you read on Facebook, that speaks to your own credulity.

There are several complications.  The most morally troubling relates to kids.  Facebook has internal research showing that teenaged girls have higher rates of depression, anxiety and body image issues associated with Instagram, and gee, why would that be?  That's the tip of the iceberg with kids, and yet even there, one may ask the extent to which parents have a responsibility.  Does this exonerate the construction of algorithms shown to harm?  No, but it does temper the matter.

With respect to adults, politics and misinformation, though-- my normal realm (and oy fucking vey)-- the key point about Soylent Green is that what you are consuming is the product of people.  Mostly living people, unless we're talking about accounts that stay up after people die, and that could turn this post way more morbid than I intended when I started writing, and yeah, tomorrow is Halloween, but let's move on.  The point is that some wackadoo writes or forwards or links or whatever a crazy conspiracy theory on Facebook.  What next?

Let's take two lies (no truth)-- a leftie lie and a righty lie.  The righty lie is that Trump won, but the 2020 election was stolen from him with unprecedented levels of voter fraud.  Total lie.  If you follow Facebook links associated with this lie, you will go down a rabbit hole that eventually leads to support for violent insurrection, like January 6.  The lefty lie for this post will be that Michael Brown died trying to surrender to Darren Wilson.  Yes, this is a lie, debunked by Obama's DoJ, under Eric Holder.  There is still a high likelihood that you didn't know this, if you live within Facebook's left-wing bubble.  Can this lead to support for crazy, violent shit?  Yes.  See Osterweil, Vicky.  Then, as a corrective, go read the actual, fucking DoJ Report.  This is Obama's DoJ.  Eric Holder.  You've been lied to.  You've been bamboozled.  Go read.  I gave you the link.

But the thing is, these lies get circulated, not by Mark Zuckerberg, nor this evil AI called, "Facebook," but by people.  People who just heard a thing and believed it without checking their facts, and then passed it on.  The result is that we observe echo chambers.  But echo chambers aren't new!  If you are in an echo chamber, you selected it.  And a person, or people, not Facebook, feed you information/misinformation.

A big part of this is laziness.  And that's what some of the recent research demonstrated.  A few clicks without any intentionality, and you'll wind up in the hole of a rabbit, and wow, that sounded wrong with just a slight rephrasing.  The misinformation/echo chamber thing?  You need to work, actively to avoid it when Facebook's algorithm operates as it does.  And yes, that's true for you lefties too.  Yet even when you are in an echo chamber of misinformation, the misinformation and echos are coming from... those like you.  And that's the problem.

The algorithms do what you do naturally-- select for similarity.  Most people would rather not read stuff with which they disagree.  Trumpists don't want to read that no, really, Trump lost and it's all a fucking lie by a megalomaniacal wannabe-dictator.  BLM activists don't want to read that no, really, "hands-up-don't-shoot," the foundational incident in Ferguson, was a lie.  The algorithms magnify your natural tendencies.  And wow do they magnify those tendencies.  It is astonishing, even to me, how quickly they'll send you spiraling.

But the misinformation is being given to you by people who range from sloppy ideologues to filthy liars, and those who believe what they are reading are making what is, at this point, such an obvious error that it boggles the mind.  Or at least, my mind.

And here we are, talking about Facebook, as though Facebook is some AI, making choices.  It isn't.  What the algorithms are doing is connecting people who make cognitive errors to other people who make cognitive errors, thereby magnifying the societal deterioration caused by a lack of cognitive rigor.  Root problem:  cognitive error.

Facebook, like Soylent Green, is people.  It's you.  It doesn't do anything.  You do.  When lies and misinformation spread on Facebook, that means people are spreading lies and misinformation, and people are making the cognitive error, first of letting themselves exist within information bubbles, and then of believing without fact-checking when they are existing within such bubbles.

Facebook is an instrument that magnifies the negative consequences of your worst tendencies, but it acts through your worst tendencies.  Scratch that.  Did you see what I just did?  I wrote that it acts.  It doesn't act.  It doesn't do fuck-all.  Facebook is a platform.  A venue.  A media outlet through which people act badly and magnify their own worst tendencies.  The company has found that their ad revenues and data sales go up when they use algorithms that magnify cognitive errors even more, but the errors are already there.  Because Facebook isn't a thing.  Facebook is you.  Facebook is people.  Every bit of ugliness in Facebook is the ugliness of humanity.

And with that, there is only one musical choice for the morning.  My man, Frank.  What's the ugliest part of your body?  Some say your nose, some say your toes, but I think it's your mind.  The Mothers of Invention, "What's The Ugliest Part Of Your Body," from We're Only In It For The Money.  Some days, I nail the musical selection.


Comments