top of page

Can Social Media Redeem Social "Science"?


I originally wrote this post one year ago today. And being the dedicated science stoner that I am, I forgot to publish it. I think I confused myself when it kept getting longer and longer and turned into two seperate posts. But I just came across it again and it's still very fascinating and cool and kind of a little scary. So here ya go Curmudgeons, read on about some kinda crazy human behavior shit!

I've said it before and I'll say it again; The social sciences aint science! They're wimpy wannabe sciences for scientists who can't science. They're the erectile disfunction of academia, thats why they're called 'soft' science. And one of my biggest gripes with the social sciences is sample size. Psychologists and sociologists will often present "studies" that come from a ridiculously small number of people under variable conditions, as if it were possible to derive any kind of meaning from a small sample size under subjective, easily manipulated, highly variable circumstances. The real problem being that the general public sees these fancy, sciencey looking "studies" and takes their "findings" seriously. The famous Stanford Prison Experiment comes to mind, wherein 24 college students appear to have taken to sadism and torture in their roles as gaurds and prisoners, indicating that human beings are sadistic by nature when given the opportunity. The Lord of the Flies hypothesis. Or just as easily, these impressionable, white, middle class undergrads could have been playing to the expectations of their professor and did exactly what he implicitly wanted them to do, the ultimate in confirmation bias. And we're supposed to take away profound insights about all of humanity from this little weekend role playing exercise. And yet this wishy washy gobbledegook of a "study", with no real data, is commonly cited in the media as one of the great understandings of human nature. And it's a bunch of horse shit! The general public reads this kind of tripe and thinks they're getting valuable "scientific" insight into the human condition, when they're actually getting the overzealous ego of a charismatic professor who really wants to get published. These ridiculously small sample sizes and easily manipulated outcomes are why sociology, psychology and philosophy are such a bunch of eye rolling, face palming jokes of discipline to real scientists.

But then came Facebook...

Facebook has 1.3 billion regular users. All interacting through the exact same interface. Contolled conditions, MASSIVE sample size. And they tend to be dicks to each other.

A few years ago, when the general public really began to discover the pleasure of posting, tagging and liking embarrassing photos on social media, Facebook suddenly found they had a problem; Complainers. Over the week between Christmas and New Year's 2011, there were more pictures uploaded to Facebook than the entirety of Flickr over all of it's existence, including now. That's a lot of freakin Christmas pics! And along with all of those pics comes all of those complaints. Shit tons of them. Over a million in that one week. Facebook would need thousands and thousands of people just to go through and organize the complaints (which, in fact, they do have now...but this was in 2011). The sudden flood of over a million complaints dictated they sit back and try to figure out what to do, how to organize them. The vast majority of complaints were miscategorized; pictures of moms with little babies reported for harrassement, families in matching Christmas sweaters reported for nudity, pictures of puppies reported as hate speech, etc.

What they needed was a way to get people to accurately categorize their complaints. Why don't you like that picture? Why is that puppy 'hate speech'?

Turns out, in almost every picture reported, the person complaining was in the picture. The mom with the little baby reported as harrassment was someones ex, the puppy reported as hate speech was a shared dog from a couple that had just broken up. These complaints were just "the natural detritus of human dramas" according to Andrew Zolli, who reported this story for Radiolab. And that human detritus was miscategorized because of how Facebook was set up; people would just click whatever options were there in order to get to the next screen. So Facebook did an experiment.

They added a step; A little box that asked "How does the photo make you feel?" Now generally, science doesn't concern itself with subjective, maleable topics like how something makes you "feel". Far too squishy and susceptible to interpretation to be considered factual, hard data. But the engineers at Facebook found something interesting. They gave people a list of options for how the photo made them feel; embarrassing, upsetting, saddening, bad photo, etc. which increased the accuracy of categorization hugely; suddenly 'embarrassing' accounted for 50% of the complaints. They also had the option for 'other'. When people clicked 'other' they often entered "it's embarrassing", even though embarrassing was already on the list. So they added the modifier 'it's' to their list of how photos made people feel. It's embarrassing, it's upsetting, it's saddening, it's a bad photo, etc. Imediately the number of photos categorized as embarrassing went from 50% to 78%. A 28% change in social behavior simply by adding the modfier 'it's'. Language affects outcomes in a predictable manner. For millions and millions of people.

So that's kind of cool; how you word something affects how people respond. No huge breakthrough here. But wait, there's more. And bear with me, I know this is getting long winded. It gets interesting.

Just getting the complaints categorized didn't solve their problem. The real problem they were looking at was that Facebook couldn't just start taking peoples pictures down willy nilly. There's nothing in the terms of service at Facebook that says you can't post embarrassing pictures of people. Taking pictures down would violate the rights of the person posting the picture. What they needed to do was get people to figure it out amongst themselves. So they crafted a sentence.

To get people to actually interact with each other and deal with it themselves, they gave them the option of messaging the person that posted the photo. 20 percent of people responded. Next they gave them a pre-written message along the lines of "Hey, I didn't like this photo. Take it down." Response immediately went up to 50 percent. Next they added modifiers like 'please' or 'I'm sorry, but...' or directly using the persons name, like "Hey Rueben, I'm sorry, but would you please take that down..." etc. And what they found was kind of amazing; real data: Simply adding the persons name increases response by 7 percent, "would you please" works 4 percent better than "would you mind", saying "sorry' or 'please' significantly reduced responses, and so on. They were able to tweak and modify, tweak and modify, using real time responses from hundreds of thousands of people until they had the exact right sentence that most often got the desired response.

Okay, again no biggie, right? Facebook spent oodles of money having a bunch of nerds sit around and compose the most statistically effective sentence. So what? Right? This was just the beginning.

They decided to bring in some real scientists; neuroscientists, data scientists, computer scientists and such, as well as some psychologists and social scientists to help analyze their findings. The researchers were blown away by the massive sample sizes and the crazy, statistically significant, predictable outcomes. "I'm just stunned and humbled at the numbers we get in these studies,'" said Emilia Simon Thomas, a neuroscientist from Berkeley brought in for the study. Even the psychologists recognized the significance of what they were now capable of analyzing; real time data involving hundreds of thousands or even millions of people from all different races, sexes, settings and backgrounds, millions of diverse data points providing increasingly accurate predictive outcomes, it kind of gives you a science boner. Social science may just have a chance at going legit.

The realization that how things are presented can modify the behavior of millions of people in predictable ways began to apply to everything, and the likelihood that you are currently a research subject in at least 10 social science experiments is hovering right around 100 percent. So of course there's Fox News and the conspiracy theorists.

You may have heard the story of how Facebook manipulated users news feeds to see how they responded. By purposely providing more positive stories on your feed, Facebook can get you to significantly post more positive posts, and vice versa. It wasn't a huge effect, but the newsies went apeshit...for a day or two, saying that Facebook might be killing people with their emotional manipulation and these kind of experiments amount to mass mind control blah blah blah. The fear mongers at Fox had a field day.

Where it gets really interesting, and what the fair and balanced, objective reporters at Fox may really be afraid of is when Facebook can modify your offline behavior.

During the 2012 elections, Facebook decided to see if they could increase voter turnout. By simply placing a button on everyone's page that says "I voted" and showing a few of your friends who have clicked the button, Facebook was able to increase voter turnout by 2 percent. Which is a good thing, right? Not if you're a republican. High voter turnout generally means republicans lose, which is why republicans have put so much effort into voter suppression the last few years. Although increased voter turnout is a good thing, what if some future Facebook decides they have a political agenda? Two percent of voters is huge when you're talking about the number of people on Facebook, and the ability to predictably manipulate peoples behavior could mean that future Facebook could manipulate the outcome of a presidential election. That kind of concentrated power is disturbing to say the least.

Yes, this is a paranoid dystopian future vision, and I do believe Facebook's intentions are benign. According to Arturo Bejar, who headed up this research "In the work that we are talking about, all of the work that we do begins with a person asking us for help." They are responding to "the natural detritus of human dramas" we talked about before. His group is called The Trust Engineers as they are trying to engineer trust between people. Users are coming to them with their interpersonal problems and they are studying human behavioral response to engineer a platform wherein people respond positively to each other and deal with their problems on their own. People have a positive experience and it's less work for Facebook. Basically people stop being dicks to each other online and Facebook makes more money.

In the past I've written about how soft sciences like social science, psychology and philosophy are just fancily worded wannabe sciences. But in the day and age of mass social media, huge sample sizes are actually possible, social interactions are actually measurable and human behavior actually predictable. The Science Curmudgeon may just have to eat his words and Ooops! look out Facebook, you may have just scienced!


Featured Posts
Search By Tags
No tags yet.
Follow Us
  • Facebook Clean
  • Twitter Clean
  • Instagram Clean
  • RSS Clean
bottom of page