Facebook Monitoring: How Much Should You Care?

Jean Dion • January 06, 2014

In an average day, I'd bet that hundreds or even thousands of research papers are written, published and promoted. It's very likely that few of them are even seen, much less debated. But sometimes, someone writes something so interesting and so unusual that it sparks widespread interest, even if people don't really take the time to read the entire study in question. That's what happened to two people who conducted a study about self-censorship and its role in the life of the average Facebook user. These researchers probably thought they were onto something grand and remarkably helpful for future marketing companies, but their research seems to have freaked out many, many people.

I decided to try to get to the bottom of it, to help you to understand how much you might need to worry about your privacy and your reputation as you merrily chat (or don't chat) with your friends on the blue-and-white behemoth. Here's what I found out.

A Simple Study

The original study, which was published by the Association for the Advancement of Artificial Intelligence, is less than 10 pages long, and the entirety of the thing is available online. For people who read such research all day long, it's pretty straightforward. The researchers outline what they were trying to prove, what they did and what they found. And there's some interesting stuff here. Over 5 million people were included in the study, and they weren't notified of their inclusion. Their names weren't attached to the data, and neither was the specific information the person put on Facebook.

However, a specific amount of information was collected about each participant, including data the researchers put under the helpful heading of "demographic."

That data included:

  • Gender
  • Age
  • Political affiliation
  • Group member count
  • Days since joining Facebook

In addition, information about the person's number of friends, age of friends, political affiliations of friends and density of social interactions was collected. If a person in the study typed something into a box, like a personal wall, the wall of a friend or the wall of a group, and then chose not to post that information for a specific period of time, the researchers got a ping about the event, and they could cross-reference it with all of the other data they've gathered. This kind of data mining could, ostensibly, provide the company with a significant amount of information about what might keep people away from Facebook.

Are they afraid to speak out when their friends are different? Do they refuse to share with people older or younger than they are? Do large networks foster more information? It's the kind of stuff people would salivate over.

The People Are Angry

While the study might seem reasonable enough to tech-heads, many bloggers and columnists were up in arms about the whole idea. Slate seems to have broken the story, and in this published piece, the author suggests that this sort of behavior is reminiscent of recent NSA surveillance scandals. She was concerned about how much the company had access to, and she ends the piece with a quote from Dave Eggers: "All that happens must be known." Other bloggers jumped on board, including Ars Technica, which covered the piece under the headline, "Facebook is tracking what you don't do on Facebook."

The piece covers the research in great detail, but again, it ends on a somewhat ominous note, suggesting that the site should "be warned." The writers of blogs like this don't seem to be interested in what this specific study does and how it works. Instead, they seem worried about the mere fact that Facebook could keep track of the things you do on the site, even if you don't pull the trigger and make your actions public. Their concern sometimes gets a little overblown, as a casual reader of these articles might walk away thinking that Facebook execs are reading their words, right now, even if they don't post them.

Words of Reason

Alarmism is easy to cultivate, but there have been some writers who exercise a little caution. Techno bloggers (and the Daily Kos gets a big mention here) think that much of this hysteria is just, well, hysteria. They note that all websites collect some kind of data on users, and that collecting information about boxes users click on is as easy as entering one simple line of code into the gobbledygook that runs the site. For bloggers in this camp, there's no real reason for alarm as the company isn't storing the data anywhere, and the anonymous nature of the study suggests that no one person's privacy was violated. Since the data collected didn't even concern the information that people did or did not post, there's no real privacy breach at all. It's just code. Just numbers. It just suggests a vague type of activity.

The underlying message: Calm down already.

The Middle Way

I'd love to say that I don't find this study creepy. The fact remains, however, that I do have some problems with the way this whole thing went down. I'm a little worried about the fact that Facebook programmers collected this information without disclosing the fact that they would do so. I know they don't have to make these disclosures, but I'd love to think that openness and transparency would rule the way the company works, especially since so many Americans really are freaked out about their reputations, their privacy and corporate spying. Why not be open?

Secondly, I am worried about the fact that unpublished information perhaps could be collected. Yes, none was collected right now. Yes, no one was outed right now. But isn't it at least possible?

Consider this: What if local authorities wanted to find out about the number of people who typed in certain phrases in their dialogue boxes, even if they chose not to publish them at the last minute? What about employers who might like to know about the number of employees who were going to post about their drunken weekend escapades, and then pulled back and deleted the data? To me, this study seems to suggest that those two scenarios are at least possible, even if they're not happening right now.

So for me, I am a little alarmed. I'm not picketing, and I'm certainly not suggesting that all people should stay away from Facebook at all times. But I feel compelled to suggest that all people really think about what they might like to publish. Speak the words aloud, don't type them, and if they sound decent, write them into the program.

The bottom line: Self-censorship on the site is likely safe, for now. But in the future? I'm not sure. It's best to get started on good habits now as a result.