When you agreed to Facebook's terms and conditions, did you know you were agreeing to become a subject in a psychology experiment? This weekend, we learned that Facebook permitted an academic research team to conduct an experiment on a huge number of Facebook's users back in 2012.
The researchers adjusted the contents of Facebook timelines for nearly 700,000 users so that either positive or negative news dominated. They found that positive news spread positive responses, and negative news spread negative responses.
Some thought leaders dismiss the resulting criticism of Facebook, either because they expected worse or because they think it's no problem. But plenty of others have been crying foul. Facebook's users were considered to have consented to the research because of a line in Facebook's terms permitting research, but many feel that the lack of any form of informed consent -- not even a check box agreeing to be a research subject -- crossed a line.
I don't think this particular problem is an inherent consequence of Facebook's business model: an amoral corporation collecting personal data. It would be entirely possible to have a data-harvesting business yet still stop short of explicitly performing psychological experiments on the user base. It would also probably have been feasible to devise a study that did not tinker with the contents of people's timelines yet still showed the effects of positive or negative trends.
I and most people around me are well aware of how Facebook hopes to monetize use of the system. Personally, that makes me avoid using Facebook for sharing much that matters deeply to me -- I use it casually to keep up with friends and family. I still use it though, because I know every company has boundaries. They usually avoid crossing them both because it's the right thing to do and because the business impact of doing so can be very negative.
What this particular incident tells us loudly about Facebook is that it has no boundaries when it comes to the ethics of manipulating users. In fact, the company seems surprised anyone would even think ethics were relevant here. That discovery surprises even those of us familiar with Facebook's business model.
I'm especially struck by this official Facebook response:
This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives, and all data is stored securely.
The media spokesperson was clearly worried that Facebook was being accused of violating privacy or security in some way and had no sense at all this might be an ethical issue of epic proportions. This is the problem.
I've long thought that to know the character of a corporation you need only study the character of its founders. Yes, I've been warned before not to treat "The Social Network" as a documentary, but Facebook really does seem to embody the character of Zuckerberg that movie portrays: supersmart, capable of empathy, yet occasionally ruthless.
I know plenty of smart, responsible people who work there, yet they still allow this sort of thing to happen. No matter how much Facebook parades its concern about security and privacy, incidents like this serve to underline that Facebook is Zuckerberg playing Hot-Or-Not with the whole Internet for profit.
This story, "Facebook's big problem: Ethical blindness," was originally published at InfoWorld.com. Get the first word on what the important tech news really means with the InfoWorld Tech Watch blog. For the latest developments in business technology news, follow InfoWorld.com on Twitter.