Monday, June 30, 2014
Facebook manipulated emotions of 689 thousand users
We all know emotions are contagious. Being around "happy" people will make you feel good; being surrounded by depressed people, not so much (or check the cases of mass hysteria for extreme scenarios). But will the same thing happen remotely, through social networks? That's precisely what Facebook wanted to find out, using over 689 thousand of its users as lab rats in the process.
Many (most?) of these thousands of people won't like to know they've been treated as guinea pigs, but the fact is, each and every Facebook user has accepted to become one when they swiftly checked the box "I agree" when registering for the service. Among all the other stuff no one ever reads there is a clause that states your info can be used for testing and research.
... Not that one shouldn't argue if deliberate manipulation is a bit more than "research"... but that's for the courts to decide, if/when someone decides to sue FB over this.
The test was quite simple. For a week, Facebook slightly tweaked what news popped up for these users: some users would see slightly more "positive" news; others would see slightly more "negative" ones. And then they would see how they would react to it. And indeed, although very slightly (about 1 word in 1000), users were statistically influenced by those.
But one shouldn't forget that these kind of "manipulations" aren't an exclusive of any single service. All kinds of media "manipulate" us (or at least they try to), as well as shops and stores - by the way the place their products; and even web sites. Most sites continuously do the so called A/B tests; where they present slightly different versions to a small subset of their users, to figure out which results in higher ad clicking or shopping.
So... I think the best way to deal with this is to be conscious about the "manipulative" world we live in, and no try to pin it out on any single service.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment