There is an adage that says if you're not paying for the product then chances are, you are the product.
It's a reference to the one-sided bargain for personal information that powers so many social businesses - the way that "infonopolies" as I call them exploit the knowledge they accumulate about us.
Now it's been revealed that we're even lower than product: we're lab rats.
Facebook data scientist Adam Kramer, with collaborators from UCSF and Cornell, reported on a study in which they tested how Facebook users respond psychologically to alternatively positive and negative posts.
Their experimental technique is at once ingenious and shocking.
They took the real life posts of nearly 700,000 Facebook members, and manipulated them, turning them slightly up- or down-beat.
Then Kramer at al measured the emotional tone in how people reading those posts reacted in their own feeds. See Experimental evidence of massive-scale emotional contagion through social networks, Adam Kramer, Jamie Guillory & Jeffrey Hancock, in Proceedings of the National Academy of Sciences, v111.24, 17 June 2014.
The resulting scandal has been well-reported by many, including Kashmir Hill in Forbes, whose blog post nicely covers how the affair has unfolded, and includes a response by Adam Kramer himself.
Plenty has been written already about the dodgy (or non-existent) ethics approval, and the entirely contemptible claim that users gave "informed consent" to have their data "used" for research in this way.
I want to draw attention here to Adam Kramer's unvarnished description of their motives.
His response to the furore (provided by Hill in her blog) is, as she puts it, tone deaf.
Kramer makes no attempt whatsoever at a serious scientific justification for this experiment:
"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product ... [We] were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.
That is, this large scale psychological experiment was simply for product development.
Some apologists have countered that social network feeds are manipulated all the time, notably by advertisers, to produce emotional responses.
Now that's interesting, because for their A-B experiment, Kramer and his colleagues took great pains to make sure the subjects were unaware of the manipulation.
After all, the results would be meaningless if people knew what they were reading had been emotionally manipulated.
In contrast, the ad industry has always insisted that today's digital consumers are super savvy, and they know the difference between advertising and real-life.
Advertising is therefore styled as just a bit of harmless fun.
But I think this positioning is a self-serving mythology crafted by people who are increasingly expert at covertly manipulating perceptions and who now have our data collected dishonestly.
To read more from Steve Wilson of Constellation Research, visit his blog.