Facebook: stats on positive and negative posts en

By letatcest on Monday 30 June 2014 13:50 - Comments (7)
Categories: computers, technologie, wetenschap, Views: 4.617

Last Friday I noticed a post on Scoop.it by the Dutch TV programme 'Tegenlicht'. The headline read "Facebook tinkered with users’ feeds for a massive psychology experiment". Maybe nice to use for a science section of a large Dutch news website. Research is research.

Quickly I skimmed the article on Avclub.com where scoop.it pointed to. Maybe interesting. Maybe not.

Click. The article itself on PNAS. I am not a statistician, but I learned when P < 0.05 than it is statistically significant.*

Scrolling through the article, I see some significant values, but the conclusion reads:

"Although these data provide, to our knowledge, some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network, the effect sizes from the manipulations are small (as small as d = 0.001). These effects nonetheless matter given that the manipulation of the independent variable (presence of emotion in the News Feed) was minimal whereas the dependent variable (people’s emotional expressions) is difficult to influence given the range of daily experiences that influence mood (10). More importantly, given the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences (14, 15): For example, the well-documented connection between emotions and physical well-being suggests the importance of these findings for public health. Online messages influence our experience of emotions, which may affect a variety of offline behaviors. And after all, an effect size of d = 0.001 at Facebook’s scale is not negligible: In early 2013, this would have corresponded to hundreds of thousands of emotion expressions in status updates per day."

And, even more important in my case, the publication date:

Published online before print June 2, 2014, doi:10.1073/pnas.1320040111
PNAS June 17, 2014 vol. 111 no. 248788-8790

This means it's been roaming on the Internet since the second of June. Ah well, let's just skip it, I probably just missed it.

Not. Everybody missed it. And in my humble opinion, it shouldn't have caused the amount of attention it received since this weekend. It's interesting, seeing how things work behind the scenes of one of the largest empires Earth has ever seen, but this is just a glimpse and a tiny one.

Why is the world so surprised? It makes no sense at all. It would be surprising if companies which use data and can manipulate it, would not use it, for better or worse.

In this case its results were published. Openly.

The question which should be addressed is the following: why is it not mandatory that companies which aren't 'companies' in the old sense of the word (because they serve the world in a way public institutions would) have to list their experiments openly? All experiments. Just like universities and other knowledge institutions (should) do.

Open Access.
http://tweakers.net/ext/f/8rf6gdGJElqSQzFjIO2Wdx8F/medium.png

*or at least should be, considering all variables are ok, etc. etc.

Volgende: Google zoekt ineens 'anders' 09-'14 Google zoekt ineens 'anders'
Volgende: Peak Fietskrat 05-'14 Peak Fietskrat

Comments


By Tweakers user 318025, Monday 30 June 2014 18:49

Potverdorrie zit ik op een Nederlandse technologie site en dan dien ik nog steeds Engels te lezen :(

By Tweakers user geekeep, Monday 30 June 2014 19:10

And in my humble opinion, it shouldn't have caused the amount of attention it received since this weekend. It's interesting, seeing how things work behind the scenes of one of the largest empires Earth has ever seen, but this is just a glimpse and a tiny one.

Why is the world so surprised? It makes no sense at all. It would be surprising if companies which use data and can manipulate it, would not use it, for better or worse.
It should cause a lot of attention and surprise.

As you said, FB is one of the largest companies (until the Social Media bubble pops..) and for that reason should take responsibility for the actions they choose. Especially if this manipulation of data results in emotional consequences. It is highly possible they're playing with human lives, this should've faced an ethical board just like any other scientific experiment. In addition, participants should've been informed for this kind of experiments. Facebook's agreements (or terms of servic)e do not hold your consent of being a guinea pig for social experiments messing with your emotions.

It was a totally different case if Facebook was using generated data which is already there. I'm not against using data for other purposes, but I do not agree with manipulating data for emotional (unconscious) experiments.
If we do agree with this kind of experiments, this would mean the beginning of a very scary future (as called in Dutch: 'het hek is van de dam').

By Tweakers user letatcest, Monday 30 June 2014 19:34

@Diqiu-Long: heel soms blog ik in het Engels, zie ook het vlaggetje rechtsboven ;)

@Geekeep: (un)knowingly the amount of manipulated data is already enormous (FB and any other on-line company). There is no 'objective' data. Also see this post: http://www.newrepublic.co...ok-digital-gerrymandering

This kind of 'experiment' maybe even started of as just a 'simple' test to see what would happen if.. We don't know and we'll never know. Every 'test' with the FB news feed could be indicated as a 'scientific' test. The line is maybe even too thin.

The final paragraph shows what I would like to see a real discussion about.

Technically the 'emotional consequences' were almost nil. Or at least not really measurable. The only thing 'visible' was the use of one word more or less in a thousand (1000) words which is associated with emotion.

Maybe my initial focus was too much on the research itself, except for the final paragraph.

People make a big fuss about one tiny thing whilst forgetting what's really at play...

(I would recommend some interesting books (which don't have the answer as well), like Who Owns the Future, by Laron Lanier and The Daily You, Joseph Turow (only on advertisments) )

By Tweakers user geekeep, Monday 30 June 2014 19:59

This kind of 'experiment' maybe even started of as just a 'simple' test to see what would happen if.. We don't know and we'll never know. Every 'test' with the FB news feed could be indicated as a 'scientific' test. The line is maybe even too thin.
There should be a clear difference between pseudo-scientific (FB) and real scientific experiments. I don't see any reason why the experiment was published in a reputable magazine such as PNAS. If we would allow every 'test' and 'result' of all companies to be published in official scientific papers, this line may indeed become too thin. The problem is those 'tests' cannot be verified by anyone whatsoever, lacking real scientific foundation.
The final paragraph shows what I would like to see a real discussion about.
Open data? I can be brief about this: some things are better not shared. Not everyone has the intelligence or reasoning skills to cope with any kind of data. The potential misinterpretation of results might have unwanted consequences.
Technically the 'emotional consequences' were almost nil. Or at least not really measurable. The only thing 'visible' was the use of one word more or less in a thousand (1000) words which is associated with emotion.
The emotional consequences being measured were low, yes. But apart from the questionable accuracy of these measurements in the first place, FB is not able to look past the posts itself, i.e. real life consequences.
People make a big fuss about one tiny thing whilst forgetting what's really at play...
What's really at play is agreeing with deliberately messing with people's emotions without any informed consent. While this is already done on a large scale in advertisements, it should not be part of real scientific research.
We should keep in mind the experiments on human minds/psyche/psychology in the 30's (Germany) which were relatively innocent at first, but were continued during the second World War. We all know how that ended...

[Comment edited on Monday 30 June 2014 23:05]


By Tweakers user letatcest, Tuesday 1 July 2014 09:34

Wellicht nog een interessante kijk op FB en de hele actie:

http://arstechnica.com/bu...n-users-arent-all-bad/#p3

By Tweakers user geekeep, Wednesday 2 July 2014 00:46

Die blogpost maak duidelijk wat ik probeer te zeggen. Er zitten teveel haken en ogen aan dit onderzoek om het te accepteren als juist:
- psychologisch manipulerend onderzoek zijnde onethisch/zonder informed consent
- enkele onderzoekers hebben een vermoeden dat het onderzoek deels ondersteund is met overheidsgeld (en dus wetgeving omzeild heeft)
- mogelijk dat het onderzoek van PNAS wordt verwijderd
- Facebook is groot en kan daarmee ook grote (ongewilde) gevolgen teweegbrengen
- het is niet de eerste keer dat FB de gebruikers tegen het hoofd stoot met ongewenste zaken

Facebook wil zogenaamd de hoeveelheid berichten filteren voor gebruikers (vanwege het grote aanbod). Waarom dan niet de gebruiker zélf laten kiezen wat interessant/relevant is, aan de hand van likes/veelvuldig contact/views? Ik zie niet hoe de emotionele insteek van een bericht kan helpen bij het filteren van posts voor de gebruiker.

Overigens volledig eens met de beste opmerking bij die blog:
"A/B testing is, in of itself, not immoral or unethical.

A/B testing designed to manipulate the psychological and emotional state of the test subject, in a potentially deleterious manner, without express informed consent? Boy is that ever immoral. "

By Tweakers user letatcest, Wednesday 2 July 2014 09:46

Nog eentje dan, een vergelijking van veel opiniestukken erover:

https://ksj.mit.edu/track...mproper-informed-consent/

Comments are closed