Chris Horrie reviews and summaries aspects of Zenynep Tufekci’s new book on the political impact of social media : Twitter and Teargas. Horrie discusses Tufeckci’s analysis of the way in which Facebook builds profiles using machine learning and conducts extensive real time experiments evaluating unceasing attempts to manipulate user behavior.
Surveillance capitalism – business model of facebook and twitter is capturing our attention and then clicking on ads.
There is enormous brain-power designed to increase the average among of spending by a fraction. This is highly scientific with lots of real-time and dynamic testing of how behavior can be influenced.
Machine learning is employed, and the tests are automated and write themselves to an increasing extent. These experiments won’t work though unless there is data, the raw material of this new economic engine.
We have a business model which is set to figure out exactly how to press our buttons and do this by using an enormous amount of data that is captured asymmetrically. You don’t get to see what they have about you, and it can come from third parties.
These enormous dataset can be used to deduce things about us that we would never disclose. When you have such a lot of data from so many sources you can use “computational inference” to figure out on a degree of probability who is a trouble-maker; or who is depressed and might be on a manic swing. You can figure these things out even if people do not disclose or don’t even know about themselves. This is a perfect set up for authoritarians.
It make surveillance much easier. At stage one of mass surveillance – phone-tapping for example – it really was impossible for the authorities to listen to everything. However one a dynamic profile starts to be built up it is easier to target persons of interest and follow them more thoroughly. The action of profiling and targeting will produce more data and more inference and pattern testing for the machine learning model.
You can easily build a model which enables people to be targeted with messages around their fears and anxieties and these can be linked to life event. This is happening in politics, but not openly. It is happening person by person and it is invisible to the person who is being targeted, except for the fact that messages are being received. It is entirely invisible the public in general and so, in the case of political positions, they can’t be rebutted or denied. “It is happening in the dark. You don’t see what other people are seeing”.
An irony is that the ultra-liberal culture of Silicon valley is building a machine that can be used by authoritarians. They think they will never lose control, but the lesson of history is that groups like this always do.
TRUMP AND SOCIAL MEDIA
His output ranged from distortion to outright fake news. More important were the comments and further exaggerations made by his supporters in the ‘echo-chamber’. Unless you were a Trump supporter or – rather – unless the machine learning had profiled you as a Trump supporter, you would never see these comments, you would have to hunt them down. This is because FB wants your experience of being online to be pleasant, and so the things that you read for all sources are likely to agree with you. You will be confirmed in your view, and all the criticism.
SUGAR AND SALT
FB gives users a diet of sugar and salt. The sugar is cute material – cuddly animals, birthday reminders and happy news – and the salt, which can be equally as enjoyable, is outrage – things we are angry about, or can self self-righteous about. Both polar sides attract attention. They just feed us that and I think that’s really destructive, especially as it is just a way to make money. So you could just be a spammer and figure out ‘hey I can just feed people fake news about Hilary Clinton’. This would be the equivalent of lots of salt, and is can change your taste buds so ordinary useful news tastes unpalatable. You crave the salt but what remains of more responsible mainstream media wont give it to you, so you find mainstream media repellent. The algorithm allows you make a lot of money from doing that.
OBAMA CAMPAIGN STARTED IT
The Obama campaign was the first major political operation to use Facebook effectively, and to lead with it. Tufeckci was worried about the asymmetric aspect of gathering the data. In 2016 Ted Cruse data people ended up being Donald Trump’s data people. They claimed according to Tufeckci that they had created individual psychological profiles based on FB likes. We can guess with high probability where you are on the scale of five major personality traits – extrovert versus introvert, etc – we can guess your sexual orientation and religious affiliation even if you never disclose them. We know that some people will vote for more authoritarian politics if they are scared. Others are antagonized by scare-mongering.
The problem is that if you put out scaremongering material aimed at the general public, or even the fairly narrow demographics you get with newspaper or TV advertising that the votes you gain by spending money on scaremongering can be offset at least to some degree by the voters who are mobilized by dislike of the campaign. What if you could go online and target an audience that is only influenced by fear mongering. This would be much more efficient.
The Trump campaign claims that they used FB to “de-mobilize” certain demographics who might have voted against him. They were not trying to persuade these groups to vote for him. The message was “Hilary Clinton is just as bad, stay at home, they are all as bad as each other”. One of the target demographics was black men in Philadelphia. They set out to de-mobilize that group. What did they tell them? We don’t know, but FB knows. Did they tell the things that were correct?
The census data from the 2016 US presidential election is in now, and the most notable factor is the decline in the black American vote – down very considerably from 2012. There are many plausible reasons as to why this might be but it could be this: We could have a world in which large sections of the population were psychologically profiled and then targeted through FB dark ads in a way that would push their buttons