PART 2 : the Cambridge Analytica harvesting of Facebook user data – some more background, Aldous Huxley saw it all before, and the European Union sounds off


From Europe: reflections, ruminations, musings and reactions regarding art, culture, media, politics, science, technology and travel

Gregory P. Bufithis

 



For Part 1 (my initial thoughts) click here

22 March 2018 (Rome, Italy) – I am old enough to remember that businesses that make money by collecting and selling detailed records of private lives were once plainly described as “surveillance companies.” Their rebranding as “social media” has to be the most successful deception since the Department of War became the Department of Defense.

Ah, those youthful and charismatic male visionaries with signature casual wardrobes: the open-necked blue shirt, the black polo-neck, the marled gray T-shirt and hoodie. We fell for it. These founders who won immense public trust with their emergent technologies, from home computing to social media to the new frontier … artificial intelligence. Their companies have seemed to grow organically within the flourishing ecology of the open Internet, with their attention-getting platforms.

But no worries!! I watched Mark Zuckerberg’s interview last night — his first surfacing since this stuff broke. In a nutshell:

“Everything is fine! Mostly fixed! Facebook’s great! All good in the hoodie!”

Phew!

How this particular chapter all started

Briefly (enough timelines have been published so you can get more details if you search):

  • In 2013, Facebook allowed an outside researcher (Aleksandr Kogan of Cambridge University) to develop an app on the platform that paid users a small sum to answer questions and download the app, which then harvested private information from their profiles and their friends.
  • Facebook permitted such data mining at the time. It is doubtful many users knew what was happening or read the fine print of their terms-of-service.
  • Kogan then provided the data — on some 50 million people — to Cambridge Analytica, a private firm founded by Stephen K. Bannon, the conservative political operative; and Robert Mercer, the wealthy financier; and another firm.
  • This transfer of data to a third party broke Facebook’s internal policies. In 2015, Facebook found out, removed the app and demanded the data be destroyed. It received a “certification” from Cambridge Analytica that the data was destroyed. Facebook never conducted an audit of the Cambridge servers (that had been suggested at the time given the already murky history of the company).
  • The data was not deleted and we would come to learn that Donald Trump’s 2016 presidential campaign used it, including tests of anti-establishment messages such as “deep state” and “drain the swamp.”
  • It now appears the transfer violated Facebook assurances about user privacy to the Federal Trade Commission in a 2011 settlement. Other pundits are better versed than I to discuss those aspects.

And let’s put a stop to a falsehood Cambridge Analytica has been putting out across social media. Their statement:

“Obama’s 2008 campaign was famously data-driven, pioneered microtargeting in 2012, talking to people specifically based on the issues they care about.”

NO. There is enough detail out there. Obama’s campaign used publicly-available data. They did not trick people into giving up their data when they thought they were interacting with their friends.

It was demographics, voting records, and shoe leather — asking people what issues were important to them in the campaign — not fake “quizzes” on Facebook. I know, because I have read the detailed analysis after Obama was elected, and I know two of the data scientists who collected that data.

Alexander Nix,CEO

And one more point: we have been told that Cambridge Analytica suspended its CEO Alexander Nix after the UK’s Channel 4 aired its expose this week showing Nix secretly filmed claiming to have run the “digital campaign” for Trump’s election team, having helped secure the reality telly tycoon tens of thousands of votes to win three crucial states, and used encrypted webmail like Protonmail to avoid congressional investigations. The company statement:

“In the view of the Board, Mr Nix’s recent comments secretly recorded by Channel 4 and other allegations do not represent the values or operations of the firm and his suspension reflects the seriousness with which we view this violation. We have asked Dr Alexander Tayler to serve as acting CEO while an independent investigation is launched to review those comments and allegations.”

Tayler is the chief data scientist at SCL Group, Cambridge Analytica’s parent company, and the related outfit SCL Elections. SCL Group and CA are weirdly intertwined. Nix is still a director of SCL Group and Elections. Cambridge Analytica is effectively a Delaware shell company that may or may not have any real employees. So Nix’s “suspension” from CA may not mean anything at all. As I noted in a previous post, Cambridge Analytica was set up by former top Trump advisor Stephen Bannon and rich Republican donor Robert Mercer, who had apparently sunk at least $15m into the biz.

And it gets better. Nix is a director of a recently started Brit company, Emerdata, along with Rebekah and Jennifer Mercer and others. Jennifer and Rebekah being Robert’s daughters. Emerdata is … surprise, surprise!! … a “data processing” business. And so the cycle continues.

The Wunderkids

And it’s not that we weren’t warned. Many pundits were poking behind the social media curtain as early as 2010 and telling us exactly what was happening, especially about “The Zuck” and Peter Theil and their “make-money-at-all-cost” endeavors at Facebook. If you have the time and inclination, pick-up any (all) of these books. I plowed through them last summer:

  • The Attention Merchants: From the Daily Newspaper to Social Media, How Our Time and Attention Is Harvested and Sold by Tim Wu
  • Chaos Monkeys: Inside the Silicon Valley Money Machine by Antonio García Martínez
  • Move Fast and Break Things: How Facebook, Google and Amazon have Cornered Culture and What It Means for All of Us by Jonathan Taplin

Yes, capturing and reselling attention has been the basic model for a large number of modern businesses, from posters in late 19th-century Paris, through the invention of mass-market newspapers that made their money not through circulation but through ad sales, to the modern industries of advertising and ad-funded TV.

Of course, Cambridge Analytica is neither the first nor the only company to engage in electoral dark arts. As long as there have been elections, there have been schemers and manipulators eager to help fix the results. In a way, the Cambridge Analytica’s business model is arguably just a supercharged version of something political parties have done for years – identifying potential supporters, compiling detailed pictures of what makes them tick, then tailor-making messages to different groups depending on what they want to hear.

But what is different this time around is the wholesale effort to use psychology to organise and exploit the human mind’s skew to fear, to distraction. Whether this aggressive mining and use of data has marked a “tipping point” … well, I have my doubts.

And to that effect, Jesse Eisenberg’s portrait of Mark Zuckerberg in The Social Network is misleading, as Antonio García Martínez, a former Facebook manager, argues in Chaos Monkeys, his caustic book about his time at the company. In the movie Zuckerberg is a highly credible character, a computer genius located somewhere on the autistic spectrum with minimal to non-existent social skills.

But that’s not what the man is really like. In real life, Zuckerberg was studying for a degree with a double concentration in computer science and – this is the part people tend to forget – psychology. Most people have a limited sense of how other people’s minds work; autists, it has been said, lack a “theory of mind”. Zuckerberg … does not apply. He was very well aware of how people’s minds work and in particular of the social dynamics of popularity and status.

And Silicon Valley billionaire Peter Thiel certainly “got it”. On that point, The Social Network has it right: Thiel’s $500,000 investment in 2004 was crucial to the success of the company. But there was a particular reason Facebook caught Thiel’s eye, says Jonathan Taplin in his book. It was rooted in a byway of intellectual history. In the course of his studies at Stanford – Thiel majored in philosophy – he became interested in the ideas of the US-based French philosopher René Girard, as advocated in his most influential book, Things Hidden since the Foundation of the World. Girard’s big idea was something he called “mimetic desire”. Human beings are born with a need for food and shelter. Once these fundamental necessities of life have been acquired, we look around us at what other people are doing, and wanting, and we copy them. And envy them. And prejudice develops. In Thiel’s summary, the idea is “that imitation is at the root of all behavior and people will pursue it, be distracted by it”.

Yes, I am sure the intent of these male Wunderkids was to show netizens a well-rounded, balanced view of the world, through links, ads and posts in news feeds, social networks and similar sites.

But those with other intents realized the platform could also be used to create addicts, and feed them a diet of slanted information that taps into their prejudices, fuels their fears, and manipulates their knowledge and feelings to political or commercial ends. All of this was made possible by using their personal and private information, feeding on their fear, their prejudice, their distraction.

Hmm. Where have I heard that before? Ah, yes, Aldous Huxley in The Divine Within, published in 1950:

Last year I gave the keynote address at the Digital Investigations Conference in Zurich and I used Huxley in a slide, taking the position that he was more germane to the time then Orwell:

 

European Data Protection Supervisor opinion

Giovanni Buttarelli, the European Data Protection Supervisor, emitted a well-timed opinion this week on the effects of, and antidotes to, websites microtargeting people using their intimate information, noting the feeding of slanted information that taps into prejudices, fuels their fears, and manipulates knowledge. He even and feelings to political or commercial ends. All of this is possible using their personal and private information.

NOTE: Yesterday, Article 29 Working Party Chairwoman Andrea Jelinek said the collection of EU data protection authorities was also launching an investigation of revelations involving Facebook and Cambridge Analytica.

Here’s a key quote from Buttarelli’s statement:

A personalised, microtargeted online environment creates ‘filter-bubbles’ where people are exposed to ‘more-of-the-same’ information and encounter fewer opinions, resulting in increased political and ideological polarisation. It increases the pervasiveness and persuasiveness of false stories and conspiracies. Research suggests that the manipulation of people’s newsfeed or search results could influence their voting behaviour.

Running to 31 pages, Buttarelli’s missive was not thrown together overnight in response to the Cambridge Analytica bombshell. It has been long in progress and just appears to be conveniently timed, with quick, brief, last-minute references to the latest reports alongside older allegations already in the report.

NOTE: many politicians and pundits seem to have forgotten that Cambridge Analytica data harvesting was first revealed by a Guardian investigation back in 2015. I certainly forgot.

Thanks to the weekend’s revelations, though, an EU document that may have otherwise languished on some hidden website will no doubt now have a greater impact, as policymakers are already coming down hard on Facebook.

The opinion describes Facebook and its peers as gatekeepers to the internet who have gained disproportionately from the boom in digital advertising that has sucked up every Tom, Dick and Harry’s personal information and fed it into all manner of data-crunching machines and algorithms.

Buttarelli argued that while the tech giants enjoyed bigger market caps than any other in recorded history, as their business models took off over the past two decades, the effect on individuals has been markedly different. For users, he said, the rise in social platforms has been a double-edged sword, as communities are more connected but increasingly divided by online manipulations and viral outrage.

Meanwhile, increased surveillance by governments and companies is having a chilling effect on people’s willingness to express themselves freely, to such an extent that it risks damaging the “health of democracy.” Although misinformation and manipulation are as old as humankind – particularly in political campaigning – Buttarelli said that rapid digitization had changed the nature of the issue:

In the past, political parties would have had limited amounts of data on their constituents; now there’s a treasure trove of information to be gleaned from scraping social media sites.

This extraordinary reach of online platforms – which are bigger than TV or newspapers, and in some countries the sole entry point to the internet – offer an easy target for people seeking to use the system for malicious ends. The ability to use technology to manipulate people and the information that they see has created the aforementioned filter bubbles, which polarize opinions, and increase the pervasiveness and persuasiveness of false stories.

Interesting point by Buttarelli. He said part of the solution must be data protection, “perhaps a bigger part than expected. But no single player or sector can do this alone. Regulators need to work across sectors – data protection, consumer rights and anti-trust – while rigorously enforcing those rules”.

The text is littered with references to the underhand – perceived or otherwise – practices employed by all those who have been complicit in generating the state of affairs, which Buttarelli argued risks weakening the “social glue” that, in his view, underpins democracy. These run from dense privacy policies to product design that bakes in addictive qualities, such as auto-playing videos and constant alerts and notifications.

At the same time, he pointed the finger at others, saying that existing efforts to tackle fake news have focused too much on exposing the source of information while “neglecting the accountability of players in the ecosy stem who profit from harmful behaviour”.

But his most telling comment:

Online manipulation is a symptom of the opacity and lack of accountability in the digital ecosystem and will only get worse as more people and things are connected. And my fear is artificial intelligence will further blur the lines of accountability.

On this latter point I am in full agreement. Part of my trip to Rome this week included meetings with software developers who know the “secret sauce” of how this social media coding works. As they told me, the AI in development … some of it already developed … provides privacy-eliminating surveillance and (almost) undetectable profiling the likes of which we have never seen.

Coming up in this series: Parts 3 – 6

In Part 3

I will examine how the internet’s aristocracy arose with the commercialization of the web, starting 20 years ago. To the hectic leaders of Silicon Valley, two decades seems like geologic time – but in the evolving social and political cultures of our species, two decades is the blink of an eye. We have not come close to absorbing the implications of these potent, purposefully disruptive technologies.

And let me kill off a comment I hear at every conference: “Technology isn’t the problem. It’s our technology habits”. WRONG! If you do not “get” that these “media” technologies are designed to instill those very technology habits, you don’t “get” the equation. Our vulnerabilities arise from commerce – specifically, the tech-enabled exploitation of personal data by businesses – that we allowed to cultivate. The algorithms of click bait, echo chambers and adtech have their basis in BF Skinner’s rat-in-the-box experiments. Russia is just one example of how anybody can take advantage of these systems. The alt-right is another example.

In Part 4

The code and process at play. There was a big shift in monetization in 2012 when internet traffic began to switch away from desktop computers towards mobile devices. If you do most of your online reading on a desktop, you are in a minority. The switch was a potential disaster for all businesses which relied on internet advertising, because people don’t much like mobile ads, and were far less likely to click on them than on desktop ads. In other words, although general internet traffic was increasing rapidly, because the growth was coming from mobile, the traffic was becoming proportionately less valuable. If the trend were to continue, every internet business that depended on people clicking links – i.e. pretty much all of them, but especially the giants like Google and Facebook – would be worth much less money.

Facebook solved the problem by means of a technique called “onboarding”. Simple example: the home furnishings store Bed, Bath & Beyond wants to give me a coupon. They can snail mail it to me:

Gregory Bufithis
xxxxxxx
xxxxxxx
Paris, France

If they want to reach me on my mobile device, my name there is:

38400000-8cfo-11bd-b23e-10b96e40000d

That’s my quasi-immutable device ID, broadcast hundreds of times a day on mobile ad exchanges.

On my laptop, my name is this:

07J6yJPMB9juTowar.AWXGQnGPA1MCmThgb9wN4vLoUpg.BUUtWg.rg.FTN.0.
AWUxZtUf

Though it may not be obvious, each of these keys is associated with a wealth of our personal behaviour data: every website we’ve been to, many things we’ve bought in physical stores, and every app we’ve used and what we did there. Facebook already had a huge amount of information about people and their social networks and their professed likes and dislikes. It simply “graphed” everything they had for every “location address”. In Part 4 I will go into more detail.

In Part 5

A look at why we give up our privacy to Facebook and other sites so willingly, and some of the psychology at work. Cambridge Analytica found success and wealth by tapping into a rich seam of public anger. Christopher Wylie, the source for the UK investigation piece that exploded this past weekend, summed up the mission:

“We exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons.”

As Wylie said in several interviews, Cambridge Analytica had “found a high level of alienation among young, white Americans with a conservative bent” well before Trump declared his intentions for the presidency. And he said:

“We also had focus groups data for the 2014 midterms, and analysed the hell out of it. These voters responded to calls for “building a new wall” to block the entry of illegal immigrants, to reforms intended the “drain the swamp” of Washington’s entrenched political community and to thinly veiled forms of racism toward African Americans called “race realism” … all of this well before Trump announced his campaign, but information in the hands of Bannon and many others”.

And more foreboding:

“The only foreign thing we tested was Putin, and it turns out, there’s a lot of Americans who really like this idea of a really strong authoritarian leader and people were quite defensive in focus groups of Putin’s invasion of Crimea.”

A host of American political scientists are genuinely worried about this streak of American “authoritarian” feeling, which was weaponized by the Trump campaign, and Cambridge Analytica.

But the problem with Facebook is not *just* the loss of your privacy and the fact that it can be used as a totalitarian panopticon. The more worrying issue, in my opinion, is its use of digital information consumption as a psychological control vector. I will discuss that in this part of the series.

In Part 6

My wrap-up of my key points. And why a “solution” to social media manipulation will always escape us.


To subscribe to my personal blog click on the following link:
 
 
 
 
You can email me at at [email protected].

 

Leave a Reply

Your email address will not be published. Required fields are marked *

scroll to top