Search engine manipulation effect

From Wikipedia, the free encyclopedia

The search engine manipulation effect (SEME) is a term invented by psychologist Robert Epstein in 2015 to describe a hypothesized change in consumer preferences and voting preferences by search engines. Rather than search engine optimization where advocates, websites, and businesses seek to optimize their placement in the search engine's algorithm, SEME focuses on the search engine companies themselves. According to Epstein, search engine companies both massively manipulate consumer and vote sentiment, and furthermore do so to ensure their favored candidates win. Epstein’s research shows that such manipulations can shift the voting preferences of undecided voters by 20 percent or more, and up to 80 percent in some demographics, and can change the outcomes in over 25% of national elections.[1][2][3]

In response to the allegations, Google denied re-ranking search results to manipulate user sentiment, or tweaking ranking specially for elections or political candidates.[4]

Scenarios[edit]

At least three scenarios offer the potential to shape/decide elections. The management of a search engine could pick a candidate and adjust search rankings accordingly. Alternatively, a rogue employee who has sufficient authority and/or hacking skills could surreptitiously adjust the rankings. Finally, since rankings influence preferences even in the absence of overt manipulation, the ability of a candidate to raise his or her ranking via traditional search engine optimization would influence voter preferences. Simple notoriety could substantially increase support for a candidate.[2]

Experiments[edit]

Five experiments were conducted with more than 4,500 participants in two countries. The experiments were randomized (subjects were randomly assigned to groups), controlled (including groups with and without interventions), counterbalanced (critical details, such as names, were presented to half the participants in one order and to half in the opposite order) and double-blind (neither subjects nor anyone who interacted with them knows the hypotheses or group assignments). The results were replicated four times.[2]

United States[edit]

In experiments conducted in the United States, the proportion of people who favored any candidate rose by between 37 and 63 percent after a single search session.[2]

Participants were randomly assigned to one of three groups in which search rankings favored either Candidate A, Candidate B or neither candidate. Participants were given brief descriptions of each candidate and then asked how much they liked and trusted each candidate and whom they would vote for. Then they were allowed up to 15 minutes to conduct online research on the candidates using a manipulated search engine. Each group had access to the same 30 search results—each linking to real web pages from a past election. Only the ordering of the results differed in the three groups. People could click freely on any result or shift between any of five different results pages.[2]

After searching, on all measures, opinions shifted in the direction of the candidate favored in the rankings. Trust, liking and voting preferences all shifted predictably.[5] 36 percent of those who were unaware of the rankings bias shifted toward the highest ranked candidate, along with 45 percent of those who were aware of the bias.[2]

Slightly reducing the bias on the first result page of search results – specifically, by including one search item that favoured the other candidate in the third or fourth position masked the manipulation so that few or even no subjects noticed the bias, while still triggering the preference change.[6]

Later research suggested that search rankings impact virtually all issues on which people are initially undecided around the world. Search results that favour one point of view tip the opinions of those who are undecided on an issue. In another experiment, biased search results shifted people's opinions about the value of fracking by 33.9 per cent.[6]

India[edit]

A second experiment involved 2,000 eligible, undecided voters throughout India during the 2014 Lok Sabha election. The subjects were familiar with the candidates and were being bombarded with campaign rhetoric. Search rankings could boost the proportion of people favoring any candidate by more than 20 percent and more than 60 percent in some demographic groups.[2]

United Kingdom[edit]

A UK experiment was conducted with nearly 4,000 people just before the 2015 national elections to examined ways to prevent manipulation. Randomizing the rankings or including alerts that identify bias had some suppressive effects.[2]

2016 U.S. presidential election[edit]

Epstein had previously disputed with Google over his website, and posted opinion pieces and essays fiercely attacking Google afterward. He claimed that Google was using its influence to ensure Hillary Clinton was elected in the 2016 United States presidential election.[6]

See also[edit]

References[edit]

  1. ^ Crain, Matthew; Nadler, Anthony (2019). "Political Manipulation and Internet Advertising Infrastructure". Journal of Information Policy. 9: 370–410. doi:10.5325/jinfopoli.9.2019.0370. ISSN 2381-5892. JSTOR 10.5325/jinfopoli.9.2019.0370. S2CID 214217187.
  2. ^ a b c d e f g h Epstein, Robert (August 19, 2015). "How Google Could Rig the 2016 Election". Politico.com. Retrieved 2015-08-24.
  3. ^ Epstein, Robert; Robertson, Ronald E. (2015-08-18). "The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections". Proceedings of the National Academy of Sciences. 112 (33): E4512–E4521. Bibcode:2015PNAS..112E4512E. doi:10.1073/pnas.1419828112. ISSN 0027-8424. PMC 4547273. PMID 26243876.
  4. ^ "A Flawed Elections Conspiracy Theory". POLITICO Magazine. Retrieved 2016-04-02.
  5. ^ "Suchmaschinenoptimierung" (in German). 6 October 2018.
  6. ^ a b c "How the internet flips elections and alters our thoughts — Robert Epstein — Aeon Essays". Aeon. Retrieved 2016-02-28.

External links[edit]