FTC Report: Google Purposely Demoted Competing Shopping Sites

Details from an FTC investigation into Google on anti-trust accusations have emerged, suggesting the search giant specifically worked to keep competitors out of its top results.

Chat with MarTechBot

google-legal4-fade-ss-1920

Google deliberately blacklisted competing shopping search sites, despite the company’s past claims that it doesn’t target competitors this way, according to new details of the US Federal Trade Commission investigation into Google on anti-trust charges, found by the Wall Street Journal.

The WSJ obtained a 160 page FTC staff report from the investigation, finding that the FTC should move ahead with an anti-trust lawsuit on several fronts. Our own summary of the WSJ story can be found on our sister-site, Search Engine Land. FTC commissioners ultimately chose to settle with Google in 2013 rather than pursue legal action.

Among the tidbits from the report are things like Google being happy that its market share was undercounted by comScore. That sounds dramatic, but it’s less so when this is well known within the industry and when Google’s Eric Schmidt himself effectively admitted that Google had a monopoly in search.

For me, the most important and damning thing to come out what the WSJ has revealed from the report was the idea that Google was deliberately blacklisting competing shopping sites.

This emerges in a sidebar story to the WSJ’s main piece, one called How Google Skewed Search Results. Most of that piece is devoted to how Google would insert its own vertical search listings, such as shopping or local results. From the story:

Google would “automatically boost” its own sites for certain specialized searches that otherwise would favor rivals, the FTC found. If a comparison-shopping site was supposed to rank highly, Google Shopping was placed above it. When Yelp was deemed relevant to a user’s search query, Google Local would pop up on top of the results page, the staff wrote.

That sounds terrible, on the face of it, unless you understand that this is a practice that many search engines have done to present vertical search results. If you really want to understand more about that, read these:

In short, Google wasn’t doing anything that rivals weren’t also doing. But how about the fact that Google was using a special ranking process for its own search results, as the FTC staff report found:

But Marissa Mayer, who was then a Google vice president, said Google didn’t use click-through rates to determine the ranking for its own specialized-search sites, because they would rank too low, according to the staff report.

That was revealed back when Universal Search was first launched and explained at the time because of the need weight Google’s vertical listings that weren’t ordinarily shown in regular web search results. From my story on Universal Search in 2008:

Mayer said that the first result in Google might get clicked on more than a third of the time out of all the clicks on that page. In fact, Google’s search quality team makes it a goal to try and make that first result as relevant as possible. In contrast, the OneBox insertions get clicked on less despite getting prime placement on the page, since they aren’t as relevant.

Universal Search aims to fix this through blending. With Universal Search, Google will hit a range of its vertical search engines, then decide if the relevancy of a result from book search is higher than a match from web page search. (FYI, Infoseek got a patent related to this type of blending back in 1997; See also: Google’s Universal Search Patent Application & Assigned Patents from Infoseek from SEO By The Sea).

Overall, what the FTC staff seemed to find so damning about Google actually reads like standard practice to those who know how search engines operate and why they operate the way they do. But not this part:

While Google promoted its own results, it sometimes demoted rivals, the FTC staff found. For example, Google compiled a list of comparison-shopping sites that compete with Google Shopping and “demoted them from the top 10 web results,” staff wrote. According to the report, Google users in tests didn’t like the changes; only after Google tweaked its search algorithm at least four times, and changed the ranking criteria, did the new results get “slightly positive” feedback, the staff said.

That’s alarming. Google has consistently said over the years in various ways, at various times, that it doesn’t blacklist particular web sites for competitive reasons. Yes, sites might get manually penalized or hit with algorithmic penalties for violating Google’s spam guidelines. But no, Google has said it doesn’t try to wipe out Yelp or Bing or Expedia just because they compete with it in search areas.

The WSJ’s article about the FTC staff report suggests evidence that this is not the case. That Google indeed actively decided it would demote competing sites and did so even though its own searchers didn’t like it.

It would be helpful to know more from the exact report, to better understand how exactly the FTC staff made this conclusion. I don’t have that, but I have asked Google to see if it can comment about this.

Postscript (8:45pm ET): I’ve talked with the WSJ reporter on the story, Rolfe Winkler, about the demotion section. Apparently, a scan of the actual footnote covering this is on the WSJ site, but I haven’t been able to locate it yet. It sounds like Google may have been doing a test that didn’t go out to all searchers at the time, one aimed at improving the “diversity” of its search results in 2007.

One of the big problems at that time was that it was common to do searches that led back to pages of search results at other sites, which was a pretty bad user experience. Google even issued a guideline against it, at the time. That guideline remains, though Google pretty much seems to ignore it, these days.

Postscript 2 (9:30pm ET): I’ve found where the WSJ has posted some of the original report it obtained. The section about demoting competing sites is covered in footnote 154. The context to what 154 is covering isn’t provided. The footnote says:

Although Google originally sought to demote all comparison shopping websites, after Google raters provided negative feedback to such a widespread demotion, Google implemented the current iteration of its so-called “diversity”algorithm …. Google claimed the goal of this algorithm was to “increase the diversity of Google’s search results for product related queries.”

It seems that in 2006 through 2007, Google decided for some reason that it wanted to demote comparison shopping sites as part of a “diversity” effort. My guess is that it thought listing actual merchants and product pages would be better. However, Google was also making moves at that time to give its then “Froogle” shopping search engine more visibility. It was something that Google raters, people that Google hires to evaluate the quality of its search results, didn’t like.

The footnote continues to describe some of the experimentation:

Initially, Google compiled a list of target comparison shopping sites and demoted them from the top 10 web results, but users preferred comparison shopping sites to the merchant sites that were often boosted by the demotion.

Again, it’s unclear if Google did this with live search results that everyone saw or a test service that only raters used. There are suggestions in the footnote that it was a limited test. Regardless, it remains alarming that it specifically went after competitors to create an algorithm designed to demote them, regardless of whether that was in the name of “diversity” or not.

The footnote continues describing various experiments that Google tried to come up with a way that demoted the competing search engines but in a way that its own quality raters would find acceptable. In the end, it found that it could only demote shopping search engines only if it ensured at least two were listd in the top results:

Google again changed its algorithm to demote CSEs [comparison shopping engines] only if more than two appeared in the top 10 results, and then, only demoting those beyond the top two. With with change, Google finally got a slightly positive rating in its “diversity test” from its raters. Google finally launched this algorithm change in July 2007.

Postscript 3 (1:40am ET, March 19): Google sent me some background information now, which I’ll summarize as follows (it didn’t provide a statement I can actually quote, just info offered to explain the situation, from its point of view).

Google said that in 2007, aggregator / comparison sites were dominating the results for product searches and that these sites were often of low-quality. Even in cases of good quality, Google still felt it was a search issue when people would click from its search result to ultimately land on another list of search results.

Google says it was a desire to improve these user experience issues that motivated the experimentation, not an effort to push its own shopping results. The goal, Google says, was to come up with the right mix of shopping aggregators as well as actual merchant sites.

Eventually, Google says, it found a mix that its raters liked, one that retained the best aggregators as well as providing for overall result diversity. That’s what was kept. It stressed that this type of diversity is something it aims for in other instances, such as say a search for “Birdman,” where people might want theaters and showtimes, a trailer, info on actors as well as news.

Google says the shopping algorithm change was limited to English-language searches only for those within the US. Those outside the US, such as in the EU, or those searching in languages other than English from anywhere, didn’t get it.

Google also said that the FTC addressed the experiment with its closing statement to say that it was part of the changes that “could reasonably be viewed as improving the overall quality of Google’s search results because the first search page now presented the user with a greater diversity of websites.”


Contributing authors are invited to create content for MarTech and are chosen for their expertise and contribution to the martech community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.


About the author

Danny Sullivan
Contributor
Danny Sullivan was a journalist and analyst who covered the digital and search marketing space from 1996 through 2017. He was also a cofounder of Third Door Media, which publishes Search Engine Land, MarTech, and produces the SMX: Search Marketing Expo and MarTech events. He retired from journalism and Third Door Media in June 2017. You can learn more about him on his personal site & blog He can also be found on Facebook and Twitter.

Fuel up with free marketing insights.