Don’t fall for the ‘magic keyword’ trap on Amazon

By spreading your budget across too many keywords, gathering data to intelligently shift budget towards terms that convert to meaningful sales is nearly impossible.

Chat with MarTechBot

For particularly popular, lucrative search terms on Amazon, organically moving up towards the top of the first page of results is a long-term challenge, and buying ads on those terms can be prohibitively expensive – perhaps even unprofitable. This has led many sellers into the “magic keyword” trap – attaching an incredibly large number of keywords to their product listing, putting some budget behind ads to get a top placement if and when they are searched for, and hoping to find a diamond in the rough. The problem? The amount of possible search terms on Amazon is essentially limitless, but an analysis showed that most, at current bid levels, are likely to receive less than one click per day on average. By spreading out your budget across so many keywords with little to no volume, gathering enough data to intelligently shift budget towards the terms that actually will result in meaningful sales becomes nearly impossible. Across Amazon, less than 1% of search terms generate an average of three or more clicks per day.

This data came to light thanks to Alin Constandache, a colleague of mine and lead researcher at Teikametrics, who kept running into issues while attempting to train conversion rate models for our network of Amazon advertisers. Briefly stated, Alin tests and trains new models by selecting a random sample of keyword and click data, which is split at random into a training set and a holdout set, for out-of-sample testing.

This approach had worked well for Alin in other applications, but in April 2019 the conversion rate models he built all seemed to perform equally poorly on the holdout data, all in exactly the same ways. Digging in, Alin analyzed all keyword and associated performance data for the first four months of 2019 – this amounted to roughly 6.5 million keywords across several thousand advertisers. The result was clear, as seen below.

Marketing Land Graph1

Roughly 60% of all keywords received no clicks at all over the four-month period, and 97% received less than one click on average per day over the same period. This larger trend held even among keywords that were included in active campaigns for 30 days or more:

Marketing Land Graph2

More than 30% of these more active keywords still received less than one click per day on average. As you would guess, a deeper analysis showed that these low-volume keywords also drove a similarly minuscule share of revenue to a larger campaign – less than 0.001% on average. Only when a keyword captured an average of at least three clicks per day did it drive a share of campaign revenue commensurate with the amount being paid to place that ad. These are the keywords that sellers should focus on, as they drive enough volume for algorithms to reliably be trained and benchmarked against – helping make better decisions over the near and long term. Yet, only 31,714 were found to have three or more clicks per day – just 0.5% of all keywords studied.

There are plenty of services around, both free and paid, for identifying these meaningful keywords to begin spending against. The bottom line illustrated by this study is that it is worth marketers investing in that work upfront rather than throwing a bunch of keywords into their product backend on Amazon and thinking they are going to strike gold.

More specifically, for marketers to improve performance on Amazon and reduce the time wasted on low-volume keywords, you also need to develop a threshold for when to remove a given keyword. As a basic example, you could remove keywords associated with a product that has been active for at least 30 days but have averaged less than 0.1 clicks per day. This would result in a negligible impact on overall revenue, but allow you to focus your budgets and time towards keywords that are driving meaningful sales.


Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.


About the author

Andrew Waber
Contributor
Andrew Waber is the director of insights at retail optimization platform (ROP) provider Teikametrics. In his current role, Andrew manages the analysis, editorial direction and strategy for Teikametrics' reporting on online retail advertising and the larger online retail marketplace. Prior to his time at Teikametrics, Andrew served as the manager of data insights and media relations at Salsify, the manager of market insights and media relations for advertising automation software provider Nanigans, and as the market analyst and lead author of reports for Chitika Insights, the research arm of the Chitika online ad network. Andrew's commentary on online trends has been quoted by the New York Times, Re/Code and The Guardian, among other outlets.

Get the must-read newsletter for marketers.