Voice, video, Shopping Graph: A marketer’s guide to Google I/O announcements

Also, improvements to in-app deep linking capabilities.

Chat with MarTechBot

Google’s I/O developer conference, held virtually this year of course, just wrapped. The keynote is still available on demand, but in case you can’t spare two hours to watch it, here are our selected highlights from the product announcements.

LaMDA’s open-ended voice conversations. Sundar Pichai, CEO of Google’s parent company Alphabet, previewed a new conversational model called LaMDA, or “Language Model for Dialogue Applications,” at the event on Tuesday. The new language model is designed to carry on an open-ended conversation with a human user without repeating information. LaMDA is still in early-phase research, with no rollout dates announced.

LaMDA was trained on dialogue, and Google has put an emphasis on training it to produce sensible and specific responses, instead of more generic replies like “that’s nice,” or “I don’t know,” which may still appropriate albeit less satisfying for users. At I/O, LaMDA was shown off personifying the planet Pluto and a paper airplane, respectively. The conversations were Q&A-style between the user and LaMDA, but LaMDA went above and beyond providing direct, Google Assistant-like answers; instead, it offered nuanced responses that some might even consider witty.

Google has not offered any details on how it might include LaMDA in any of its other products, but it is easy to imagine LaMDA helping users find the products they’re looking for or sift through local business reviews, for example.

MUM the multi-tasker. Google SVP Prabhakar Raghavan showcased a new technology called Multitask Unified Model (MUM). Similar to BERT, it’s built on a transformer architecture but is far more powerful (1,000 times more powerful) and is capable of multitasking to connect information for users in new ways. The company is currently running internal pilot programs with MUM, although no public rollout date was announced.

On stage at I/O, Raghavan provided some examples of the tasks that MUM can handle at the same time:

  • Acquire deep knowledge of the world;
  • Understand and generate language;
  • Train across 75 languages; and
  • Understand multiple modalities (enabling it to understand multiple forms of information, like images, text and video).

Raghavan used the query “I’ve hiked Mt. Adams and now want to hike Mt. Fuji next fall, what should I do differently to prepare?” as an example that would give present-day search engines some trouble providing relevant results for. In the simulated search leveraging MUM, Google could highlight the differences and similarities between the two mountains and surface articles for the right equipment to hike Mt. Fuji.

Prabhakar Raghavan at I/O
Prabhakar Raghavan at Google I/O

Helping video search rankings with key moments. John Mueller, Google Search Advocate, spoke about new video structured data that you can use to help your videos gain more visibility in Google Search. The new structured data is called Clip markup and Seek markup and they work to help Google create key moment video clips in Google Search.

Google Key Moments

ShoppingGraph and a partnership with Shopify. Bill Ready, the company’s president of commerce and payments, revealed details about its Shopping Graph, the real-time dataset that connects shoppers with billions of product listings from merchants all across the internet. 

“Building on the Knowledge Graph, the Shopping Graph brings together information from websites, prices, reviews, videos and, most importantly, the product data we receive from brands and retailers directly,” Ready said. The AI-enhanced model works in real-time and is designed to show users relevant listings as they shop across Google.

Somewhat similar to Google’s Knowledge Graph, the Shopping Graph connects information about entities and affects what can appear in search results. What’s different is that Knowledge Graph information comes from various sources and is not submitted directly to Google. Product information can be submitted to Google via the Merchant Center or Manufacturer Center.

Google has also partnered with Shopify to enable the platform’s 1.7 million merchants to show their products across Google Search, Shopping, Image search and YouTube. “With just a few clicks, these retailers can sign up to appear across Google’s one billion shopping journeys each day, from Search to Maps, Images to Lens and YouTube,” Ready said.

Improvements to deep linking implementation. Deep linking can help app developers and marketers surface the most relevant pages to their users. Deep-linked experiences improve ad performance, according to Google’s own data, by potentially doubling conversion rates. However, feedback indicated that deep linking often required multiple internal teams, shared KPIs, and update prioritization.

In response, Google announced the deep link validator and impact calculator at Google I/O this week. “Marketers can use these tools in Google Ads to see which types of deep links they have, how to fix ones that aren’t working and estimate the ROI opportunity of implementing deep links.”

Along with easier implementation of deep linking initiatives, Google announced data-driven attribution (DDA) for deep linked campaigns to help surface which ad interactions and channels drive the most conversions — helping marketer improve their campaigns further.

Why we care. This is a compelling mix of sophisticated future-directed AI design — the LaMDA and MUM pilots might be game-changers one day — and developments of immediate utility for marketers. It’s particularly notable that the boom in ecommerce during the pandemic has focused Google’s attention on a space where its primary use had historically been research not discovery and purchase.

Reporting by Carolyn Lyden, Barry Schwartz and George Nguyen.


About the author

Kim Davis
Staff
Kim Davis is currently editor at large at MarTech. Born in London, but a New Yorker for almost three decades, Kim started covering enterprise software ten years ago. His experience encompasses SaaS for the enterprise, digital- ad data-driven urban planning, and applications of SaaS, digital technology, and data in the marketing space. He first wrote about marketing technology as editor of Haymarket’s The Hub, a dedicated marketing tech website, which subsequently became a channel on the established direct marketing brand DMN. Kim joined DMN proper in 2016, as a senior editor, becoming Executive Editor, then Editor-in-Chief a position he held until January 2020. Shortly thereafter he joined Third Door Media as Editorial Director at MarTech.

Kim was Associate Editor at a New York Times hyper-local news site, The Local: East Village, and has previously worked as an editor of an academic publication, and as a music journalist. He has written hundreds of New York restaurant reviews for a personal blog, and has been an occasional guest contributor to Eater.

Fuel up with free marketing insights.