Google IO 2024: What was announced and what did we learn?

兔子先生

thought leadership

On, Tuesday, 14th May 2024 Google held their annual I/O keynote, and it’s safe to say that all things AI took centre stage. Dominating the announcements, AI Overviews along with multi-step reasoning, lens search with video and AI organised search results, all highlighted where our digital lives are headed within this new world of AI. Although previously Google used this keynote to announce new hardware, this year focused on AI software updates and how Google is aiming to command dominance over the current AI boom. 

So, let’s break down what our experts felt were the most important announcements and what we learnt from them. 

The key announcements  

Each year the Google IO conference gives us a glimpse into developments, tests, and new products that Google is planning to launch. Given SGE (Search Generative Experience) was launched last year, which plans to fundamentally change how search results are delivered through AI, this year’s conference was highly anticipated. 

New features and formats being tested and announced by Google is nothing new, SERPs are constantly evolving and focused on delivering the best results – so here are the proposed tests that were announced this year:  

AI overviews: Is SGE dead? 

Yes, and no. ‘AI overviews’ is what Google is now calling, what was previously, their Search Generative Experience. They explained that AI overviews give answers to search queries using Google Gemini (which was previously Bard – Google’s AI-model alternative to ChatGPT). The results will feature an AI generated snippet based on information from around the web?. 

It is unclear from examples shared by Google how detailed the source of the information will be highlighted in the AI Overviews. Content was said to have been pulled from an array of mediums, including social media, Q&A sites, discussion boards, forums, videos, and images.  

It’s now live in the US and is expected to roll out across 200+ markets throughout 2024. 

Multi-step reasoning  

What is Google’s multi-step reasoning? This was announced as a ‘coming soon’ feature that will bring multi-step reasoning capabilities to Google Search. It will break down bigger questions into parts, figuring out which problems to solve and in what order. The aim is that research that might previously have taken hours, will be able to be done in seconds. As with AI overviews, it will launch in the US first.  

Planning capabilities  

This will enable a user to plan activities alongside Google, directly in Google Search. There will be the ability to collaborate with others, export results and amend your plans based on your searches?. 

The results are not just search results but are results being planned with you in the search interface. Searches for activities such as meal planning, date nights, workouts, holiday planning etc. will likely be the most impacted.  

For example, you can search for “create a 3-day meal plan for a group that's easy to prepare,” and you'll get a starting point with a wide range of recipes from across the web. 

It’s meant to be coming to Search Labs later in 2024.  

AI organised search results  

Search results will be grouped into categories generated by AI, giving the user different options and ideas based on their search. ?According to Google this will feature a ‘wide range of perspectives and content types?’. 

This will likely have an impact on zero-click searches as Google is aiming to provide the user with rich information in SERP results. It’s not yet available in the Search Labs environment. Roll out will start with dining and recipes, and will be followed by films, music, books, hotels, shopping and more.  

Lens search with video 

This new feature will allow users to record a video and ask a question based on the video recorded. AI will interpret the video content and generate an AI based result to answer the user query. It’s meant to be available soon in the Search Labs test environment, in both EN and US, with expected roll out in 2024.  

Advanced image recognition  

This feature will allow users to do more in-depth searchers with their own photos with natural language, with AI being able to find patterns in image content and recognise familiar faces and facets of imagery to match user queries. For example, you can give the tool your license plate number, and it will use context to find your car in your photo library. Currently, Google Photos software engineer Jeremy Selier says the feature doesn’t collect data on your photos.? 

In future, although not specified by Google, this may be used by Image Search or become an available feature of platforms and beyond to enhance shopping and commerce experience. 

How will the new search experience look and what this means for search?  

Google stated that AI Overviews impressions will be reported in Google Search Console. Although the AI overviews data will be grouped with normal search impressions – so there will be no way to distinguish what search format generated the impression. 

Google also revealed that Click Through Rate (CTR) on AI Overview link cards are higher than CTRs across normal web results. However, this will be hard to show given that data in Search Console isn’t being split out.  

AI Overviews will also only be featured across results that need more complex answers, and where value can be added beyond a traditional search result. Results are expected to be featured across medical, financial and health queries. However no % of AI generated search results is being shared by Google.  

Google also highlighted that AI Overviews would only launch where it will not ‘hurt’ their ad revenues. Previews of new experiences throughout the event displayed ads being pushed below the fold by the generative AI responses.  

It’s still not clear how this will impact ads and whether this will be visible within Google Ads – as it will be within Search Console.  

So, what are the main takeaways for brands?  

We see the main takeaways for brands in relation to Search to be threefold:  

  1. Search queries will change - Queries are likely to become conversational and topic focused. Brands should continue to monitor the Search Engine Results Pages (SERPs) to see where AI Overviews is being rolled out. At Dentsu, we are working with our keyword tracking and tool providers to highlight AI Overview queries. We’ll be focusing on topical ownership and delivering signals that search engines can interpret.
  2. CTR will be impacted - It’s likely to deliver an increase in zero-click searches, which have already been on the rise. To adapt to this, we will be focusing on impressions in our reporting and looking to replicated searches in the US or Search Labs to learn ahead of global roll out.
  3. Matching content format to the search query will be crucial - It’s clear that visual, rich results are being prioritised - so content needs to align to this. We’ll be focusing on AI organised results and delivering an integrated approach to our clients marketing strategies. 

In terms of how clients can optimise towards the new features announced at Google IO, our Head of Technical SEO, Mario La Malfa has said:  

“While we can't know exactly what factors Google will use to deem content worthy of being included in AI overviews, we know that it will still be relevant to our audience. When it comes to assets, we know that they will have to be marked up correctly to be easily understood and catalogued to be readily accessible to Gemini. 

At the risk of repeating ourselves, it will be ever more important to meet customers at each moment of their user journey, matching their intent for each of their stacked questions in a single query or all the section of the AI built Search result. 

The pages on our websites should helpfully answer potential questions from the users, the category pages and filter options should match groupings that make sense to our target audience, and of course the product pages should be informative and be marked up correctly to seamlessly fit into product grids. 

What is fundamental in the new world of AI organised search results is to optimise the journeys within the websites to mimic those created by Google for our audience.” 

In summary, whilst specific outcomes for Google’s latest features are still largely speculative, we did learn a lot about what the future of search is going to look like, with innovation (particularly in relation to AI) leading the way. If your brand needs help moving into this new world of search, then don’t hesitate to .  

If you’re a or brand, take a look at our latest thinking on AI Overviews (previously SGE). For more information around Ad Spend, our Global Ad Spend Forecasts report can be downloaded here.