Search has continuously evolved since its inception in the early-2000s.
The first search engine optimization (SEO) could be described as “quantity over quality.” In the early days of SEO, websites showed up in Google search based on the number of times a target keyword was repeated on a site, and the number of links that pointed to the site (regardless of whether those links were trustworthy or built naturally).
As search algorithms evolved to prioritize quality, though, so did SEO. This is when content – real content, not invisible keywords – and naturally earned backlinks gained priority.
After that, mobile-friendliness and usability ruled the roost, followed closely by the demonstration of trust and authority in the form of E-E-A-T signals (experience, expertise, authoritativeness, and trustworthiness).
And today, AI and generative search is all the rage.
Search is the space race of our day, but the goal has always been the same: connecting people with information. That is why people search, isn’t it? We want to satisfy our compulsory human need for answers. The act of searching is as old as humanity. We’ve always been searchers – first through storytelling and exploration, then through libraries, encyclopedias, and print publications. As technology has evolved, so has the ways and modes by which we search.
Enter: The era of multimodal search.
“We can really see, as we enter this new era of search, that you’ll be able to find exactly what you’re looking for by combining images, sounds, text and speech, organized in a way that makes sense to you and that ultimately helps you make sense of the world.”
– Cathy Edwards, Google VP and GM of Search, Google’s 2022 ‘On Search’ event
What is multimodal search?
“Multimodality” refers to multiple data types combined into one, such as a blog post containing text, audio, and visuals. When we apply this concept to search, we are referring to a search mechanism that can reference and process different inputs or formats at the same time.
One real-world application of this is Google Lens, powered by its AI model called Multitask Unified Model (MUM). The introduction of MUM in 2021 transformed search. Now, you can take a photo and run a Google search against that photo.
This functionality (though not powered by MUM) can also be seen elsewhere, such as apps that identify plants and fauna using your photos.
Multimodal searching mimics how most humans process information: by combining different modalities. It therefore represents an important step in the advancement of AI and the ways people connect to information.
How can life-sciences marketers respond?
Multimodal search will only continue to expand, so it’s important for pharma brands to tap into its power to future-proof their ability to connect with their target audiences.
Here are five practical, achievable ways to leverage multimodal search in the life-sciences space:
- Increase visibility for “point-and-ask” visual queries (such as those done through Google Lens) by providing images and videos of disease symptoms and disease presentation.
- Provide simply formatted information to increase your eligibility to show in Google results, such as the “Things to Know” section. This can include how-to’s, tips, step-by-steps, or quick facts – ways to help a reader dive into a topic.
- Mark up websites – including text, images, and videos – with schema. This helps connect pieces of information and modalities so MUM (and other similar AI models) can better discover and process your content.
- Leverage multimedia content. Since multimodal search is all about combining different content formats, you will benefit from presenting content in a variety of formats, including video, audio, image, and text.
- Combine great SEO with great content strategy. In many ways, content is still king. But it isn’t enough to just drop content onto your website. Content needs to be thoughtfully and semantically related, authoritative, and mapped against the user journey. In this way, we can increase the number of content touchpoints and ensure that we are answering questions along each point of the user’s journey – something that search engines have a vested interest in, as younger searchers turn to social search for middle- and bottom-of-funnel information.