Artificial intelligence is spurring startups like Perplexity.ai to challenge Google’s dominance in search, which they see as more vulnerable than at any point in the product’s 25-year history.
Google could have copied Perplexity, a new kind of search engine that answers queries in paragraph form and cites source material with research-style footnotes.
Instead, Google’s new generative AI-powered search engine unveiled Wednesday preserves much of its traditional style (and, therefore, its core business model), complete with a list of links and a lot of advertising. It will also live separately from core search, inside “Google Labs,” billing it as an experiment.
“That shows they’re not brave enough to revamp search yet,” said Perplexity co-founder and CEO Aravind Srinivas in an interview. When he saw Google’s new products, his nervousness turned into relief.
Large language models (LLMs), which power services like ChatGPT, have created an internet search conundrum for companies dependent on ad revenue like Google. Users have turned to chatbots to find one answer, instead of dozens of search results with related advertising.
“Soon enough, the traditional 10 blue links on a search engine results page will seem as archaic as dial-up internet and rotary phones,” Sridhar Ramaswamy, former head of Google search and co-founder of Neeva, a new search engine powered by AI, told Semafor.
Despite Google’s dominance in search, investors are interested in startups taking it head on — something that hasn’t been true since the 1990s. At a recent tech gathering in San Francisco, Srinivas was surrounded by venture capitalists from some of the top firms. And last month, it closed a $25.6 million funding round led by New Enterprise Associates.
To create an AI-powered Google competitor, Perplexity didn’t have to hire an army of AI researchers. It only had to write a prompt, albeit a more complex one than most people type into ChatGPT.
But the company’s secret sauce isn’t the prompt itself, it’s the user interface. In essence, OpenAI is enabling anyone to create their own, customized internet search engine and Perplexity is building a business around that.
One thing to note: Startups powered by OpenAI’s technology can instruct GPT to do a Google search and anchor the answer to real web sites. In essence, Google’s own page ranking system can be used against it by would-be rivals.
Srinivas said that as of Thursday, the company had 2.9 million monthly active users and 1 million queries per day.
Don’t bet against Google’s ability to continue to dominate internet search. If it sees users fleeing from its core product, it can quickly pivot because it is sitting on some of the best fundamental artificial intelligence technology in the industry.
Search is still Google’s game to lose.
The more radical way of looking at Google’s search conundrum — and one that would probably get the company in trouble with Wall Street — is whether it’s open to cannibalizing its business, and the ad dollars that go with it, by building a search engine reimagined from the ground up.
Protecting the search ad business model, but allowing competitors to build better consumer experiences, is not a winning strategy in the long term.
The explosion of businesses built atop LLMs has barely begun. Perplexity shows just how easy and inexpensive it is to spin up new products that grab eyeballs.
Google also has several opportunities to leverage its AI know-how to boost revenue in other areas, reducing its dependence on search ads. Here are a few key areas:
Waymo: Google has self-driving cars roaming around San Francisco, Phoenix, and Austin and will, at some point, begin scaling that business. Now, it’s a matter of waiting for the right timing to fundamentally change urban transportation.
Life sciences: Google’s DeepMind upended biology with its AlphaFold discovery. Parent company Alphabet is now launching into the rapidly-growing, $80 billion drug discovery industry with its Isomorphic Labs. Google has long been growing its health industry expertise with other bets, including Verily and Calico Labs.
Cloud computing: Google’s cloud division has only a small portion of the overall $500 billion market. Google is already starting to gain market share and the pie is growing quickly. It became clear at its event Wednesday that it intends to go after new customers by flooding its product offerings with AI enhancements that make its competitors look slow and stodgy.
Room for Disagreement
David Rodnitzky, founder of 3Q Digital and a veteran of the search ads industry, said he believes Google Search is facing an existential threat from LLMs. In an email to Semafor, he wrote: “Google has the technical power to build great AI, and they can use this AI to make their search results better.”
“But it is hard to see how they can maintain their revenue per search if they start to direct users to ‘an answer’ instead of ‘many answers.’ From what I can tell, they are trying to integrate Bard into the [search engine results page] in the hopes that users will continue to use the Google search bar. This enables them to monetize results the way they always have — by placing ads above and below the results. The problem is, this is just not as user friendly as ChatGPT.”
The View From Japan
Google is training LLMs on specific content written in different languages. So a query in Japanese won’t just be like a translated version of ChatGPT; it will be tailored to Japanese culture. Google’s international approach could give it a global advantage over smaller and less resourced competitors. Fun fact: Japanese LLMs should be faster than English versions because Japanese requires fewer “tokens,” a sequence of characters grouped together for semantic processing.
- You might think Google was caught by surprise by the LLM revolution, but the company knew this day would come. It’s been discussing a radical change to search using LLMs for at least two years, as this paper from Google research shows. “Users want to engage with a domain expert, but often turn to an information retrieval system, such as a search engine, instead,” the paper says. This MIT Technology Review article explains it nicely.