The Future of SEO in a Post-ChatGPT World


SEO is the growth engine that powers many companies, so it’s worth paying particular attention to any new technology that might change the rules of search.

While SEO isn’t dying, it is undergoing substantial changes, accelerated by generative AI, and it’s worth hazarding a few educated guesses about what it might look like on the other side. 

1. Massive Increase in Competition for All Search Results

Generative AI will dramatically increase the amount of SEO content published each day. 

It’s difficult to argue otherwise. The time and skill required to research and write a search-optimized article have plummeted to almost zero: a couple of sentences of direction, a handful of copy-pasted examples, and a few clicks in a freemium SaaS product. That’s likely to have a big impact on the competitiveness of most SERPs (search engine results pages). 

Publishing Frequency and Content Length Trend Upwards

Publishing frequency is a powerful growth lever, but for most of SEO’s history, the mechanical process of writing has been the bottleneck. Now, that bottleneck is virtually gone, and publishing twenty articles each month — or forty or a thousand — is much more approachable.

Using ChatGPT, it is, in fact, almost as easy to write a 5,000-word article as a 500-word one, so we can also expect the average length of published content to continue its upward trend. We will see more companies creating more search content targeted at the same number of keywords. Much of this content will languish at zero pageviews — but not all.

Original research and graphic by Orbit Media, crude additions by me

Companies Will Be Less Discerning With the Keywords They Target

When it costs a lot of time and energy to write an article, it makes sense to be discerning about the articles you create, focusing your energy on the highest traffic, lowest competition, or most product-relevant keywords. 

But generative AI removes this constraint and creates an incentive to publish first, think later. Companies will be less discerning and more willing to tackle any keyword with a passing relevance to their product. As we’ve written before:

“Why bother with strategy when you can create hundreds of blog posts at a time? Why bother with prioritization when you can target every keyword? Why bother with competitor analysis when you can just copy your competitors in a fraction of the time?”

The Search Singularity: How to Win in the Era of Infinite Content

Programmatic SEO Becomes Commonplace

Some companies will use AI as a creative co-pilot. Others will entrust it with first drafts, leaving reviewing and editing to a human. Others will use AI content as an exploratory tool, publishing hundreds of so-so articles and improving those that show the best performance.

But the impact of these use cases will be dwarfed by the impact of programmatic SEO.

For publishers everywhere, there has always been a trade-off between content volume and content quality: publishing a thousand articles in a month usually requires a huge compromise on the relevance and uniqueness of the content. With GPT-4, that constraint may no longer exist. Add in GPT-4’s ability to write and troubleshoot Python scripts for web scraping, and it becomes clear that the barriers to entry for programmatic SEO are virtually zero. 

2. Google Prioritizes Off-Page Ranking Factors

Much of the content published with AI will be some flavor of copycat content: a rehashing of the same core ideas found in the existing content on a given topic.

Generative AI exacerbates a well-established problem: marketers creating content based on the same tools and the same source material. GPT-4 is trained on a huge corpus of writing, but for many topics (like “top tips for CRM adoption”), generative AI will be drawing on the same tired skyscraper articles as everyone else. The problem is worsened by generative AI’s inability to create new information: it can’t collect new data, have personal experiences, or perform credible industry analysis.

This will create a huge increase in articles that are virtually indiscernible from one other. This is bad for the user experience, making it harder to surface new information, and bad for Google, curtailing the user journey. Google will be incentivized to find better ways to distinguish between the same content.

In practice, that could mean:

Off-Page Ranking Factors Become More Important

In the presence of copycat content — where the words on the page across competing articles are largely the same — Google could place greater emphasis on off-page ranking factors, like backlinks, as a means of differentiating between similar content. Big, recognizable brands with established backlink moats could become even harder to challenge.

As Risto Rehemägi, co-founder at ContentGecko, shared on LinkedIn, user signals might also see an uptick in relevance:

“As the web expands, it’s increasingly costly to maintain this index and interlinking data. Authority calculation will likely shift from domains to actual authors (social media will benefit from this). So I’d rather put my money on user-signals — are the users happy with this result?”

Content Authorship Emphasized

Bylined content may also become more important as Google puts more emphasis on the author of the content and less on the content itself. We can already see the early signs of this change in Google’s recently updated quality rater guidelines:

“Now to better assess our results, E-A-T is gaining an E: experience. Does content also demonstrate that it was produced with some degree of experience, such as with actual use of a product, having actually visited a place or communicating what a person experienced? There are some situations where really what you value most is content produced by someone who has first-hand, life experience on the topic at hand.”

We will see more situations where “getting the right answer from the wrong person isn’t good enough” — Google will differentiate between similar articles based on the expertise of the authors who wrote them. We may even see a return to prominence for publishers — trusted entities that can vouch for the authenticity of the content. 

Information Gain Prioritized

One of Google’s patents already offers a speculative solution to the problem of copycat content: information gain scores, rewarding articles for bringing new information to the discussion and penalizing those that don’t.

This possibility paints an optimistic picture: instead of the current system that rewards companies for “skyscraper”-ing existing search results, brands could be rewarded for conducting research and deviating from a given SERP’s status quo. In practice, that could entail:

  • Creating content that complements and builds on existing articles by providing a practical next step, elaborating on key ideas, or offering more depth and detail
  • Experimenting with risky framings and angles by addressing unserved intent, filling in missing information, challenging differing opinions, or correcting Google's comprehension
  • Incorporating original research into content through personal perspectives or customer surveys or adding quotes from subject matter experts to create an information moat

Read more: The Winner Doesn’t Take It All: ‘Information Gain’ and the New Future of SEO

3. Massively Reduced Returns From Search

In the past decade, SEO and content marketing have crossed the chasm and become de facto marketing channels for virtually all growth-minded companies. That growth has created a gradual but very noticeable diminishing of returns. The more popular a tactic, the greater the competition, and the harder it becomes to earn outsized results for any period of time. Based on our experience working with hundreds of SaaS companies, SEO today is harder than it was yesterday.

It’s possible that generative AI and ChatGPT represent a tipping point that changes SEO’s value equation even further, making it significantly harder to earn great results.

More Zero-Click Searches

Google’s Bard and Bing’s Chat show one possible future for search engines in a post-ChatGPT world. Users can interact with the search engine through fluid, natural conversation; the AI model can pore through existing search results for specific information and then — crucially — synthesize it all into a brand-new conversational text response, answering the question right there in the search results.

This is wonderful for searchers and terrible for publishers: instead of directing searchers to your website, rewarding you for your investment in content creation, search engines could use your hard work to answer queries directly in the search results.

Citations (like the image below) represent one possible solution, but the question remains: why bother indexing your content if the search engine will never send traffic to your website?

Source: Google explains why Bard rarely lists citations and links to content creators

Greater Fragmentation of Search

An uptick in interest in Bing (the search engine saw a waitlist of over 1 million people on the back of its AI-powered search announcement) is a small example of the impact LLMs could have on the search ecosystem. There’s also the possibility that LLMs will create a wider fragmentation of search, generating a host of niche engines trained on specific datasets.

The modern search paradigm represented by Google is, actually, pretty bad. We have grown accustomed to using a blunt tool: it is difficult to interact with, it funnels traffic unequally to a tiny percentage of the world’s content, it’s prone to gaming, and most important of all, it’s rife with erroneous and untrusted data.

LLMs make an interesting alternative possible, as demonstrated by Dan Shipper’s Huberman bot:

  • take a niche topic (“health and wellness”),
  • find a trusted data set (the Huberman Lab’s podcast), 
  • and use LLM-fueled querying to build your own alternative search engine.

Even if these niche search engines don’t take off in a concerted way, they each represent a diminishing of Google’s monopoly on search. The search of today is unlikely to look like the search of tomorrow.

Crucially, all of these trends already exist today. Generative AI is likely to accelerate the “maturation” we’re seeing in search: greater competition seeping into all but the smallest of niches, fewer uncontested keywords, and a continued reification of search in favor of big, established brands. SEO will require more effort for smaller returns.

So What Should I Do?

These ideas sit firmly in the realm of speculation based on second-order thinking (and you should be intensely skeptical of anyone who claims certainty). We use SEO for ourselves and our customers; it generates great results, and there is still an opportunity for incredible results. It just requires a little more thought and better execution than it once did.

If you feel so inclined, opportunities abound for companies willing to ride the AI wave. It's still early: there is leverage to be found through early adoption, and many companies will build huge traffic empires on the back of this technology. Longer term, it might be worth hedging in a few ways:

  • Build team strength in areas beyond just “writing.” Content writing is a small portion of a great content marketer's total skill set. Build a team with chops in higher-leverage areas: data analysis, content distribution, industry research, technical SEO, and editorial. Hire marketing generalists willing to experiment.
  • Develop your own flavor of “information gain.” Bring original information to your content in a way that aligns with your team’s strengths: build a network of SMEs, start a yearly benchmark report, or get comfortable sharing your team’s personal experiences and insights.
  • Diversify beyond search. Imagine a worst-case scenario where traffic from SEO disappears: how would you build your business? For many companies, the answer lies in social content, community, and media marketing, so it’s worth launching exploratory efforts.

For now, experimentation is the order of the day: play with generative AI, explore weird and wonderful use cases, and discover its strengths and limitations for yourself.