Can You Protect Your Website from an SEO Heist?
A couple of months ago, an SEO consultant published a controversial Twitter/X thread explaining how we had “pulled off an SEO heist”.
The thread got significant backlash, from SEOs deriding his actions to journalists worried about the future of the internet, and random users complaining that “Google results are terrible now” and blaming the SEO consultant for it. And even the owner of one of the affected websites wrote a heartfelt comment criticizing the strategy:
In this post, we’ll analyze the SEO heist from a new perspective. As a company that has invested thousands of dollars in content marketing, how can you protect your website from an SEO heist? Is asking GPT-4 not to scrape your content good enough?
Let’s get to the bottom of it! But first, let’s break down the components of this “SEO heist”.
SEO Heist Essentials
This controversial “SEO heist” was actually a very simple process. It involved:
- Taking a competitor’s public sitemap
- Using it as a blueprint to create thousands of content pieces using AI
- Publishing the new content
- Profit?
Of course, if you publish thousands of relatively keyword-optimized, reliable-looking pages, Google will boost you. But the fake growth created by AI content doesn’t last. Traffic declined as fast as it increased. And with low-quality content, conversions are unlikely. That’s why the main pillar of our AI SEO strategy is not having AI write the content.
Here are the results of the SEO heist:
We know that this strategy isn’t successful. But that doesn’t mean that black hat SEOs and agencies won’t try it. So - how can you protect your website from being used by a competitor for an SEO heist?
Can You Protect Your Website from an SEO Heist?
The heist relied on 2 elements:
- Competitors’ sitemaps
- Competitors’ existing content
A potential way to protect your site could be to hide your sitemap and prevent AI bots from crawling it. But - is it good enough? Let’s take a closer look.
Can you hide your XML sitemap from competitors, but not from search engines?
You can give your sitemap an odd name, hide it within a subdirectory, and exclude it from your robots.txt file. Don’t forget to upload it to GSC so Google can find it anyway.
If you’re using Cloudflare, you can easily set up a firewall rule to restrict access to your sitemap file.
Can you block AI crawlers from your website?
You can prevent AI crawlers from accessing your website by disallowing them on your site’s robots.txt file.
You can block GPTBot by adding:
User-agent:
GPTBot Disallow:/
And you can block Google’s Bard by adding:
User-agent:
Google-Extended Disallow: /
Of course, you can fine-tune these settings, if you want bots to be allowed on certain directories and excluded on others.
But, are there any cons to blocking AI crawlers from accessing your site? Let’s explore it.
Pros & cons of disallowing AI crawling
Something’s for sure: Finding answers through GPT-4 or Bing’s conversational search can be quicker and easier than doing it through Google.
Google is still most people’s go-to search engine. But we’ve had clients getting organic leads through ChatGPT. Plus, we’ve had conversations with C-suite leaders who’ve told us that they look for B2B products on ChatGPT. So, when some SEOs say that the future of discoverability is conversational AI, they may be right. That may even be the case for relatively high-stakes B2B niches.
With that in mind, hiding your brand’s existence from AI crawlers may be locking yourself out of a great opportunity. That’s the biggest con. The biggest pro is that your content wouldn’t be used to produce material that can compete against your site. But one could argue that that content would be produced anyway, using your competitors’ material.
Of course, whether you should block AI crawlers from your website is a tricky issue. And it’s up to you to decide whether it’s right for your brand. We don’t have a definite answer.
But there’s another way you can protect yourself from AI-powered black hat SEO.
The Best Way to Protect Yourself Against an SEO Heist
The content collection that was subject to “the SEO heist” was a series of Excel tutorials. Most of the AI-generated Excel tutorials included hallucinations. So they were low-quality content pieces that frustrated users and provided no real answers.
But, in principle, the idea for the collection was good. “How-tos” could be the sort of actionable, straight-to-the-point content piece that AI gets right. We’re just not there yet.
The best way to protect yourself against an SEO heist is to create content that has competitive advantages that AI can’t emulate.
AI can create (relatively) scannable and SEO-optimized material. But it can’t produce nuanced, reliable answers for specific, long-tail topics. And it can’t outline content in a way that makes it fun to read and original. AI writers can’t tell evocative real-life stories or form opinions.
Additionally, fact-checking can be a great competitive advantage. Most teams producing AI content don’t even bother to fact-check their outputs. And they may be unable to because they’re not producing content at a human scale. Who can fact-check 35,000 AI-generated posts?
Quick reminder: Google’s new content quality standards are synthesized in the acronym EEAT, which stands for Experience, Expertise, Authoritativeness & Trustworthiness.
So, we recommend you:
- Create product-led pieces that leverage your product’s unique value
- Add (relevant) anecdotes and testimonials
- Use information depth and accuracy as a competitive advantage
- Outline your posts to maximize impact & information density, not keyword placement
- Keep your users’ pain points & goals in mind
We can help you achieve that and win the SERPs with actual quality content.
Impactful Content, Written by Our Best Humans
We collaborate with founders & founding teams, helping them grow, maximize sales efficiency & improve their positioning through unique content.
Looking for a content partner that brings actual results? Let’s have a 15-minute chat.