Guide

Why AI Crawlers Cannot See Your Single Page Application

AI crawlers like GPTBot, ClaudeBot, and PerplexityBot cannot execute JavaScript. If your website is a SPA without Server-Side Rendering, it does not exist for these systems. Learn why this is a critical problem and how to solve it.

Ranketic ResearchMarch 20, 20267 min read
Why AI Crawlers Cannot See Your Single Page Application

Introduction: The Invisible Wall

Imagine running a shop in a prime high-street location. The interior is modern, the products compelling, the prices fair. But the shop window is completely opaque. Pedestrians walk past and see: nothing. That is exactly what happens to your website when it runs as a Single Page Application (SPA) without Server-Side Rendering – at least from the perspective of the AI systems that increasingly determine which businesses get recommended and which do not.

This article explains why this happens, what the concrete consequences are, and what you can do about it. The analysis is based on current research findings, official documentation, and empirical analyses of billions of crawler requests.

What Is a Single Page Application?

A Single Page Application is a web application that, after the initial page load, no longer requests complete HTML documents from the server. Instead, the entire user interface is constructed by JavaScript running in the visitor's browser. Frameworks like React, Vue.js, and Angular have popularized this approach because it enables fast, fluid user experiences – similar to a desktop application.

The technical principle is straightforward: the server delivers a minimal HTML file consisting essentially of an empty container (typically <div id="root"></div>) and one or more JavaScript files. Only when the browser executes these scripts does the actual content appear. For human visitors with modern browsers, this works perfectly. For crawlers that cannot execute JavaScript, the page remains empty.

How Search Engines and AI Crawlers Process Websites

To understand the problem, one must know how different systems read websites. The differences are fundamental.

Google's Three-Phase Process

Google processes JavaScript-based websites in three phases: Crawling, Rendering, and Indexing. During crawling, the server's HTML response is retrieved. The page is then placed in a render queue, where a headless Chromium browser (the same technology as Google Chrome, just without a visible window) executes the JavaScript and produces the finished content. Only then is the page indexed.[1]

This process works but has drawbacks: the render queue can cause delays ranging from seconds to longer periods. Google itself explicitly recommends: "Server-side or pre-rendering is still a great idea because it makes your website faster for users and crawlers, and not all bots can run JavaScript."[1]

AI Crawlers: No JavaScript, No Content

The crawlers of major AI systems – including OpenAI's GPTBot, Anthropic's ClaudeBot, and PerplexityBot – operate fundamentally differently from Googlebot. They send an HTTP request, read the returned HTML response, and move on. None of these crawlers execute JavaScript.

A comprehensive analysis by Vercel, based on real-world data from the platforms MERJ, Resume Library, and CV Library, confirms this unequivocally: none of the major AI crawlers render JavaScript. Although ChatGPT's crawler fetches JavaScript files (11.50% of requests) and Claude's crawler does likewise (23.84%), these files are not executed.[2]

A separate analysis of over 500 million GPTBot requests found zero evidence of JavaScript execution.[3]

Overview: Who Sees What?

CrawlerJavaScript ExecutionWhat It Sees on a SPA
GooglebotYes (headless Chrome)Full content after JS execution
Google GeminiYes (uses Googlebot)Full content
AppleBotYes (browser-based)Full content
GPTBot (OpenAI)NoEmpty app shell
ClaudeBot (Anthropic)NoEmpty app shell
PerplexityBotNoEmpty app shell
BingbotLimitedPartial content
Meta-ExternalAgentNoEmpty app shell
Bytespider (ByteDance)NoEmpty app shell
CCBot (Common Crawl)NoEmpty app shell

Sources: Vercel Research (Dec. 2024)[2], Passionfruit Analysis (Mar. 2026)[3]

The Scale of the Problem: Numbers and Facts

Current figures illustrate the magnitude of this problem. OpenAI's GPTBot alone generated 569 million requests per month on the Vercel network, with Anthropic's ClaudeBot following at 370 million. Combined, this represents approximately 20% of Google's 4.5 billion requests during the same period.[2] GPTBot traffic is growing at 305% year-over-year, and AI bots now account for 4.2% of all HTML page requests.[3]

These numbers make clear: AI crawlers are no longer a fringe phenomenon. They represent a rapidly growing share of total web traffic – and thus an increasingly important channel for business visibility.

Split Visibility: Google vs. AI

What makes this problem particularly insidious is that it often goes unnoticed. A SPA can rank excellently on Google – because Googlebot can execute JavaScript – while being completely invisible to all AI systems. Website operators who only monitor their Google rankings will not detect the problem.

However, when someone asks ChatGPT "Which provider is the best for [your product]?", your website will never be recommended – not because your offering is poor, but because the AI has literally never seen your content. The same applies to Perplexity, Claude, and every other AI system that relies on its own crawlers.

Research confirms the correlation between rendering speed and AI visibility: pages with a First Contentful Paint under 0.4 seconds receive an average of 6.7 ChatGPT citations, compared to just 2.1 for slower pages. For a client-side rendered SPA whose content never appears in the initial HTML response, there is no First Contentful Paint from the AI crawler's perspective at all.[3]

Generative Engine Optimization: The Scientific Foundation

Academic research has already formalized the importance of AI visibility. Aggarwal et al. introduced the concept of Generative Engine Optimization (GEO) at the prestigious ACM SIGKDD conference in 2024 – a framework for improving content visibility in generative AI system responses. Their experiments demonstrated that targeted GEO strategies can boost visibility by up to 40%.[4]

Yet even the best GEO strategies are futile if the AI crawler cannot read the content in the first place. A SPA without Server-Side Rendering renders every form of Generative Engine Optimization ineffective because the fundamental prerequisite is missing: readable content in the HTML response.

The Solution: Server-Side Rendering

The good news: the problem is solvable without sacrificing the advantages of modern JavaScript frameworks. The solution is called Server-Side Rendering (SSR).

With Server-Side Rendering, HTML content is generated on the server before being sent to the browser or crawler. The visitor receives a complete HTML page with all text, headings, and metadata. Subsequently, JavaScript in the browser takes over interactivity – a process known as Hydration. This way, human visitors continue to enjoy the fluid SPA experience, while crawlers can immediately read the complete content.

Modern Frameworks with SSR Support

FrameworkSSR SolutionBase Technology
ReactNext.jsNode.js
Vue.jsNuxtNode.js / Nitro
AngularAngular UniversalNode.js
SvelteSvelteKitNode.js

Pre-Rendering as an Alternative

For teams that cannot implement a full SSR migration immediately, Pre-Rendering offers a lighter alternative. Static HTML snapshots of SPA pages are generated and served specifically to crawlers, while human visitors receive the dynamic JavaScript version. Services like Prerender.io automate this process. The downside: snapshots must be regularly regenerated to reflect current content.

A case study demonstrates the effectiveness: after implementing pre-rendering for a SPA, AI bots accounted for 47.95% of all requests – evidence of how quickly AI crawlers engage with newly accessible content.[3]

Practical Recommendations

Regardless of whether you choose SSR or pre-rendering, you should implement the following steps:

  1. Disable JavaScript and test: Open your website in a browser and disable JavaScript. If product descriptions, prices, FAQ answers, or comparison tables disappear, those pages are invisible to AI crawlers.
  2. Check the source code: Right-click on a page and select "View Page Source." If you see your actual text content, it is server-side rendered. If you see only an empty <div> and script tags, you have a problem.
  3. Verify schema markup: Ensure that structured data, canonical URLs, and meta descriptions are present in the raw HTML source – not only after JavaScript execution.
  4. Monitor server logs: Check whether GPTBot, ClaudeBot, and OAI-SearchBot receive your pages with status 200 and content-rich HTML – or just empty shells.
  5. Use Ranketic: Our analysis tool automatically detects whether your website is a SPA without SSR and shows you exactly which indicators point to it.

Conclusion: Act Now

The question is not whether AI-powered search will become relevant – it already is. With annual growth of over 300% in AI crawler requests and the increasing integration of AI responses into users' daily lives, visibility for AI systems is becoming a decisive competitive factor.

If your website is a Single Page Application without Server-Side Rendering, it simply does not exist for these systems. Every day without a solution is a day when your competitors – whose content is readable by AI crawlers – get recommended and you do not.

The transition to SSR or pre-rendering is a one-time architectural decision with permanent returns. The sooner you act, the sooner you benefit from the growing reach of AI-powered search.


References

[1] Google Search Central: "Understand the JavaScript SEO basics", developers.google.com/search/docs/crawling-indexing/javascript/javascript-seo-basics

[2] Vercel Research: "The rise of the AI crawler", December 2024. vercel.com/blog/the-rise-of-the-ai-crawler

[3] Passionfruit: "JavaScript Rendering and AI Crawlers: Can LLMs Read Your SPA?", March 2026. getpassionfruit.com/blog/javascript-rendering-and-ai-crawlers-can-llms-read-your-spa

[4] Aggarwal, P. et al.: "GEO: Generative Engine Optimization", ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD '24), August 2024. doi.org/10.1145/3637528.3671900

[5] seoClarity: "Optimizing Single-Page Applications for SEO & AI Search", January 2026. seoclarity.net/blog/single-page-applications

SPASingle Page ApplicationSSRServer-Side RenderingKI-CrawlerGPTBotClaudeBotSEOGEOAI Visibility

Cookie-Einstellungen

Deine Privatsphäre ist uns wichtig

Wir verwenden Cookies und ähnliche Technologien, um dir die bestmögliche Erfahrung zu bieten. Du kannst selbst entscheiden, welche Kategorien du zulassen möchtest. Mehr erfahren