Introduction: The Invisible Wall
In short: Structured content supports readability and discoverability.
Further Overview and In-depth Topic Exploration.
Imagine this: You have a store. It's downtown. The location is great. The setup is modern. The offerings are convincing. Prices are fair. But the shop window is opaque. Passersby walk by. They see: nothing. This happens with your website. This is the case if it runs as a Single Page Application (SPA) without server-side rendering. At least from the perspective of AI systems, that's how it is. These systems increasingly decide. They determine which companies are recommended. And which are not.
This article explains why this is so. It shows the consequences. And it explains what you can do about it. The presentation is based on current research results. It relies on official documents. And it uses empirical analyses. These include billions of crawler requests.
This is important.
What is a Single Page Application?
A Single Page Application is a web application. After the initial page load, it no longer requests full HTML documents from the server. Instead, JavaScript builds the entire interface for users. This happens in the visitor's browser. Frameworks like React, Vue.js, and Angular have popularized this approach. They allow fast, fluid experiences for users. It's similar to a desktop application.
The technical principle is simple: The server delivers a minimal HTML file. This essentially consists of an empty container. Typically, it's <div id="root"></div>. One or more JavaScript files are added. Only when the browser executes these scripts does the real content appear. For human visitors with modern browsers, this works flawlessly. For crawlers that can't execute JavaScript, the page remains empty.
That's how you see it at a glance.
How Search Engines and AI Crawlers Process Websites
To understand the problem, you need to know: Different systems read websites differently. There are fundamental differences.
Google's Three-Step Process
Google processes JavaScript-based websites in three phases: Crawling, Rendering, and Indexing. During crawling, Google retrieves the server's HTML response. Then Google queues the page for rendering. There, a headless Chromium browser executes the JavaScript. This is the same technology as Google Chrome. Just without a visible window. It generates the finished content. Only then does Google index the page.[1]
This process works. But it has drawbacks: The rendering queue can cause delays. These range from seconds to longer periods. Google itself explicitly recommends: “Server-side or pre-rendering is still a great idea because it makes your website faster for users and crawlers, and not all bots can run JavaScript.”[1]
That makes the difference.
AI Crawlers: No JavaScript, No Content
The crawlers of major AI systems work differently than Googlebot. Among them are OpenAI's GPTBot, Anthropic's ClaudeBot, and PerplexityBot. They send an HTTP request. They read the returned HTML response. Then they move on. None of these crawlers execute JavaScript.
A comprehensive analysis by Vercel confirms this clearly. It is based on real data from platforms MERJ, Resume Library, and CV Library. None of the major AI crawlers render JavaScript. ChatGPT's crawler does fetch JavaScript files. That's 11.50% of requests. Claude's crawler does the same. That's 23.84%. But these files are not executed.[2]
A separate analysis of over 500 million GPTBot requests found no evidence of JavaScript execution.[3]
Here's the crux.
Overview: Who Sees What?
Sources: Vercel Research (Dec. 2024)[2], Passionfruit Analysis (Mar. 2026)[3]
The Extent of the Problem: Numbers and Facts
Current numbers clearly show the scope of this problem. OpenAI's GPTBot alone generated 569 million requests per month on Vercel's network. Anthropic's ClaudeBot followed with 370 million. Together, this amounts to about 20% of Google's 4.5 billion requests in the same period.[2] GPTBot's volume is growing by 305% per year. AI bots now account for 4.2% of all HTML page requests.[3]
These numbers show: AI crawlers are no longer a fringe phenomenon. They represent a rapidly growing share of total web traffic. Thus, they are an increasingly important channel. A channel for company visibility.
This brings advantages.
Split Visibility: Google vs. AI
The tricky part of this problem is: It often goes unnoticed. An SPA can rank excellently on Google. That's because Googlebot can execute JavaScript. At the same time, it can be completely invisible to all AI systems. Website operators who only monitor their Google rankings don't recognize the problem.
Imagine: Someone asks ChatGPT “Which provider is the best for [your product]?” Your website is never recommended. It's not because your offering is bad. It's because: The AI has literally never seen your content. The same goes for Perplexity, Claude, and any other AI system. They rely on their own crawlers.
Research confirms the connection. The connection between rendering speed and visibility for AI: Pages with a First Contentful Paint under 0.4 seconds receive an average of 6.7 citations from ChatGPT. Slower pages receive only 2.1. For an SPA rendered on the client side, the content never appears in the initial HTML response. From the AI crawler's perspective, there is no First Contentful Paint at all.[3]
This is clear.
Generative Engine Optimization: The Scientific Foundation
Academic research has already formalized the importance of visibility for AI. Aggarwal et al. introduced the concept of Generative Engine Optimization (GEO) at the prestigious ACM SIGKDD conference in 2024. It's a framework. It serves to improve the visibility of content. This content appears in the responses of generative AI systems. Their experiments showed something important: Targeted GEO strategies can increase visibility by up to 40%.[4]
But even the best GEO strategies are futile. That happens when the AI crawler can't read the content at all. An SPA without server-side rendering makes any form of Generative Engine Optimization ineffective. The reason: The prerequisite as a foundation is missing. That is readable content in the HTML response.
This helps in practice.
The Solution: Server-Side Rendering
The good news: The problem is solvable. You don't have to give up the advantages of modern JavaScript frameworks. The solution is called Server-Side Rendering (SSR).
With server-side rendering, the server generates the HTML content beforehand. This happens before it is sent to the browser or crawler. The visitor receives a full HTML page. With all texts, headings, and metadata. Then JavaScript in the browser takes over interactivity. This process is called Hydration. This way, human visitors continue to benefit from the fluid SPA experience. At the same time, crawlers can immediately read the full content.
That's how it works.
Modern Frameworks with SSR Support
Pre-Rendering as an Alternative
For teams that can't immediately migrate fully to SSR, Pre-Rendering offers an easier alternative. Static HTML snapshots of the SPA pages are generated. These are specifically delivered to crawlers. Human visitors receive the dynamic JavaScript version. Services like Prerender.io automate this process. The downside: Snapshots must be regularly updated. Only then do they reflect current content.
A case study shows the effectiveness: After implementing pre-rendering for an SPA, AI bots made up 47.95% of all requests. This is proof: AI crawlers quickly respond to newly accessible content.[3]
That makes it simple.
Practical Recommendations
Whether you choose SSR or pre-rendering, you should implement the following steps:
Disable JavaScript and Check: Open your website in a browser. Disable JavaScript. If product descriptions, prices, FAQ answers, or comparison tables disappear, these pages are invisible to AI crawlers.
Check Source Code: Right-click on a page. Select “View Page Source.” If you see your actual content as text, it is server-rendered. If you only see an empty <div> and script tags, you have a problem.
Verify Schema Markup: Ensure: Structured data, canonical URLs, and meta descriptions are present in the raw HTML source. They must not appear only after JavaScript execution.
Monitor Server Logs: Check: Do GPTBot, ClaudeBot, and OAI-SearchBot receive your pages with status 200 and rich HTML content? Or do they only get empty shells?
Use Ranketic: Our analysis tool automatically detects: Is your website an SPA without SSR? It shows you exactly which features indicate this.
Conclusion: Act Now
The question is not whether AI-based search will become relevant. It already is. With annual growth of over 300% in AI crawler requests, visibility for AI systems is becoming a crucial competitive factor. Additionally, the integration of AI responses into users' daily lives is increasing.
If your website is a Single Page Application without server-side rendering, it simply doesn't exist for these systems. Every day without a solution is a day when your competitors are recommended and you are not. Their content is readable for AI crawlers.
Switching to SSR or pre-rendering is a one-time architectural decision with lasting benefits. The sooner you act, the sooner you benefit. You gain reach through AI-based search.
References
[1] Google Search Central: “Understand the JavaScript SEO basics”, developers.google.com/search/docs/crawling-indexing/javascript/javascript-seo-basics
[2] Vercel Research: “The rise of the AI crawler”, December 2024. vercel.com/blog/the-rise-of-the-ai-crawler
[3] Passionfruit: “JavaScript Rendering and AI Crawlers: Can LLMs Read Your SPA?”, March 2026. getpassionfruit.com/blog/javascript-rendering-and-ai-crawlers-can-llms-read-your-spa
[4] Aggarwal, P. et al.: “GEO: Generative Engine Optimization”, ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD ’24), August 2024. doi.org/10.1145/3637528.3671900