Skip to content

How Websites Work

Before optimizing for AI crawlers, it helps to understand how web content travels from your server to a user's browser, and where things can go wrong for bots.

In this guide

  • The request-response cycle
  • Where rendering happens
  • Why this matters for AI crawlers
  • Overview of rendering strategies
8 min read Prerequisite: Content Accessibility

The Request-Response Cycle

When someone visits your website, a chain of requests and responses occurs. Each step takes time and affects what crawlers see:

👤
User
~50ms
Request
🌐
Browser (Client)
~100ms
HTTP
🖥️
Server (Your app)
~50ms
Query
🗄️
Database
Response travels back: Database Server Browser User sees page
Total: ~200-500ms (or much more with JavaScript)

Where Does Rendering Happen?

"Rendering" means turning your content into the HTML that users (and crawlers) see. This can happen in different places:

🖥️

On the Server

HTML is built on your server before being sent. Crawlers receive complete content immediately. This is ideal for AI crawlers.

🌐

In the Browser (Client)

Server sends JavaScript code; the browser builds the HTML. Crawlers may see an empty page if they don't execute JS. This is risky for AI visibility.

🏗️

At Build Time

HTML is pre-generated when you deploy. Fast, static files served to everyone. This is excellent for AI crawlers.

Why This Matters for AI Crawlers

AI crawlers like GPTBot and ClaudeBot behave differently from human users:

Human Users

  • Wait for JavaScript to load
  • Click buttons and scroll
  • Wait seconds for content
  • Have modern browsers

AI Crawlers

  • Often skip JavaScript entirely
  • Don't click or interact
  • Have strict time budgets
  • Just fetch raw HTML

The Problem

If your content only appears after JavaScript runs, many AI crawlers won't see it at all. Your pages might look complete to users but appear empty to training bots.

Rendering Strategies Overview

There are several approaches to rendering, each with trade-offs for AI visibility:

Strategy Where AI Visibility Trade-off
Static (SSG) Build time Excellent Content not real-time
Server-Side (SSR) Server Excellent Server load per request
Client-Side (CSR) Browser Poor Invisible to most AI bots
Hybrid Both Depends Complexity

These choices also affect your crawl budget, which determines how much of your site AI crawlers can process efficiently.

What Should You Do?

Recommendation

Move toward static generation (SSG) or server-side rendering (SSR) for content you want AI to find. The best setup depends on your organization's constraints:

  • Blogs/marketing sites: Static generation is ideal
  • E-commerce: SSR for product pages, static for categories
  • Web apps: At minimum, ensure landing/SEO pages are server-rendered

Key Takeaway

Server-rendered HTML is what AI crawlers see.

The further you move rendering toward the server (or build time), the more visible your content becomes to AI systems. JavaScript-heavy sites risk being invisible to AI training.

Sources