The evolution of web rendering is really a story of trade-offs.
Every new strategy wasn’t invented because the old one was “bad”, but because it failed under a new constraint: scale, UX, SEO, or performance.
Let’s walk through this the same way we’d reverse-engineer a library — by understanding what problem each approach tried to solve.
What We’re Trying to Solve
At a high level, every rendering strategy is juggling the same goals:
- Fast first paint
- Smooth navigation
- SEO-friendly HTML
- Reasonable server load
- Simple mental model for developers
Different strategies optimize for different parts of this list.
1. The Early Days: Static Files & Server Templates
The earliest web was simple.
A server hosted files like:
/index.html
/about.html
When a browser requested a page:
Server sent the file.
Browser rendered it.
Done.
No JavaScript.
No state.
No interactivity.
Enter Dynamic Data
As soon as we needed things like:
- User profiles
- Product prices
- Authenticated pages
Servers started injecting data into templates.
Think EJS, Handlebars, PHP.
How it worked
- Request hits the server
- Server fetches data
- Data is injected into an HTML template
- A complete HTML page is sent to the browser
This is the original Server-Side Rendering.
The Problem?
Every navigation caused:
- Full page reload
- CSS, JS, images re-downloaded
- Visible flicker
UX suffered badly as apps became more interactive.
2. Client-Side Rendering (CSR)
To fix full-page reloads, we moved rendering to the browser.
This is where Single Page Applications came from.
How CSR Works
- Browser requests a page
- Server responds with a tiny HTML shell
- Browser downloads a large JS bundle
JavaScript then takes over:
- Routing
- Rendering
- Data fetching
Example HTML Shell
<html>
<body>
<div id="root"></div>
<script src="main.js"></script>
</body>
</html>
The UI doesn’t exist until JavaScript runs.
Pros
- Extremely smooth navigation
- No full reloads
- Great for highly interactive apps
Cons
- Slow first paint (blank screen until JS loads)
- Poor SEO (crawlers see an empty div)
- Large JS bundles hurt performance on low-end devices
CSR fixed UX — but broke SEO and initial load.
3. Modern Server-Side Rendering (SSR)
Modern frameworks like Next.js tried to merge the old and the new.
The idea:
Use React, but render it on the server.
How Modern SSR Works
- Request hits the server
- React components are rendered into HTML
- HTML is sent to the browser
- JavaScript “hydrates” it and takes over
So the user sees content before JS finishes loading.
Next.js Example (Forced SSR)
// page.js
export const dynamic = "force-dynamic";
export default function Page() {
const seconds = new Date().getSeconds();
return <h1>Current Second: {seconds}</h1>;
}
Rendered fresh on every request.
Pros
- Excellent SEO
- Fast first paint
- Works well for dynamic content
Cons
- Server does work on every request
- Higher infra cost
- Slower TTFB under heavy load
SSR fixed SEO — but pushed work back to the server.
4. Static Site Generation (SSG)
Then we asked a simple question:
Why render the same page 1,000 times if it never changes?
SSG renders pages once at build time.
How SSG Works
During build:
- Fetch all required data
- Render pages into static HTML files
At runtime:
- Server just serves files
- No rendering
- No computation
Next.js Example (Static Params)
// [slug]/page.js
export async function generateStaticParams() {
const posts = await fetch('https://api.example.com/posts')
.then(res => res.json());
return posts.map(post => ({
slug: post.id.toString(),
}));
}
export default function Post({ params }) {
return <h1>Post ID: {params.slug}</h1>;
}
Pros
- Blazing fast
- Zero server cost per request
- Perfect for blogs, docs, landing pages
Cons
- Data becomes stale
- Any update requires a rebuild
- Doesn’t scale well for frequently changing content
SSG optimized performance — but sacrificed freshness.
5. Incremental Static Regeneration (ISR)
ISR is the compromise.
It keeps static performance but allows updates.
How ISR Works
- Pages are generated statically
- After a certain time, they expire
- The next request triggers regeneration in the background
Next.js Example (Revalidation)
// page.js
export const revalidate = 60;
export default async function Page() {
const data = await fetch('https://api.example.com/data');
return <div>{/* render data */}</div>;
}
The user never waits.
Old HTML is served while new HTML is generated.
Pros
- Static performance
- Fresh data
- No full rebuilds
Cons
- Slightly more complex mental model
- Not suitable for real-time data
ISR gives you controlled staleness — which is often good enough.
The Mental Model (Everything in One Flow)
You can think of rendering like this:
- CSR → Render after JS loads
- SSR → Render on every request
- SSG → Render once at build time
- ISR → Render at build time + occasionally later
That’s it.
Everything else is an optimization around this axis.
Which One Should You Use?
CSR
- Internal dashboards
- Auth-heavy apps
- SEO doesn’t matter
SSR
- Highly dynamic, SEO-critical pages
- Content must always be fresh
SSG
- Docs, blogs, marketing pages
- Content rarely changes
ISR
- Blogs
- E-commerce product pages
- Content that changes, but not constantly