The traditional server model puts your application in one or two data centers and serves everyone from there. A user in Tokyo hits the same box as a user in London. Distance adds latency: every 1000km of physical separation adds roughly 10ms of round-trip time, and that time is irreducible regardless of how fast your code runs.
Edge computing moves computation to a network of servers distributed globally — hundreds of locations instead of two. When a request arrives, it is handled by the server geographically closest to the user. A Tokyo user might see 5ms to the edge instead of 150ms to a US data center. For anything time-sensitive — authentication, personalization, A/B testing, API responses — this matters.
Three platforms make edge computing accessible to web developers today without infrastructure expertise: Cloudflare Workers, Deno Deploy, and Vercel Edge Functions. Each one is worth understanding.
The Edge Runtime: What’s Different from Node
Edge functions run in a constrained JavaScript environment based on the Web Platform APIs, not Node.js. This means:
- You have:
fetch,Request,Response,Headers,URL,TextEncoder,crypto,setTimeout,Streams,WebSockets. - You don’t have:
fs,path,process, native Node modules, or arbitrary npm packages that use them. - Startup time is near-zero: Edge functions use V8 isolates rather than full processes. Cold starts are typically under 1ms instead of the 100–500ms you see with Lambda or Cloud Run.
- Execution time is capped: Most platforms limit edge functions to 30–50ms CPU time per request. They are designed for fast request handling, not long-running compute.
Cloudflare Workers
Workers was the first major edge function platform and remains the most feature-complete. A basic Worker handles a fetch event:
export default {
async fetch(request, env, ctx) {
const url = new URL(request.url);
if (url.pathname === "/api/greet") {
const name = url.searchParams.get("name") || "world";
return new Response(JSON.stringify({ message: `Hello, ${name}!` }), {
headers: { "Content-Type": "application/json" }
});
}
return new Response("Not found", { status: 404 });
}
};
Workers runs in 300+ data centers globally. What makes the platform particularly useful for web developers is its storage ecosystem: KV (key-value store replicated globally), Durable Objects (stateful objects with strongly consistent storage, one object per key, anywhere in the world), R2 (S3-compatible object storage with no egress fees), and D1 (SQLite at the edge).
KV is ideal for caching, feature flags, or configuration that needs to be readable globally with eventual consistency. Durable Objects solve coordination problems: rate limiting, presence, collaborative editing — any case where you need a single authoritative state.
Deno Deploy
Deno Deploy runs Deno scripts at the edge with the same isolation model as Workers but a different developer experience. Deno’s native TypeScript support, built-in formatter and linter, and standard library make it attractive for developers who want a more integrated toolchain:
import { serve } from "https://deno.land/[email protected]/http/server.ts";
serve(async (req) => {
const url = new URL(req.url);
if (url.pathname === "/api/time") {
return new Response(JSON.stringify({ time: new Date().toISOString() }), {
headers: { "Content-Type": "application/json" }
});
}
return new Response("Not found", { status: 404 });
}, { port: 8000 });
Deno Deploy connects directly to GitHub: push to main, it deploys within seconds. Every pull request gets a preview URL automatically. The developer loop from code change to live URL is among the fastest of any platform.
Storage options are more limited than Workers: Deno Deploy integrates with Deno KV (a global key-value store built on FoundationDB) and external databases via HTTP. For most stateless or lightly-stateful applications this is sufficient.
Vercel Edge Functions
Vercel’s edge functions are designed for the Next.js ecosystem but work with any framework via the Edge Runtime. The key use case is middleware: code that runs before every request, inspecting or modifying it before it reaches your pages or API routes:
// middleware.ts (Next.js)
import { NextResponse } from "next/server";
import type { NextRequest } from "next/server";
export function middleware(request: NextRequest) {
const country = request.geo?.country || "US";
// Serve different content based on geography
if (country === "DE") {
return NextResponse.rewrite(new URL("/de" + request.nextUrl.pathname, request.url));
}
// Add security headers to every response
const response = NextResponse.next();
response.headers.set("X-Frame-Options", "DENY");
response.headers.set("X-Content-Type-Options", "nosniff");
return response;
}
export const config = {
matcher: "/((?!_next/static|favicon.ico).*)"
};
Middleware runs before the cache on every request, which makes it the right place for authentication checks, redirects, header injection, and geo-routing. Moving these decisions to the edge means a user in Tokyo who hits a redirect doesn’t wait for the redirect response to travel from a US origin server — it is handled at the nearest edge node in milliseconds.
What to Run at the Edge (and What Not To)
Edge functions are not a replacement for all server-side code. Think of them as the fast lane for a specific set of operations:
- Good at the edge: Authentication checks, redirect logic, A/B testing assignment, geo-based routing, request transformation, rate limiting, security header injection, serving personalized static content from edge KV.
- Not good at the edge: Database queries that require low-latency to a single origin (your DB is still in one region; the edge doesn’t help if the bottleneck is the DB round-trip), heavy compute (CPU cap is strict), Node.js-dependent code.
The practical pattern for most web apps: run authentication, redirects, and header logic at the edge; run database queries and business logic in a regional Node/Python/Go service; use edge KV to cache the results of expensive queries near users.
Getting Started in Five Minutes
The fastest path to a running edge function:
- Cloudflare Workers:
npm create cloudflare@latest, pick “Hello World” Worker,npm run deploy. You get a*.workers.devURL immediately. Free tier includes 100,000 requests/day. - Deno Deploy: Connect a GitHub repo at dash.deno.com, point it at a
main.tsfile. Free tier includes 100,000 requests/day. - Vercel Edge: Create a Next.js app (
npx create-next-app), add amiddleware.tsfile, deploy withvercel. Middleware runs at the edge automatically with no extra configuration.
All three platforms have free tiers sufficient for side projects and production traffic at modest scale. The edge is no longer infrastructure you have to negotiate, provision, or maintain. It is a fetch handler you write and push.