2761 words
14 minutes
Building Lightning-Fast Serverless APIs with Cloudflare Workers
Anubhav Gain
2025-08-10
Building Lightning-Fast Serverless APIs with Cloudflare Workers
Cloudflare Workers revolutionize serverless computing by running your code at the edge in 330+ data centers worldwide. Unlike traditional serverless platforms that suffer from cold starts, Workers use V8 Isolates to achieve sub-millisecond startup times and deliver responses in under 10ms globally.
Table of Contents
Why Cloudflare Workers?
Performance Advantages
- Zero Cold Starts: V8 Isolates eliminate cold start latency completely
- Global Distribution: Code runs in 330+ locations automatically
- 50ms CPU Time: Sufficient for most API operations
- Sub-10ms Response Times: Achievable for cached and optimized workloads
- Automatic Scaling: Handle millions of requests without configuration
Cost Efficiency
// Cost comparison for 10M requests/month// AWS Lambda: ~$20 + data transfer costs// Cloudflare Workers: $5 (paid plan) or FREE (100K requests/day)
Project Setup
1. Initialize Worker Project
# Install Wrangler CLInpm install -g wrangler
# Create new projectnpm create cloudflare@latest my-api# Select "Hello World" Worker# Choose TypeScript for type safety
# Navigate to projectcd my-api
2. Project Structure
export interface Env { // KV Namespaces CACHE: KVNamespace; RATE_LIMIT: KVNamespace;
// D1 Database DB: D1Database;
// Secrets API_KEY: string; JWT_SECRET: string;
// Durable Objects COUNTER: DurableObjectNamespace;}
export default { async fetch( request: Request, env: Env, ctx: ExecutionContext ): Promise<Response> { return handleRequest(request, env, ctx); },};
Building a Production API
1. Advanced Router Implementation
type Handler = ( request: Request, env: Env, ctx: ExecutionContext, params: Record<string, string>) => Promise<Response>;
class Router { private routes: Map<string, Map<string, Handler>> = new Map(); private middlewares: Array<Handler> = [];
constructor() { ['GET', 'POST', 'PUT', 'DELETE', 'PATCH'].forEach(method => { this.routes.set(method, new Map()); }); }
use(middleware: Handler) { this.middlewares.push(middleware); return this; }
get(path: string, handler: Handler) { this.routes.get('GET')?.set(path, handler); return this; }
post(path: string, handler: Handler) { this.routes.get('POST')?.set(path, handler); return this; }
async handle(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> { const url = new URL(request.url); const method = request.method; const routes = this.routes.get(method);
if (!routes) { return new Response('Method Not Allowed', { status: 405 }); }
// Apply middlewares for (const middleware of this.middlewares) { const response = await middleware(request, env, ctx, {}); if (response.status !== 200) return response; }
// Match routes with parameters for (const [pattern, handler] of routes) { const params = this.matchRoute(pattern, url.pathname); if (params) { return handler(request, env, ctx, params); } }
return new Response('Not Found', { status: 404 }); }
private matchRoute(pattern: string, path: string): Record<string, string> | null { const patternParts = pattern.split('/'); const pathParts = path.split('/');
if (patternParts.length !== pathParts.length) return null;
const params: Record<string, string> = {};
for (let i = 0; i < patternParts.length; i++) { if (patternParts[i].startsWith(':')) { params[patternParts[i].slice(1)] = pathParts[i]; } else if (patternParts[i] !== pathParts[i]) { return null; } }
return params; }}
export default Router;
2. Authentication Middleware
import jwt from '@tsndr/cloudflare-worker-jwt';
export async function authenticate( request: Request, env: Env, ctx: ExecutionContext, params: Record<string, string>): Promise<Response> { const authorization = request.headers.get('Authorization');
if (!authorization || !authorization.startsWith('Bearer ')) { return new Response('Unauthorized', { status: 401 }); }
const token = authorization.slice(7);
try { const isValid = await jwt.verify(token, env.JWT_SECRET);
if (!isValid) { return new Response('Invalid token', { status: 401 }); }
const decoded = jwt.decode(token); // Attach user info to request for use in handlers (request as any).user = decoded.payload;
return new Response('OK', { status: 200 }); } catch (error) { return new Response('Invalid token', { status: 401 }); }}
export async function generateToken(userId: string, env: Env): Promise<string> { const token = await jwt.sign({ userId, exp: Math.floor(Date.now() / 1000) + (60 * 60 * 24), // 24 hours }, env.JWT_SECRET);
return token;}
3. Rate Limiting with Durable Objects
export class RateLimiter { private state: DurableObjectState; private requests: Map<string, number[]> = new Map();
constructor(state: DurableObjectState) { this.state = state; }
async fetch(request: Request): Promise<Response> { const url = new URL(request.url); const key = url.searchParams.get('key') || 'default'; const limit = parseInt(url.searchParams.get('limit') || '100'); const window = parseInt(url.searchParams.get('window') || '60');
const now = Date.now(); const windowStart = now - (window * 1000);
// Get existing requests for this key let timestamps = this.requests.get(key) || [];
// Filter out old requests timestamps = timestamps.filter(ts => ts > windowStart);
// Check if limit exceeded if (timestamps.length >= limit) { return new Response(JSON.stringify({ allowed: false, remaining: 0, resetAt: Math.min(...timestamps) + (window * 1000), }), { headers: { 'Content-Type': 'application/json' }, }); }
// Add current request timestamps.push(now); this.requests.set(key, timestamps);
// Store state await this.state.storage.put(key, timestamps);
return new Response(JSON.stringify({ allowed: true, remaining: limit - timestamps.length, resetAt: now + (window * 1000), }), { headers: { 'Content-Type': 'application/json' }, }); }}
// Middleware to use rate limiterexport async function rateLimit( request: Request, env: Env, ctx: ExecutionContext): Promise<Response> { const clientIp = request.headers.get('CF-Connecting-IP') || 'unknown'; const id = env.RATE_LIMITER.idFromName(clientIp); const limiter = env.RATE_LIMITER.get(id);
const url = new URL(request.url); const rateLimitUrl = `https://rate-limiter.internal/?key=${clientIp}&limit=100&window=60`;
const response = await limiter.fetch(rateLimitUrl); const result = await response.json();
if (!result.allowed) { return new Response('Rate limit exceeded', { status: 429, headers: { 'X-RateLimit-Remaining': '0', 'X-RateLimit-Reset': new Date(result.resetAt).toISOString(), }, }); }
return new Response('OK', { status: 200 });}
4. Database Integration with D1
CREATE TABLE IF NOT EXISTS users ( id INTEGER PRIMARY KEY AUTOINCREMENT, email TEXT UNIQUE NOT NULL, username TEXT UNIQUE NOT NULL, password_hash TEXT NOT NULL, created_at DATETIME DEFAULT CURRENT_TIMESTAMP, updated_at DATETIME DEFAULT CURRENT_TIMESTAMP);
CREATE TABLE IF NOT EXISTS posts ( id INTEGER PRIMARY KEY AUTOINCREMENT, user_id INTEGER NOT NULL, title TEXT NOT NULL, content TEXT NOT NULL, slug TEXT UNIQUE NOT NULL, published BOOLEAN DEFAULT false, created_at DATETIME DEFAULT CURRENT_TIMESTAMP, updated_at DATETIME DEFAULT CURRENT_TIMESTAMP, FOREIGN KEY (user_id) REFERENCES users(id));
CREATE INDEX idx_posts_slug ON posts(slug);CREATE INDEX idx_posts_user_id ON posts(user_id);
export interface Post { id: number; userId: number; title: string; content: string; slug: string; published: boolean; createdAt: string; updatedAt: string;}
export class PostModel { constructor(private db: D1Database) {}
async findAll(limit = 10, offset = 0): Promise<Post[]> { const { results } = await this.db .prepare('SELECT * FROM posts WHERE published = ? ORDER BY created_at DESC LIMIT ? OFFSET ?') .bind(true, limit, offset) .all();
return results as Post[]; }
async findBySlug(slug: string): Promise<Post | null> { const result = await this.db .prepare('SELECT * FROM posts WHERE slug = ? AND published = ?') .bind(slug, true) .first();
return result as Post | null; }
async create(post: Omit<Post, 'id' | 'createdAt' | 'updatedAt'>): Promise<Post> { const { meta } = await this.db .prepare( 'INSERT INTO posts (user_id, title, content, slug, published) VALUES (?, ?, ?, ?, ?)' ) .bind(post.userId, post.title, post.content, post.slug, post.published) .run();
return this.findById(meta.last_row_id as number); }
async update(id: number, updates: Partial<Post>): Promise<Post> { const setClause = Object.keys(updates) .map(key => `${key} = ?`) .join(', ');
await this.db .prepare(`UPDATE posts SET ${setClause}, updated_at = CURRENT_TIMESTAMP WHERE id = ?`) .bind(...Object.values(updates), id) .run();
return this.findById(id); }
private async findById(id: number): Promise<Post> { const result = await this.db .prepare('SELECT * FROM posts WHERE id = ?') .bind(id) .first();
if (!result) throw new Error('Post not found'); return result as Post; }}
5. Advanced Caching Strategy
export class CacheStrategy { private kv: KVNamespace; private cache: Cache;
constructor(kv: KVNamespace) { this.kv = kv; this.cache = caches.default; }
async get(key: string, request?: Request): Promise<any> { // Try browser cache first if (request) { const cached = await this.cache.match(request); if (cached) return cached; }
// Try KV store const kvData = await this.kv.get(key, 'json'); if (kvData) return kvData;
return null; }
async set( key: string, value: any, ttl: number = 3600, request?: Request, response?: Response ): Promise<void> { // Store in KV await this.kv.put(key, JSON.stringify(value), { expirationTtl: ttl, metadata: { cachedAt: Date.now(), ttl, }, });
// Store in browser cache if response provided if (request && response) { const cacheResponse = new Response(response.body, response); cacheResponse.headers.set('Cache-Control', `public, max-age=${ttl}`); await this.cache.put(request, cacheResponse); } }
async invalidate(pattern: string): Promise<void> { // List all keys matching pattern const list = await this.kv.list({ prefix: pattern });
// Delete matching keys await Promise.all( list.keys.map(key => this.kv.delete(key.name)) ); }
// Stale-while-revalidate pattern async swr<T>( key: string, fetcher: () => Promise<T>, ttl: number = 3600, staleTime: number = 86400 ): Promise<T> { const cached = await this.kv.getWithMetadata(key, 'json');
if (cached.value) { const age = Date.now() - (cached.metadata?.cachedAt || 0);
// Return fresh cache if (age < ttl * 1000) { return cached.value as T; }
// Return stale cache and revalidate in background if (age < staleTime * 1000) { // Don't await - run in background fetcher().then(fresh => this.set(key, fresh, ttl) ); return cached.value as T; } }
// Fetch fresh data const fresh = await fetcher(); await this.set(key, fresh, ttl); return fresh; }}
Complete API Implementation
import Router from './router';import { authenticate } from './middleware/auth';import { rateLimit } from './rateLimiter';import { PostModel } from './models/post';import { CacheStrategy } from './cache/strategy';
const router = new Router();
// Apply global middlewaresrouter.use(rateLimit);
// Public routesrouter.get('/api/posts', async (request, env, ctx) => { const url = new URL(request.url); const page = parseInt(url.searchParams.get('page') || '1'); const limit = Math.min(parseInt(url.searchParams.get('limit') || '10'), 100); const offset = (page - 1) * limit;
const cache = new CacheStrategy(env.CACHE); const cacheKey = `posts:${page}:${limit}`;
// Use stale-while-revalidate const posts = await cache.swr( cacheKey, async () => { const model = new PostModel(env.DB); return model.findAll(limit, offset); }, 300, // 5 minutes fresh 3600 // 1 hour stale );
return new Response(JSON.stringify({ data: posts, page, limit, }), { headers: { 'Content-Type': 'application/json', 'Cache-Control': 'public, max-age=60', }, });});
router.get('/api/posts/:slug', async (request, env, ctx, params) => { const { slug } = params;
const cache = new CacheStrategy(env.CACHE); const cacheKey = `post:${slug}`;
const post = await cache.swr( cacheKey, async () => { const model = new PostModel(env.DB); return model.findBySlug(slug); }, 600, // 10 minutes 7200 // 2 hours stale );
if (!post) { return new Response('Not Found', { status: 404 }); }
return new Response(JSON.stringify(post), { headers: { 'Content-Type': 'application/json', 'Cache-Control': 'public, max-age=300', }, });});
// Protected routesrouter.use(authenticate); // Apply auth to remaining routes
router.post('/api/posts', async (request, env, ctx) => { const body = await request.json(); const user = (request as any).user;
const model = new PostModel(env.DB); const post = await model.create({ ...body, userId: user.userId, });
// Invalidate cache const cache = new CacheStrategy(env.CACHE); await cache.invalidate('posts:');
return new Response(JSON.stringify(post), { status: 201, headers: { 'Content-Type': 'application/json' }, });});
router.put('/api/posts/:id', async (request, env, ctx, params) => { const { id } = params; const body = await request.json();
const model = new PostModel(env.DB); const post = await model.update(parseInt(id), body);
// Invalidate related caches const cache = new CacheStrategy(env.CACHE); await Promise.all([ cache.invalidate('posts:'), cache.invalidate(`post:${post.slug}`), ]);
return new Response(JSON.stringify(post), { headers: { 'Content-Type': 'application/json' }, });});
// Export handlerexport default { async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> { try { // Add CORS headers if (request.method === 'OPTIONS') { return new Response(null, { headers: { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS', 'Access-Control-Allow-Headers': 'Content-Type, Authorization', }, }); }
const response = await router.handle(request, env, ctx);
// Add CORS to all responses response.headers.set('Access-Control-Allow-Origin', '*');
return response; } catch (error) { console.error('Error:', error); return new Response('Internal Server Error', { status: 500 }); } },};
// Export Durable Objectexport { RateLimiter } from './rateLimiter';
Deployment Configuration
1. Wrangler Configuration
name = "my-api"main = "src/index.ts"compatibility_date = "2025-01-01"
[env.production]kv_namespaces = [ { binding = "CACHE", id = "your-cache-kv-id" }, { binding = "RATE_LIMIT", id = "your-rate-limit-kv-id" }]
[[env.production.d1_databases]]binding = "DB"database_name = "my-api-db"database_id = "your-d1-database-id"
[[env.production.durable_objects.bindings]]name = "RATE_LIMITER"class_name = "RateLimiter"
[env.production.vars]API_KEY = "your-api-key"
# Secrets (set via wrangler secret)# JWT_SECRET = "set-via-wrangler-secret"
2. Database Setup
# Create D1 databasewrangler d1 create my-api-db
# Apply migrationswrangler d1 execute my-api-db --file=./src/database/schema.sql
# Create KV namespaceswrangler kv:namespace create CACHEwrangler kv:namespace create RATE_LIMIT
# Set secretswrangler secret put JWT_SECRET
3. Deploy to Production
# Deploy to Cloudflarewrangler deploy --env production
# View logswrangler tail --env production
# Test the APIcurl https://my-api.username.workers.dev/api/posts
Performance Optimization
1. Smart Caching Headers
function setCacheHeaders(response: Response, options: { public?: boolean; maxAge?: number; sMaxAge?: number; staleWhileRevalidate?: number;}): Response { const parts = [];
if (options.public) parts.push('public'); if (options.maxAge) parts.push(`max-age=${options.maxAge}`); if (options.sMaxAge) parts.push(`s-maxage=${options.sMaxAge}`); if (options.staleWhileRevalidate) { parts.push(`stale-while-revalidate=${options.staleWhileRevalidate}`); }
response.headers.set('Cache-Control', parts.join(', ')); return response;}
2. Request Coalescing
class RequestCoalescer { private pending = new Map<string, Promise<Response>>();
async fetch( key: string, fetcher: () => Promise<Response> ): Promise<Response> { // Return existing promise if request is in flight const existing = this.pending.get(key); if (existing) return existing.then(r => r.clone());
// Create new promise const promise = fetcher(); this.pending.set(key, promise);
try { const response = await promise; return response; } finally { // Clean up after request completes setTimeout(() => this.pending.delete(key), 100); } }}
3. Response Compression
async function compressResponse(response: Response): Promise<Response> { const acceptEncoding = response.headers.get('Accept-Encoding') || '';
if (!acceptEncoding.includes('gzip')) return response;
const compressed = new Response( response.body?.pipeThrough(new CompressionStream('gzip')), response );
compressed.headers.set('Content-Encoding', 'gzip'); compressed.headers.delete('Content-Length');
return compressed;}
Monitoring and Analytics
1. Custom Analytics
interface Analytics { requestCount: number; errorCount: number; avgResponseTime: number; endpoints: Map<string, number>;}
class AnalyticsCollector { async track( request: Request, response: Response, duration: number, env: Env ): Promise<void> { const url = new URL(request.url); const key = `analytics:${new Date().toISOString().slice(0, 10)}`;
const existing = await env.CACHE.get<Analytics>(key, 'json') || { requestCount: 0, errorCount: 0, avgResponseTime: 0, endpoints: {}, };
existing.requestCount++; if (response.status >= 400) existing.errorCount++; existing.avgResponseTime = (existing.avgResponseTime * (existing.requestCount - 1) + duration) / existing.requestCount;
const endpoint = `${request.method} ${url.pathname}`; existing.endpoints[endpoint] = (existing.endpoints[endpoint] || 0) + 1;
await env.CACHE.put(key, JSON.stringify(existing), { expirationTtl: 86400 * 7, // Keep for 7 days }); }}
2. Error Tracking
class ErrorTracker { async log(error: Error, request: Request, env: Env): Promise<void> { const errorData = { message: error.message, stack: error.stack, url: request.url, method: request.method, headers: Object.fromEntries(request.headers), timestamp: new Date().toISOString(), cf: request.cf, };
// Store in KV for analysis const key = `error:${Date.now()}:${Math.random().toString(36).substr(2, 9)}`; await env.CACHE.put(key, JSON.stringify(errorData), { expirationTtl: 86400 * 30, // 30 days });
// Send to external service if configured if (env.SENTRY_DSN) { await this.sendToSentry(errorData, env.SENTRY_DSN); } }
private async sendToSentry(data: any, dsn: string): Promise<void> { // Implement Sentry integration }}
Security Best Practices
1. Input Validation
import { z } from 'zod';
const PostSchema = z.object({ title: z.string().min(1).max(200), content: z.string().min(1).max(10000), slug: z.string().regex(/^[a-z0-9-]+$/), published: z.boolean().optional(),});
function validateInput<T>(schema: z.ZodSchema<T>, data: unknown): T { const result = schema.safeParse(data);
if (!result.success) { throw new ValidationError(result.error.issues); }
return result.data;}
2. SQL Injection Prevention
// Always use parameterized queries// NEVER concatenate user input into SQL strings
// BAD - Vulnerable to SQL injectionconst query = `SELECT * FROM posts WHERE slug = '${userInput}'`;
// GOOD - Safe parameterized queryconst query = env.DB .prepare('SELECT * FROM posts WHERE slug = ?') .bind(userInput);
3. CORS Configuration
function configureCORS(request: Request, response: Response): Response { const origin = request.headers.get('Origin'); const allowedOrigins = [ 'https://example.com', 'https://app.example.com', ];
if (origin && allowedOrigins.includes(origin)) { response.headers.set('Access-Control-Allow-Origin', origin); response.headers.set('Access-Control-Allow-Credentials', 'true'); }
response.headers.set( 'Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE, OPTIONS' ); response.headers.set( 'Access-Control-Allow-Headers', 'Content-Type, Authorization' ); response.headers.set('Access-Control-Max-Age', '86400');
return response;}
Testing
1. Unit Tests
import { unstable_dev } from 'wrangler';
describe('API Tests', () => { let worker;
beforeAll(async () => { worker = await unstable_dev('src/index.ts', { experimental: { disableExperimentalWarning: true }, }); });
afterAll(async () => { await worker.stop(); });
it('should return posts', async () => { const response = await worker.fetch('/api/posts'); const data = await response.json();
expect(response.status).toBe(200); expect(data).toHaveProperty('data'); expect(Array.isArray(data.data)).toBe(true); });
it('should require authentication for POST', async () => { const response = await worker.fetch('/api/posts', { method: 'POST', body: JSON.stringify({ title: 'Test' }), });
expect(response.status).toBe(401); });});
2. Load Testing
# Install autocannonnpm install -g autocannon
# Run load testautocannon \ -c 100 \ -d 30 \ -p 10 \ https://my-api.username.workers.dev/api/posts
# Expected results:# - 50,000+ requests/second# - p99 latency < 50ms# - Zero errors under normal load
Cost Optimization
1. Request Minimization
// Batch multiple operations into single requestrouter.post('/api/batch', async (request, env) => { const { operations } = await request.json();
const results = await Promise.all( operations.map(op => executeOperation(op, env)) );
return new Response(JSON.stringify(results), { headers: { 'Content-Type': 'application/json' }, });});
2. KV Storage Optimization
// Use KV list operations efficientlyasync function bulkDelete(prefix: string, env: Env): Promise<void> { let cursor;
do { const list = await env.CACHE.list({ prefix, limit: 1000, cursor, });
await Promise.all( list.keys.map(key => env.CACHE.delete(key.name)) );
cursor = list.cursor; } while (cursor);}
Conclusion
Cloudflare Workers provide an incredibly powerful platform for building globally distributed, lightning-fast APIs with minimal operational overhead. By leveraging V8 Isolates, edge computing, and Cloudflare’s global network, you can achieve performance levels that are impossible with traditional serverless platforms.
Key Takeaways
- Zero cold starts make Workers ideal for latency-sensitive applications
- Global distribution ensures low latency for users worldwide
- Integrated storage (KV, D1, R2) provides a complete backend solution
- Cost-effective pricing model with generous free tier
- Security by default with built-in DDoS protection and WAF
Next Steps
- Implement WebSocket support using Durable Objects
- Add real-time features with Server-Sent Events
- Integrate with Cloudflare R2 for file storage
- Implement GraphQL API layer
- Add observability with Cloudflare Analytics
Resources
Building Lightning-Fast Serverless APIs with Cloudflare Workers
https://mranv.pages.dev/posts/cloudflare-workers-edge-computing-guide/