Skip to main content

Edge Computing: Cloudflare Workers vs Vercel Edge

Created: February 23, 2026 Larry Qu 6 min read

Introduction

Edge computing brings computation closer to users, reducing latency and improving performance. Cloudflare Workers and Vercel Edge are the two leading platforms for deploying serverless functions at the edge.

This comprehensive comparison helps you choose the right platform for your needs in 2025.


What Is Edge Computing?

The Basic Concept

Edge computing runs code on servers distributed globally, close to users. Instead of requests traveling to a central data center, they hit the nearest edge location, reducing latency from hundreds of milliseconds to just a few.

Key Terms

  • Edge Function: Serverless function running on edge servers
  • Cold Start: Initial delay when function starts
  • Global CDN: Content delivery network with edge compute
  • Serverless: Code runs without server management
  • Durable Objects: Stateful computing at edge (Cloudflare)
  • Edge Runtime: Lightweight JavaScript/Wasm environment

Why Edge Matters in 2025-2026

Metric Traditional Server Edge
Latency (avg) 100-300ms 10-50ms
Cold Start N/A 5-50ms
Global Scaling Manual Automatic
Cost Fixed Pay-per-use

Platform Comparison

Cloudflare Workers

// Cloudflare Worker
export default {
  async fetch(request, env, ctx) {
    const url = new URL(request.url);
    
    // A/B testing
    const bucket = Math.random() < 0.5 ? 'a' : 'b';
    
    // Fetch from origin
    const response = await fetch(request.url, {
      headers: {
        'X-Experiment-Bucket': bucket
      }
    });
    
    // Modify response
    return new Response(response.body, {
      headers: {
        ...Object.fromEntries(response.headers),
        'X-Edge-Version': '1.0'
      }
    });
  }
};

// KV storage
export async function onRequest(context) {
  const value = await context.env.MY_KV.get('key');
  return new Response(value);
}

// Durable Objects (stateful)
export class Counter {
  constructor(state, env) {
    this.state = state;
  }
  
  async increment() {
    const current = await this.state.get('count') || 0;
    await this.state.put('count', current + 1);
    return current + 1;
  }
}

Vercel Edge

// Vercel Edge Function
import { NextRequest, NextResponse } from 'next/server';

export const config = {
  runtime: 'edge',
  regions: ['iad1', 'sfo1'], // Specify regions
};

export default function handler(req: NextRequest) {
  const url = new URL(req.url);
  
  // Geolocation data
  const country = req.geo?.country || 'US';
  const city = req.geo?.city || 'San Francisco';
  
  // Rewrite to localized content
  if (country === 'JP') {
    return NextResponse.rewrite(new URL('/jp' + url.pathname, req.url));
  }
  
  return NextResponse.json({
    message: `Hello from ${city}, ${country}!`,
    edge: true,
  });
}

// Streaming response
export async function POST(req: NextRequest) {
  const stream = new ReadableStream({
    start(controller) {
      controller.enqueue('Hello ');
      controller.enqueue('World!');
      controller.close();
    }
  });
  
  return new Response(stream);
}

Feature Comparison

Feature Cloudflare Workers Vercel Edge
Languages JavaScript, Rust, C++, Python JavaScript, TypeScript
Max Duration 30-300 seconds 10-50 seconds
Cold Start ~5ms ~10ms
Memory 128MB 1024MB
Free Tier 100K req/day 100K req/day
Pricing $5/10M req $0.60/1M req
Stateful Durable Objects Limited
Database D1 (SQL), KV, R2 Serverless SQL (Vercel Postgres)
Framework Any Next.js focused

Performance Benchmarks

Cold Start Time

Cloudflare Workers:  ████░░░░░░  ~5ms
Vercel Edge:        █████░░░░░  ~10ms
AWS Lambda:         ██████████  ~200ms

Request Latency (Global Average)

Cloudflare Workers:  ████░░░░░░  15ms
Vercel Edge:        ████░░░░░░  18ms
AWS Lambda (us-east): ████████░░░  45ms
AWS Lambda (eu-west): ████████████  120ms

Throughput

Cloudflare Workers:  ████████████  100K+ req/s/core
Vercel Edge:         ████████████  80K+ req/s/core

Use Cases

1. A/B Testing

Cloudflare Workers:

export default {
  async fetch(request, env, ctx) {
    const bucket = Math.random() < 0.5 ? 'control' : 'variant';
    
    // Set cookie
    const response = await fetch(request.url);
    const modified = new Response(response.body, response);
    modified.headers.set('Set-Cookie', `ab=${bucket}; Path=/; Max-Age=2592000`);
    
    return modified;
  }
}

Vercel Edge:

export default function handler(req: NextRequest) {
  const variant = Math.random() < 0.5 ? 'control' : 'variant';
  
  const response = NextResponse.next();
  response.cookies.set('ab', variant, { path: '/', maxAge: 2592000 });
  
  return response;
}

2. Authentication

Cloudflare Workers:

export default {
  async fetch(request, env, ctx) {
    const token = request.headers.get('Authorization');
    
    if (!token) {
      return new Response('Unauthorized', { status: 401 });
    }
    
    // Verify JWT
    const user = await verifyToken(token, env.JWT_SECRET);
    if (!user) {
      return new Response('Invalid token', { status: 403 });
    }
    
    // Add user to context
    return new Response(`Hello, ${user.name}!`);
  }
}

async function verifyToken(token, secret) {
  // JWT verification logic
  return { name: 'John' }; // Simplified
}

Vercel Edge:

import { jwtVerify, SignJWT } from 'jose';

export default async function handler(req: NextRequest) {
  const token = req.headers.get('authorization')?.replace('Bearer ', '');
  
  if (!token) {
    return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });
  }
  
  try {
    const { payload } = await jwtVerify(token, new TextEncoder().encode(process.env.JWT_SECRET));
    return NextResponse.json({ message: `Hello, ${payload.name}!` });
  } catch {
    return NextResponse.json({ error: 'Invalid token' }, { status: 403 });
  }
}

3. Real-time Features

Cloudflare Durable Objects:

export class ChatRoom {
  constructor(state, env) {
    this.state = state;
    this.sessions = new Set();
  }
  
  async fetch(request) {
    const url = new URL(request.url);
    
    if (url.pathname === '/connect') {
      // WebSocket upgrade
      return this.handleWebSocket(request);
    }
    
    return new Response('Not Found', { status: 404 });
  }
  
  handleWebSocket(request) {
    const { 0: client, 1: server } = new WebSocketPair();
    
    this.sessions.add(server);
    
    server.addEventListener('message', async (event) => {
      // Broadcast to all clients
      for (const session of this.sessions) {
        session.send(event.data);
      }
    });
    
    server.addEventListener('close', () => {
      this.sessions.delete(server);
    });
    
    return new Response(null, { status: 101, webSocket: client });
  }
}

Database Integration

Cloudflare D1 (SQL)

export default {
  async fetch(request, env, ctx) {
    const url = new URL(request.url);
    
    if (url.pathname === '/api/users') {
      // Query D1
      const { results } = await env.DB.prepare(
        'SELECT * FROM users LIMIT 10'
      ).all();
      
      return Response.json(results);
    }
    
    if (url.pathname === '/api/users' && request.method === 'POST') {
      const { name, email } = await request.json();
      
      await env.DB.prepare(
        'INSERT INTO users (name, email) VALUES (?, ?)'
      ).bind(name, email).run();
      
      return Response.json({ success: true }, { status: 201 });
    }
    
    return new Response('Not Found', { status: 404 });
  }
}

Vercel Postgres

import { sql } from '@vercel/postgres';

export default async function handler(req: NextRequest) {
  if (req.method === 'GET') {
    const { rows } = await sql`
      SELECT * FROM users LIMIT 10
    `;
    return NextResponse.json(rows);
  }
  
  if (req.method === 'POST') {
    const { name, email } = await req.json();
    await sql`
      INSERT INTO users (name, email) VALUES (${name}, ${email})
    `;
    return NextResponse.json({ success: true }, { status: 201 });
  }
  
  return NextResponse.json({ error: 'Not found' }, { status: 404 });
}

Best Practices

1. Minimize Dependencies

// ❌ Bad: Large bundle
import { bigLibrary } from 'big-library';
export default { fetch: (req) => bigLibrary.process(req) }

// ✅ Good: Minimal code
export default {
  async fetch(request) {
    const data = await request.json();
    return Response.json({ processed: true });
  }
}

2. Use Caching

// Cloudflare Workers
export default {
  async fetch(request, env, ctx) {
    const url = new URL(request.url);
    
    // Cache HTML responses
    if (request.method === 'GET' && url.pathname !== '/api/') {
      const cache = caches.default;
      const cached = await cache.match(request);
      
      if (cached) return cached;
      
      const response = await fetch(request);
      
      // Cache for 1 hour
      ctx.waitUntil(
        cache.put(request, response.clone())
      );
      
      return response;
    }
    
    return fetch(request);
  }
}

3. Handle Errors Gracefully

// Vercel Edge
export default function handler(req: NextRequest) {
  try {
    // Your logic
    return NextResponse.json({ success: true });
  } catch (error) {
    return NextResponse.json(
      { error: 'Internal server error' },
      { status: 500 }
    );
  }
}

Pricing Comparison

Cloudflare Workers

Tier Requests Compute (GB-s) Price
Free 100K/day 10 $0
Paid 10M 1M $5/month
Enterprise Custom Custom Custom

Vercel Edge

Tier Requests Edge Function Price
Free 100K 6GB-s $0
Pro 1M 150GB-s $20/month
Enterprise Custom Custom Custom

External Resources

Cloudflare

Vercel

Community


Key Takeaways

  • Cloudflare Workers offers better pricing, Durable Objects for stateful apps
  • Vercel Edge integrates seamlessly with Next.js applications
  • Cold starts are minimal on both platforms (~5-10ms)
  • Use cases: A/B testing, auth, personalization, API routing
  • Best practices: Minimize dependencies, use caching, handle errors
  • Pricing: Cloudflare more generous on compute, Vercel simpler tier

Next Steps: Explore AI in Frontend: Browser AI and WebGPU to see the future of client-side AI.

Resources

Comments

Share this article

Scan to read on mobile