Introduction
Edge computing brings computation closer to data sources, reducing latency and enabling real-time processing. This article explores major edge computing platforms, architecture patterns, and production deployment strategies.
Understanding Edge Computing
What is Edge Computing?
Edge computing processes data near where it’s generated rather than sending it to centralized data centers. This reduces latency, bandwidth costs, and enables real-time decision making.
graph LR
A[User Device] --> B[Edge Node]
B --> C[Regional Edge]
C --> D[Cloud Data Center]
style B fill:#90EE90
style C fill:#87CEEB
style D fill:#FFB6C1
B -- < 10ms --> A
C -- 10-50ms --> B
D -- 100ms+ --> C
Why Edge Computing Matters
| Metric | Traditional Cloud | Edge Computing |
|---|---|---|
| Latency | 100-300ms | 5-50ms |
| Bandwidth | High costs | Reduced by 60-90% |
| Availability | Single region | Global distribution |
| Data Sovereignty | Limited control | Process locally |
| Cost | Per-request pricing | Often included in CDN |
Cloudflare Workers
Overview
Cloudflare Workers provides serverless JavaScript execution at the edge across 300+ locations worldwide.
// Cloudflare Worker - Basic Example
export default {
async fetch(request, env, ctx) {
const url = new URL(request.url);
// Route-based handling
if (url.pathname.startsWith('/api/')) {
return handleAPIRequest(request);
}
return fetch(request); // Pass through to origin
}
};
async function handleAPIRequest(request) {
// Cache API responses at edge
const cache = caches.default;
const cached = await cache.match(request);
if (cached) {
return cached;
}
const response = await fetch('https://api.example.com' + new URL(request.url).pathname);
// Cache for 1 hour at edge
ctx.waitUntil(
cache.put(request, response.clone())
);
return response;
}
Architecture Deep Dive
// Advanced Cloudflare Worker Patterns
// Durable Objects for stateful edge computing
export class Counter {
constructor(state, env) {
this.state = state;
}
async increment(amount = 1) {
const current = await this.state.get('count') || 0;
const next = current + amount;
await this.state.put('count', next);
return { count: next };
}
}
// KV Storage for global data
export default {
async fetch(request, env) {
const url = new URL(request.url);
// Read from KV (global key-value at edge)
if (url.pathname === '/config') {
const config = await env.APP_CONFIG.get('site-config');
return new Response(config, {
headers: { 'Content-Type': 'application/json' }
});
}
// Write to KV
if (url.pathname === '/track' && request.method === 'POST') {
const body = await request.json();
await env.EVENTS.put(Date.now().toString(), JSON.stringify(body));
return new Response('Tracked', { status: 201 });
}
return new Response('Not Found', { status: 404 });
}
};
Performance Characteristics
// Benchmark: Cloudflare Workers vs Traditional Serverless
// Cold Start Comparison
// AWS Lambda: 100-500ms
// Cloudflare Worker: 1-5ms
// Cloudflare Worker with Cache: <1ms
// Memory and CPU Limits
// Workers Free Tier: 128MB RAM, 10ms CPU time
// Workers Paid: 128MB RAM, 50ms CPU time
// Workers Bundled: 128MB RAM, 30ms CPU time
// Throughput
// Default: 100,000 requests/day (free)
// Paid: 10,000,000 requests/month ($5/10M)
Production Configuration
// wrangler.toml - Cloudflare Worker Configuration
name = "production-api"
main = "src/index.js"
compatibility_date = "2024-01-15"
[[kv_namespaces]]
binding = "APP_CONFIG"
id = "your-kv-namespace-id"
[[durable_objects.bindings]]
name = "USER_SESSIONS"
class_name = "SessionManager"
[[r2_buckets]]
binding = "ASSETS"
bucket_name = "media-assets"
[env.production]
vars = { ENVIRONMENT = "production" }
[[env.production.kv_namespaces]]
binding = "APP_CONFIG"
id = "prod-config-id"
AWS Lambda@Edge
Overview
Lambda@Edge runs Lambda functions at CloudFront edge locations, enabling customization of CloudFront responses.
// Lambda@Edge - Viewer Request Handler
exports.handler = async (event) => {
const request = event.Records[0].cf.request;
const uri = request.uri;
// Add security headers
const response = {
status: '200',
statusDescription: 'OK',
headers: {
'strict-transport-security': [{
key: 'Strict-Transport-Security',
value: 'max-age=31536000; includeSubDomains'
}],
'content-security-policy': [{
key: 'Content-Security-Policy',
value: "default-src 'self'"
}],
'x-content-type-options': [{
key: 'X-Content-Type-Options',
value: 'nosniff'
}]
}
};
return response;
};
Lambda@Edge Use Cases
// Lambda@Edge - A/B Testing at Edge
exports.handler = async (event) => {
const request = event.Records[0].cf.request;
const response = event.Records[0].cf.response;
// Check for existing experiment cookie
const cookies = request.headers.cookie || [];
const cookieHeader = cookies[0]?.value || '';
let variant;
if (cookieHeader.includes('variant=a')) {
variant = 'a';
} else if (cookieHeader.includes('variant=b')) {
variant = 'b';
} else {
// Random assignment (50/50 split)
variant = Math.random() < 0.5 ? 'a' : 'b';
}
// Rewrite URI based on variant
request.uri = request.uri.replace('/experiment/', `/${variant}/`);
// Set cookie for persistence
response.headers['set-cookie'] = [{
key: 'Set-Cookie',
value: `variant=${variant}; Path=/; Max-Age=2592000; HttpOnly; SameSite=Lax`
}];
return {
request,
response
};
};
CloudFront + Lambda@Edge Architecture
# CloudFront Distribution with Lambda@Edge
AWSTemplateFormatVersion: '2010-09-09'
Resources:
CloudFrontDistribution:
Type: AWS::CloudFront::Distribution
Properties:
DistributionConfig:
Enabled: true
DefaultRootObject: index.html
CacheBehaviors:
- PathPattern: /api/*
TargetOriginId: api-origin
ViewerProtocolPolicy: https-only
CachePolicyId: 4135ea2d-6df8-44a3-9df3-4b5a84be39ad
- PathPattern: /*
TargetOriginId: s3-origin
ViewerProtocolPolicy: redirect-to-https
LambdaFunctionAssociations:
- EventType: viewer-request
LambdaFunctionARN: !Sub ${RequestFunction.Arn}
- EventType: origin-response
LambdaFunctionARN: !Sub ${ResponseFunction.Arn}
Performance Comparison
| Aspect | Cloudflare Workers | Lambda@Edge |
|---|---|---|
| Execution Locations | 300+ | 200+ (CloudFront) |
| Cold Start | 1-5ms | 50-200ms |
| Memory Limit | 128MB | 128MB (128MB-10GB) |
| Timeout | 30s (paid) | 30s |
| Languages | JS, Rust, Python, Go | Any Lambda runtime |
| Pricing | $5/10M requests | $0.60/1M + CloudFront |
Edge Platform Comparison
Feature Matrix
| Feature | Cloudflare Workers | Lambda@Edge | Vercel Edge | Deno Deploy |
|---|---|---|---|---|
| Global Points | 300+ | 200+ | 35+ | 30+ |
| Cold Start | ~1ms | ~100ms | ~5ms | ~2ms |
| Database Integration | D1 (SQLite) | DynamoDB | Vercel Postgres | Deno KV |
| Object Storage | R2 | S3 | Blob | Object Wrappers |
| WebSockets | โ | โ | โ | โ |
| Min Requests | None | 1 | None | None |
| Free Tier | 100K/day | None | 100K/month | 100K/month |
Cost Analysis
# Monthly cost comparison (1M requests)
# Cloudflare Workers (Paid)
# $5 per 10M requests
cost = 1000000 / 10000000 * 5 # $0.50
# Lambda@Edge (us-east-1)
# $0.00002080 per GB-second (memory) + $0.00000060 per request
# Assuming 128MB, 30ms avg
compute_cost = 0.00002080 * 0.128 * 0.030 * 1000000 # $0.08
request_cost = 0.00000060 * 1000000 # $0.60
total = compute_cost + request_cost # $0.68
# Vercel Edge Functions
# $0.75 per 1M requests (Pro plan)
cost = 0.75
# Deno Deploy
# $0.50 per 1M requests (Pro)
cost = 0.50
Production Architecture Patterns
Pattern 1: Hybrid Edge-Cloud
graph TB
User[User Request] --> Edge[Edge Function]
Edge -->|Static Content| CDN[CDN Cache]
Edge -->|API Requests| EdgeAPI[Edge API Logic]
EdgeAPI -->|Cache Hit| Cache
EdgeAPI -->|Cache Miss| Cloud[Cloud Backend]
Cloud -->|Async Processing| Queue[Message Queue]
Queue -->|Background| Worker[Cloud Worker]
Worker -->|Update| Cache
Cache -->|TTL Expiry| EdgeAPI
// Hybrid Architecture Implementation
class HybridEdgeGateway {
constructor(edgeConfig, cloudConfig) {
this.edgeCache = new EdgeCache(edgeConfig.cacheConfig);
this.cloudClient = new CloudClient(cloudConfig);
}
async handleRequest(request) {
const cacheKey = this.getCacheKey(request);
// Check edge cache first
const cached = await this.edgeCache.get(cacheKey);
if (cached && !this.isStale(cached)) {
return cached;
}
// Check if request can be served from edge
if (this.isEdgeEligible(request)) {
// Serve dynamically from edge
const response = await this.serveFromEdge(request);
await this.edgeCache.set(cacheKey, response);
return response;
}
// Fall back to cloud
const cloudResponse = await this.cloudClient.fetch(request);
return cloudResponse;
}
isEdgeEligible(request) {
// Only GET, no auth required, cacheable
return request.method === 'GET' &&
!request.headers.has('Authorization') &&
this.isCacheable(request);
}
}
Pattern 2: Edge Caching Strategy
// Advanced Edge Caching with Stale-While-Revalidate
export default {
async fetch(request, env, ctx) {
const cache = caches.default;
const url = new URL(request.url);
// Try cache first
let response = await cache.match(request);
if (response) {
// Return cached immediately
ctx.waitUntil(
// Revalidate in background
this.revalidate(request, cache)
);
return response;
}
// Not cached, fetch and cache
response = await fetch(request);
if (response.ok) {
const cacheResponse = response.clone();
ctx.waitUntil(
cache.put(request, cacheResponse)
);
}
return response;
}
async revalidate(request, cache) {
// Fetch fresh content
const fresh = await fetch(request);
if (fresh.ok) {
// Update cache with fresh content
// But still serve stale content to next request
// (handled by Cache-Control: stale-while-revalidate)
await cache.put(request, fresh);
}
}
}
Pattern 3: Edge Database Access
// Edge-compatible database patterns
// 1. Read replicas at edge
class EdgeDatabaseRouter {
constructor(config) {
this.readReplicas = config.edgeReplicas;
this.writer = config.primaryDB;
}
async read(query, params) {
// Choose nearest replica
const replica = this.getNearestReplica();
return replica.query(query, params);
}
async write(query, params) {
// Always write to primary
return this.writer.query(query, params);
}
}
// 2. SQLite at edge (Cloudflare D1)
export default {
async fetch(request, env) {
const db = env.DB;
// Query from edge
const result = await db.prepare(
'SELECT * FROM users WHERE id = ?'
).bind(userId).all();
return Response.json(result);
}
}
// 3. CRDT-based sync
class CRDTSync {
constructor(peerId) {
this.peerId = peerId;
this.localState = new Map();
}
merge(remoteState) {
// Merge using last-writer-wins or custom CRDT
for (const [key, value] of remoteState) {
if (!this.localState.has(key) ||
value.timestamp > this.localState.get(key).timestamp) {
this.localState.set(key, value);
}
}
}
}
Performance Optimization
Reducing Cold Starts
// 1. Module bundling for faster loading
// webpack.config.js
module.exports = {
mode: 'production',
entry: './src/index.js',
output: {
filename: 'worker.js',
path: path.resolve(__dirname, 'dist')
},
optimization: {
minimize: true,
usedExports: true
}
};
// 2. Eager loading of critical modules
// src/index.js
import { handleApiRequest } from './api'; // Eager
import { processPayment } from './payment'; // Lazy
export default {
async fetch(request) {
if (request.url.includes('/api/')) {
return handleApiRequest(request); // Pre-loaded
}
// Lazy load payment module
if (request.url.includes('/pay/')) {
const { processPayment } = await import('./payment');
return processPayment(request);
}
}
};
Request Batching
// Batch multiple requests at edge
class BatchProcessor {
constructor(batchSize = 10, timeoutMs = 5) {
this.batchSize = batchSize;
this.timeoutMs = timeoutMs;
this.pending = [];
this.timer = null;
}
async add(request) {
return new Promise((resolve) => {
this.pending.push({ request, resolve });
if (this.pending.length >= this.batchSize) {
this.flush();
} else if (!this.timer) {
this.timer = setTimeout(() => this.flush(), this.timeoutMs);
}
});
}
async flush() {
clearTimeout(this.timer);
this.timer = null;
const batch = this.pending.splice(0, this.batchSize);
const requests = batch.map(b => b.request);
// Batch process
const results = await this.processBatch(requests);
// Resolve individual promises
batch.forEach((item, i) => {
item.resolve(results[i]);
});
}
}
Monitoring and Debugging
Cloudflare Workers Analytics
// Custom metrics at edge
export default {
async fetch(request, env, ctx) {
const start = Date.now();
// Track request
ctx.waitUntil(
this.trackMetrics(request, Date.now() - start, env)
);
return fetch(request);
}
async trackMetrics(duration, env) {
await env.METRICS.put(Date.now().toString(), JSON.stringify({
path: new URL(request.url).pathname,
duration,
country: request.headers.get('cf-ipcountry'),
colo: request.cf?.colo
}));
}
};
Lambda@Edge Monitoring
# CloudWatch Dashboard Configuration
resources:
Dashboards:
EdgeLambdaDashboard:
Type: AWS::CloudWatch::Dashboard
Properties:
DashboardName: EdgeLambdaDashboard
Widgets:
- Type: Text
Properties:
Markdown: "## Lambda@Edge Metrics"
- Type: Metric
Properties:
Metrics:
- AWS/Lambda
- Invocations
- Errors
- Duration
- Throttles
Period: 300
Stat: Average
Security Considerations
Edge-Specific Security
// 1. Rate limiting at edge
export class RateLimiter {
constructor(limit = 100, windowSeconds = 60) {
this.limit = limit;
this.windowMs = windowSeconds * 1000;
}
async check(request, env) {
const ip = request.headers.get('CF-Connecting-IP');
const key = `ratelimit:${ip}`;
const current = await env.KV.get(key);
const count = current ? parseInt(current) : 0;
if (count >= this.limit) {
return { allowed: false, remaining: 0 };
}
// Increment counter
await env.KV.put(key, (count + 1).toString(), {
expirationTtl: this.windowSeconds
});
return { allowed: true, remaining: this.limit - count - 1 };
}
}
// 2. Bot detection
export function isBot(request) {
const userAgent = request.headers.get('User-Agent') || '';
const verifiedBot = request.headers.get('CF-Connecting-IP')?.verified;
return /bot|crawler|spider/i.test(userAgent) || !verifiedBot;
}
Conclusion
Edge computing is production-ready and offers significant advantages for latency-sensitive applications. Key takeaways:
- Cloudflare Workers: Best for JavaScript-first teams, excellent cold start performance
- Lambda@Edge: Best for AWS integration, extensive customization
- Hybrid approaches: Combine edge and cloud for optimal performance
Start with static content and simple API routing, then expand to more complex edge logic as your team gains experience.
External Resources
- Cloudflare Workers Documentation
- AWS Lambda@Edge Documentation
- Edge Computing Use Cases
- Web Standards for Edge
Comments