Skip to main content
โšก Calmops

Redis in 2025-2026: New Features, Redis Stack, and Cloud Evolution

Introduction

Redis has undergone significant transformation in 2025-2026. What started as a simple in-memory key-value store has evolved into a comprehensive data platform powering everything from traditional caching to cutting-edge AI applications. This article explores the latest developments, new features, and the evolving Redis ecosystem.


Redis 8.0: Performance and Features

Key New Features

Redis 8.0 represents the most significant release in years, bringing substantial performance improvements and new capabilities.

1. Faster Performance

# Redis 8.0 performance improvements
# - 2x throughput in many workloads
# - Reduced latency through optimized data structures
# - Better memory efficiency

# Benchmark comparison (typical results)
# Redis 7.x: ~100K ops/sec
# Redis 8.0: ~200K+ ops/sec

2. New Commands and Capabilities

# New JSON capabilities
JSON.ARRAPPEND key $.tags "new_tag"
JSON.ARRINSERT key $.scores 0 100

# Enhanced time series
TS.ADD sensor:temp 1700000000 25.5
TS.RANGE sensor:temp 1700000000 1700100000 AGGREGATE avg

# New probabilistic data structures
CMS.INCRBY page:views 1
CMS.QUERY page:views

3. Improved ACL and Security

# Granular permissions
ACL SETUSER reader ON >password ~cached:* -@all +@read
ACL SETUSER writer ON >password ~* +@read +@write +@set +@sortedset +@hash +@list +@string

Redis Stack transforms Redis from a cache to a complete data platform with modules that add search, JSON, time series, and AI capabilities.

Components Overview

Redis Stack
โ”œโ”€โ”€ Redis Core          - Base database
โ”œโ”€โ”€ RediSearch          - Full-text and vector search
โ”œโ”€โ”€ RedisJSON           - JSON document storage
โ”œโ”€โ”€ RedisTimeSeries     - Time series data
โ”œโ”€โ”€ RedisBloom          - Probabilistic data structures
โ”œโ”€โ”€ RedisGears          - Programmable engine
โ””โ”€โ”€ RedisAI             - ML model serving

Setting Up Redis Stack

# Docker deployment
docker run -d --name redis-stack \
  -p 6379:6379 \
  -p 8001:8001 \
  redis/redis-stack:latest

# Verify modules
redis-cli INFO modules
import redis
from redis.commands.search.field import TextField, NumericField, VectorField
from redis.commands.search.indexDefinition import IndexDefinition

# Connect to Redis Stack
r = redis.Redis(host='localhost', port=6379, decode_responses=True)

# Create index with vector field
r.ft('idx:documents').create_index(
    [TextField('title'), TextField('content'), VectorField('embedding')],
    definition=IndexDefinition(prefix=['doc:'])
)

# Index documents
r.ft('idx:documents').add_document(
    'doc:1',
    title='Redis Tutorial',
    content='Learn Redis from basics to advanced',
    embedding=[0.1, 0.2, 0.3]  # Your embedding vector
)

# Search
results = r.ft('idx:documents').search('Redis tutorial')
for doc in results.docs:
    print(f"{doc.title}: {doc.content}")

RedisJSON: Native JSON Support

# Store and query JSON documents
r.json().set('user:100', '.', {
    'name': 'Alice',
    'age': 30,
    'address': {'city': 'NYC', 'country': 'USA'},
    'tags': ['developer', 'AI']
})

# Update nested fields
r.json().set('user:100', '.age', 31)

# Array operations
r.json().arrappend('user:100', '.tags', 'engineer')

# Query with JSONPath
name = r.json().get('user:100', '.name')

Redis Cloud: Managed Service Evolution

Redis Cloud has evolved significantly, offering serverless options, automatic scaling, and integrated AI capabilities.

Service Tiers Comparison

Feature Free Pro Enterprise
Memory 30MB 100GB+ Unlimited
Connections 30 10000+ Unlimited
Modules Limited Full Full
Vector Search No Yes Yes
Active-Active No No Yes
Price Free $0-700/mo Custom

Python Integration with Redis Cloud

import redis

# Redis Cloud connection
r = redis.Redis(
    host='redis-12345.us-east-1-1.ec2.cloud.rlrcp.com',
    port=12345,
    password='your-redis-cloud-password',
    decode_responses=True,
    ssl=True
)

# Test connection
r.ping()

# Use standard Redis commands
r.set('key', 'value')
r.get('key')

Serverless Redis

# Redis Flex / Serverless
# Pay-per-request pricing
# Automatic scaling
# Zero cold starts with smart caching

Vector Search for AI

One of the biggest developments in Redis is native vector search capabilities, making it a popular choice for AI applications.

Vector Search Implementation

import numpy as np
from redis.commands.search.field import VectorField
from redis.commands.search.indexDefinition import IndexDefinition

# Create vector index
r.ft('idx:embeddings').create_index(
    [VectorField('embedding', 
                 'FLAT', 
                 {'TYPE': 'FLOAT32', 'DIM': 384, 'DISTANCE_METRIC': 'COSINE'}),
     TextField('content'),
     TextField('metadata')],
    definition=IndexDefinition(prefix=['doc:'])
)

def create_embedding(text):
    """Generate embedding vector (using any embedding model)"""
    # Placeholder - use OpenAI, HuggingFace, etc.
    return np.random.rand(384).astype(np.float32).tolist()

# Index documents with embeddings
for doc in documents:
    embedding = create_embedding(doc['content'])
    r.ft('idx:embeddings').add_document(
        f'doc:{doc["id"]}',
        content=doc['content'],
        metadata=doc.get('metadata', ''),
        embedding=embedding
    )

# Semantic search
query_embedding = create_embedding('machine learning tutorial')
results = r.ft('idx:embeddings').search(
    f'*=>[KNN 5 @embedding $vector AS score]',
    query_params={'vector': np.array(query_embedding).astype(np.float32).tobytes()}
)
# Combine vector search with filtering
results = r.ft('idx:hybrid').search(
    '(@category:{tutorial} | @level:{beginner}) '
    '=>[KNN 10 @embedding $vector AS score]',
    query_params={'vector': query_vector},
    sortby='score'
)

Multi-Threaded I/O (Redis 6+)

Redis 6.0 and later versions introduced multi-threaded I/O, significantly improving throughput.

Configuration

# redis.conf
io-threads 4
io-threads-do-reads yes

# For 8+ cores
io-threads 6

# Note: Command execution remains single-threaded
# I/O threads handle network reading/writing

Performance Impact

# Expected throughput improvements
# Single-threaded Redis: ~100K ops/sec
# Multi-threaded Redis 6+: ~200-400K ops/sec

# Latency improvements
# P99 latency reduced by 30-50% under high load

Client-Side Caching

Redis 6 introduced client-side caching, reducing network round trips for frequently accessed data.

Implementation

import redis

# Enable client-side caching
r = redis.Redis(
    host='localhost',
    port=6379,
    client_side_caching=True,  # Enable CSC
    csc_max_cached_keys=10000,  # Max cached keys
    decode_responses=True
)

# First request - fetches from server
user = r.get('user:123')

# Subsequent requests - served from local cache
# Redis tracks cached keys and invalidates on changes

Redis Streams Enhancements

Streams have become more mature with better consumer group support and stream processing capabilities.

Stream Processing

# Advanced stream operations
# Read with consumer groups
messages = r.xreadgroup(
    groupname='processors',
    consumername='worker-1',
    streams={'orders': '>'},
    count=10,
    block=5000
)

# Claim pending messages
claimed = r.xclaim(
    'orders',
    'processors',
    'worker-2',
    3600000,  # Min idle time
    ['message-id-1', 'message-id-2']
)

# Stream info and monitoring
info = r.xinfo_stream('orders')
groups = r.xinfo_groups('orders')

Integration Ecosystem

Kubernetes Integration

# Kubernetes deployment
apiVersion: apps/v1
kind: StatefulSet
metadata:
  name: redis-cluster
spec:
  serviceName: redis-cluster
  replicas: 6
  template:
    spec:
      containers:
      - name: redis
        image: redis:8.0
        ports:
        - containerPort: 6379
        command:
        - redis-server
        - --cluster-enabled
        - "yes"
        - --cluster-config-file
        - /data/nodes.conf

Service Mesh Integration

# Istio/Linkerd with Redis
# Redis works seamlessly with service meshes
# Consider connection pooling for mesh environments
# Use Redis Sentinel for high availability

# Service mesh aware configuration
r = redis.Redis(
    cluster=True,
    skip_full_coverage_check=True,
    decode_responses=True
)

Emerging Patterns

1. Edge Computing

# Redis for edge caching
# Deploy Redis at edge locations
# Reduce latency for global applications

edge_locations = {
    'us-east': 'redis-edge-us-east.example.com',
    'eu-west': 'redis-edge-eu-west.example.com',
    'ap-south': 'redis-edge-ap-south.example.com'
}

def get_redis_for_region(region):
    return redis.Redis(host=edge_locations[region], port=6379)

2. Real-Time ML Features

# Feature store with Redis
class FeatureStore:
    def __init__(self, redis_client):
        self.redis = redis_client
    
    def store_features(self, user_id, features):
        """Store precomputed features for ML"""
        key = f"features:user:{user_id}"
        self.redis.set(key, json.dumps(features), ex=3600)
    
    def get_features(self, user_id):
        """Retrieve features for inference"""
        key = f"features:user:{user_id}"
        data = self.redis.get(key)
        return json.loads(data) if data else None

3. Graph Capabilities

# Redis for graph-like relationships
# Using sorted sets for social graphs

def follow_user(follower_id, followed_id):
    # Add to follower's following list
    r.zadd(f'user:{follower_id}:following', {followed_id: time.time()})
    # Add to followed's followers list
    r.zadd(f'user:{followed_id}:followers', {follower_id: time.time()})

def get_followers(user_id, limit=100):
    return r.zrevrange(f'user:{user_id}:followers', 0, limit - 1)

Future Outlook

Expected Developments

  1. Enhanced AI Integration: Deeper integration with vector databases and LLM frameworks
  2. Serverless Scaling: More granular auto-scaling options
  3. Multi-Model Support: Better support for complex data types
  4. Improved Durability: Faster persistence with Optane/PMEM support
2024: 60% cache, 25% session, 10% real-time, 5% AI
2025: 50% cache, 20% session, 15% real-time, 15% AI
2026: 40% cache, 15% session, 20% real-time, 25% AI

Resources


Conclusion

Redis has evolved far beyond its caching origins. With Redis 8.0, Redis Stack, and cloud offerings, it now serves as a versatile data platform powering modern applications including AI/ML systems. Understanding these developments is essential for developers building next-generation applications.

In the next article, we’ll explore Redis alternatives and when to consider other options for your specific use cases.

Comments