阿里云主机折上折
  • 微信号
Current Site:Index > Redis cache integration solution

Redis cache integration solution

Author:Chuan Chen 阅读数:43930人阅读 分类: Node.js

Redis Caching Integration Solution

Redis is a high-performance key-value storage system commonly used for caching, message queues, and other scenarios. Integrating Redis with Koa2 can significantly improve application response speed and concurrency capabilities. Below is a comprehensive Redis caching integration solution, including installation, configuration, basic operations, and advanced usage.

Installing Redis and Dependencies

First, you need to install the Redis server and the Node.js Redis client library. The Redis server can be downloaded and installed from the official website or quickly started using Docker:

# Start Redis using Docker
docker run --name my-redis -p 6379:6379 -d redis

Install the ioredis or redis client library in your Koa2 project:

npm install ioredis
# or
npm install redis

Basic Configuration

In Koa2, it is common to attach the Redis client instance to the ctx context for global access. Here is a basic configuration example:

const Koa = require('koa');
const Redis = require('ioredis');

const app = new Koa();
const redis = new Redis({
  host: '127.0.0.1',
  port: 6379,
  password: 'yourpassword', // If a password is set
  db: 0, // Select a database
});

// Attach the Redis instance to the context
app.context.redis = redis;

app.use(async (ctx) => {
  ctx.body = 'Hello Redis';
});

app.listen(3000);

Basic Cache Operations

Redis supports various data structures. Below are examples of common operations:

Strings

// Set cache
await ctx.redis.set('user:1', JSON.stringify({ name: 'Alice', age: 25 }));

// Get cache
const userData = await ctx.redis.get('user:1');
const user = JSON.parse(userData);
console.log(user); // { name: 'Alice', age: 25 }

// Set expiration time (10 seconds)
await ctx.redis.set('temp:data', 'expires soon', 'EX', 10);

Hashes

// Set hash fields
await ctx.redis.hset('user:2', 'name', 'Bob');
await ctx.redis.hset('user:2', 'age', '30');

// Get hash field
const userName = await ctx.redis.hget('user:2', 'name');
console.log(userName); // 'Bob'

// Get all fields
const user = await ctx.redis.hgetall('user:2');
console.log(user); // { name: 'Bob', age: '30' }

Lists

// Add elements to a list
await ctx.redis.lpush('messages', 'msg1', 'msg2');

// Get a range of list elements
const messages = await ctx.redis.lrange('messages', 0, -1);
console.log(messages); // ['msg2', 'msg1']

Caching Strategies

Cache Penetration

Cache penetration occurs when querying non-existent data causes every request to hit the database. This can be mitigated using a Bloom filter or caching empty values:

app.use(async (ctx, next) => {
  const cacheKey = `user:${ctx.params.id}`;
  let user = await ctx.redis.get(cacheKey);

  if (user === null) {
    // Database query
    user = await db.queryUser(ctx.params.id);
    if (!user) {
      // Cache an empty value with a short expiration time
      await ctx.redis.set(cacheKey, '', 'EX', 60);
    } else {
      await ctx.redis.set(cacheKey, JSON.stringify(user), 'EX', 3600);
    }
  }

  ctx.body = user || {};
});

Cache Avalanche

Cache avalanche occurs when a large number of caches expire simultaneously, overwhelming the database. This can be avoided by setting staggered expiration times:

// Set random expiration time
const randomTTL = Math.floor(Math.random() * 300) + 300; // 300-600 seconds
await ctx.redis.set('data:key', 'value', 'EX', randomTTL);

Cache Breakdown

Cache breakdown happens when a hot cache item expires, causing a flood of requests to the database. This can be resolved using a mutex lock:

async function getDataWithLock(key) {
  const value = await ctx.redis.get(key);
  if (value !== null) return value;

  const lockKey = `lock:${key}`;
  const locked = await ctx.redis.set(lockKey, '1', 'NX', 'EX', 10);
  if (locked) {
    try {
      const data = await fetchDataFromDB(); // Fetch data from the database
      await ctx.redis.set(key, JSON.stringify(data), 'EX', 3600);
      return data;
    } finally {
      await ctx.redis.del(lockKey);
    }
  } else {
    // Wait and retry
    await new Promise(resolve => setTimeout(resolve, 100));
    return getDataWithLock(key);
  }
}

Advanced Usage

Pub/Sub Model

Redis supports the publish/subscribe model for real-time notifications:

// Subscribe to a channel
const subscriber = new Redis();
subscriber.subscribe('news');

subscriber.on('message', (channel, message) => {
  console.log(`Received ${message} from ${channel}`);
});

// Publish a message
const publisher = new Redis();
publisher.publish('news', 'Hello world!');

Lua Scripts

Redis supports Lua scripts for atomic operations:

const script = `
  local key = KEYS[1]
  local limit = tonumber(ARGV[1])
  local current = tonumber(redis.call('GET', key) or 0)
  if current + 1 > limit then
    return 0
  else
    redis.call('INCR', key)
    return 1
  end
`;

// Execute the rate-limiting script
const result = await ctx.redis.eval(script, 1, 'rate:limit:user1', 10);
console.log(result); // 1 or 0

Pipelining

Pipelining allows sending multiple commands at once to reduce network overhead:

const pipeline = ctx.redis.pipeline();
pipeline.set('key1', 'value1');
pipeline.set('key2', 'value2');
pipeline.expire('key1', 60);
const results = await pipeline.exec();
console.log(results);

Performance Optimization

Connection Pooling

Using connection pools avoids frequent connection creation and destruction:

const Redis = require('ioredis');
const redis = new Redis({
  host: '127.0.0.1',
  port: 6379,
  password: 'yourpassword',
  db: 0,
  enableOfflineQueue: false, // Disable offline queue
  maxRetriesPerRequest: 1, // Maximum retry attempts
  reconnectOnError: (err) => {
    // Custom reconnection logic
    return err.message.includes('READONLY');
  },
});

Cluster Mode

For large-scale applications, Redis Cluster can be used:

const Redis = require('ioredis');
const cluster = new Redis.Cluster([
  { host: '127.0.0.1', port: 7000 },
  { host: '127.0.0.1', port: 7001 },
]);

app.context.redis = cluster;

Monitoring and Debugging

Slow Query Logs

Redis supports slow query logs, which can be enabled with the following commands:

# Set slow query threshold to 10 milliseconds
redis-cli config set slowlog-log-slower-than 10000
# Keep the last 100 slow queries
redis-cli config set slowlog-max-len 100
# View slow query logs
redis-cli slowlog get

Performance Testing

Use the redis-benchmark tool to test Redis performance:

redis-benchmark -h 127.0.0.1 -p 6379 -c 100 -n 100000

本站部分内容来自互联网,一切版权均归源网站或源作者所有。

如果侵犯了你的权益请来信告知我们删除。邮箱:cc@cccx.cn

Front End Chuan

Front End Chuan, Chen Chuan's Code Teahouse 🍵, specializing in exorcising all kinds of stubborn bugs 💻. Daily serving baldness-warning-level development insights 🛠️, with a bonus of one-liners that'll make you laugh for ten years 🐟. Occasionally drops pixel-perfect romance brewed in a coffee cup ☕.