阿里云主机折上折
  • 微信号
Current Site:Index > Cache strategy and implementation plan

Cache strategy and implementation plan

Author:Chuan Chen 阅读数:32396人阅读 分类: Node.js

Caching is a crucial means of enhancing application performance, and a well-designed caching strategy can significantly reduce server load and response times. Express, as a popular Node.js framework, offers flexible caching implementation methods, including in-memory caching, distributed caching, and HTTP caching.

In-Memory Caching Implementation

In-memory caching is the simplest form of caching, suitable for small applications or development environments. Node.js's memory-cache module can quickly implement in-memory caching:

const express = require('express');
const cache = require('memory-cache');
const app = express();

// Caching middleware
const memCache = (duration) => {
  return (req, res, next) => {
    const key = '__express__' + req.originalUrl;
    const cachedContent = cache.get(key);
    
    if (cachedContent) {
      res.send(cachedContent);
      return;
    } else {
      res.sendResponse = res.send;
      res.send = (body) => {
        cache.put(key, body, duration * 1000);
        res.sendResponse(body);
      };
      next();
    }
  };
};

// Usage example
app.get('/api/data', memCache(30), (req, res) => {
  // Simulate time-consuming operation
  setTimeout(() => {
    res.json({ data: Date.now() });
  }, 1000);
});

This approach requires attention to memory leaks. When caching large amounts of data, reasonable expiration times and memory limits must be set.

Redis Distributed Caching

For production environments, Redis is a more reliable choice. Below is a complete example of Express integrating Redis:

const redis = require('redis');
const { promisify } = require('util');
const client = redis.createClient({
  host: 'redis-server',
  port: 6379
});

const getAsync = promisify(client.get).bind(client);
const setexAsync = promisify(client.setex).bind(client);

app.get('/api/users/:id', async (req, res) => {
  const cacheKey = `user_${req.params.id}`;
  
  try {
    // Attempt to fetch from cache
    const cachedData = await getAsync(cacheKey);
    if (cachedData) {
      return res.json(JSON.parse(cachedData));
    }

    // If cache miss, query the database
    const user = await User.findById(req.params.id);
    
    // Set cache (30-second expiration)
    await setexAsync(cacheKey, 30, JSON.stringify(user));
    
    res.json(user);
  } catch (err) {
    res.status(500).send(err.message);
  }
});

The Redis solution requires consideration of cache avalanche issues, which can be resolved using random expiration times or mutual exclusion locks:

// Mutual exclusion lock implementation to resolve cache avalanches
const acquireLock = async (lockKey, expireTime = 10) => {
  const result = await setnxAsync(lockKey, 'LOCK');
  if (result === 1) {
    await expireAsync(lockKey, expireTime);
    return true;
  }
  return false;
};

HTTP Caching Strategy

Express can implement HTTP caching by setting response headers:

app.get('/static/image.jpg', (req, res) => {
  res.set({
    'Cache-Control': 'public, max-age=86400', // 1 day
    'ETag': '123456789',
    'Last-Modified': new Date('2023-01-01').toUTCString()
  });
  res.sendFile('/path/to/image.jpg');
});

For dynamic content, conditional requests can be used:

app.get('/api/news', async (req, res) => {
  const news = await getLatestNews();
  const etag = crypto.createHash('md5').update(JSON.stringify(news)).digest('hex');
  
  if (req.headers['if-none-match'] === etag) {
    return res.status(304).end();
  }
  
  res.set({
    'ETag': etag,
    'Cache-Control': 'no-cache'
  }).json(news);
});

Cache Update Strategies

Common cache update strategies should be chosen based on business scenarios:

  1. Cache Aside Pattern:
// Read
async function getProduct(id) {
  let product = await cache.get(id);
  if (!product) {
    product = await db.query(id);
    await cache.set(id, product);
  }
  return product;
}

// Update
async function updateProduct(id, data) {
  await db.update(id, data);
  await cache.del(id); // Invalidate cache
}
  1. Write Through Pattern:
async function writeThroughUpdate(id, data) {
  await cache.set(id, data); // Update cache first
  await db.update(id, data); // Then update database
}
  1. Write Behind Pattern:
const writeQueue = new Queue();

async function writeBehindUpdate(id, data) {
  await cache.set(id, data);
  writeQueue.add({ id, data }); // Asynchronously batch-write to database
}

Cache Monitoring and Optimization

Implement cache monitoring middleware:

app.use((req, res, next) => {
  const start = Date.now();
  const originalSend = res.send;
  
  res.send = function(body) {
    const duration = Date.now() - start;
    const cacheStatus = res.get('X-Cache') || 'MISS';
    
    // Record cache hit rate
    statsd.increment(`cache.${cacheStatus.toLowerCase()}`);
    statsd.timing('response_time', duration);
    
    originalSend.call(this, body);
  };
  
  next();
});

Use Redis monitoring commands:

# View cache hit rate
redis-cli info stats | grep keyspace_hits
redis-cli info stats | grep keyspace_misses

# Memory usage
redis-cli info memory

Cache Problem Solutions

Cache Penetration Solution:

// Bloom filter implementation
const { BloomFilter } = require('bloom-filters');
const filter = new BloomFilter(1000, 0.01);

// Add valid IDs
validIds.forEach(id => filter.add(id));

app.get('/api/items/:id', async (req, res) => {
  if (!filter.has(req.params.id)) {
    return res.status(404).send('Not Found');
  }
  // ...Normal processing logic
});

Cache Breakdown Solution:

// Use mutual exclusion lock
const mutex = new Mutex();

app.get('/api/hot-item', async (req, res) => {
  const cacheKey = 'hot_item';
  let data = await cache.get(cacheKey);
  
  if (!data) {
    const release = await mutex.acquire();
    try {
      // Double-check
      data = await cache.get(cacheKey);
      if (!data) {
        data = await fetchHotItem();
        await cache.set(cacheKey, data, 60);
      }
    } finally {
      release();
    }
  }
  
  res.json(data);
});

Multi-Level Cache Architecture

Large applications can adopt a multi-level cache architecture:

// Three-level cache example: Memory -> Redis -> CDN
app.get('/api/popular', async (req, res) => {
  // 1. Check memory cache
  const memoryKey = `popular_${req.query.page}`;
  const memoryCache = localCache.get(memoryKey);
  if (memoryCache) return res.json(memoryCache);

  // 2. Check Redis cache
  const redisKey = `popular:${req.query.page}`;
  const redisData = await redisClient.get(redisKey);
  if (redisData) {
    localCache.set(memoryKey, redisData, 10); // Write back to memory cache
    return res.json(JSON.parse(redisData));
  }

  // 3. Fallback to source query
  const data = await fetchPopularData(req.query.page);
  
  // Set multi-level caches
  localCache.set(memoryKey, data, 10); // Memory cache for 10 seconds
  await redisClient.setex(redisKey, 3600, JSON.stringify(data)); // Redis cache for 1 hour
  cdnCache.set(`/api/popular?page=${req.query.page}`, data); // CDN cache
  
  res.json(data);
});

Cache Version Control

For long-term caching, version control is necessary:

// Content hash-based version control
app.get('/static/:file', (req, res) => {
  const filePath = path.join(__dirname, 'static', req.params.file);
  const fileContent = fs.readFileSync(filePath);
  const hash = crypto.createHash('sha1').update(fileContent).digest('hex').slice(0, 8);
  
  res.set({
    'Cache-Control': 'public, max-age=31536000', // 1 year
    'ETag': hash
  });
  
  res.sendFile(filePath);
});

Frontend references should use filenames with hashes:

<script src="/static/app.a1b2c3d4.js"></script>
<link href="/static/style.e5f6g7h8.css" rel="stylesheet">

本站部分内容来自互联网,一切版权均归源网站或源作者所有。

如果侵犯了你的权益请来信告知我们删除。邮箱:cc@cccx.cn

Front End Chuan

Front End Chuan, Chen Chuan's Code Teahouse 🍵, specializing in exorcising all kinds of stubborn bugs 💻. Daily serving baldness-warning-level development insights 🛠️, with a bonus of one-liners that'll make you laugh for ten years 🐟. Occasionally drops pixel-perfect romance brewed in a coffee cup ☕.