阿里云主机折上折
  • 微信号
Current Site:Index > Design of offline caching strategy

Design of offline caching strategy

Author:Chuan Chen 阅读数:32198人阅读 分类: 性能优化

Offline Cache Strategy Design

Offline caching is a key method for improving application performance, especially in unstable or weak network environments. Reasonable caching strategies can significantly enhance user experience. Through preloading, dynamic updates, and intelligent expiration mechanisms, the number of network requests can be effectively reduced, server pressure can be alleviated, and data timeliness can be ensured.

Cache Types and Selection

Based on storage medium and lifecycle, frontend caching is mainly divided into the following types:

  1. Memory Cache: In-memory cache, fastest read speed but short lifecycle, invalidated when the page is closed.
  2. Session Storage: Session-level storage, valid within the same tab.
  3. Local Storage: Persistent storage, requires manual clearing.
  4. IndexedDB: Structured data storage, suitable for large amounts of data.
  5. Service Worker Cache: Network request interception cache, a core technology of PWAs.
// Cache type usage example
const cacheData = {
  // Memory cache (temporary data)
  memoryCache: new Map(),
  
  // Session storage (form state)
  saveSessionData(key, value) {
    sessionStorage.setItem(key, JSON.stringify(value));
  },
  
  // Local storage (user preferences)
  saveLocalData(key, value) {
    localStorage.setItem(key, JSON.stringify(value));
  }
};

Cache Update Strategies

Time-Based Expiration Strategy

Set a fixed cache validity period, suitable for data with a fixed change cycle:

function getWithExpiry(key) {
  const itemStr = localStorage.getItem(key);
  if (!itemStr) return null;
  
  const item = JSON.parse(itemStr);
  if (Date.now() > item.expiry) {
    localStorage.removeItem(key);
    return null;
  }
  return item.value;
}

function setWithExpiry(key, value, ttl) {
  const item = {
    value: value,
    expiry: Date.now() + ttl
  };
  localStorage.setItem(key, JSON.stringify(item));
}

Version Control

Force cache updates through version identifiers:

const CACHE_VERSION = 'v1.2';
const getCacheKey = (key) => `${CACHE_VERSION}_${key}`;

function fetchWithCache(url) {
  const cacheKey = getCacheKey(url);
  const cached = localStorage.getItem(cacheKey);
  if (cached) return Promise.resolve(JSON.parse(cached));
  
  return fetch(url)
    .then(res => res.json())
    .then(data => {
      localStorage.setItem(cacheKey, JSON.stringify(data));
      return data;
    });
}

Hybrid Cache Strategy Design

Layered Cache Solution

Build a multi-level cache system to improve hit rates:

class HybridCache {
  constructor() {
    this.memoryCache = new Map();
    this.SWSupported = 'serviceWorker' in navigator;
  }

  async get(url) {
    // First layer: Memory cache
    if (this.memoryCache.has(url)) {
      return this.memoryCache.get(url);
    }
    
    // Second layer: Service Worker cache
    if (this.SWSupported) {
      const cached = await caches.match(url);
      if (cached) {
        const data = await cached.json();
        this.memoryCache.set(url, data); // Backfill memory cache
        return data;
      }
    }
    
    // Third layer: Network request
    const freshData = await fetch(url).then(r => r.json());
    this.memoryCache.set(url, freshData);
    return freshData;
  }
}

Intelligent Preloading Strategy

Combine user behavior prediction for cache preheating:

// Route-based preloading
const preloadMap = {
  '/home': ['/api/news', '/api/announcements'],
  '/products': ['/api/categories', '/api/hot-products']
};

router.beforeEach((to, from, next) => {
  const preloadUrls = preloadMap[to.path];
  if (preloadUrls) {
    preloadUrls.forEach(url => {
      fetch(url, { cache: 'force-cache' })
        .then(res => res.json())
        .then(data => cacheStore.set(url, data));
    });
  }
  next();
});

Cache Cleanup Mechanisms

LRU-Based Cleanup

Implement the Least Recently Used (LRU) eviction algorithm:

class LRUCache {
  constructor(maxSize = 20) {
    this.cache = new Map();
    this.maxSize = maxSize;
  }

  get(key) {
    if (!this.cache.has(key)) return null;
    
    const value = this.cache.get(key);
    this.cache.delete(key);
    this.cache.set(key, value);
    return value;
  }

  set(key, value) {
    if (this.cache.has(key)) {
      this.cache.delete(key);
    } else if (this.cache.size >= this.maxSize) {
      // Delete the least recently used entry
      const oldestKey = this.cache.keys().next().value;
      this.cache.delete(oldestKey);
    }
    this.cache.set(key, value);
  }
}

Storage Space Monitoring

Dynamically adjust cache strategies based on available space:

function getStorageQuota() {
  return new Promise((resolve) => {
    if ('storage' in navigator && 'estimate' in navigator.storage) {
      navigator.storage.estimate().then(estimate => {
        const used = estimate.usage;
        const quota = estimate.quota;
        resolve({ used, quota, ratio: used / quota });
      });
    } else {
      // Fallback solution
      try {
        localStorage.setItem('test', 'x');
        localStorage.removeItem('test');
        resolve({ status: 'supported' });
      } catch (e) {
        resolve({ status: 'full' });
      }
    }
  });
}

Exception Handling and Fallback Solutions

The caching system requires robust error handling mechanisms:

async function robustCacheFetch(url, fallbackUrl) {
  try {
    // First attempt to fetch from cache
    const cachedResponse = await caches.match(url);
    if (cachedResponse) return cachedResponse.json();
    
    // Network request
    const networkResponse = await fetch(url);
    
    // Cache the new response
    const cache = await caches.open('dynamic-v1');
    await cache.put(url, networkResponse.clone());
    
    return networkResponse.json();
  } catch (error) {
    console.error('Fetch failed:', error);
    
    // Fallback solution 1: Try backup URL
    if (fallbackUrl) {
      return fetch(fallbackUrl).then(r => r.json());
    }
    
    // Fallback solution 2: Return stale cache
    const staleResponse = await caches.match(url);
    if (staleResponse) {
      return staleResponse.json();
    }
    
    // Final fallback: Return preset default value
    return { status: 'offline', data: [] };
  }
}

Performance Monitoring and Tuning

Implement cache hit rate statistics to help optimize strategies:

const cacheStats = {
  hits: 0,
  misses: 0,
  get hitRate() {
    return this.hits / (this.hits + this.misses) || 0;
  }
};

function instrumentedFetch(url) {
  return fetch(url).then(response => {
    // Record original response time
    const timing = performance.now();
    
    return {
      response,
      timing,
      markHit() {
        cacheStats.hits++;
      },
      markMiss() {
        cacheStats.misses++;
      }
    };
  });
}

// Usage example
async function getData(url) {
  const { response, timing, markHit, markMiss } = await instrumentedFetch(url);
  if (fromCache(response)) {
    markHit();
  } else {
    markMiss();
  }
  return response;
}

Browser Compatibility Handling

Implement a unified cache interface for different browsers:

const cacheWrapper = {
  set(key, value) {
    try {
      if (window.localStorage) {
        localStorage.setItem(key, JSON.stringify(value));
        return true;
      }
    } catch (e) {
      console.warn('LocalStorage full, falling back to memory');
    }
    
    // Fallback to memory cache
    if (!this.memoryCache) this.memoryCache = {};
    this.memoryCache[key] = value;
    return false;
  },
  
  get(key) {
    // First check memory cache
    if (this.memoryCache && key in this.memoryCache) {
      return this.memoryCache[key];
    }
    
    try {
      if (window.localStorage) {
        const item = localStorage.getItem(key);
        return item ? JSON.parse(item) : null;
      }
    } catch (e) {
      console.error('LocalStorage access error');
    }
    return null;
  }
};

本站部分内容来自互联网,一切版权均归源网站或源作者所有。

如果侵犯了你的权益请来信告知我们删除。邮箱:cc@cccx.cn

Front End Chuan

Front End Chuan, Chen Chuan's Code Teahouse 🍵, specializing in exorcising all kinds of stubborn bugs 💻. Daily serving baldness-warning-level development insights 🛠️, with a bonus of one-liners that'll make you laugh for ten years 🐟. Occasionally drops pixel-perfect romance brewed in a coffee cup ☕.