阿里云主机折上折
  • 微信号
Current Site:Index > Load testing

Load testing

Author:Chuan Chen 阅读数:57584人阅读 分类: Node.js

Basic Concepts of Load Testing

Load testing evaluates a system's performance under specific loads. By simulating real user behavior, it observes metrics such as system response time, throughput, and resource utilization. Node.js, as an asynchronous event-driven platform, requires special attention to event loop latency and memory leaks during load testing.

const loadTest = require('loadtest');

const options = {
    url: 'http://localhost:3000/api',
    maxRequests: 1000,
    concurrency: 50,
    method: 'POST',
    body: {
        userId: 'test123',
        data: 'sample payload'
    },
    contentType: 'application/json'
};

loadTest.loadTest(options, (error, results) => {
    if (error) return console.error('Test failed:', error);
    console.log('Test results:', results);
});

Node.js Load Testing Tool Selection

Artillery

Artillery is a professional load testing tool that supports HTTP and WebSocket protocols. Its YAML configuration files are easy to write and maintain:

config:
  target: "http://api.example.com"
  phases:
    - duration: 60
      arrivalRate: 50
      name: "Warm up phase"
    - duration: 120
      arrivalRate: 100
      rampTo: 200
      name: "Stress test"
scenarios:
  - flow:
    - post:
        url: "/login"
        json:
          username: "user"
          password: "pass"
    - get:
        url: "/profile"

Autocannon

Autocannon is a high-performance load testing tool written in pure JavaScript, particularly suitable for testing Node.js services:

const autocannon = require('autocannon');

autocannon({
    url: 'http://localhost:3000',
    connections: 100, // Concurrent connections
    pipelining: 10,   // Pipelined requests per connection
    duration: 30      // Test duration (seconds)
}, console.log);

Test Metric Analysis

Key Performance Metrics

  1. Request Latency: P95 and P99 latencies better reflect real user experience.
  2. Throughput: RPS (requests per second) is a core metric.
  3. Error Rate: Proportion of non-2xx/3xx HTTP responses.
  4. Resource Usage: CPU, memory, and event loop latency.
// Using perf_hooks to monitor event loop latency
const { monitorEventLoopDelay } = require('perf_hooks');

const histogram = monitorEventLoopDelay({ resolution: 10 });
histogram.enable();

setTimeout(() => {
    histogram.disable();
    console.log(`P50 Latency: ${histogram.percentile(50)}ms`);
    console.log(`P99 Latency: ${histogram.percentile(99)}ms`);
}, 10000);

Test Scenario Design

Progressive Load Testing

Gradually increase load from low to high, observing system behavior changes:

const stages = [
    { duration: '30s', target: 50 },  // Warm-up phase
    { duration: '1m', target: 100 },  // Normal load
    { duration: '2m', target: 250 },  // Stress test
    { duration: '30s', target: 50 }   // Recovery phase
];

Spike Testing

Simulate sudden traffic surges to test system resilience:

config:
  phases:
    - duration: 10
      arrivalRate: 5
    - duration: 1
      arrivalRate: 500  # Instant peak
    - duration: 30
      arrivalRate: 50

Node.js-Specific Optimization Points

Connection Pool Management

Database connection pool configuration directly impacts system load capacity:

const pool = mysql.createPool({
    connectionLimit: 50,          // Critical parameter
    queueLimit: 1000,             // Wait queue length
    acquireTimeout: 30000,        // Connection acquisition timeout
    waitForConnections: true      // Wait when no connections are available
});

Cluster Mode Testing

Test multi-process performance using Node.js cluster module:

const cluster = require('cluster');
const numCPUs = require('os').cpus().length;

if (cluster.isMaster) {
    for (let i = 0; i < numCPUs; i++) {
        cluster.fork();
    }
} else {
    require('./server'); // Start the application
}

Common Issue Troubleshooting

Memory Leak Detection

Use heapdump and Chrome DevTools to analyze memory issues:

const heapdump = require('heapdump');

setInterval(() => {
    if (process.memoryUsage().heapUsed > 500 * 1024 * 1024) {
        heapdump.writeSnapshot(`heap-${Date.now()}.heapsnapshot`);
    }
}, 5000);

Event Loop Blocking

Long synchronous operations can block the event loop:

// Bad example: Synchronous encryption
const crypto = require('crypto');
function hashPassword(password) {
    // Synchronous operation blocks the event loop
    return crypto.createHash('sha256').update(password).digest('hex');
}

// Good example: Asynchronous encryption
async function hashPasswordAsync(password) {
    return new Promise((resolve) => {
        crypto.pbkdf2(password, 'salt', 100000, 64, 'sha256', (err, derivedKey) => {
            resolve(derivedKey.toString('hex'));
        });
    });
}

Load Testing in Continuous Integration

Add load testing to CI/CD pipelines:

# GitHub Actions example
name: Load Test
on: [push]
jobs:
  load-test:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - run: npm install
    - run: npm install -g artillery
    - run: artillery run test.yml
      env:
        CI: true

Cloud-Based Load Testing

Use services like AWS Distributed Load Testing for large-scale testing:

const { LambdaClient, InvokeCommand } = require('@aws-sdk/client-lambda');

const client = new LambdaClient({ region: 'us-east-1' });

async function triggerLoadTest() {
    const command = new InvokeCommand({
        FunctionName: 'load-test-worker',
        InvocationType: 'Event',
        Payload: JSON.stringify({
            targetUrl: 'https://api.example.com',
            duration: 300,
            rate: 1000
        })
    });
    await client.send(command);
}

Test Result Visualization

Build monitoring dashboards with Grafana+Prometheus:

const client = require('prom-client');

// Define metrics
const httpRequestDurationMicroseconds = new client.Histogram({
    name: 'http_request_duration_ms',
    help: 'HTTP request duration (ms)',
    labelNames: ['method', 'route', 'code'],
    buckets: [50, 100, 200, 300, 400, 500, 1000]
});

// Record metrics in middleware
app.use((req, res, next) => {
    const end = httpRequestDurationMicroseconds.startTimer();
    res.on('finish', () => {
        end({ 
            method: req.method,
            route: req.route.path,
            code: res.statusCode
        });
    });
    next();
});

Real-World Case: E-commerce API Testing

Test configuration simulating an e-commerce flash sale scenario:

config:
  target: "https://api.shop.com"
  phases:
    - duration: 300
      arrivalRate: 50
      name: "Normal traffic"
    - duration: 600
      arrivalRate: 200
      rampTo: 1000
      name: "Flash sale"
scenarios:
  - name: "Product browsing"
    flow:
      - get:
          url: "/products"
      - think: 3
      - get:
          url: "/products/{{ $randomNumber(1,1000) }}"
  - name: "Checkout process"
    flow:
      - post:
          url: "/cart"
          json:
            productId: "{{ $randomNumber(1,1000) }}"
            quantity: 1
      - think: 1
      - post:
          url: "/checkout"
          json:
            userId: "user-{{ $randomNumber(1,10000) }}"

Node.js Performance Tuning

Targeted optimization based on test results:

// HTTP server optimization
const server = http.createServer(app);
server.keepAliveTimeout = 65000;  // Keep-alive timeout
server.headersTimeout = 70000;    // Headers timeout

// Use compression middleware
app.use(require('compression')({
    threshold: 1024,  // Compress only if >1KB
    filter: (req) => !req.headers['x-no-compression']
}));

// Connection reuse optimization
const httpAgent = new http.Agent({
    keepAlive: true,
    maxSockets: 100,
    maxFreeSockets: 10,
    timeout: 60000
});
axios.defaults.httpAgent = httpAgent;

本站部分内容来自互联网,一切版权均归源网站或源作者所有。

如果侵犯了你的权益请来信告知我们删除。邮箱:cc@cccx.cn

Front End Chuan

Front End Chuan, Chen Chuan's Code Teahouse 🍵, specializing in exorcising all kinds of stubborn bugs 💻. Daily serving baldness-warning-level development insights 🛠️, with a bonus of one-liners that'll make you laugh for ten years 🐟. Occasionally drops pixel-perfect romance brewed in a coffee cup ☕.