Log collection and analysis solution
The Necessity of Log Collection
Logs are critical data generated during system operation, recording application behavior, errors, and performance metrics. Effective log collection and analysis help developers quickly locate issues and optimize system performance. In Koa2 applications, logs typically include request logs, error logs, custom business logs, etc. The lack of systematic log management can make troubleshooting difficult and performance optimization unsubstantiated.
Koa2 Log Middleware Options
The Koa2 ecosystem offers several mature log middleware options:
- koa-logger: Lightweight request logging middleware
const logger = require('koa-logger')
app.use(logger())
Example output:
GET /api/users 200 12ms
- koa-morgan: Feature-rich HTTP request logger
const morgan = require('koa-morgan')
app.use(morgan('combined'))
- winston: Versatile logging library supporting multiple transport methods
const winston = require('winston')
const logger = winston.createLogger({
transports: [
new winston.transports.Console(),
new winston.transports.File({ filename: 'combined.log' })
]
})
app.use(async (ctx, next) => {
ctx.logger = logger
await next()
})
Custom Log Middleware Implementation
For specific requirements, custom log middleware can be developed:
async function logMiddleware(ctx, next) {
const start = Date.now()
try {
await next()
const ms = Date.now() - start
console.log(`${ctx.method} ${ctx.url} - ${ms}ms`)
// Log to file
fs.appendFileSync('access.log',
`${new Date().toISOString()} ${ctx.ip} ${ctx.method} ${ctx.url} ${ctx.status} ${ms}ms\n`
)
} catch (err) {
console.error(`[ERROR] ${err.stack}`)
throw err
}
}
app.use(logMiddleware)
Structured Log Processing
Raw text logs are not conducive to analysis; structured formats should be used:
const { createLogger, format, transports } = require('winston')
const logger = createLogger({
format: format.combine(
format.timestamp(),
format.json()
),
transports: [new transports.Console()]
})
// Usage example
logger.info('User login', {
userId: 123,
ip: '192.168.1.1',
device: 'iOS'
})
Output:
{
"message": "User login",
"userId": 123,
"ip": "192.168.1.1",
"device": "iOS",
"timestamp": "2023-05-20T08:30:15.123Z"
}
Log Collection System Architecture
A complete log collection system typically consists of the following components:
- Log Collection Layer: Log generation within applications
- Log Transport Layer: Filebeat/Logstash/Fluentd
- Storage Layer: Elasticsearch/MongoDB
- Analysis Layer: Kibana/Grafana
Example Filebeat configuration:
filebeat.inputs:
- type: log
paths:
- /var/log/koa/*.log
output.elasticsearch:
hosts: ["elasticsearch:9200"]
Special Handling for Error Logs
Error logs require additional attention and should be collected separately:
app.on('error', (err, ctx) => {
logger.error('Server error', {
error: err.message,
stack: err.stack,
request: ctx.request
})
// Send to error monitoring platforms like Sentry
if (process.env.NODE_ENV === 'production') {
Sentry.captureException(err)
}
})
Log Level Strategy
Reasonable log levels aid in troubleshooting:
Level | Usage Scenario | Example |
---|---|---|
error | System errors | Database connection failure |
warn | Potential issues | API request timeout |
info | Operational status | User login success |
debug | Debugging info | SQL query statements |
verbose | Detailed tracing | Request header details |
Implementation example:
logger.log('error', 'DB connection failed', {
error: err.message,
retryCount: 3
})
Log Sampling and Degradation
Log sampling should be considered in high-traffic scenarios:
function shouldLog() {
return Math.random() < 0.1 // 10% sampling rate
}
app.use(async (ctx, next) => {
if (shouldLog()) {
logger.info(`Request: ${ctx.method} ${ctx.path}`)
}
await next()
})
Sensitive Information Filtering
Sensitive data in logs must be filtered:
const maskFields = ['password', 'token', 'creditCard']
function sanitize(obj) {
const result = {...obj}
maskFields.forEach(field => {
if (result[field]) {
result[field] = '***'
}
})
return result
}
logger.info('User created', sanitize(user))
Log Performance Optimization
Bulk log writing may impact performance; recommendations:
- Use asynchronous writing
const fs = require('fs').promises
async function writeLog(message) {
try {
await fs.appendFile('app.log', message + '\n')
} catch (err) {
console.error('Log write failed', err)
}
}
- Batch writing
let logBuffer = []
setInterval(() => {
if (logBuffer.length > 0) {
fs.appendFileSync('app.log', logBuffer.join('\n'))
logBuffer = []
}
}, 5000)
Log Analysis Practices
Collected logs can be analyzed in the following ways:
- ELK Stack Query:
{
"query": {
"bool": {
"must": [
{ "match": { "status": 500 } },
{ "range": { "timestamp": { "gte": "now-1h" } } }
]
}
}
}
- Kibana Visualization:
- Create pie charts for request status code distribution
- Draw line charts for response time trends
- Set error rate threshold alerts
Log and Monitoring System Integration
Integrate logging systems with monitoring platforms:
const { Counter } = require('prom-client')
const httpRequestsTotal = new Counter({
name: 'http_requests_total',
help: 'Total HTTP requests',
labelNames: ['method', 'path', 'status']
})
app.use(async (ctx, next) => {
httpRequestsTotal.inc({
method: ctx.method,
path: ctx.path,
status: ctx.status
})
await next()
})
Log Storage Strategy
Develop storage strategies based on business needs:
- Hot Storage: Recent 7-day logs for fast queries
- Warm Storage: 7-30 day logs, compressed storage
- Cold Storage: 30+ day logs, archived to object storage
Example lifecycle policy:
{
"policy": {
"hot": {
"min_age": "0d",
"actions": {
"rollover": {
"max_size": "50gb",
"max_age": "7d"
}
}
},
"delete": {
"min_age": "90d",
"actions": {
"delete": {}
}
}
}
}
Distributed Log Tracing
Microservices architectures require distributed tracing:
const { v4: uuidv4 } = require('uuid')
app.use(async (ctx, next) => {
ctx.traceId = ctx.get('X-Trace-Id') || uuidv4()
ctx.set('X-Trace-Id', ctx.traceId)
logger.info(`Request started`, {
traceId: ctx.traceId,
path: ctx.path
})
await next()
logger.info(`Request completed`, {
traceId: ctx.traceId,
status: ctx.status,
duration: Date.now() - start
})
})
Log Compliance and Auditing
Log processing to meet compliance requirements:
- Preserve original logs without modification
- Record operator identity information
- Implement log access control
Audit log example:
function logAudit(action, user, details) {
logger.info('AUDIT_LOG', {
action,
userId: user.id,
ip: user.ip,
timestamp: new Date().toISOString(),
...details
})
}
// Usage
logAudit('USER_DELETE', adminUser, {
targetUserId: 123,
reason: 'violation'
})
本站部分内容来自互联网,一切版权均归源网站或源作者所有。
如果侵犯了你的权益请来信告知我们删除。邮箱:cc@cccx.cn