Bulk operations (bulkWrite)
Batch Operations (bulkWrite)
MongoDB's bulkWrite
method provides an efficient way to execute multiple write operations through a single network request, completing combinations of insert, update, delete, and other operations. Compared to executing operations one by one, it significantly reduces network overhead and server load, making it particularly suitable for scenarios requiring atomic processing of large batches of data.
Core Concepts and Syntax
bulkWrite
accepts an array of write operation objects, each representing an independent operation. The basic syntax is as follows:
db.collection.bulkWrite(
[ <operation 1>, <operation 2>, ... ],
{
writeConcern: <document>,
ordered: <boolean>
}
)
Key parameters:
operations
: An array containing multiple write operationsordered
: Defaults totrue
, meaning operations are executed sequentially (stops on error);false
continues executing subsequent operationswriteConcern
: Optional, sets the write operation acknowledgment level
Supported Operation Types
Insert Document (insertOne)
{
insertOne: {
"document": { name: "Product A", price: 99 }
}
}
Update Document (updateOne/updateMany)
Single update example:
{
updateOne: {
"filter": { _id: ObjectId("5f8792b...") },
"update": { $set: { stock: 45 } },
"upsert": true // Optional, inserts if document doesn't exist
}
}
Batch update example:
{
updateMany: {
"filter": { category: "Electronics" },
"update": { $inc: { viewCount: 1 } }
}
}
Replace Document (replaceOne)
{
replaceOne: {
"filter": { sku: "XJ-208" },
"replacement": {
sku: "XJ-208",
name: "New Controller",
inventory: 120
}
}
}
Delete Document (deleteOne/deleteMany)
// Delete single document
{
deleteOne: { "filter": { status: "Expired" } }
}
// Batch delete
{
deleteMany: { "filter": { createDate: { $lt: ISODate("2023-01-01") } } }
}
Practical Application Examples
E-commerce Inventory Management Scenario
const operations = [
// Insert new product
{
insertOne: {
document: {
sku: "P-1002",
name: "Wireless Headphones",
stock: 200,
price: 299
}
}
},
// Update existing product stock
{
updateOne: {
filter: { sku: "P-1001" },
update: { $inc: { stock: 50 } }
}
},
// Mark out-of-stock products
{
updateMany: {
filter: { stock: { $lte: 0 } },
update: { $set: { status: "Out of Stock" } }
}
},
// Delete expired promotions
{
deleteMany: {
filter: {
promoEnd: { $lt: new Date() }
}
}
}
];
db.products.bulkWrite(operations, { ordered: false });
User Data Migration Case
const userUpdates = [
// Migrate legacy system IDs
{
updateMany: {
filter: { legacyId: { $exists: true } },
update: {
$set: {
"meta.legacyId": "$legacyId",
migrated: true
},
$unset: { legacyId: "" }
}
}
},
// Standardize phone number format
{
updateMany: {
filter: { phone: /^(\d{3})(\d{4})(\d{4})$/ },
update: {
$set: {
phone: {
$concat: [
"$phone.substr(0,3)", "-",
"$phone.substr(3,4)", "-",
"$phone.substr(7,4)"
]
}
}
}
}
}
];
db.users.bulkWrite(userUpdates);
Performance Optimization Strategies
Batch Size Control
// Process 100,000 records in batches
const batchSize = 1000;
for (let i = 0; i < 100000; i += batchSize) {
const batchOps = generateOperations(i, batchSize);
db.collection.bulkWrite(batchOps, { ordered: false });
}
function generateOperations(offset, limit) {
// Generate specific operation logic
return [...];
}
Index Optimization Recommendations
- Ensure
filter
condition fields are indexed - For update operations, queries containing
_id
are most efficient - Avoid full collection scans with
updateMany
/deleteMany
Write Concern Level Adjustment
// Reduce write acknowledgment requirements to improve throughput
db.collection.bulkWrite(ops, {
writeConcern: { w: 1, j: false }
});
Error Handling Patterns
Ordered Operation Error Capture
try {
const result = db.orders.bulkWrite([
{ insertOne: { ... } },
{ updateOne: { ... } } // Assume this fails
], { ordered: true });
} catch (e) {
console.error("Failed operation position:", e.result.nInserted + 1);
console.error("Error details:", e.writeErrors[0].errmsg);
}
Unordered Operation Result Analysis
const result = db.logs.bulkWrite([...], { ordered: false });
if (result.hasWriteErrors()) {
result.getWriteErrors().forEach(err => {
console.log(`Operation ${err.index} failed:`, err.errmsg);
});
}
console.log("Successful inserts:", result.nInserted);
console.log("Successful updates:", result.nModified);
Special Scenario Handling
Mixed Operation Atomicity
// Funds transfer transaction example
const transferOps = [
{
updateOne: {
filter: { _id: acc1Id, balance: { $gte: 100 } },
update: { $inc: { balance: -100 } }
}
},
{
updateOne: {
filter: { _id: acc2Id },
update: { $inc: { balance: 100 } }
}
}
];
// Execute within a transaction session
const session = db.getMongo().startSession();
session.withTransaction(() => {
db.accounts.bulkWrite(transferOps, { session });
});
Array Batch Update Techniques
// Add elements to array for all matching documents
{
updateMany: {
filter: { department: "R&D" },
update: {
$push: {
projects: {
$each: ["New Gateway System", "Data Platform"],
$position: 0
}
}
}
}
}
Comparison with Alternatives
Comparison with insertMany
// insertMany only handles inserts
db.collection.insertMany([doc1, doc2, doc3])
// bulkWrite enables mixed operations
db.collection.bulkWrite([
{ insertOne: { document: doc1 } },
{ updateOne: { ... } },
{ deleteOne: { ... } }
])
Comparison with Looped Single Operations
// Inefficient approach (multiple network roundtrips)
for (const doc of docs) {
db.collection.updateOne(
{ _id: doc._id },
{ $set: { value: doc.newValue } }
);
}
// Efficient approach (single network request)
const ops = docs.map(doc => ({
updateOne: {
filter: { _id: doc._id },
update: { $set: { value: doc.newValue } }
}
}));
db.collection.bulkWrite(ops);
Monitoring and Debugging
Obtaining Operation Statistics
const result = db.inventory.bulkWrite([...]);
console.log(JSON.stringify(result, null, 2));
/* Sample output:
{
"ok": 1,
"nInserted": 3,
"nUpserted": 0,
"nMatched": 5,
"nModified": 5,
"nRemoved": 2,
"upserted": [],
"writeErrors": []
}
*/
Using explain for Analysis
const explain = db.collection.explain().bulkWrite([...]);
console.log(explain.queryPlanner.winningPlan);
Best Practice Recommendations
- Batch Size Control: Adjust based on average document size, typically 1000-5000 operations/batch
- Error Handling: Always check for
writeErrors
in the result - Retry Mechanism: Implement exponential backoff for network errors
- Schema Design: Consider document structure impact on batch updates
- Monitoring Metrics: Track
nModified
tonMatched
ratio to detect unintended updates
Real Business Cases
Log Archiving Process
const lastMonth = new Date();
lastMonth.setMonth(lastMonth.getMonth() - 1);
const logArchiveOps = [
// Mark old logs as archived
{
updateMany: {
filter: {
timestamp: { $lt: lastMonth },
status: { $ne: "archived" }
},
update: { $set: { status: "archived" } }
}
},
// Copy to archive collection
{
insertMany: {
documents: db.logs.find({
timestamp: { $lt: lastMonth }
}).toArray()
}
},
// Clean up original collection
{
deleteMany: {
filter: {
timestamp: { $lt: lastMonth }
}
}
}
];
db.logs.bulkWrite(logArchiveOps);
Comprehensive Price Adjustment
const priceAdjustments = [
// Increase electronics prices by 10%
{
updateMany: {
filter: { category: "electronics" },
update: {
$mul: { price: 1.1 },
$currentDate: { lastModified: true }
}
}
},
// Reduce clearance items by 30%
{
updateMany: {
filter: { clearance: true },
update: {
$mul: { price: 0.7 },
$set: { promoTag: "Clearance Sale" }
}
}
},
// Delete discontinued products
{
deleteMany: {
filter: {
status: "discontinued",
lastSold: { $lt: new Date("2022-01-01") }
}
}
}
];
db.products.bulkWrite(priceAdjustments, { ordered: false });
本站部分内容来自互联网,一切版权均归源网站或源作者所有。
如果侵犯了你的权益请来信告知我们删除。邮箱:cc@cccx.cn