A Node.js server is crashing due to memory exhaustion when handling large file uploads. Identify the likely cause in the code and explain how to fix it using streams.
Node.js interview question for Advanced practice.
Answer
The likely cause is that the server is buffering the entire file upload into memory before writing it to disk. If a file is larger than the available memory for the Node.js process, the server will crash. The correct approach is to use streams to process the file in chunks. The incoming request is a readable stream, and you can create a writable stream to a file. By 'piping' the request stream directly to the file stream, the data flows from the client to the disk in small chunks without ever being fully stored in memory. Example of incorrect, memory-intensive code: javascript // This will crash with large files req.on('data', chunk = { body += chunk; }); req.on('end', () = { fs.writeFileSync('upload.tmp', body); }); Corrected code using streams: javascript const http = require('http'); const fs = require('fs'); const server = http.createServer((req, res) = { if (req.method === 'POST') { const destination = fs.createWriteStream('./upload.tmp'); req.pipe(destination); destination.on('finish', () = { res.writeHead(200, {'Content-Type': 'text/plain'}); res.end('File uploaded successfully!'); }); req.on('error', err = { console.error('Request stream error:', err); res.writeHead(500, {'Content-Type': 'text/plain'}); res.end('Upload failed.'); }); } else { res.end('Send a POST request to upload a file.'); } }); server.listen(3000);
Explanation
Node.js streams are crucial for building memory-efficient applications. They allow you to process data piece by piece, as it arrives, rather than waiting for the entire payload to be available.