-
-
Notifications
You must be signed in to change notification settings - Fork 691
Description
Commit 1a7f4a9 ("fix: added support for compression") introduced a switch on this.headers['content-encoding'] in src/Formidable.js (line 262). The default case references node_stream, which is never imported:
default:
pipe = node_stream.Transform({
transform: function (chunk, encoding, callback) {
callback(null, chunk);
}
})Every request without a Content-Encoding header -- which is the vast majority of real multipart uploads -- hits this path and throws ReferenceError: node_stream is not defined. The file imports EventEmitter from node:events but nothing from node:stream.
Repro on current master:
$ node -e "import('./src/index.js').then(({formidable}) => { const h=require('http'); const f=formidable(); h.createServer((q,r)=>f.parse(q,(e)=>{r.end(e?e.message:'ok')})).listen(3000) })"
Then curl -F file=@any.txt http://localhost:3000 -- immediate crash:
file:///…/src/Formidable.js:277
pipe = node_stream.Transform({
^
ReferenceError: node_stream is not defined
at IncomingForm.parse (file:///…/src/Formidable.js:277:9)
The compressed paths also use require("zlib") in what is an ESM module ("type": "module" in package.json, file uses import statements). The CJS build in dist/ would handle this, but the ESM export ("import": { "default": "./src/index.js" }) serves the raw source where require is not defined.
Separately, the zlib decompressors are created without maxOutputLength:
case "gzip":
pipe = require("zlib").createGunzip();A small gzip bomb (1KB compressed, gigabytes decompressed) will spike memory before maxTotalFileSize fires, because the zlib stream buffers internally. Node's zlib streams have supported { maxOutputLength } since v14.5.0 specifically for this.
No tests were added for the compression feature, which is how the node_stream reference survived.
I can open a PR to fix all three (import, ESM require, decompression limits) if that's useful.