You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am attempting pipe a large collection of files that can include one large file with size ranging from 500MB to 5GB. On data transfer, when I attempt to pipe a file larger than ~2GB, I get the following error:
buffer.js:269
throw err;
^
RangeError [ERR_INVALID_OPT_VALUE]: The value "2246554069" is invalid for option "size"
at Function.allocUnsafe (buffer.js:291:3)
at Function.concat (buffer.js:473:23)
at bufferConcat (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\concat-stream\index.js:117:17)
at ConcatStream.getBody (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\concat-stream\index.js:64:42)
at ConcatStream.<anonymous> (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\concat-stream\index.js:37:51)
at ConcatStream.emit (events.js:194:15)
at ConcatStream.EventEmitter.emit (domain.js:441:20)
at finishMaybe (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\readable-stream\lib\_stream_writable.js:624:14)
at endWritable (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\readable-stream\lib\_stream_writable.js:643:3)
at ConcatStream.Writable.end (D:\Development\speedwise-platform-app-web\node_modules\s3-files\node_modules\readable-stream\lib\_stream_writable.js:571:22)
at PassThrough.onend (_stream_readable.js:629:10)
at Object.onceWrapper (events.js:277:13)
at PassThrough.emit (events.js:194:15)
at PassThrough.EventEmitter.emit (domain.js:441:20)
at endReadableNT (_stream_readable.js:1103:12)
at process._tickCallback (internal/process/next_tick.js:63:19)
Using s3-files as a dependency of s3-zip.
I am attempting pipe a large collection of files that can include one large file with size ranging from 500MB to 5GB. On data transfer, when I attempt to pipe a file larger than ~2GB, I get the following error:
The code:
ids is an array of file references for s3
and fileLocations is an array of objects that look as follows:
The text was updated successfully, but these errors were encountered: