-
-
Notifications
You must be signed in to change notification settings - Fork 11k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Requests fully buffered by default #1045
Comments
This is indeed something worth mentioning in the docs. Would you be interested in creating a PR? ;) |
This is still true at this time of writing, and this is a major documentation issue, isn't it ? |
Is this going to be worked on at some point? |
This seems a really strange interaction, but it's just bitten me. I'm using axios to upload large files to presigned S3 URLs. const stream = fs.createReadStream(file);
const config: AxiosRequestConfig = {
onUploadProgress: (event) => progressCallback(Math.round((100 * event.loaded) / stats.size)),
headers: { 'Content-Length': stats.size, 'Content-Type': contentType },
validateStatus: () => true,
// Without this, we load the whole file into memory (which can be several GB).
maxRedirects: 0,
};
const signedUrl = await this.signPutUrl(key, stats.size);
const rsp = await axios.put(signedUrl, stream, config); Without the Am I missing anything here? This seems really odd behaviour. Would be great to have this documented. |
@aza547 Yes, you need to disable redirects if you are uploading large data, since stream buffering is enabled to be able to resend the data in case of a redirect.
In the long term, we hope to handle redirects ourselves and buffer the upstream only for a short time and buffer size, but for now, we have what we have. |
closing this its relatively old and @DigitalBrainJS has mentioned a future proposal to handle this |
@jasonsaayman, a future proposal does not resolve the issue. Can this be open until this is fixed without |
@realies for sure, i am trying to see what we can and cannot close so will gladly re-open |
I discovered that when the stream is passed as compressed using gzip, you do not need to set This works as expected: import { Readable } from 'node:stream';
import axios from 'axios';
import zlib from 'node:zlib';
const readable = Readable.from(genData(1e9));
const gzip = zlib.createGzip();
const compressed = readable.pipe(gzip);
await axios.post('http://localhost:3000', compressed, {
headers: {
'Content-Type': 'text/csv',
},
});
async function* genData(max) {
let i = 0;
yield 'username,password,email\n';
while (i < max) {
yield `user${i},password${i},user${i}@gmail.com\n`;
i++;
}
} This required import { Readable } from 'node:stream';
import axios from 'axios';
const readable = Readable.from(genData(1e9));
await axios.post('http://localhost:3000', readable, {
headers: {
'Content-Type': 'text/csv',
},
maxRedirects: 0, // this is required; otherwise, a memory leak will occur
});
async function* genData(max) {
let i = 0;
yield 'username,password,email\n';
while (i < max) {
yield `user${i},password${i},user${i}@gmail.com\n`;
i++;
}
} |
#### Summary
By default, when making a streaming request using Axios setting data to a stream, the request is fully buffered. E.g. if you use Axios to send a 10GB file over a network using a file stream, it will consume 10GB of RAM, minimum. This is due to the default behaviour of maxRedirects, which is to allow redirects, which means that the follow-redirects module buffers all data written in case there's a redirect.
I think a section in the docs on streams and buffering with a prominent warning in the docs for maxRedirects that setting it to any value other than 0 (or not setting it at all) will cause writes to be fully buffered would be valuable.
#### Context
The text was updated successfully, but these errors were encountered: