Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Requests fully buffered by default #1045

Open
tomwidmer opened this issue Aug 16, 2017 · 9 comments
Open

Requests fully buffered by default #1045

tomwidmer opened this issue Aug 16, 2017 · 9 comments
Labels
state::triage Issue that is awaiting trial

Comments

@tomwidmer
Copy link

#### Summary

By default, when making a streaming request using Axios setting data to a stream, the request is fully buffered. E.g. if you use Axios to send a 10GB file over a network using a file stream, it will consume 10GB of RAM, minimum. This is due to the default behaviour of maxRedirects, which is to allow redirects, which means that the follow-redirects module buffers all data written in case there's a redirect.

I think a section in the docs on streams and buffering with a prominent warning in the docs for maxRedirects that setting it to any value other than 0 (or not setting it at all) will cause writes to be fully buffered would be valuable.

           axios({
                url: "http://somewhere",
                method: 'post',
                responseType: 'json',
                data: fs.createReadStream('huge10GBFile.txt'),
                headers: {
                    'Content-Type': 'text/plain'
                }
            });`

#### Context

  • axios version: v0.16.2
  • Environment: Mac OS X Sierra
@rubennorte
Copy link
Member

This is indeed something worth mentioning in the docs. Would you be interested in creating a PR? ;)

@cyrilchapon
Copy link

This is still true at this time of writing, and this is a major documentation issue, isn't it ?

@realies
Copy link

realies commented Jul 3, 2020

Is this going to be worked on at some point?

@aza547
Copy link

aza547 commented Mar 29, 2024

This seems a really strange interaction, but it's just bitten me. I'm using axios to upload large files to presigned S3 URLs.

const stream = fs.createReadStream(file);

const config: AxiosRequestConfig = {
  onUploadProgress: (event) => progressCallback(Math.round((100 * event.loaded) / stats.size)),
  headers: { 'Content-Length': stats.size, 'Content-Type': contentType },
  validateStatus: () => true,

  // Without this, we load the whole file into memory (which can be several GB).
  maxRedirects: 0,
};

const signedUrl = await this.signPutUrl(key, stats.size);
const rsp = await axios.put(signedUrl, stream, config);

Without the maxRedirects bit I have the RAM usage of NodeJS skyrocket, and in some cases bring down the whole process. I struggled to find any reference online to explain what I'm doing wrong till I stumbled on this thread.

Am I missing anything here? This seems really odd behaviour. Would be great to have this documented.

@DigitalBrainJS
Copy link
Collaborator

@aza547 Yes, you need to disable redirects if you are uploading large data, since stream buffering is enabled to be able to resend the data in case of a redirect.
There is a warning about this ​Axios/follow-redirects specificity in the docs, although it is written in the "progress capturing" section.

⚠️ Warning It is recommended to disable redirects by setting maxRedirects: 0 to upload the stream in the node.js environment, as follow-redirects package will buffer the entire stream in RAM without following the "backpressure" algorithm.

In the long term, we hope to handle redirects ourselves and buffer the upstream only for a short time and buffer size, but for now, we have what we have.

@jasonsaayman
Copy link
Member

closing this its relatively old and @DigitalBrainJS has mentioned a future proposal to handle this

@realies
Copy link

realies commented Sep 27, 2024

@jasonsaayman, a future proposal does not resolve the issue. Can this be open until this is fixed without maxRedirects: 0?

@jasonsaayman
Copy link
Member

@realies for sure, i am trying to see what we can and cannot close so will gladly re-open

@jasonsaayman jasonsaayman reopened this Sep 27, 2024
@jasonsaayman jasonsaayman added the state::triage Issue that is awaiting trial label Sep 28, 2024
@lacherogwu
Copy link

I discovered that when the stream is passed as compressed using gzip, you do not need to set maxRedirects: 0 to prevent it from being fully buffered. I thought this might be helpful.

This works as expected:

import { Readable } from 'node:stream';
import axios from 'axios';
import zlib from 'node:zlib';

const readable = Readable.from(genData(1e9));
const gzip = zlib.createGzip();
const compressed = readable.pipe(gzip);

await axios.post('http://localhost:3000', compressed, {
	headers: {
		'Content-Type': 'text/csv',
	},
});

async function* genData(max) {
	let i = 0;
	yield 'username,password,email\n';
	while (i < max) {
		yield `user${i},password${i},user${i}@gmail.com\n`;
		i++;
	}
}

This required maxRedirects: 0 to work as expected:

import { Readable } from 'node:stream';
import axios from 'axios';

const readable = Readable.from(genData(1e9));

await axios.post('http://localhost:3000', readable, {
	headers: {
		'Content-Type': 'text/csv',
	},
	maxRedirects: 0, // this is required; otherwise, a memory leak will occur
});

async function* genData(max) {
	let i = 0;
	yield 'username,password,email\n';
	while (i < max) {
		yield `user${i},password${i},user${i}@gmail.com\n`;
		i++;
	}
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
state::triage Issue that is awaiting trial
Projects
None yet
Development

No branches or pull requests

8 participants