Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Possible memory leak in Fetch API #68636

Answered by snyamathi
kamdubiel asked this question in Help
Discussion options

Link to the code that reproduces this issue

https://github.com/kamdubiel/next-fetch

To Reproduce

I created simple reproduction with docker compose. There's simple Express API (just to return random JSON data), K6 for stress test and NextJS 14.2.5 with simple dynamic route with async server component.

  1. Clone the repository: https://github.com/kamdubiel/next-fetch
  2. Run the containers: docker compose up -d
  3. Attach to Next container stats: docker stats next
  4. Run K6 test: docker compose run k6 run /scripts/ewoks.js
  5. Wait for the K6 test to finish - there will be high memory usage even with idling Next server. It will never go down, and in real world app it's causing out of memory exceptions every few days.

Current vs. Expected behavior

Current:
When container starts, it's memory usage is around ~45MB.
After stress test, the memory usage goes up to ~400MB, and stays there forever. Running more tests will increase the idle memory usage.

Expected:
After stress test, when Next is idling, the memory used should drop.

Provide environment information

From Docker container:
Operating System:
 Platform: linux
 Arch: x64
 Version: #39-Ubuntu SMP PREEMPT_DYNAMIC Fri Jul 5 21:49:14 UTC 2024
 Available memory (MB): 63434
 Available CPU cores: 16
Binaries:
 Node: 20.16.0
 npm: 10.8.1
 Yarn: 1.22.22
 pnpm: N/A
Relevant Packages:
 next: 14.2.5 // Latest available version is detected (14.2.5).
 eslint-config-next: 14.2.5
 react: 18.3.1
 react-dom: 18.3.1
 typescript: 5.5.4
Next.js Config:
 output: standalone

Which area(s) are affected? (Select all that apply)

Output (export/standalone), Performance

Which stage(s) are affected? (Select all that apply)

next start (local), Other (Deployed)

Additional context

I tested my reproduction code against different Next ~14 versions, canary releases and NodeJS versions. It looks like all the versions since async server components support are affected.
Also, I tested the same code with axios and cross-fetch, and it works correctly, so for me it looks like the issue is with Fetch caching.
I have the same issue in real-world app hosted in AKS, and it causes out of memory errors every few days - is there any way to make the garbage collector clean up memory allocations more aggressively when using native fetch?

I think I checked all the other similar issues like:
#64212
#54708
But I think that my example is much simpler so I decided to create new issue.

You must be logged in to vote

Replies: 7 comments 14 replies

Comment options

We faced this problem in our platform, And we fixed it by downgrading node js to 20.15.1.

You must be logged in to vote
1 reply
Comment options

looks like someone also reported the leak in 20.16.0 (1 minor after 20.15)
https://github.com/orgs/nodejs/discussions/54248

Comment options

Hey @kamdubiel I wonder if it's the console.log buffering the string that's causing the high memory usage. What if you removed the log?

You must be logged in to vote
0 replies
Comment options

@khuezy I tested without console.log and it has no effect on memory usage.

@hahmadzadeh Thank you for that, indeed 20.15.1 works - I didn't test that one.

Unfortunately 22.6.0 that will become LTS in October is also affected.

You must be logged in to vote
0 replies
Comment options

undici's fetch seems to be a nightmare. Have you notified the node team of the leaks happening to that library? There's this: https://undici.nodejs.org/#/?id=garbage-collection but your repo is consuming the body... so something else in node is leaking 🤷

You must be logged in to vote
0 replies
Comment options

@kamdubiel Thank you for submitting an issue and thanks for taking a look everyone!

Since this seems to be a Node issue and not a Next.js issue, I will be converting this to a discussion for further discussion in case anyone else comes across this.

You must be logged in to vote
0 replies
Comment options

@samcx I think I can reproduce this locally.

I created the same handler in vanilla node and next. The memory usage in Next + Node 20.16.0 increases until it crashes whereas Next + Node 20.15.1 does not have the same issue.

Additionally, neither of the vanilla Node version show the same behavior so I think this likely has something to do with modifications that next makes on top of the fetch function (for caching?).

chart

Next

import { headers } from "next/headers";
export default async function Page() {
 const res = await fetch("http://counting-service:9001", {
 headers: headers(),
 });
 const { count } = await res.json();
 return <pre>{count}</pre>;
}

Node

import http from "node:http";
http
 .createServer(async (req, res) => {
 const response = await fetch("http://counting-service:9001", {
 headers: req.headers,
 });
 const { count } = await response.json();
 res.writeHead(200, { "Content-Type": "text/html" });
 res.end(`<pre>${count}</pre>`);
 })
 .listen(3000);

The reproduction requires siege to make a ton of requests against the servers, but feel free to check it out

repro: https://github.com/snyamathi/68636

You must be logged in to vote
6 replies
Comment options

@snyamathi Maybe the source of the leak could be from React's patching of fetch?

From what I see React has removed their fetch patching but those changes aren't in the 14.2.5 build of nextjs. I think 14.3.0-canary has it so maybe that's worth a test.

I'm not able to run your repro steps but I can mess with it later today

docker compose up -d
[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/yokogawa/siege:latest to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/
Comment options

@ppiwo I made a simpler, dockerless reproduction branch. You'll have to select the node version and do the load testing yourself using ab or similar.

https://github.com/snyamathi/68636/tree/simple

If you replace the patched fetch with undici OR downgrade from 20.16.0 to 20.15.1 then the memory leak goes away

import { fetch } from "undici";
Comment options

@snyamathi Thanks that repro worked for me.

I installed 14.3.0-canary.45 where React's fetch patching was removed and the leak is still there. I also tried the latest canary (15.0.0-canary.108) and the memory leak is present there too. I guess it can't be coming from React.

Comment options

possible fix nodejs/undici#3445

Answer selected by kamdubiel
Comment options

for me downgrading Node from 20.17.0 to 20.15.1 solved the issue

Comment options

Same problem here with Next.js and Node v20.16 and v20.17

You must be logged in to vote
7 replies
Comment options

Did you try in Node.js v20.18.0? That is the version where a fix was released nodejs/node#54274

Comment options

I updated to the Node 20.18.0 version, and the leak was apparently fixed

Comment options

this helped us #68636 (reply in thread) we're keeping up with latest next.js 14.x version AND node 20.15.1 - all works perfectly so far

What's the version of Next.js do you use?
latest at the time of writing.

I will try 20.18.0 soon. Thanks for pointing that out.

Comment options

Is anyone knows if 23 could help? We have same issue with memory leaks with node 22

Comment options

Is anyone knows if 23 could help? We have same issue with memory leaks with node 22

I've just fixed memory leak on my Node 22 LTS by using node-fetch instead. It has the same API so all I had to do was add a single import line to each file where fetch is called.
I have a feeling failed fetch requests (like 404) will leak memory regardless whether you consume body or not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Help
Labels
bug Issue was opened via the bug report template. Output Related to the the output configuration option. Performance Anything with regards to Next.js performance.
Converted from issue

This discussion was converted from issue #68578 on August 07, 2024 21:30.

AltStyle によって変換されたページ (->オリジナル) /