-
Notifications
You must be signed in to change notification settings - Fork 29.7k
-
Link to the code that reproduces this issue
https://github.com/kamdubiel/next-fetch
To Reproduce
I created simple reproduction with docker compose. There's simple Express API (just to return random JSON data), K6 for stress test and NextJS 14.2.5 with simple dynamic route with async server component.
- Clone the repository: https://github.com/kamdubiel/next-fetch
- Run the containers:
docker compose up -d - Attach to Next container stats:
docker stats next - Run K6 test:
docker compose run k6 run /scripts/ewoks.js - Wait for the K6 test to finish - there will be high memory usage even with idling Next server. It will never go down, and in real world app it's causing out of memory exceptions every few days.
Current vs. Expected behavior
Current:
When container starts, it's memory usage is around ~45MB.
After stress test, the memory usage goes up to ~400MB, and stays there forever. Running more tests will increase the idle memory usage.
Expected:
After stress test, when Next is idling, the memory used should drop.
Provide environment information
From Docker container:
Operating System:
Platform: linux
Arch: x64
Version: #39-Ubuntu SMP PREEMPT_DYNAMIC Fri Jul 5 21:49:14 UTC 2024
Available memory (MB): 63434
Available CPU cores: 16
Binaries:
Node: 20.16.0
npm: 10.8.1
Yarn: 1.22.22
pnpm: N/A
Relevant Packages:
next: 14.2.5 // Latest available version is detected (14.2.5).
eslint-config-next: 14.2.5
react: 18.3.1
react-dom: 18.3.1
typescript: 5.5.4
Next.js Config:
output: standaloneWhich area(s) are affected? (Select all that apply)
Output (export/standalone), Performance
Which stage(s) are affected? (Select all that apply)
next start (local), Other (Deployed)
Additional context
I tested my reproduction code against different Next ~14 versions, canary releases and NodeJS versions. It looks like all the versions since async server components support are affected.
Also, I tested the same code with axios and cross-fetch, and it works correctly, so for me it looks like the issue is with Fetch caching.
I have the same issue in real-world app hosted in AKS, and it causes out of memory errors every few days - is there any way to make the garbage collector clean up memory allocations more aggressively when using native fetch?
I think I checked all the other similar issues like:
#64212
#54708
But I think that my example is much simpler so I decided to create new issue.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 20
possible fix nodejs/undici#3445
Replies: 7 comments 14 replies
-
We faced this problem in our platform, And we fixed it by downgrading node js to 20.15.1.
Beta Was this translation helpful? Give feedback.
All reactions
-
❤️ 8
-
looks like someone also reported the leak in 20.16.0 (1 minor after 20.15)
https://github.com/orgs/nodejs/discussions/54248
Beta Was this translation helpful? Give feedback.
All reactions
-
Hey @kamdubiel I wonder if it's the console.log buffering the string that's causing the high memory usage. What if you removed the log?
Beta Was this translation helpful? Give feedback.
All reactions
-
@khuezy I tested without console.log and it has no effect on memory usage.
@hahmadzadeh Thank you for that, indeed 20.15.1 works - I didn't test that one.
Unfortunately 22.6.0 that will become LTS in October is also affected.
Beta Was this translation helpful? Give feedback.
All reactions
-
undici's fetch seems to be a nightmare. Have you notified the node team of the leaks happening to that library? There's this: https://undici.nodejs.org/#/?id=garbage-collection but your repo is consuming the body... so something else in node is leaking 🤷
Beta Was this translation helpful? Give feedback.
All reactions
-
@kamdubiel Thank you for submitting an issue and thanks for taking a look everyone!
Since this seems to be a Node issue and not a Next.js issue, I will be converting this to a discussion for further discussion in case anyone else comes across this.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 4
-
@samcx I think I can reproduce this locally.
I created the same handler in vanilla node and next. The memory usage in Next + Node 20.16.0 increases until it crashes whereas Next + Node 20.15.1 does not have the same issue.
Additionally, neither of the vanilla Node version show the same behavior so I think this likely has something to do with modifications that next makes on top of the fetch function (for caching?).
Next
import { headers } from "next/headers"; export default async function Page() { const res = await fetch("http://counting-service:9001", { headers: headers(), }); const { count } = await res.json(); return <pre>{count}</pre>; }
Node
import http from "node:http"; http .createServer(async (req, res) => { const response = await fetch("http://counting-service:9001", { headers: req.headers, }); const { count } = await response.json(); res.writeHead(200, { "Content-Type": "text/html" }); res.end(`<pre>${count}</pre>`); }) .listen(3000);
The reproduction requires siege to make a ton of requests against the servers, but feel free to check it out
Beta Was this translation helpful? Give feedback.
All reactions
-
🚀 2 -
👀 2
-
@snyamathi Maybe the source of the leak could be from React's patching of fetch?
From what I see React has removed their fetch patching but those changes aren't in the 14.2.5 build of nextjs. I think 14.3.0-canary has it so maybe that's worth a test.
I'm not able to run your repro steps but I can mess with it later today
docker compose up -d
[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/yokogawa/siege:latest to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/
Beta Was this translation helpful? Give feedback.
All reactions
-
@ppiwo I made a simpler, dockerless reproduction branch. You'll have to select the node version and do the load testing yourself using ab or similar.
https://github.com/snyamathi/68636/tree/simple
If you replace the patched fetch with undici OR downgrade from 20.16.0 to 20.15.1 then the memory leak goes away
import { fetch } from "undici";
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 2
-
@snyamathi Thanks that repro worked for me.
I installed 14.3.0-canary.45 where React's fetch patching was removed and the leak is still there. I also tried the latest canary (15.0.0-canary.108) and the memory leak is present there too. I guess it can't be coming from React.
Beta Was this translation helpful? Give feedback.
All reactions
-
possible fix nodejs/undici#3445
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1 -
🚀 2
-
for me downgrading Node from 20.17.0 to 20.15.1 solved the issue
Beta Was this translation helpful? Give feedback.
All reactions
-
Same problem here with Next.js and Node v20.16 and v20.17
Beta Was this translation helpful? Give feedback.
All reactions
-
Did you try in Node.js v20.18.0? That is the version where a fix was released nodejs/node#54274
Beta Was this translation helpful? Give feedback.
All reactions
-
I updated to the Node 20.18.0 version, and the leak was apparently fixed
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1
-
this helped us #68636 (reply in thread) we're keeping up with latest next.js 14.x version AND node 20.15.1 - all works perfectly so far
What's the version of Next.js do you use?
latest at the time of writing.
I will try 20.18.0 soon. Thanks for pointing that out.
Beta Was this translation helpful? Give feedback.
All reactions
-
Is anyone knows if 23 could help? We have same issue with memory leaks with node 22
Beta Was this translation helpful? Give feedback.
All reactions
-
Is anyone knows if 23 could help? We have same issue with memory leaks with node 22
I've just fixed memory leak on my Node 22 LTS by using node-fetch instead. It has the same API so all I had to do was add a single import line to each file where fetch is called.
I have a feeling failed fetch requests (like 404) will leak memory regardless whether you consume body or not.
Beta Was this translation helpful? Give feedback.