"du -b --files0-from=-" running out of memory
Barry Kelly
bkelly.ie@gmail.com
Mon Nov 24 08:32:00 GMT 2008
I have a problem with du running out of memory.
I'm feeding it a list of null-separated file names via standard input,
to a command-line that looks like:
du -b --files0-from=-
The problem is that when du is run in this way, it leaks memory like a
sieve. I feed it about 4.7 million paths but eventually it falls over as
it hits the 32-bit address space limit.
Now, I can understand why a du -c might want to exclude excess hard
links to files, but that at most requires a hash table for device &
inode pairs - it's hard to see why 4.7 million entries would cause OOM -
and in any case, I'm not asking for a grand total.
Is there any other alternative to running e.g. xargs -0 du -b, possibly
with a high -n <arg> to xargs to limit memory leakage?
-- Barry
--
http://barrkel.blogspot.com/
--
Unsubscribe info: http://cygwin.com/ml/#unsubscribe-simple
Problem reports: http://cygwin.com/problems.html
Documentation: http://cygwin.com/docs.html
FAQ: http://cygwin.com/faq/
More information about the Cygwin
mailing list