why does memory consumption keep growing?

Fetchinson . fetchinson at googlemail.com
Thu Oct 5 17:42:32 EDT 2017


On 10/5/17, Chris Angelico <rosuav at gmail.com> wrote:
> On Fri, Oct 6, 2017 at 8:06 AM, Fetchinson . via Python-list
> <python-list at python.org> wrote:
>> Hi folks,
>>>> I have a rather simple program which cycles through a bunch of files,
>> does some operation on them, and then quits. There are 500 files
>> involved and each operation takes about 5-10 MB of memory. As you'll
>> see I tried to make every attempt at removing everything at the end of
>> each cycle so that memory consumption doesn't grow as the for loop
>> progresses, but it still does.
>>>> import os
>>>> for f in os.listdir( '.' ):
>>>> x = [ ]
>>>> for ( i, line ) in enumerate( open( f ) ):
>>>> import mystuff
>> x.append( mystuff.expensive_stuff( line ) )
>> del mystuff
>>>> import mystuff
>> mystuff.some_more_expensive_stuff( x )
>> del mystuff
>> del x
>>>>>> What can be the reason? I understand that mystuff might be leaky, but
>> if I delete it, doesn't that mean that whatever memory was allocated
>> is freed? Similary x is deleted so that can't possibly make the memory
>> consumption go up.
>> You're not actually deleting anything. When you say "del x", all
> you're doing is removing the *name* x. Especially, deleting an
> imported module basically does nothing; it's a complete waste of time.
> Modules are kept in their own special cache.

Meaning that if mystuff has some leaky stuff in it, there is no way
for me to recover?
Daniel
-- 
Psss, psss, put it down! - http://www.cafepress.com/putitdown


More information about the Python-list mailing list

AltStyle によって変換されたページ (->オリジナル) /