Message66996
| Author |
loewis |
| Recipients |
georg.brandl, loewis, rhettinger, schuppenies |
| Date |
2008年05月17日.14:10:44 |
| SpamBayes Score |
0.0026552153 |
| Marked as misclassified |
No |
| Message-id |
<482EE751.5050601@v.loewis.de> |
| In-reply-to |
<1211032506.53.0.362158166003.issue2898@psf.upfronthosting.co.za> |
| Content |
> Proposals like this have been rejected in the past. Memory consumption
> is an evasive concept. Lists over-allocate space
That issue is addressed in this patch.
> there are freelists,
but they allocate just an upper bound.
> there are immortal objects, the python memory allocator may hang-on to
> space thought to be available
These issues are orthogonal to the memory consumption of a single
object.
> the packing and alignment of structures
> varies across implementations
This is addressed in the current patch.
> the system memory allocator may assign
> much larger chunks than are needed for a single object
While true in general, this is not true in practice - in particular,
when objects get allocated through pymalloc.
> and the memory
> may not be freed back to the system. Because of these issues, it is
> not that meaningful to say the object x consumes y bytes.
This is not true. It is meaningful to say that (and many that you
noted are independent from such a statement, as they say things for
the whole interpreter, not an individual object).
The patch meets a real need, and is the minimum amount of code that
actually *has* to be implemented in the virtual machine, to get
a reasonable analysis of the total memory consumption. Please be
practical here, not puristic. |
|