Message2855
| Author |
pbmtl |
| Recipients |
| Date |
2001年01月10日.14:17:16 |
| SpamBayes Score |
| Marked as misclassified |
| Message-id |
| In-reply-to |
| Content |
I agree with you that it's not that important when script are executed using PythonWin for example, because when PythonWin is closed, the memory is recover.
But if you use python embedded in a server on a remote computer that is never supposed to be stopped, it's a real problem. The memory usage increase rapidly, Hundreds KB at each execution in my case.
In fact, I can't call PyInitialize and PyFinalize for each script because if I do, the second script to be executed will not work correctly (the call to PyEval_CallObject will return an error). I use the Py_NewInterpreter and the Py_EndInterpreter.
For my part, I think the script virtual machine should be responsable of the memory created by the scripts. Otherwise, there is no garanty that the PyFinalyze will free the memory correctly.
Thanks
|
|
History
|
|---|
| Date |
User |
Action |
Args |
| 2007年08月23日 13:52:39 | admin | link | issue228040 messages |
| 2007年08月23日 13:52:39 | admin | create |
|