Re: Squeezing more coroutines ?
[
Date Prev][
Date Next][
Thread Prev][
Thread Next]
[
Date Index]
[
Thread Index]
- Subject: Re: Squeezing more coroutines ?
- From: mpb <mpb.spam@...>
- Date: Fri, 1 Jun 2007 23:52:57 -0700
On 5/30/07, Bogdan Harjoc <harjoc@fokus.fraunhofer.de> wrote:
While writing a performance testing tool, I chose Lua to describe a set
of tests that
run in parallel.
So far things are looking great, as far as both available cpu and memory
usage
are concerned: after increasing LUAI_MAXCSTACK, I've been testing with
300.000
concurrent and mostly idle threads, created with lua_newthread(). Memory
usage
for these is about 350M, i.e. a little over 1KB per created thread.
After a glance at the lua_State definition, it looks like there a couple
of members
that could be left out but nothing that would noticeably reduce
sizeof(lua_State).
Has anyone managed to create something on the order of 1,000,000
coroutines ?
If so, any tips on what to leave out from lua_State (or other places)
would be
appreciated.
Hi Bogdan,
I'm curious - have you considered using Stackless Python?
http://www.stackless.com/
http://en.wikipedia.org/wiki/Stackless_Python
I've never used Stackless, but I've heard that Stackless threads
(which seem to be called Tasklets) are "just a few bytes each".
Thanks!
-mpb