Long running process - how to speed up?

Shaozhong SHI shishaozhong at gmail.com
Sun Feb 20 13:05:33 EST 2022


On 2022年2月19日 at 19:44, Mats Wichmann <mats at wichmann.us> wrote:
> On 2/19/22 05:09, Shaozhong SHI wrote:
> > Can it be divided into several processes?
> > Regards,
> > David
>> The answer is: "maybe". Multiprocessing doesn't happen for free, you
> have to figure out how to divide the task up, requiring thought and
> effort. We can't guess to what extent the problem you have is amenable
> to multiprocessing.
>> Google for "dataframe" and "multiprocessing" and you should get some
> hits (in my somewhat limited experience in this area, people usually
> load the csv data into Pandas before they get started working with it).
>>> --
> https://mail.python.org/mailman/listinfo/python-list

I am trying this approach,
import multiprocessing as mp
def my_func(x):
 print(x**x)
def main():
 pool = mp.Pool(mp.cpu_count())
 result = pool.map(my_func, [4,2,3])
if __name__ == "__main__":
 main()
I modified the script and set off a test run.
However, I have no idea whether this approach will be faster than
conventional approach.
Any one has idea?
Regards,
David


More information about the Python-list mailing list

AltStyle によって変換されたページ (->オリジナル) /