Message130485
| Author |
loewis |
| Recipients |
Trundle, brian.curtin, giampaolo.rodola, loewis, neologix, nvetoshkin, pitrou, socketpair, terry.reedy |
| Date |
2011年03月10日.03:14:11 |
| SpamBayes Score |
1.679722e-07 |
| Marked as misclassified |
No |
| Message-id |
<4D784200.8010001@v.loewis.de> |
| In-reply-to |
<1299705685.69.0.751195607755.issue11406@psf.upfronthosting.co.za> |
| Content |
> a cron script which must process just a bunch of them at a time.
> There's no need to gather them all.
Can you please be more explicit? What's the application in which you
have several millions of files in a directory? What's the task that
the processing script needs to perform?
> http://pastebin.com/NCGmfF49 - here's a kind of test (cached and uncached)
This isn't really convincing - the test looks at all files, so it isn't
clear why xlistdir should do any better than listdir. And indeed, with
a cold cache, xlistdir is slower (IIUC).
> http://pastebin.com/tTKRTiNc - here's a testcase for batch processing of directory contenst (first is xlistdir(), second - listdir()) both uncached.
This is not a real-world application - there is no actual processing done.
BTW, can you publish your xlistdir implementation somewhere? |
|