Message403512
| Author |
olliemath |
| Recipients |
olliemath |
| Date |
2021年10月08日.23:52:32 |
| SpamBayes Score |
-1.0 |
| Marked as misclassified |
Yes |
| Message-id |
<1633737153.71.0.819400116811.issue45417@roundup.psfhosted.org> |
| In-reply-to |
| Content |
Creating large enums takes a significant amount of time. Moreover this appears to be nonlinear in the number of entries in the enum. Locally, importing a single python file and taking this to the extreme:
1000 entries - 0.058s
10000 entries - 4.327s
This is partially addressed by https://bugs.python.org/issue38659 and I can confirm that using `@_simple_enum` does not have this problem. But it seems like that API is only intended for internal use and the 'happy path' for user-defined enums is still not good.
Note that it is not simply parsing the file / creating the instances, it is to do with the cardinality. Creating 100 enums with 100 entries each is far faster than a single 10000 entry enum. |
|
History
|
|---|
| Date |
User |
Action |
Args |
| 2021年10月08日 23:52:33 | olliemath | set | recipients:
+ olliemath |
| 2021年10月08日 23:52:33 | olliemath | set | messageid: <1633737153.71.0.819400116811.issue45417@roundup.psfhosted.org> |
| 2021年10月08日 23:52:33 | olliemath | link | issue45417 messages |
| 2021年10月08日 23:52:33 | olliemath | create |
|