Message99693
| Author |
Alexander.Belopolsky |
| Recipients |
Alexander.Belopolsky, l0nwlf, loewis, michael.foord, orsenthil, ronaldoussoren |
| Date |
2010年02月21日.22:29:11 |
| SpamBayes Score |
4.3656134e-11 |
| Marked as misclassified |
No |
| Message-id |
<d38f5331002211429k1e1d6d2ar20d26b094f665282@mail.gmail.com> |
| In-reply-to |
<4B81825A.6040503@v.loewis.de> |
| Content |
On Sun, Feb 21, 2010 at 1:58 PM, Martin v. Löwis <report@bugs.python.org> wrote:
..
> I would propose a different strategy: if _SC_NGROUPS_MAX is defined, use
> that to find out how much memory to allocate, otherwise, fall back to
> the current max array size. Can you find out whether doing so would also
> fix the issue at hand?
I am afraid that the following is the evidence that it won't:
Python 2.7a3+ (trunk:78265M, Feb 20 2010, 15:20:36)
[GCC 4.2.1 (Apple Inc. build 5646) (dot 1)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> os.sysconf('SC_NGROUPS_MAX')
16
>>> len(os.getgroups()) # with the patch
22 |
|