Message117988
| Author |
goddard |
| Recipients |
goddard |
| Date |
2010年10月04日.23:33:04 |
| SpamBayes Score |
3.782752e-05 |
| Marked as misclassified |
No |
| Message-id |
<1286235186.65.0.07824350342.issue10025@psf.upfronthosting.co.za> |
| In-reply-to |
| Content |
In Python 2.7, random.seed() with a string argument is documented as being equivalent to random.seed() with argument equal to the hash of the string argument. This is not the actual behavior. Reading the _random C code reveals it in fact casts the signed hash value to unsigned long. This also appears to be the situation with Python 2.5.2. Rather than fix this in 2.7.1 it seems preferable to just correct the documentation in 2.7.1 to preserve backward compatibility. Bug #7889 has already addressed this problem in Python 3.2 by eliminating the use of hash() for non-integer random.seed() arguments. I encountered this problem while trying to produce identical sequences of random numbers on 64-bit architectures as on 32-bit architectures.
Here is a demonstration of the bug in Python 2.7, 32-bit.
random.seed('1pov')
random.uniform(0,1)
0.713827305919223
random.seed(hash('1pov'))
random.uniform(0,1)
0.40934677883730686
hash('1pov')
-747753952
random.seed(hash('1pov') + 2**32) # unsigned long cast
random.uniform(0,1)
0.713827305919223 |
|
History
|
|---|
| Date |
User |
Action |
Args |
| 2010年10月04日 23:33:06 | goddard | set | recipients:
+ goddard |
| 2010年10月04日 23:33:06 | goddard | set | messageid: <1286235186.65.0.07824350342.issue10025@psf.upfronthosting.co.za> |
| 2010年10月04日 23:33:05 | goddard | link | issue10025 messages |
| 2010年10月04日 23:33:04 | goddard | create |
|