Message28370
| Author |
eriban |
| Recipients |
| Date |
2006年04月28日.14:37:44 |
| SpamBayes Score |
| Marked as misclassified |
| Message-id |
| In-reply-to |
| Content |
The function datetime.datetime.fromtimestamp() can
throw a ValueError when the timestamp is close to an
integer value but not quite due to rounding errors. It
then gives the following error: microsecond must be in
0..999999
This can be seen by running the attached code (the
values are taken from an actual event log), which
gives the following output:
1146227423.0 -> 2006年04月28日 14:30:23
1146227448.7 -> 2006年04月28日 14:30:48.702000
1146227459.95 -> 2006年04月28日 14:30:59.947000
1146227468.41 -> 2006年04月28日 14:31:08.409000
1146227501.4 -> 2006年04月28日 14:31:41.399000
1146227523.0 -> Error converting 1146227522.99999976
microsecond must be in 0..999999
Admittedly, I can work around the bug in this case, by
summing the durations first, and calculating all times
from "starttime" directly. Nevertheless, I think this
is a bug in datetime, as it should work as long as the
input time any floating point value within a given
range (based on the date range that is supported).
Details of my Python environment:
Python 2.4.2 (#1, Feb 6 2006, 13:53:18)
[GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-53)] on linux2
Cheers,
Erwin |
|
History
|
|---|
| Date |
User |
Action |
Args |
| 2007年08月23日 14:39:41 | admin | link | issue1478429 messages |
| 2007年08月23日 14:39:41 | admin | create |
|