Message91304
| Author |
pitrou |
| Recipients |
eric.smith, ezio.melotti, ggenellina, lemburg, loewis, mark.dickinson, pitrou |
| Date |
2009年08月05日.10:14:16 |
| SpamBayes Score |
2.5932684e-07 |
| Marked as misclassified |
No |
| Message-id |
<1249467258.52.0.434004376706.issue6632@psf.upfronthosting.co.za> |
| In-reply-to |
| Content |
On the specific point of:
> 2.1 some languages/alphabets use other chars (e.g. a comma or other
> symbols) instead of the decimal point.
I think it's not the job of the float() constructor to support it.
Depending on the country, the comma has different meanings when put in a
number (thousands separator or decimal separator). Ditto for the point,
but using a point as decimal separator is the accepted standard for
non-localized computer I/O.
More generally, I think the fact that int(), float() et al. support
non-ASCII decimal digits should be seen as a convenience rather than a
willingness to accomodate the broadest set possible of inputs. Which
means, we should only add support for new formats only if it's sensible,
safe and non-ambiguous to do so.
I also agree with Marc-André's argument that the Unicode spec should be
a good guide here. |
|