This issue tracker has been migrated to GitHub ,
and is currently read-only.
For more information,
see the GitHub FAQs in the Python's Developer Guide.
| Author | jaredgrubb |
|---|---|
| Recipients | jaredgrubb |
| Date | 2008年02月25日.02:22:29 |
| SpamBayes Score | 0.023912596 |
| Marked as misclassified | No |
| Message-id | <1203906150.42.0.824335055729.issue2180@psf.upfronthosting.co.za> |
| In-reply-to |
| Content | |
|---|---|
CPython allows \ at EOF, but tokenize does not.
>>> s = 'print 1\\\n'
>>> exec s
1
>>> tokenize.tokenize(StringIO(s).readline)
1,0-1,5: NAME 'print'
1,6-1,7: NUMBER '1'
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File
"/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/tokenize.py",
line 153, in tokenize
tokenize_loop(readline, tokeneater)
File
"/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/tokenize.py",
line 159, in tokenize_loop
for token_info in generate_tokens(readline):
File
"/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/tokenize.py",
line 283, in generate_tokens
raise TokenError, ("EOF in multi-line statement", (lnum, 0))
tokenize.TokenError: ('EOF in multi-line statement', (2, 0)) |
|
| History | |||
|---|---|---|---|
| Date | User | Action | Args |
| 2008年02月25日 02:22:30 | jaredgrubb | set | spambayes_score: 0.0239126 -> 0.023912596 recipients: + jaredgrubb |
| 2008年02月25日 02:22:30 | jaredgrubb | set | spambayes_score: 0.0239126 -> 0.0239126 messageid: <1203906150.42.0.824335055729.issue2180@psf.upfronthosting.co.za> |
| 2008年02月25日 02:22:29 | jaredgrubb | link | issue2180 messages |
| 2008年02月25日 02:22:29 | jaredgrubb | create | |