Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 74ca557

Browse files
committed
Added tests for tokenize, blocked corresponding checkin from trunk.
1 parent f1bb97c commit 74ca557

1 file changed

Lines changed: 10 additions & 2 deletions

File tree

Lib/test/test_tokenize.py

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
>>> import glob, random, sys
55
66
The tests can be really simple. Given a small fragment of source
7-
code, print out a table with thokens. The ENDMARK is omitted for
7+
code, print out a table with tokens. The ENDMARK is omitted for
88
brevity.
99
1010
>>> dump_tokens("1 + 1")
@@ -105,7 +105,7 @@
105105
... "else: print 'Loaded'\\n")
106106
True
107107
108-
Balancing contunuation
108+
Balancing continuation
109109
110110
>>> roundtrip("a = (3,4, \\n"
111111
... "5,6)\\n"
@@ -125,6 +125,14 @@
125125
NUMBER '0xff' (1, 0) (1, 4)
126126
OP '<=' (1, 5) (1, 7)
127127
NUMBER '255' (1, 8) (1, 11)
128+
>>> dump_tokens("0b10 <= 255")
129+
NUMBER '0b10' (1, 0) (1, 4)
130+
OP '<=' (1, 5) (1, 7)
131+
NUMBER '255' (1, 8) (1, 11)
132+
>>> dump_tokens("0o123 <= 0O123")
133+
NUMBER '0o123' (1, 0) (1, 5)
134+
OP '<=' (1, 6) (1, 8)
135+
NUMBER '0O123' (1, 9) (1, 14)
128136
>>> dump_tokens("1234567 > ~0x15")
129137
NUMBER '1234567' (1, 0) (1, 7)
130138
OP '>' (1, 8) (1, 9)

0 commit comments

Comments
 (0)