Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit da99d1c

Browse files
committed
SF bug #1224621: tokenize module does not detect inconsistent dedents
1 parent 8fa7eb5 commit da99d1c

3 files changed

Lines changed: 25 additions & 1 deletion

File tree

Lib/test/test_tokenize.py

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
from test.test_support import verbose, findfile, is_resource_enabled
1+
from test.test_support import verbose, findfile, is_resource_enabled, TestFailed
22
import os, glob, random
33
from tokenize import (tokenize, generate_tokens, untokenize,
44
NUMBER, NAME, OP, STRING)
@@ -41,6 +41,24 @@ def test_roundtrip(f):
4141
test_roundtrip(f)
4242

4343

44+
###### Test detecton of IndentationError ######################
45+
46+
from cStringIO import StringIO
47+
48+
sampleBadText = """
49+
def foo():
50+
bar
51+
baz
52+
"""
53+
54+
try:
55+
for tok in generate_tokens(StringIO(sampleBadText).readline):
56+
pass
57+
except IndentationError:
58+
pass
59+
else:
60+
raise TestFailed("Did not detect IndentationError:")
61+
4462

4563
###### Test example in the docs ###############################
4664

Lib/tokenize.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -271,6 +271,9 @@ def generate_tokens(readline):
271271
indents.append(column)
272272
yield (INDENT, line[:pos], (lnum, 0), (lnum, pos), line)
273273
while column < indents[-1]:
274+
if column not in indents:
275+
raise IndentationError(
276+
"unindent does not match any outer indentation level")
274277
indents = indents[:-1]
275278
yield (DEDENT, '', (lnum, pos), (lnum, pos), line)
276279

Misc/NEWS

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -147,6 +147,9 @@ Extension Modules
147147
Library
148148
-------
149149

150+
- The tokenize module now detects and reports indentation errors.
151+
Bug #1224621.
152+
150153
- The tokenize module has a new untokenize() function to support a full
151154
roundtrip from lexed tokens back to Python sourcecode. In addition,
152155
the generate_tokens() function now accepts a callable argument that

0 commit comments

Comments
 (0)