A wrapper around the stdlib `tokenize` which roundtrips.
Project description
tokenize-rt
The stdlib tokenize
module does not properly roundtrip. This wrapper
around the stdlib provides two additional tokens ESCAPED_NL
and
UNIMPORTANT_WS
, and a Token
data type. Use src_to_tokens
and
tokens_to_src
to roundtrip.
This library is useful if you're writing a refactoring tool based on the python tokenization.
Installation
pip install tokenize-rt
Usage
tokenize_rt.src_to_tokens(text) -> List[Token]
tokenize_rt.tokens_to_src(Sequence[Token]) -> text
tokenize_rt.ESCAPED_NL
tokenize_rt.UNIMPORTANT_WS
tokenize_rt.Offset(line=None, utf8_byte_offset=None)
A token offset, useful as a key when cross referencing the ast
and the
tokenized source.
tokenize_rt.Token(name, src, line=None, utf8_byte_offset=None)
Construct a token
name
: one of the token names listed intoken.tok_name
orESCAPED_NL
orUNIMPORTANT_WS
src
: token's source as textline
: the line number that this token appears on. This will beNone
forESCAPED_NL
andUNIMPORTANT_WS
tokens.utf8_byte_offset
: the utf8 byte offset that this token appears on in the line. This will beNone
forESCAPED_NL
andUNIMPORTANT_WS
tokens.
tokenize_rt.Token.offset
Retrieves an Offset
for this token.
tokenize_rt.reversed_enumerate(Sequence[Token]) -> Iterator[Tuple[int, Token]]
yields (index, token)
pairs. Useful for rewriting source.
Sample usage
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
tokenize_rt-2.2.0.tar.gz
(3.8 kB
view hashes)
Built Distribution
Close
Hashes for tokenize_rt-2.2.0-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6b845036b52d430d395b02981fa4adaeb279c6914b57d8019be0d7d0a98b8a03 |
|
MD5 | 73007d91a928060240fb486b71938a6b |
|
BLAKE2b-256 | 434bc5df89ff5b38afffc04fb208c9b1fce30c1426788a368d7039b4cbcf524e |