lxtokenizer

LX-Tokenizer segments text into lexically relevant tokens.

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install lxtokenizer

Releases

Version Released Bullseye
Python 3.9
Bookworm
Python 3.11
Files
0.5.9 2022-07-11  
0.5.8 2022-07-11  
0.5.7 2022-07-11  
0.5.6 2022-07-11  
0.5.3 2022-07-11  

Issues with this package?

Page last updated 2025-06-27 21:20:14 UTC