lxtokenizer

LX-Tokenizer segments text into lexically relevant tokens.

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install lxtokenizer

Releases

Version Released Buster
Python 3.7
Bullseye
Python 3.9
Bookworm
Python 3.11
Files
0.5.9 2022-07-11    
0.5.8 2022-07-11    
0.5.7 2022-07-11    
0.5.6 2022-07-11    
0.5.3 2022-07-11    

Issues with this package?

Page last updated 2023-10-28 03:49:29 UTC