tokenizer-xm

Tokenizing with options to include contractions, lemmatize and stem.

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install tokenizer-xm

Releases

Version Released Bullseye
Python 3.9
Bookworm
Python 3.11
Files
1.0.2 2021-08-31  
1.0.1 2021-08-31  
1.0 2021-08-31  
0.5 2019-12-09  
0.4 2019-11-19  
0.2 2019-09-20  
0.1 2019-09-20  

Issues with this package?

Page last updated 2025-07-17 17:07:10 UTC