PyPI Stats

Search

All packages
Top packages

Track packages

ts-tokenizer


PyPI page
Home page
Author: Taner Sezer
License: MIT
Summary: TS Tokenizer is a hybrid (lexicon-based and rule-based) tokenizer designed specifically for tokenizing Turkish texts.
Latest version: 0.1.22
Required dependencies: tqdm

Downloads last day: 29
Downloads last week: 33
Downloads last month: 68