MeCab Tokenizer
¶
class
konoha.word_tokenizers.mecab_tokenizer.
MeCabTokenizer
(
user_dictionary_path
:
str
|
None
=
None
,
system_dictionary_path
:
str
|
None
=
None
,
dictionary_format
:
str
|
None
=
None
)
¶
tokenize
(
text
:
str
)
→
List
[
Token
]
¶
Abstract method forkonoha.tokenization
konoha
Navigation
Quick Start
Installation
API Reference
Word Level Tokenizer Interface
Sentence Level Tokenizer Interface
Word Tokenizer Implementations
Token
Data classes
Server
Related Topics
Documentation overview
API Reference
Word Tokenizer Implementations
Previous:
Character Tokenizer
Next:
KyTea Tokenizer
Quick search