MeCab Tokenizer

class konoha.word_tokenizers.mecab_tokenizer.MeCabTokenizer(user_dictionary_path: Optional[str] = None, system_dictionary_path: Optional[str] = None, dictionary_format: Optional[str] = None)
tokenize(text: str) List[Token]

Abstract method forkonoha.tokenization