MeCab Tokenizer

class konoha.word_tokenizers.mecab_tokenizer.MeCabTokenizer(user_dictionary_path: str | None = None, system_dictionary_path: str | None = None, dictionary_format: str | None = None)
tokenize(text: str) List[Token]

Abstract method forkonoha.tokenization