KoBERT-NER

Dependencies

Dataset

How to use KoBERT on Huggingface Transformers Library

from transformers import BertModel
from tokenization_kobert import KoBertTokenizer

model = BertModel.from_pretrained('monologg/kobert')
tokenizer = KoBertTokenizer.from_pretrained('monologg/kobert')

Usage

$ python3 main.py --model_type kobert --do_train --do_eval

Prediction

$ python3 predict.py --input_file {INPUT_FILE_PATH} --output_file {OUTPUT_FILE_PATH} --model_dir {SAVED_CKPT_PATH}

Results

Slot F1 (%)
KoBERT 86.11
DistilKoBERT 84.13
Bert-Multilingual 84.20
CNN-BiLSTM-CRF 74.57

References