pip install bert-multitask-learning
This a project that uses BERT to do multi-task learning with multiple GPU support.
In the original BERT code, neither multi-task learning or multiple GPU training is possible. Plus, the original purpose of this project is NER which dose not have a working script in the original BERT code.
To sum up, compared to the original bert repo, this repo has the following features:
There are two types of chaining operations can be used to chain problems.
&
. If two problems have the same inputs, they can be chained using &
. Problems chained by &
will be trained at the same time.|
. If two problems don't have the same inputs, they need to be chained using |
. Problems chained by |
will be sampled to train at every instance.For example, cws|NER|weibo_ner&weibo_cws
, one problem will be sampled at each turn, say weibo_ner&weibo_cws
, then weibo_ner
and weibo_cws
will trained for this turn together. Therefore, in a particular batch, some tasks might not be sampled, and their loss could be 0 in this batch.
Please see the examples in notebooks for more details about training, evaluation and export models.
pip install bert-multitask-learning
这是利用BERT进行多任务学习并且支持多GPU训练的项目.
在原始的BERT代码中, 是没有办法直接用多GPU进行多任务学习的. 另外, BERT并没有给出序列标注和Seq2seq的训练代码.
因此, 和原来的BERT相比, 这个项目具有以下特点:
可以用两种方法来将多个任务连接起来.
&
. 如果两个任务有相同的输入, 不同标签的话, 那么他们可以用&
来连接. 被&
连接起来的任务会被同时训练.|
. 如果两个任务为不同的输入, 那么他们必须用|
来连接. 被|
连接起来的任务会被随机抽取来训练.例如, 我们定义任务cws|NER|weibo_ner&weibo_cws
, 那么在生成每一条数据时, 一个任务块会被随机抽取出来, 例如在这一次抽样中, weibo_ner&weibo_cws
被选中. 那么这次weibo_ner
和weibo_cws
会被同时训练. 因此, 在一个batch中, 有可能某些任务没有被抽中, loss为0.
训练, eval和导出模型请见notebooks