Unigram Agent

Baseline model which predicts the \(n\) most common unigrams regardless of input.

Basic Examples

Training the unigram model on convai2:

parlai train_model -m unigram -mf unigram.model -t convai2 -eps 1 --num-words 15

DictionaryAgent Options

BPEHelper Arguments

Argument

Description

--bpe-vocab

Path to pre-trained tokenizer vocab

--bpe-merge

Path to pre-trained tokenizer merge

UnigramAgent Options

optional arguments

Argument

Description

--num-words

Number of unigrams to output.

Default: 10.

BPEHelper Arguments

Argument

Description

--bpe-vocab

Path to pre-trained tokenizer vocab

--bpe-merge

Path to pre-trained tokenizer merge