Model Zoo

This is a list of pretrained ParlAI models. Some are meant to be used as components in larger systems, while others may be used by themselves.

Pretrained Embeddings

Some models support using Pretrained Embeddings, via torchtext. As of writing, this includes:

Example invocation:

python -m parlai.scripts.train_model -t convai2 -m seq2seq -emb fasttext_cc

Drqa

drqa reader trained on SQuAD

Example invocation:

python -m parlai.scripts.eval_model --model drqa --task squad -mf models:drqa/squad/model

Wikipedia 2016-12-21

retrieval over Wikipedia dump, used for DrQA on the open squad dataset. This is the dump from the original paper, used for replicating results.

Example invocation:

python -m parlai.scripts.eval_model --model tfidf_retriever --task wikipedia_2016-12-21 -mf models:wikipedia_2016-12-21/tfidf_retriever/drqa_docs

Wikipedia Full

retrieval over Wikipedia dump, used for DrQA on the open squad dataset

Example invocation:

python -m parlai.scripts.eval_model --model tfidf_retriever --task wikipedia_full -mf models:wikipedia_full/tfidf_retriever/model

Twitter

Generic conversational model trained on the twitter task

Example invocation:

python -m parlai.scripts.eval_model --model legacy:seq2seq:0 --task twitter -mf models:twitter/seq2seq/twitter_seq2seq_model