Opt Presets

Opt presets are a way to provide multiple options on the command line as shorthand. Opt presets are bundled with ParlAI and may be used by simply invoking the -o preset_name option within any ParlAI command.

You may also define your own options by placing them in ~/.parlai/opt_presets/. For example, creating ~/.parlai/opt_presets/myfolder/mypreset.opt allows you to invoke it via -o myfolder/mypreset. These preset files are simple json files containing a dictionary of files. For example:

{
    "inference": "beam",
    "beam_size": 10,
}

List of presets

The following is a list of all options presets bundled with the latest version of ParlAI.

Preset name

Description

Expansion

arch/bart_large

Architecture parameters (number layers, etc.) for BART-Large. See Lewis et. al. (2019

--activation gelu --attention-dropout 0.0 --dict-file zoo:bart/bart_large/model.dict --dict-tokenizer gpt2 --dropout 0.1 --embedding-size 1024 --embeddings-scale False --ffn-size 4096 --force-fp16-tokens True --fp16 True --init-model zoo:bart/bart_large/model --learn-positional-embeddings True --model bart --n-decoder-layers 12 --n-encoder-layers 12 --n-heads 16 --n-positions 1024 --variant bart

arch/blenderbot_3B

Architecture parameters (number layers, etc) for BlenderBot 3B. See Roller et al. (2020)

--activation gelu --attention-dropout 0 --embedding-size 2560 --ffn-size 10240 --label-truncate 128 --model transformer/generator --n-decoder-layers 24 --n-encoder-layers 2 --n-heads 32 --n-positions 128 --relu-dropout 0 --text-truncate 128 --truncate 128 --variant prelayernorm

arch/r2c2_base_3B

Architecture parameters for R2C2 Base 3B model. SeeShuster et al. (2022)

--activation gelu --delimiter \n --dict-tokenizer gpt2 --embedding-size 2048 --embeddings-scale True --ffn-size 8192 --force-fp16-tokens True --history-add-global-end-token end --init-model zoo:seeker/r2c2_base_3B/model --label-truncate 1024 --learn-positional-embeddings True --model bart --n-decoder-layers 22 --n-encoder-layers 22 --n-heads 32 --n-layers 22 --n-positions 1024 --n-segments 0 --output-scaling 1 --share-word-embeddings True --split-lines True --text-truncate 1024 --truncate 1024 --variant prelayernorm

arch/r2c2_base_400M

Architecture parameters for R2C2 Base 400M model. SeeShuster et al. (2022)

--activation gelu --attention-dropout 0.1 --delimiter \n --dict-tokenizer gpt2 --dropout 0.1 --embedding-size 1024 --embeddings-scale True --ffn-size 4096 --force-fp16-tokens True --history-add-global-end-token None --init-model zoo:seeker/r2c2_base_400M/model --label-truncate 1024 --learn-positional-embeddings True --model bart --n-decoder-layers 12 --n-encoder-layers 12 --n-heads 16 --n-layers 12 --n-positions 1024 --n-segments 0 --output-scaling 1 --rank-candidates False --relu-dropout 0 --share-word-embeddings True --split-lines True --text-truncate 1024 --truncate 1024 --variant prelayernorm

gen/blenderbot

Beam search parameters for BlenderBot. SeeRoller et al. (2020)

--beam-context-block-ngram 3 --beam-block-ngram 3 --beam-size 10 --inference beam --beam-min-length 20 --beam-block-full-context False

gen/meena

Inference parameters for the Sample & Rank procedure of Meena. See Adiwardana et al. (2020).

--beam-size 20 --inference topk --topk 40

gen/opt_bb3

Generation parameters for BB3-175B. See https://parl.ai/projects/bb3

--sdm-inference greedy --sdm-beam-min-length 1 --sdm-beam-max-length 10 --sdm-generation-take-last-newline False --sdm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --sdm-history-size 1 --sdm-module sdm --sdm-max-prompt-len 1912 --sdm-penalize-repetitions False --sdm-penalize-ctxt-repetitions False --sdm-exclude-knowledge-from-ctxt-penalty False --search-decision compute --search-decision-control-token --search-decision-do-search-reply search --search-decision-dont-search-reply do not search --sdm-server opt_server --mdm-inference greedy --mdm-beam-min-length 1 --mdm-beam-max-length 10 --mdm-generation-take-last-newline False --mdm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --mdm-history-size -1 --mdm-module mdm --mdm-max-prompt-len 1912 --mdm-penalize-repetitions False --mdm-penalize-ctxt-repetitions False --mdm-exclude-knowledge-from-ctxt-penalty False --memory-decision compute --memory-decision-control-token --memory-decision-do-access-reply access memory --memory-decision-dont-access-reply do not access memory --memory-decision-use-memories True --mdm-server opt_server --search-query-control-token --search-server default --sgm-generation-take-last-newline False --sgm-inference greedy --sgm-beam-min-length 1 --sgm-beam-max-length 32 --sgm-module sgm --sgm-max-prompt-len 1912 --sgm-exclude-knowledge-from-ctxt-penalty False --sgm-penalize-repetitions False --sgm-penalize-ctxt-repetitions False --sgm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --sgm-server opt_server --memory-generator-control-token --mgm-inference greedy --mgm-beam-min-length 1 --mgm-beam-max-length 32 --mgm-generation-take-last-newline False --mgm-module mgm --mgm-max-prompt-len 1912 --mgm-exclude-knowledge-from-ctxt-penalty False --mgm-penalize-repetitions False --mgm-penalize-ctxt-repetitions False --mgm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --mgm-server opt_server --memory-knowledge-control-token --mkm-inference greedy --mkm-beam-min-length 1 --mkm-beam-max-length 32 --mkm-generation-take-last-newline False --mkm-module mkm --mkm-max-prompt-len 1412 --mkm-exclude-knowledge-from-ctxt-penalty False --mkm-penalize-ctxt-repetitions False --mkm-penalize-repetitions True --mkm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --mkm-server opt_server --ignore-in-session-memories-mkm False --memory-overlap-threshold 0 --memory-hard-block-for-n-turns 0 --memory-soft-block-decay-factor 0 --contextual-knowledge-control-token --contextual-knowledge-decision compute --ckm-inference greedy --ckm-beam-min-length 1 --ckm-beam-max-length 32 --ckm-module ckm --ckm-max-prompt-len 1812 --ckm-generation-take-last-newline False --ckm-exclude-knowledge-from-ctxt-penalty False --ckm-penalize-ctxt-repetitions False --ckm-penalize-repetitions True --ckm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --ckm-server opt_server --search-knowledge-control-token --skm-inference greedy --skm-beam-min-length 1 --skm-beam-max-length 64 --skm-module skm --skm-max-prompt-len 1412 --skm-generation-take-last-newline False --skm-penalize-ctxt-repetitions False --skm-penalize-repetitions True --skm-exclude-knowledge-from-ctxt-penalty False --skm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --skm-server opt_server --srm-inference factual_nucleus --srm-beam-min-length 20 --srm-beam-max-length 128 --srm-beam-size 1 --srm-generation-take-last-newline False --srm-penalize-ctxt-repetitions True --srm-penalize-repetitions True --srm-exclude-knowledge-from-ctxt-penalty True --srm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --srm-module srm --srm-max-prompt-len 1784 --srm-server opt_server --crm-inference factual_nucleus --crm-beam-min-length 20 --crm-beam-max-length 128 --crm-beam-size 1 --crm-generation-take-last-newline False --crm-module crm --crm-max-prompt-len 1880 --crm-penalize-ctxt-repetitions False --crm-penalize-repetitions True --crm-exclude-knowledge-from-ctxt-penalty True --crm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --crm-server opt_server --mrm-inference factual_nucleus --mrm-beam-min-length 20 --mrm-beam-max-length 128 --mrm-beam-size 1 --mrm-module mrm --mrm-max-prompt-len 1848 --mrm-generation-take-last-newline False --mrm-penalize-ctxt-repetitions False --mrm-penalize-repetitions True --mrm-exclude-knowledge-from-ctxt-penalty True --mrm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --mrm-server opt_server --vrm-inference factual_nucleus --vrm-beam-min-length 20 --vrm-beam-max-length 128 --vrm-beam-size 1 --vrm-module vrm --vrm-max-prompt-len 1912 --vrm-generation-take-last-newline False --vrm-penalize-ctxt-repetitions False --vrm-penalize-repetitions True --vrm-exclude-knowledge-from-ctxt-penalty False --vrm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --vrm-server opt_server --grm-inference factual_nucleus --grm-beam-min-length 20 --grm-beam-max-length 128 --grm-beam-size 1 --grm-module grm --grm-max-prompt-len 1880 --grm-generation-take-last-newline False --grm-penalize-ctxt-repetitions False --grm-penalize-repetitions True --grm-exclude-knowledge-from-ctxt-penalty False --grm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --grm-server opt_server --orm-inference factual_nucleus --orm-beam-min-length 1 --orm-beam-max-length 128 --orm-beam-size 1 --orm-module orm --orm-max-prompt-len 1412 --orm-generation-take-last-newline False --orm-penalize-ctxt-repetitions False --orm-penalize-repetitions False --orm-exclude-knowledge-from-ctxt-penalty False --orm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --orm-server opt_server --datatype valid --inject-query-string None --loglevel info --model projects.bb3.agents.opt_bb3_agent:BlenderBot3Agent --beam-disregard-knowledge-for-srm-context-blocking False --exclude-context-in-skm-context-blocking False --include-knowledge-in-ckm-context-blocking False --knowledge-conditioning combined --num-shots 0 --include-prompt False --knowledge-chunk-size 100 --max-prompt-len 1912 --all-vanilla-prompt False

gen/opt_pt

Generation parameters for OPT-175B in BB3 setup. See https://parl.ai/projects/bb3

--sdm-inference greedy --sdm-beam-min-length 1 --sdm-beam-max-length 10 --sdm-generation-take-last-newline False --sdm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --sdm-history-size 1 --sdm-module sdm --sdm-max-prompt-len 1912 --sdm-penalize-repetitions False --sdm-penalize-ctxt-repetitions False --sdm-exclude-knowledge-from-ctxt-penalty False --search-decision compute --search-decision-control-token --search-decision-do-search-reply search --search-decision-dont-search-reply do not search --sdm-server opt_server --mdm-inference greedy --mdm-beam-min-length 1 --mdm-beam-max-length 10 --mdm-generation-take-last-newline False --mdm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --mdm-history-size -1 --mdm-module mdm --mdm-max-prompt-len 1912 --mdm-penalize-repetitions False --mdm-penalize-ctxt-repetitions False --mdm-exclude-knowledge-from-ctxt-penalty False --memory-decision compute --memory-decision-control-token --memory-decision-do-access-reply access memory --memory-decision-dont-access-reply do not access memory --memory-decision-use-memories True --mdm-server opt_server --search-query-control-token --search-server default --sgm-generation-take-last-newline False --sgm-inference greedy --sgm-beam-min-length 2 --sgm-beam-max-length 32 --sgm-module sgm --sgm-max-prompt-len 1912 --sgm-exclude-knowledge-from-ctxt-penalty False --sgm-penalize-repetitions False --sgm-penalize-ctxt-repetitions False --sgm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --sgm-server opt_server --memory-generator-control-token --mgm-inference greedy --mgm-beam-min-length 10 --mgm-beam-max-length 32 --mgm-generation-take-last-newline False --mgm-module mgm --mgm-max-prompt-len 1912 --mgm-exclude-knowledge-from-ctxt-penalty False --mgm-penalize-repetitions False --mgm-penalize-ctxt-repetitions False --mgm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --mgm-server opt_server --memory-knowledge-control-token --mkm-inference greedy --mkm-beam-min-length 5 --mkm-beam-max-length 32 --mkm-generation-take-last-newline False --mkm-module mkm --mkm-max-prompt-len 1412 --mkm-exclude-knowledge-from-ctxt-penalty False --mkm-penalize-ctxt-repetitions False --mkm-penalize-repetitions False --mkm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --mkm-server opt_server --ignore-in-session-memories-mkm False --memory-overlap-threshold 0 --memory-hard-block-for-n-turns 0 --memory-soft-block-decay-factor 0 --contextual-knowledge-control-token --contextual-knowledge-decision compute --ckm-inference greedy --ckm-beam-min-length 1 --ckm-beam-max-length 32 --ckm-module ckm --ckm-max-prompt-len 1812 --ckm-generation-take-last-newline False --ckm-exclude-knowledge-from-ctxt-penalty False --ckm-penalize-ctxt-repetitions False --ckm-penalize-repetitions False --ckm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --ckm-server opt_server --search-knowledge-control-token --skm-inference greedy --skm-beam-min-length 10 --skm-beam-max-length 32 --skm-module skm --skm-max-prompt-len 1412 --skm-generation-take-last-newline False --skm-penalize-ctxt-repetitions False --skm-penalize-repetitions False --skm-exclude-knowledge-from-ctxt-penalty False --skm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --skm-server opt_server --srm-inference nucleus --srm-beam-min-length 1 --srm-beam-max-length 32 --srm-beam-size 1 --srm-generation-take-last-newline False --srm-penalize-ctxt-repetitions False --srm-penalize-repetitions False --srm-exclude-knowledge-from-ctxt-penalty False --srm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --srm-module srm --srm-max-prompt-len 1784 --srm-server opt_server --crm-inference nucleus --crm-beam-min-length 1 --crm-beam-max-length 32 --crm-beam-size 1 --crm-generation-take-last-newline False --crm-module crm --crm-max-prompt-len 1880 --crm-penalize-ctxt-repetitions False --crm-penalize-repetitions False --crm-exclude-knowledge-from-ctxt-penalty False --crm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --crm-server opt_server --mrm-inference nucleus --mrm-beam-min-length 1 --mrm-beam-max-length 32 --mrm-beam-size 1 --mrm-module mrm --mrm-max-prompt-len 1848 --mrm-generation-take-last-newline False --mrm-penalize-ctxt-repetitions False --mrm-penalize-repetitions False --mrm-exclude-knowledge-from-ctxt-penalty False --mrm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --mrm-server opt_server --vrm-inference nucleus --vrm-beam-min-length 1 --vrm-beam-max-length 32 --vrm-beam-size 1 --vrm-module vrm --vrm-max-prompt-len 1912 --vrm-generation-take-last-newline False --vrm-penalize-ctxt-repetitions False --vrm-penalize-repetitions False --vrm-exclude-knowledge-from-ctxt-penalty False --vrm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --vrm-server opt_server --grm-inference nucleus --grm-beam-min-length 1 --grm-beam-max-length 32 --grm-beam-size 1 --grm-module grm --grm-max-prompt-len 1880 --grm-generation-take-last-newline False --grm-penalize-ctxt-repetitions False --grm-penalize-repetitions False --grm-exclude-knowledge-from-ctxt-penalty False --grm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --grm-server opt_server --orm-inference nucleus --orm-beam-min-length 1 --orm-beam-max-length 32 --orm-beam-size 1 --orm-module orm --orm-max-prompt-len 1412 --orm-generation-take-last-newline False --orm-penalize-ctxt-repetitions False --orm-penalize-repetitions False --orm-exclude-knowledge-from-ctxt-penalty False --orm-model projects.bb3.agents.opt_api_agent:BB3OPTAgent --orm-server opt_server --datatype valid --inject-query-string None --loglevel info --model projects.bb3.agents.opt_bb3_agent:BlenderBot3Agent --beam-disregard-knowledge-for-srm-context-blocking False --exclude-context-in-skm-context-blocking False --include-knowledge-in-ckm-context-blocking False --knowledge-conditioning combined --num-shots -1 --include-prompt True --knowledge-chunk-size 100 --max-prompt-len 1912 --all-vanilla-prompt False

gen/r2c2_bb3

Generation parameters for BB3-3B. See https://parl.ai/projects/bb3

--sdm-beam-block-ngram -1 --sdm-beam-min-length 1 --sdm-beam-size 1 --sdm-history-size 1 --sdm-inference greedy --search-decision compute --search-decision-control-token __is-search-required__ --search-decision-do-search-reply __do-search__ --search-decision-dont-search-reply __do-not-search__ --mdm-beam-block-ngram -1 --mdm-beam-min-length 1 --mdm-beam-size 1 --mdm-history-size -1 --mdm-inference greedy --mdm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --memory-decision compute --memory-decision-control-token __is-memory-required__ --memory-decision-do-access-reply __do-access-memory__ --memory-decision-dont-access-reply __do-not-access-memory__ --memory-decision-use-memories True --search-query-control-token __generate-query__ --search-server relevant_search_server --sgm-beam-block-ngram -1 --sgm-beam-min-length 2 --sgm-beam-size 1 --sgm-inference beam --sgm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --memory-generator-control-token __generate-memory__ --mgm-beam-block-ngram 3 --mgm-beam-min-length 10 --mgm-beam-size 3 --mgm-inference beam --mgm-history-size 1 --mgm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --memory-knowledge-control-token __access-memory__ --mkm-beam-block-ngram 3 --mkm-beam-context-block-ngram -1 --mkm-beam-min-length 5 --mkm-beam-size 3 --mkm-inference beam --mkm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --mkm-rag-retriever-type search_engine --mkm-search-query-generator-model-file '' --mkm-search-server --mkm-memory-retriever True --contextual-knowledge-control-token __extract-entity__ --ckm-beam-block-ngram 3 --ckm-beam-context-block-ngram 3 --ckm-beam-min-length 1 --ckm-beam-size 3 --ckm-inference beam --ckm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --search-knowledge-control-token __generate-knowledge__ --skm-beam-block-ngram 3 --skm-beam-context-block-ngram 3 --skm-beam-min-length 10 --skm-beam-size 3 --skm-doc-chunks-ranker woi_chunk_retrieved_docs --skm-inference beam --skm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --skm-n-ranked-doc-chunks 1 --skm-rag-retriever-type search_engine --skm-search-query-generator-model-file '' --srm-beam-block-full-context True --srm-beam-block-ngram 3 --srm-beam-context-block-ngram 3 --srm-beam-min-length 20 --srm-beam-size 10 --srm-inference beam --srm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --crm-beam-block-full-context True --crm-beam-block-ngram 3 --crm-beam-context-block-ngram 3 --crm-beam-min-length 20 --crm-beam-size 10 --crm-inference beam --crm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --mrm-beam-block-full-context True --mrm-beam-block-ngram 3 --mrm-beam-context-block-ngram 3 --mrm-beam-min-length 20 --mrm-beam-size 10 --mrm-inference beam --mrm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --grm-beam-block-full-context True --grm-beam-block-ngram 3 --grm-beam-context-block-ngram 3 --grm-beam-min-length 20 --grm-beam-size 10 --grm-inference beam --grm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --vrm-beam-block-full-context True --vrm-beam-block-ngram 3 --vrm-beam-context-block-ngram 3 --vrm-beam-min-length 20 --vrm-beam-size 10 --vrm-inference beam --vrm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --orm-beam-block-full-context True --orm-beam-block-ngram 3 --orm-beam-context-block-ngram 3 --orm-beam-min-length 20 --orm-beam-size 10 --orm-inference beam --orm-model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --datatype valid --beam-disregard-knowledge-for-srm-context-blocking False --beam-disregard-knowledge-for-mrm-context-blocking False --beam-disregard-knowledge-for-crm-context-blocking False --beam-disregard-knowledge-for-grm-context-blocking False --beam-disregard-knowledge-for-vrm-context-blocking False --beam-disregard-knowledge-for-orm-context-blocking False --exclude-context-in-skm-context-blocking False --exclude-context-in-mkm-context-blocking False --exclude-context-in-ckm-context-blocking False --include-knowledge-in-skm-context-blocking True --include-knowledge-in-mkm-context-blocking True --include-knowledge-in-ckm-context-blocking False --inject-query-string None --loglevel debug --model projects.bb3.agents.r2c2_bb3_agent:BlenderBot3Agent --knowledge-conditioning combined --contextual-knowledge-decision compute

gen/seeker_dialogue

Generation parameters for SeeKeR, Dialogue. SeeShuster et al. (2022)

--beam-disregard-knowledge-for-context-blocking False --datatype valid --drm-beam-block-full-context True --drm-beam-block-ngram 3 --drm-beam-context-block-ngram 3 --drm-beam-min-length 20 --drm-beam-size 10 --drm-inference beam --drm-message-mutators None --drm-model projects.seeker.agents.seeker:ComboFidSearchQueryAgent --exclude-context-in-krm-context-blocking False --include-knowledge-in-krm-context-blocking True --inject-query-string None --knowledge-response-control-token None --krm-beam-block-ngram 3 --krm-beam-context-block-ngram 3 --krm-beam-min-length 1 --krm-beam-size 3 --krm-doc-chunks-ranker woi_chunk_retrieved_docs --krm-inference beam --krm-message-mutators None --krm-model projects.seeker.agents.seeker:ComboFidSearchQueryAgent --krm-n-ranked-doc-chunks 1 --krm-rag-retriever-type search_engine --krm-search-query-generator-model-file '' --krm-search-server --loglevel debug --min-knowledge-length-when-search 10 --model projects.seeker.agents.seeker:SeekerAgent --model-file zoo:seeker/seeker_dialogue_3B/model --sdm-beam-block-ngram -1 --sdm-beam-min-length 1 --sdm-beam-size 1 --sdm-history-size 1 --sdm-inference greedy --sdm-model projects.seeker.agents.seeker:ComboFidSearchQueryAgent --search-decision always --search-decision-control-token __is-search-required__ --search-decision-do-search-reply __do-search__ --search-decision-dont-search-reply __do-not-search__ --search-query-control-token __generate-query__ --search-server None --sqm-beam-block-ngram -1 --sqm-beam-min-length 2 --sqm-beam-size 1 --sqm-inference beam --sqm-model projects.seeker.agents.seeker:ComboFidSearchQueryAgent

gen/seeker_lm

Generation parameters for SeeKeR, Language Model. SeeShuster et al. (2022)

--beam-disregard-knowledge-for-context-blocking True --datatype valid --drm-beam-block-full-context True --drm-beam-block-ngram 3 --drm-beam-context-block-ngram 3 --drm-beam-min-length 20 --drm-beam-size 10 --drm-inference beam --drm-message-mutators None --drm-model projects.seeker.agents.gpt2_seeker:GPT2ComboSearchQueryAgent --exclude-context-in-krm-context-blocking False --include-knowledge-in-krm-context-blocking True --inject-query-string None --knowledge-response-control-token None --krm-beam-block-ngram 3 --krm-beam-context-block-ngram -1 --krm-beam-min-length 1 --krm-beam-size 3 --krm-doc-chunks-ranker woi_chunk_retrieved_docs --krm-inference beam --krm-message-mutators None --krm-model projects.seeker.agents.gpt2_seeker:GPT2ComboSearchQueryAgent --krm-n-ranked-doc-chunks 1 --krm-rag-retriever-type search_engine --krm-search-query-generator-model-file '' --krm-search-server --loglevel debug --min-knowledge-length-when-search 1 --model projects.seeker.agents.gpt2_seeker:GPT2SeekerAgent --model-file zoo:seeker/seeker_lm_xl/model --sdm-beam-block-ngram -1 --sdm-beam-min-length 1 --sdm-beam-size 1 --sdm-history-size 1 --sdm-inference greedy --search-decision always --search-decision-control-token __is-search-required__ --search-decision-do-search-reply __do-search__ --search-decision-dont-search-reply __do-not-search__ --search-query-control-token __generate-query__ --search-server None --sqm-beam-block-ngram -1 --sqm-beam-min-length 2 --sqm-beam-size 1 --sqm-inference beam --sqm-model projects.seeker.agents.gpt2_seeker:GPT2ComboSearchQueryAgent