General LLM#

class recwizard.modules.llm.configuration_llm.LLMConfig(model_name='meta-llama/Llama-2-7b-chat-hf', max_gen_len: int = 0, answer_type='movie', answer_mask='<movie>', prompt: dict | None = None, **kwargs)[source]#

The configuration of the generator based on OpenAI’s GPT models.

answer_name#

The special string used to represent the answer in the response.

Type:

str

answer_mask#

The type of the answer.

Type:

str

prompt#

The prompt for the GPT model.

Type:

str

model_name#

The specified GPT model’s name.

Type:

str

__init__(model_name='meta-llama/Llama-2-7b-chat-hf', max_gen_len: int = 0, answer_type='movie', answer_mask='<movie>', prompt: dict | None = None, **kwargs)[source]#

Initializes the instance of this configuration.

Parameters:

max_gen_len (int, optional) – The maximum length to set in the generator.

class recwizard.modules.llm.configuration_llm_rec.LLMRecConfig(backup_prompt=None, *args, **kwargs)[source]#

The configuration of the recommender based on OpenAI’s GPT models.

answer_name#

The special string representing the answer in the response template.

Type:

str

answer_type#

The type of the answer.

Type:

str

prompt#

The prompt for the GPT model.

Type:

str

model_name#

The specified GPT model’s name.

Type:

str

__init__(backup_prompt=None, *args, **kwargs)[source]#

Initializes the instance of this configuration.