llm#
- class besser.agent.nlp.llm.llm.LLM(nlp_engine, name, parameters, global_context=None)[source]#
Bases:
ABC
The LLM abstract class.
An LLM (Large Language Model) receives a text input and generates an output. Depending on the LLM, several tasks can be performed, such as question answering, translation, text classification, etc.
This class serves as a template to be implemented for specific LLM providers.
- Parameters:
- _nlp_engine#
the NLPEngine that handles the NLP processes of the agent the LLM belongs to
- Type:
- _user_context#
aggregation of user specific contexts to be provided to the LLM for each request
- Type:
- _user_contexts#
dictionary containing the different context elements making up the user’s context user specific context to be provided to the LLM for each request
- Type:
- _abc_impl = <_abc._abc_data object>#
- chat(session, parameters=None, system_message=None)[source]#
Make a prediction, i.e., generate an output.
This function can provide the chat history to the LLM for the output generation, simulating a conversation or remembering previous messages.
- abstract initialize()[source]#
Initialize the LLM. This function is called during the agent training.
- intent_classification(intent_classifier, message, parameters=None)[source]#
Predict the intent of a given message.
Instead of returning only the intent with the highest likelihood, return all predictions. Predictions include not only the intent scores but other information extracted from the message.
- Parameters:
intent_classifier (LLMIntentClassifier) – the intent classifier that is running the intent classification process
message (str) – the message to predict the intent
parameters (dict) – the LLM parameters. If none is provided, the RAG’s default value will be used
- Returns:
the list of predictions made by the LLM.
- Return type:
- abstract predict(message, parameters=None, session=None, system_message=None)[source]#
Make a prediction, i.e., generate an output.
- Parameters:
message (Any) – the LLM input text
session (Session) – the ongoing session, can be None if no context needs to be applied
parameters (dict) – the LLM parameters to use in the prediction. If none is provided, the default LLM parameters will be used
system_message (str) – system message to give high priority context to the LLM
- Returns:
the LLM output
- Return type: