Skip to content

Chat Models

aana.core.models.chat

Role

Role = Literal['system', 'user', 'assistant']

The role of a participant in a conversation.

  • "system": Used for instructions or context provided to the model.
  • "user": Represents messages from the user.
  • "assistant": Represents LLM responses.

Prompt

Prompt = str

The prompt for the LLM.

Question

Question = str

The question.

ChatMessage

Bases: BaseModel

A chat message.

ATTRIBUTE DESCRIPTION
content

the text of the message

TYPE: str

role

the role of the message

TYPE: Role

ChatDialog

Bases: BaseModel

A chat dialog.

ATTRIBUTE DESCRIPTION
messages

the messages in the dialog

TYPE: list[ChatMessage]

from_list

from_list(messages)

Create a ChatDialog from a list of dictionaries.

PARAMETER DESCRIPTION
messages

the list of messages

TYPE: list[dict[str, str]]

RETURNS DESCRIPTION
ChatDialog

the chat dialog

TYPE: ChatDialog

Source code in aana/core/models/chat.py
@classmethod
def from_list(cls, messages: list[dict[str, str]]) -> "ChatDialog":
    """Create a ChatDialog from a list of dictionaries.

    Args:
        messages (list[dict[str, str]]): the list of messages

    Returns:
        ChatDialog: the chat dialog
    """
    return ChatDialog(messages=[ChatMessage(**message) for message in messages])

from_prompt

from_prompt(prompt)

Create a ChatDialog from a prompt.

PARAMETER DESCRIPTION
prompt

the prompt

TYPE: str

RETURNS DESCRIPTION
ChatDialog

the chat dialog

TYPE: ChatDialog

Source code in aana/core/models/chat.py
@classmethod
def from_prompt(cls, prompt: str) -> "ChatDialog":
    """Create a ChatDialog from a prompt.

    Args:
        prompt (str): the prompt

    Returns:
        ChatDialog: the chat dialog
    """
    return ChatDialog(messages=[ChatMessage(content=prompt, role="user")])

ChatCompletionRequest

Bases: BaseModel

A chat completion request for OpenAI compatible API.

ATTRIBUTE DESCRIPTION
model

the model name (name of the LLM deployment)

TYPE: str

messages

a list of messages comprising the conversation so far

TYPE: list[ChatMessage]

temperature

float that controls the randomness of the sampling

TYPE: float

top_p

float that controls the cumulative probability of the top tokens to consider

TYPE: float

max_tokens

the maximum number of tokens to generate

TYPE: int

repetition_penalty

float that penalizes new tokens based on whether they appear in the prompt and the generated text so far

TYPE: float

stream

if set, partial message deltas will be sent

TYPE: bool

ChatCompletionChoice

Bases: BaseModel

A chat completion choice for OpenAI compatible API.

ATTRIBUTE DESCRIPTION
index

the index of the choice in the list of choices

TYPE: int

message

a chat completion message generated by the model

TYPE: ChatMessage

ChatCompletion

Bases: BaseModel

A chat completion for OpenAI compatible API.

ATTRIBUTE DESCRIPTION
id

a unique identifier for the chat completion

TYPE: str

model

the model used for the chat completion

TYPE: str

created

the Unix timestamp (in seconds) of when the chat completion was created

TYPE: int

choices

a list of chat completion choices

TYPE: list[ChatCompletionChoice]

object

the object type, which is always chat.completion

TYPE: Literal['chat.completion']