Skip to content

Chat

Chat is the primary unit of work in chatsnack.

chatsnack.Chat(*args, **kwargs)

Bases: ChatQueryMixin, ChatSerializationMixin, ChatUtensilMixin

A chat prompt that can be expanded into a chat ⭐

Initialize a chat from a terse authored shape.

Common forms include Chat("system message"), Chat("Name", "system message"), Chat(name="SavedPrompt"), and Chat(..., utensils=[...]).

json: str property

Return the expanded chat messages as JSON for API submission.

json_unexpanded: str property

Return the chat messages as JSON before include expansion.

yaml: str property

Returns the chat prompt as a yaml string ⭐

last: str property

Returns the value of the last message in the chat prompt (any)

response: str property

Returns the value of the last assistant message in the chat prompt ⭐

ChatStreamListener(ai, prompt, **kwargs)

Stream listener for handling streamed responses.

Initialize the stream listener.

start_a() async

Start the stream in async mode.

start()

Start the stream in sync mode.

set_utensils(utensils: Any)

Set the utensils available for this chat.

Parameters:

Name Type Description Default
utensils Any

Can be a list of functions, UtensilFunction objects, UtensilGroup objects, or a dictionary mapping names to functions.

required

execute_tool_call(tool_call)

Process a tool call and return the result

set_tools(tools_list)

Set the tools list from API-format dictionaries

get_tools() -> List[Dict]

Get the tools with complex structures deserialized.

handle_tool_call(tool_call: Dict[str, Any]) -> Dict[str, Any]

Handle a tool call response from the LLM.

Parameters:

Name Type Description Default
tool_call Dict[str, Any]

The tool call information from the API

required

Returns:

Type Description
Dict[str, Any]

Result of the tool execution

save(path: str = None)

Persist the current datafile-backed object to disk.

load(path: str = None)

Load the object from disk, optionally from an explicit path.

generate_markdown(wrap=80) -> str

Returns the chat prompt as a markdown string ⭐

set_response_filter(prefix: Optional[str] = None, suffix: Optional[str] = None, pattern: Optional[str] = None)

Filters the response by a given prefix/suffix or regex pattern. If suffix is None, it is set to the same as prefix.

filter_by_pattern(text: str) -> Optional[str]

Applies self.pattern if set, returning the first capture group match.

system(content: str, chat=False) -> object

Adds or sets the system message in the chat prompt ⭐ Returns: If chat is False returns this object for chaining. If chat is True, submits the chat and returns a new Chat object that includes the message and response

user(content: str, chat=False) -> object

Message added to the chat from the user ⭐ Returns: If chat is False returns this object for chaining. If chat is True, submits the chat and returns a new Chat object that includes the message and response

assistant(content: Union[str, List, Dict], chat=False) -> object

Message added to the chat from the assistant ⭐ Returns: If chat is False returns this object for chaining. If chat is True, submits the chat and returns a new Chat object that includes the message and response

tool(content: Union[str, Dict], chat=False) -> object

Message added to the chat which is a tool response ⭐ Returns: If chat is False returns this object for chaining. If chat is True, submits the chat and returns a new Chat object that includes the message and response

include(chatprompt_name: str = None, chat=False) -> object

Message added to the chat that is a reference to another ChatPrompt where the messages will be inserted in this spot right before formatting ⭐ Returns: If chat is False returns this object for chaining. If chat is True, submits the chat and returns a new Chat object that includes the message and response

developer(content: str, chat=False) -> object

Alias for system() that accepts a developer role name.

add_message(role: str, content: Union[str, List, Dict], chat: bool = False) -> object

Add a message to the chat, as role ('user', 'assistant', 'system', 'developer', 'tool' or 'include') with the content Returns: If chat is False returns this object for chaining. If chat is True, submits the chat and returns a new Chat object that includes the message and response

add_messages_json(json_messages: str, escape: bool = True)

Add messages from a JSON string while properly handling tool calls and responses.

add_or_update_last_assistant_message(content: str)

Adds a final assistant message (or appends to the end of the last assistant message)

get_messages(includes_expanded=True) -> List[Dict[str, str]]

Returns a list of messages with any included named chat files expanded

ask(usermsg=None, files=None, images=None, **additional_vars) -> str

Executes the internal chat query as-is and returns only the string response. If usermsg is passed in, it will be added as a user message to the chat before executing the query. ⭐

ask_a(usermsg=None, files=None, images=None, **additional_vars) -> str async

Async form of ask().

listen(usermsg=None, events=False, event_schema='legacy', files=None, images=None, **additional_vars) -> ChatStreamListener

Executes the internal chat query as-is and returns a listener object that can be iterated on for the text. If usermsg is passed in, it will be added as a user message to the chat before executing the query. ⭐

listen_a(usermsg=None, async_listen=True, events=False, event_schema='legacy', files=None, images=None, **additional_vars) -> ChatStreamListener async

Async form of listen().

chat(usermsg=None, files=None, images=None, **additional_vars) -> object

Executes the query as-is and returns a new Chat for continuation If usermsg is passed in, it will be added as a user message to the chat before executing the query. ⭐

chat_a(usermsg=None, files=None, images=None, **additional_vars) -> object async

Async form of chat().

copy(name: str = None, system=None, expand_includes: bool = False, expand_fillings: bool = False, **additional_vars) -> object

Returns a new ChatPrompt object that is a copy of this one, optionally with a new name ⭐

close_session()

Close the active runtime session if the selected runtime supports it.

close_all_sessions() classmethod

Close every tracked shared Responses WebSocket session.

reset() -> object

Restore the chat to the state captured immediately after initialization.