afnio.cognitive.modules.chat_completion
afnio.cognitive.modules.chat_completion.ChatCompletion
Bases: Module
Generates a chat-based completion using a language model.
This module leverages the ChatCompletion
operation from afnio.autodiff.lm_ops to perform model inference. The forward
method accepts a list of messages representing the conversation history, with
optional dynamic inputs for filling placeholders within the messages. The
forward_model_client is responsible for interfacing with the language model
(e.g., gpt-4.1), while completion_args allows customization of generation
parameters such as temperature, maximum tokens, and seed.
Examples:
>>> from afnio import cognitive as cog
>>> from afnio.models.openai import OpenAI
>>> from afnio import set_backward_model_client
>>> fwd_model_client = OpenAI()
>>> fwd_model_args = {"model": "gpt-4o", "temperature": 0.7}
>>> set_backward_model_client("openai/gpt-4o")
>>> class Assistant(cog.Module):
... def __init__(self):
... super().__init__()
... self.chat = cog.ChatCompletion()
... def forward(self, fwd_model, messages, inputs, **completion_args):
... return self.chat(fwd_model, messages, inputs, **completion_args)
>>> system = Variable(
... "You are a helpful assistant.",
... role="system instruction",
... requires_grad=True
... )
>>> user = Variable("Translate 'Hello' to {language}.", role="user query")
>>> language = afnio.Variable("Italian", role="language")
>>> messages = [
... {"role": "system", "content": [system]},
... {"role": "user", "content": [user]},
... ]
>>> agent = Assistant()
>>> response = agent(
... fwd_model_client,
... messages,
... inputs={"language": language},
... **fwd_model_args
... )
>>> print(response.data)
'Ciao'
>>> feedback = Variable("Use only capital letters.", role="feedback")
>>> response.backward(feedback)
>>> system.grad[0].data
'The system instruction should enforce the use of capital letters only.'
Raises:
| Type | Description |
|---|---|
TypeError
|
If the types of |
See Also
afnio.autodiff.lm_ops.ChatCompletion
for the underlying operation.
Source code in afnio/cognitive/modules/chat_completion.py
11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 | |
forward(forward_model_client, messages, inputs=None, **completion_args)
Forward pass for the chat completion function.
Warning
Users should not call this method directly. Instead, they should call the
module instance itself, which will internally invoke this forward method.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
forward_model_client
|
ChatCompletionModel | None
|
The LM model client used for generating chat completions. |
required |
messages
|
MultiTurnMessages
|
A list of messages that compose the prompt/context for the LM.
Each message is a dictionary with a |
required |
inputs
|
dict[str, str | Variable] | None
|
A dictionary mapping placeholder names to their corresponding
values, which can be strings or |
None
|
**completion_args
|
Additional keyword arguments to pass to the LM model
client's |
{}
|
Returns:
| Name | Type | Description |
|---|---|---|
response |
Variable
|
A |
Raises:
| Type | Description |
|---|---|
TypeError
|
If the types of |
Source code in afnio/cognitive/modules/chat_completion.py
81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 | |