Skip to content

afnio.autodiff.basic_ops

afnio.autodiff.basic_ops.Add

Bases: Function

Implements an addition operation for Variable instances within the afnio framework, supporting automatic differentiation.

This class inherits from Function and requires both the forward and backward methods to be defined.

The Add function supports both scalar and list data fields:

  • Scalars: Adds numerical values (int, float) or concatenates strings.
  • Lists: Performs element-wise addition of corresponding elements from the lists. Lists must be of the same length.

It automatically handles type-based operations:

  • For numerical data (int, float), it performs arithmetic addition.
  • For strings, it concatenates the values.
  • Mixed types (e.g., string and number) are converted appropriately before performing the addition.

This operation also tracks Variable dependencies, enabling automatic gradient computation through backpropagation.

Examples:

Example with scalar inputs:

>>> x = Variable(data="abc", role="first input", requires_grad=True)
>>> y = Variable(data="def", role="second input", requires_grad=False)
>>> result = Add.apply(x, y)
>>> result.data
'abcdef'
>>> result.role
'first input and second input'
>>> result.requires_grad
True
>>> g = Variable(data="MY_FEEDBACK", role="add gradient")
>>> result.backward(g)
>>> x.grad[0].data
'Here is the combined feedback we got for this specific first input and other variables: MY_FEEDBACK'
>>> x.grad[0].role
'feedback to first input'

Example with batched inputs:

>>> x = Variable(data=[1, 2, 3], role="first input", requires_grad=True)
>>> y = Variable(data=[4, 5, 6], role="second input", requires_grad=False)
>>> result = Add.apply(x, y)
>>> result.data
[5, 7, 9]
>>> result.role
'first input and second input'
>>> result.requires_grad
True
Source code in afnio/autodiff/basic_ops.py
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
class Add(Function):
    """
    Implements an addition operation for [`Variable`][afnio.Variable] instances within
    the `afnio` framework, supporting automatic differentiation.

    This class inherits from [`Function`][afnio.autodiff.function.Function] and
    requires both the [`forward`][afnio.autodiff.function.Function.forward] and
    [`backward`][afnio.autodiff.function.Function.backward] methods to be defined.

    The `Add` function supports both scalar and list [`data`][afnio.Variable.data]
    fields:

    - **Scalars**: Adds numerical values (`int`, `float`) or concatenates strings.
    - **Lists**: Performs element-wise addition of corresponding elements from the lists.
      Lists must be of the same length.

    It automatically handles type-based operations:

    - For numerical data (`int`, `float`), it performs arithmetic addition.
    - For strings, it concatenates the values.
    - Mixed types (e.g., string and number) are converted appropriately before performing
      the addition.

    This operation also tracks [`Variable`][afnio.Variable] dependencies,
    enabling automatic gradient computation through backpropagation.

    Examples:
        Example with scalar inputs:
        >>> x = Variable(data="abc", role="first input", requires_grad=True)
        >>> y = Variable(data="def", role="second input", requires_grad=False)
        >>> result = Add.apply(x, y)
        >>> result.data
        'abcdef'
        >>> result.role
        'first input and second input'
        >>> result.requires_grad
        True
        >>> g = Variable(data="MY_FEEDBACK", role="add gradient")
        >>> result.backward(g)
        >>> x.grad[0].data
        'Here is the combined feedback we got for this specific first input and other variables: MY_FEEDBACK'
        >>> x.grad[0].role
        'feedback to first input'

        Example with batched inputs:
        >>> x = Variable(data=[1, 2, 3], role="first input", requires_grad=True)
        >>> y = Variable(data=[4, 5, 6], role="second input", requires_grad=False)
        >>> result = Add.apply(x, y)
        >>> result.data
        [5, 7, 9]
        >>> result.role
        'first input and second input'
        >>> result.requires_grad
        True
    """  # noqa: E501

    @staticmethod
    def forward(ctx, x: Variable, y: Variable) -> Variable:
        """
        Forward pass for element-wise addition.

        Warning:
            This method is invoked by
            [`apply()`][afnio.autodiff.function.Function.apply]
            and should not be called directly.

        Args:
            ctx: Context object used to save information for [`backward`][..backward]
                computation.
            x: The first input `Variable`.
            y: The second input `Variable`.

        Returns:
            A new `Variable` instance representing the result of the addition, \
            with appropriately aggregated [`data`][afnio.Variable.data], \
            [`role`][afnio.Variable.role], and \
            [`requires_grad`][afnio.Variable.requires_grad] attributes.

        Raises:
            TypeError: If either input is not an instance
                of [`Variable`][afnio.Variable].
            TypeError: If addition between the input types is not allowed.
            ValueError: If a scalar [`data`][afnio.Variable.data] is added
                to a list[`data`][afnio.Variable.data].
            ValueError: If list [`data`][afnio.Variable.data] fields
                have mismatched lengths.
        """
        raise NotImplementedError(
            "Add.forward is implemented on the server. "
            "Client-side execution is not supported."
        )

    @staticmethod
    def backward(
        ctx, grad_output: Variable
    ) -> Tuple[Optional[Variable], Optional[Variable]]:
        """
        Backward pass for elementwise addition.

        Warning:
            This method is invoked by the autodiff engine
            and should not be called directly.

        Args:
            ctx: Context object containing saved information from the
                [`forward`][..forward] pass.
            grad_output: The gradient of the output `Variable` with respect to
                the output of the `forward()` method.

        Returns:
            grad_x: The gradient of the output w.r.t. the first input `Variable`
                if it requires gradients, otherwise `None`.
            grad_y: The gradient of the output w.r.t. the second input `Variable`
                if it requires gradients, otherwise `None`.

        Raises:
            TypeError: If `grad_output` is not an instance
                of [`Variable`][afnio.Variable].
        """
        raise NotImplementedError(
            "Add.backward is implemented on the server. "
            "Client-side execution is not supported."
        )

forward(ctx, x, y) staticmethod

Forward pass for element-wise addition.

Warning

This method is invoked by apply() and should not be called directly.

Parameters:

Name Type Description Default
ctx

Context object used to save information for backward computation.

required
x Variable

The first input Variable.

required
y Variable

The second input Variable.

required

Returns:

Type Description
Variable

A new Variable instance representing the result of the addition, with appropriately aggregated data, role, and requires_grad attributes.

Raises:

Type Description
TypeError

If either input is not an instance of Variable.

TypeError

If addition between the input types is not allowed.

ValueError

If a scalar data is added to a listdata.

ValueError

If list data fields have mismatched lengths.

Source code in afnio/autodiff/basic_ops.py
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
@staticmethod
def forward(ctx, x: Variable, y: Variable) -> Variable:
    """
    Forward pass for element-wise addition.

    Warning:
        This method is invoked by
        [`apply()`][afnio.autodiff.function.Function.apply]
        and should not be called directly.

    Args:
        ctx: Context object used to save information for [`backward`][..backward]
            computation.
        x: The first input `Variable`.
        y: The second input `Variable`.

    Returns:
        A new `Variable` instance representing the result of the addition, \
        with appropriately aggregated [`data`][afnio.Variable.data], \
        [`role`][afnio.Variable.role], and \
        [`requires_grad`][afnio.Variable.requires_grad] attributes.

    Raises:
        TypeError: If either input is not an instance
            of [`Variable`][afnio.Variable].
        TypeError: If addition between the input types is not allowed.
        ValueError: If a scalar [`data`][afnio.Variable.data] is added
            to a list[`data`][afnio.Variable.data].
        ValueError: If list [`data`][afnio.Variable.data] fields
            have mismatched lengths.
    """
    raise NotImplementedError(
        "Add.forward is implemented on the server. "
        "Client-side execution is not supported."
    )

backward(ctx, grad_output) staticmethod

Backward pass for elementwise addition.

Warning

This method is invoked by the autodiff engine and should not be called directly.

Parameters:

Name Type Description Default
ctx

Context object containing saved information from the forward pass.

required
grad_output Variable

The gradient of the output Variable with respect to the output of the forward() method.

required

Returns:

Name Type Description
grad_x Variable | None

The gradient of the output w.r.t. the first input Variable if it requires gradients, otherwise None.

grad_y Variable | None

The gradient of the output w.r.t. the second input Variable if it requires gradients, otherwise None.

Raises:

Type Description
TypeError

If grad_output is not an instance of Variable.

Source code in afnio/autodiff/basic_ops.py
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
@staticmethod
def backward(
    ctx, grad_output: Variable
) -> Tuple[Optional[Variable], Optional[Variable]]:
    """
    Backward pass for elementwise addition.

    Warning:
        This method is invoked by the autodiff engine
        and should not be called directly.

    Args:
        ctx: Context object containing saved information from the
            [`forward`][..forward] pass.
        grad_output: The gradient of the output `Variable` with respect to
            the output of the `forward()` method.

    Returns:
        grad_x: The gradient of the output w.r.t. the first input `Variable`
            if it requires gradients, otherwise `None`.
        grad_y: The gradient of the output w.r.t. the second input `Variable`
            if it requires gradients, otherwise `None`.

    Raises:
        TypeError: If `grad_output` is not an instance
            of [`Variable`][afnio.Variable].
    """
    raise NotImplementedError(
        "Add.backward is implemented on the server. "
        "Client-side execution is not supported."
    )

afnio.autodiff.basic_ops.Sum

Bases: Function

Implements a summation operation for a list of Variable instances within the afnio framework, supporting automatic differentiation.

This class inherits from Function and requires both the forward and backward methods to be defined.

The Sum function aggregates the data, role, and requires_grad attributes of all input Variable instances into a single Variable. It supports both scalar and list data fields:

  • Scalars: Computes the arithmetic sum for numerical data (int, float) or concatenates all string values, wrapping each in <ITEM></ITEM> tags.
  • Lists: Aggregates the corresponding elements of the lists. For numerical data, it sums the corresponding elements. For string data, it concatenates them, wrapping each element in <ITEM></ITEM> tags.

During backpropagation, the function distributes the gradient to all input Variable instances that require gradients.

Examples:

Example with scalar inputs:

>>> x = Variable(data="abc", role="first input", requires_grad=True)
>>> y = Variable(data="def", role="second input", requires_grad=False)
>>> result = Sum.apply([x, y])
>>> result.data
'<ITEM>abc</ITEM><ITEM>def</ITEM>'
>>> result.role
'first input and second input'
>>> result.requires_grad
True
>>> g = Variable(data="MY_FEEDBACK", role="add gradient")
>>> result.backward(g)
>>> x.grad[0].data
'Here is the combined feedback we got for this specific first input and other variables: MY_FEEDBACK'
>>> x.grad[0].role
'feedback to first input'

Example with batched inputs:

>>> x = Variable(data=[1, 2, 3.5], role="first input", requires_grad=True)
>>> y = Variable(data=[4, 5, 6], role="second input", requires_grad=False)
>>> result = Sum.apply([x, y])
>>> result.data
[5, 7, 9.5]
>>> result.role
'first input and second input'
>>> result.requires_grad
True
Source code in afnio/autodiff/basic_ops.py
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
class Sum(Function):
    """
    Implements a summation operation for a list of [`Variable`][afnio.Variable]
    instances within the `afnio` framework, supporting automatic differentiation.

    This class inherits from [`Function`][afnio.autodiff.function.Function] and
    requires both the [`forward`][afnio.autodiff.function.Function.forward] and
    [`backward`][afnio.autodiff.function.Function.backward] methods to be defined.

    The `Sum` function aggregates the [`data`][afnio.Variable.data],
    [`role`][afnio.Variable.role], and [`requires_grad`][afnio.Variable.requires_grad]
    attributes of all input [`Variable`][afnio.Variable] instances into a single
    [`Variable`][afnio.Variable]. It supports both scalar and list
    [`data`][afnio.Variable.data] fields:

    - **Scalars**: Computes the arithmetic sum for numerical data (`int`, `float`)
      or concatenates all string values, wrapping each in `<ITEM></ITEM>` tags.
    - **Lists**: Aggregates the corresponding elements of the lists. For numerical
      data, it sums the corresponding elements. For string data, it concatenates them,
      wrapping each element in `<ITEM></ITEM>` tags.

    During backpropagation, the function distributes the gradient to all input
    [`Variable`][afnio.Variable] instances that require gradients.

    Examples:
        Example with scalar inputs:
        >>> x = Variable(data="abc", role="first input", requires_grad=True)
        >>> y = Variable(data="def", role="second input", requires_grad=False)
        >>> result = Sum.apply([x, y])
        >>> result.data
        '<ITEM>abc</ITEM><ITEM>def</ITEM>'
        >>> result.role
        'first input and second input'
        >>> result.requires_grad
        True
        >>> g = Variable(data="MY_FEEDBACK", role="add gradient")
        >>> result.backward(g)
        >>> x.grad[0].data
        'Here is the combined feedback we got for this specific first input and other variables: MY_FEEDBACK'
        >>> x.grad[0].role
        'feedback to first input'

        Example with batched inputs:
        >>> x = Variable(data=[1, 2, 3.5], role="first input", requires_grad=True)
        >>> y = Variable(data=[4, 5, 6], role="second input", requires_grad=False)
        >>> result = Sum.apply([x, y])
        >>> result.data
        [5, 7, 9.5]
        >>> result.role
        'first input and second input'
        >>> result.requires_grad
        True
    """  # noqa: E501

    @staticmethod
    def forward(ctx, x: List[Variable]) -> Variable:
        """
        Forward pass for summation.

        Warning:
            This method is invoked by
            [`apply()`][afnio.autodiff.function.Function.apply]
            and should not be called directly.

        Args:
            ctx: Context object used to save information for [`backward`][..backward]
                computation.
            x: A list of `Variable` instances to be summed.

        Returns:
            A new `Variable` instance representing the result of the summation, \
            with appropriately aggregated [`data`][afnio.Variable.data], \
            [`role`][afnio.Variable.role], and \
            [`requires_grad`][afnio.Variable.requires_grad] attributes.

        Raises:
            TypeError: If any element in `x` is not an instance
                of [`Variable`][afnio.Variable] or a sequence
                of [`Variable`][afnio.Variable] instances, or if addition between
                the [`data`][afnio.Variable.data] types is not allowed.
        """
        raise NotImplementedError(
            "Sum.forward is implemented on the server. "
            "Client-side execution is not supported."
        )

    @staticmethod
    def backward(ctx, grad_output: Variable) -> Tuple[Optional[Variable], ...]:
        """
        Backward pass for summation.

        Warning:
            This method is invoked by the autodiff engine
            and should not be called directly.

        Args:
            ctx: Context object containing saved information from the
                [`forward`][..forward] pass.
            grad_output: The gradient of the output `Variable` with respect to
                the output of the `forward()` method.

        Returns:
            grad_inputs: A tuple of gradients corresponding to each input `Variable`
                in the same order as the inputs to `forward()`. Each gradient is a
                `Variable` if the corresponding input requires gradients,
                otherwise `None`.

        Raises:
            TypeError: If `grad_output` is not an instance
                of [`Variable`][afnio.Variable].
        """
        raise NotImplementedError(
            "Sum.backward is implemented on the server. "
            "Client-side execution is not supported."
        )

forward(ctx, x) staticmethod

Forward pass for summation.

Warning

This method is invoked by apply() and should not be called directly.

Parameters:

Name Type Description Default
ctx

Context object used to save information for backward computation.

required
x list[Variable]

A list of Variable instances to be summed.

required

Returns:

Type Description
Variable

A new Variable instance representing the result of the summation, with appropriately aggregated data, role, and requires_grad attributes.

Raises:

Type Description
TypeError

If any element in x is not an instance of Variable or a sequence of Variable instances, or if addition between the data types is not allowed.

Source code in afnio/autodiff/basic_ops.py
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
@staticmethod
def forward(ctx, x: List[Variable]) -> Variable:
    """
    Forward pass for summation.

    Warning:
        This method is invoked by
        [`apply()`][afnio.autodiff.function.Function.apply]
        and should not be called directly.

    Args:
        ctx: Context object used to save information for [`backward`][..backward]
            computation.
        x: A list of `Variable` instances to be summed.

    Returns:
        A new `Variable` instance representing the result of the summation, \
        with appropriately aggregated [`data`][afnio.Variable.data], \
        [`role`][afnio.Variable.role], and \
        [`requires_grad`][afnio.Variable.requires_grad] attributes.

    Raises:
        TypeError: If any element in `x` is not an instance
            of [`Variable`][afnio.Variable] or a sequence
            of [`Variable`][afnio.Variable] instances, or if addition between
            the [`data`][afnio.Variable.data] types is not allowed.
    """
    raise NotImplementedError(
        "Sum.forward is implemented on the server. "
        "Client-side execution is not supported."
    )

backward(ctx, grad_output) staticmethod

Backward pass for summation.

Warning

This method is invoked by the autodiff engine and should not be called directly.

Parameters:

Name Type Description Default
ctx

Context object containing saved information from the forward pass.

required
grad_output Variable

The gradient of the output Variable with respect to the output of the forward() method.

required

Returns:

Name Type Description
grad_inputs tuple[Variable | None, ...]

A tuple of gradients corresponding to each input Variable in the same order as the inputs to forward(). Each gradient is a Variable if the corresponding input requires gradients, otherwise None.

Raises:

Type Description
TypeError

If grad_output is not an instance of Variable.

Source code in afnio/autodiff/basic_ops.py
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
@staticmethod
def backward(ctx, grad_output: Variable) -> Tuple[Optional[Variable], ...]:
    """
    Backward pass for summation.

    Warning:
        This method is invoked by the autodiff engine
        and should not be called directly.

    Args:
        ctx: Context object containing saved information from the
            [`forward`][..forward] pass.
        grad_output: The gradient of the output `Variable` with respect to
            the output of the `forward()` method.

    Returns:
        grad_inputs: A tuple of gradients corresponding to each input `Variable`
            in the same order as the inputs to `forward()`. Each gradient is a
            `Variable` if the corresponding input requires gradients,
            otherwise `None`.

    Raises:
        TypeError: If `grad_output` is not an instance
            of [`Variable`][afnio.Variable].
    """
    raise NotImplementedError(
        "Sum.backward is implemented on the server. "
        "Client-side execution is not supported."
    )

afnio.autodiff.basic_ops.Split

Bases: Function

Implements a split operation for Variable instances within the afnio framework, supporting automatic differentiation.

This class inherits from Function and requires both the forward and backward methods to be defined.

The Split function divides the data of the input Variable into multiple parts using a specified delimiter sep. If maxsplit is specified, the split operation is limited to a maximum number of splits. It handles both scalar and list data fields:

  • Scalars: The scalar data (a single string) is split into substrings based on the specified sep and maxsplit parameters.
  • Lists: Each element of the list data (strings) is split individually. If splits of varying lengths occur, shorter splits are automatically padded with empty strings to ensure consistent dimensions.

During backpropagation, feedback is collected and aggregated across all split parts. The combined feedback is propagated back to the original input Variable, allowing for the proper computation of gradients.

Examples:

Example with scalar inputs:

>>> x = Variable(data="afnio is great!", role="sentence", requires_grad=True)
>>> result = Split.apply(x, sep=" ", maxsplit=1)
>>> [var.data for var in result]
['afnio', 'is great!']
>>> result[0].role
'split part 0 of sentence'
>>> g_1 = Variable(data="MY_FIRST_FEEDBACK", role="gradient")
>>> g_2 = Variable(data="MY_SECOND_FEEDBACK", role="gradient")
>>> result[0].backward(g_1, retain_graph=True)
>>> result[1].backward(g_2)
>>> x.grad[0].data
'Here is the combined feedback we got for this specific sentence and other variables: <ITEM>MY_FIRST_FEEDBACK</ITEM><ITEM></ITEM>'
>>> x.grad[0].role
'feedback to sentence'
>>> x.grad[1].data
'Here is the combined feedback we got for this specific sentence and other variables: <ITEM></ITEM><ITEM>MY_SECOND_FEEDBACK</ITEM>'
>>> x.grad[1].role
'feedback to sentence'

Example with batched inputs:

>>> x = Variable(
...     data=["afnio is great!", "Deep learning"],
...     role="sentences",
...     requires_grad=True
... )
>>> result = Split.apply(x, sep=" ", maxsplit=2)
>>> [var.data for var in result]
[['afnio', 'Deep'], ['is', 'learning'], ['great!', '']]
>>> g = Variable(data="MY_FEEDBACK", role="gradient")
>>> result[1].backward(g)
>>> x.grad[0].data
'Here is the combined feedback we got for this specific sentences and other variables: <ITEM></ITEM><ITEM>MY_FEEDBACK</ITEM><ITEM></ITEM>'
>>> x.grad[0].role
'feedback to sentences'
Source code in afnio/autodiff/basic_ops.py
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
class Split(Function):
    """
    Implements a split operation for [`Variable`][afnio.Variable] instances within the
    `afnio` framework, supporting automatic differentiation.

    This class inherits from [`Function`][afnio.autodiff.function.Function] and
    requires both the [`forward`][afnio.autodiff.function.Function.forward] and
    [`backward`][afnio.autodiff.function.Function.backward] methods to be defined.

    The `Split` function divides the [`data`][afnio.Variable.data] of the input
    [`Variable`][afnio.Variable] into multiple parts using a specified delimiter `sep`.
    If `maxsplit` is specified, the split operation is limited to a maximum number of
    splits. It handles both scalar and list [`data`][afnio.Variable.data] fields:

    - **Scalars**: The scalar [`data`][afnio.Variable.data] (a single string) is split
        into substrings based on the specified `sep` and `maxsplit` parameters.
    - **Lists**: Each element of the list [`data`][afnio.Variable.data] (strings) is
        split individually. If splits of varying lengths occur, shorter splits are
        automatically padded with empty strings to ensure consistent dimensions.

    During backpropagation, feedback is collected and aggregated across all split parts.
    The combined feedback is propagated back to the original input
    [`Variable`][afnio.Variable], allowing for the proper computation of gradients.

    Examples:
        Example with scalar inputs:
        >>> x = Variable(data="afnio is great!", role="sentence", requires_grad=True)
        >>> result = Split.apply(x, sep=" ", maxsplit=1)
        >>> [var.data for var in result]
        ['afnio', 'is great!']
        >>> result[0].role
        'split part 0 of sentence'
        >>> g_1 = Variable(data="MY_FIRST_FEEDBACK", role="gradient")
        >>> g_2 = Variable(data="MY_SECOND_FEEDBACK", role="gradient")
        >>> result[0].backward(g_1, retain_graph=True)
        >>> result[1].backward(g_2)
        >>> x.grad[0].data
        'Here is the combined feedback we got for this specific sentence and other variables: <ITEM>MY_FIRST_FEEDBACK</ITEM><ITEM></ITEM>'
        >>> x.grad[0].role
        'feedback to sentence'
        >>> x.grad[1].data
        'Here is the combined feedback we got for this specific sentence and other variables: <ITEM></ITEM><ITEM>MY_SECOND_FEEDBACK</ITEM>'
        >>> x.grad[1].role
        'feedback to sentence'

        Example with batched inputs:
        >>> x = Variable(
        ...     data=["afnio is great!", "Deep learning"],
        ...     role="sentences",
        ...     requires_grad=True
        ... )
        >>> result = Split.apply(x, sep=" ", maxsplit=2)
        >>> [var.data for var in result]
        [['afnio', 'Deep'], ['is', 'learning'], ['great!', '']]
        >>> g = Variable(data="MY_FEEDBACK", role="gradient")
        >>> result[1].backward(g)
        >>> x.grad[0].data
        'Here is the combined feedback we got for this specific sentences and other variables: <ITEM></ITEM><ITEM>MY_FEEDBACK</ITEM><ITEM></ITEM>'
        >>> x.grad[0].role
        'feedback to sentences'
    """  # noqa: E501

    @staticmethod
    def forward(
        ctx,
        x: Variable,
        sep: Optional[Union[str, Variable]] = None,
        maxsplit: Optional[Union[int, Variable]] = -1,
    ) -> Tuple[Variable]:
        """
        Forward pass for splitting a `Variable`.

        Warning:
            This method is invoked by
            [`apply()`][afnio.autodiff.function.Function.apply]
            and should not be called directly.

        Args:
            ctx: Context object used to save information for [`backward`][..backward]
                computation.
            x: The input `Variable` to be split.
            sep: The delimiter to use for splitting the string. If `None`, splits on
                whitespace. Can be a string or a `Variable` containing a string.
            maxsplit: The maximum number of splits to perform. If `-1`, there is no
                limit on the number of splits. Can be an integer or a `Variable`
                containing an integer.

        Returns:
            A tuple of `Variable` instances resulting from the split operation, \
            each with appropriately assigned [`data`][afnio.Variable.data], \
            [`role`][afnio.Variable.role], and \
            [`requires_grad`][afnio.Variable.requires_grad] attributes.

        Raises:
            TypeError: If `x` is not an instance of [`Variable`][afnio.Variable] which
            [`data`][afnio.Variable.data] attribute is a string or a list of strings,
            or if `sep` is not a string or `Variable` containing a string,
            or if `maxsplit` is not an integer or `Variable` containing an integer.
        """
        raise NotImplementedError(
            "Split.forward is implemented on the server. "
            "Client-side execution is not supported."
        )

    # TODO: enable summarization of elements in `grad.data`` fields in `backward()`
    #       method using a LM (ideally a small and fast SLM)
    @staticmethod
    def backward(ctx, *grad_outputs: Variable) -> Tuple[Optional[Variable], None, None]:
        """
        Backward pass for splitting a `Variable`.

        Warning:
            This method is invoked by the autodiff engine
            and should not be called directly.

        Args:
            ctx: Context object containing saved information from the
                [`forward`][..forward] pass.
            grad_outputs: A variable number of gradients corresponding to each split
                output `Variable` with respect to the output of the `forward()` method.

        Returns:
            grad_input: The gradient of the output w.r.t. the input `Variable` if it
                requires gradients, otherwise `None`. The feedback from all split parts
                is combined and propagated back to the original input `Variable`.
            None: Placeholder for the `sep` argument, which does not require a gradient.
            None: Placeholder for the `maxsplit` argument, which does not require
                a gradient.

        Raises:
            TypeError: If any element in `x` is not an instance
                of [`Variable`][afnio.Variable] or a sequence
                of [`Variable`][afnio.Variable] instances.
        """
        raise NotImplementedError(
            "Split.backward is implemented on the server. "
            "Client-side execution is not supported."
        )

forward(ctx, x, sep=None, maxsplit=-1) staticmethod

Forward pass for splitting a Variable.

Warning

This method is invoked by apply() and should not be called directly.

Parameters:

Name Type Description Default
ctx

Context object used to save information for backward computation.

required
x Variable

The input Variable to be split.

required
sep str | Variable | None

The delimiter to use for splitting the string. If None, splits on whitespace. Can be a string or a Variable containing a string.

None
maxsplit int | Variable | None

The maximum number of splits to perform. If -1, there is no limit on the number of splits. Can be an integer or a Variable containing an integer.

-1

Returns:

Type Description
tuple[Variable]

A tuple of Variable instances resulting from the split operation, each with appropriately assigned data, role, and requires_grad attributes.

Raises:

Type Description
TypeError

If x is not an instance of Variable which

Source code in afnio/autodiff/basic_ops.py
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
@staticmethod
def forward(
    ctx,
    x: Variable,
    sep: Optional[Union[str, Variable]] = None,
    maxsplit: Optional[Union[int, Variable]] = -1,
) -> Tuple[Variable]:
    """
    Forward pass for splitting a `Variable`.

    Warning:
        This method is invoked by
        [`apply()`][afnio.autodiff.function.Function.apply]
        and should not be called directly.

    Args:
        ctx: Context object used to save information for [`backward`][..backward]
            computation.
        x: The input `Variable` to be split.
        sep: The delimiter to use for splitting the string. If `None`, splits on
            whitespace. Can be a string or a `Variable` containing a string.
        maxsplit: The maximum number of splits to perform. If `-1`, there is no
            limit on the number of splits. Can be an integer or a `Variable`
            containing an integer.

    Returns:
        A tuple of `Variable` instances resulting from the split operation, \
        each with appropriately assigned [`data`][afnio.Variable.data], \
        [`role`][afnio.Variable.role], and \
        [`requires_grad`][afnio.Variable.requires_grad] attributes.

    Raises:
        TypeError: If `x` is not an instance of [`Variable`][afnio.Variable] which
        [`data`][afnio.Variable.data] attribute is a string or a list of strings,
        or if `sep` is not a string or `Variable` containing a string,
        or if `maxsplit` is not an integer or `Variable` containing an integer.
    """
    raise NotImplementedError(
        "Split.forward is implemented on the server. "
        "Client-side execution is not supported."
    )

backward(ctx, *grad_outputs) staticmethod

Backward pass for splitting a Variable.

Warning

This method is invoked by the autodiff engine and should not be called directly.

Parameters:

Name Type Description Default
ctx

Context object containing saved information from the forward pass.

required
grad_outputs Variable

A variable number of gradients corresponding to each split output Variable with respect to the output of the forward() method.

()

Returns:

Name Type Description
grad_input Variable | None

The gradient of the output w.r.t. the input Variable if it requires gradients, otherwise None. The feedback from all split parts is combined and propagated back to the original input Variable.

None None

Placeholder for the sep argument, which does not require a gradient.

None None

Placeholder for the maxsplit argument, which does not require a gradient.

Raises:

Type Description
TypeError

If any element in x is not an instance of Variable or a sequence of Variable instances.

Source code in afnio/autodiff/basic_ops.py
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
@staticmethod
def backward(ctx, *grad_outputs: Variable) -> Tuple[Optional[Variable], None, None]:
    """
    Backward pass for splitting a `Variable`.

    Warning:
        This method is invoked by the autodiff engine
        and should not be called directly.

    Args:
        ctx: Context object containing saved information from the
            [`forward`][..forward] pass.
        grad_outputs: A variable number of gradients corresponding to each split
            output `Variable` with respect to the output of the `forward()` method.

    Returns:
        grad_input: The gradient of the output w.r.t. the input `Variable` if it
            requires gradients, otherwise `None`. The feedback from all split parts
            is combined and propagated back to the original input `Variable`.
        None: Placeholder for the `sep` argument, which does not require a gradient.
        None: Placeholder for the `maxsplit` argument, which does not require
            a gradient.

    Raises:
        TypeError: If any element in `x` is not an instance
            of [`Variable`][afnio.Variable] or a sequence
            of [`Variable`][afnio.Variable] instances.
    """
    raise NotImplementedError(
        "Split.backward is implemented on the server. "
        "Client-side execution is not supported."
    )