Skip to content

afnio.autodiff.graph

afnio.autodiff.graph.Node

Base class for nodes in the autograd computation graph.

A Node represents a single operation in the backward graph and is responsible for producing gradients during backpropagation. Each Node may have multiple inputs and outputs and is connected to other nodes via GradientEdge objects.

Attributes:

Name Type Description
next_functions tuple[GradientEdge]

The gradient edges to this node's parent nodes in the backward graph.

Source code in afnio/autodiff/graph.py
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
class Node:
    """
    Base class for nodes in the autograd computation graph.

    A Node represents a single operation in the backward graph and is
    responsible for producing gradients during backpropagation. Each Node
    may have multiple inputs and outputs and is connected to other nodes
    via [`GradientEdge`][..GradientEdge] objects.

    Attributes:
        next_functions: The gradient edges to this node's parent nodes
            in the backward graph.
    """

    def __init__(self, next_functions: Optional[Tuple["GradientEdge"]] = None):
        self._next_functions = next_functions if next_functions else ()
        self._name = None
        self.node_id = None

    def __repr__(self):
        return f"<afnio.autodiff.function.{self.name()} object at {hex(id(self))}>"

    def __str__(self):
        return f"<{self.name()} object at {hex(id(self))}>"

    def apply(self, *args):
        raise NotImplementedError("Subclasses should implement this method.")

    def name(self) -> str:
        """
        Returns the name of the backward node.

        Returns:
            The name of the backward node, which is typically the name of the forward operation that produced the output associated with this node.

        Examples:
            >>> import afnio
            >>> import afnio.cognitive.functional as F
            >>> a = afnio.Variable("Hello,", requires_grad=True)
            >>> b = afnio.Variable("world!", requires_grad=True)
            >>> c = F.sum([a, b])
            >>> assert isinstance(c.grad_fn, afnio.autodiff.graph.Node)
            >>> print(c.grad_fn.name())
            SumBackward0
        """  # noqa: E501
        return self._name

    @property
    def next_functions(self) -> Tuple["GradientEdge"]:
        """
        Returns the gradient edges to this node's parent nodes in the backward graph.

        Each entry is a tuple of `(Node, output_nr)` identifying where gradients
        should be propagated next during backpropagation.

        Returns:
            A tuple of [`GradientEdge`][..GradientEdge] objects representing the edges to this node's parent nodes in the backward graph.
        """  # noqa: E501
        return self._next_functions

    @next_functions.setter
    def next_functions(self, edges: Tuple["GradientEdge", ...]):
        self._next_functions = edges

next_functions property writable

Returns the gradient edges to this node's parent nodes in the backward graph.

Each entry is a tuple of (Node, output_nr) identifying where gradients should be propagated next during backpropagation.

Returns:

Type Description
tuple[GradientEdge]

A tuple of [GradientEdge][afnio.autodiff.graph.Node.GradientEdge] objects representing the edges to this node's parent nodes in the backward graph.

name()

Returns the name of the backward node.

Returns:

Type Description
str

The name of the backward node, which is typically the name of the forward operation that produced the output associated with this node.

Examples:

>>> import afnio
>>> import afnio.cognitive.functional as F
>>> a = afnio.Variable("Hello,", requires_grad=True)
>>> b = afnio.Variable("world!", requires_grad=True)
>>> c = F.sum([a, b])
>>> assert isinstance(c.grad_fn, afnio.autodiff.graph.Node)
>>> print(c.grad_fn.name())
SumBackward0
Source code in afnio/autodiff/graph.py
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
def name(self) -> str:
    """
    Returns the name of the backward node.

    Returns:
        The name of the backward node, which is typically the name of the forward operation that produced the output associated with this node.

    Examples:
        >>> import afnio
        >>> import afnio.cognitive.functional as F
        >>> a = afnio.Variable("Hello,", requires_grad=True)
        >>> b = afnio.Variable("world!", requires_grad=True)
        >>> c = F.sum([a, b])
        >>> assert isinstance(c.grad_fn, afnio.autodiff.graph.Node)
        >>> print(c.grad_fn.name())
        SumBackward0
    """  # noqa: E501
    return self._name

afnio.autodiff.graph.GradientEdge

Bases: NamedTuple

Object representing a given gradient edge within the autodiff backward graph.

A GradientEdge identifies where a gradient should be propagated during backpropagation. It points to a specific backward node and a specific output index of that node.

Each [Variable][afnio.autodiff.graph.Variable] that participates in autodiff is associated with one or more GradientEdge objects, which define how gradients flow through the computation graph.

To get the gradient edge where a given Variable gradient will be computed, you can do edge = autodiff.graph.get_gradient_edge(variable).

Attributes:

Name Type Description
node Node

The backward node responsible for producing gradients. This is an instance of a backward Function (e.g. AccumulateGrad, SumBackward0, ChatCompletionBackward0).

output_nr int

The index of the output of node to which this edge corresponds. Required because a backward node may produce multiple gradient outputs.

Source code in afnio/autodiff/graph.py
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
class GradientEdge(NamedTuple):
    """
    Object representing a given gradient edge within the autodiff backward graph.

    A GradientEdge identifies *where* a gradient should be propagated during
    backpropagation. It points to a specific backward node and a specific
    output index of that node.

    Each [`Variable`][..Variable] that participates in autodiff is associated with
    one or more GradientEdge objects, which define how gradients flow through the
    computation graph.

    To get the gradient edge where a given Variable gradient will be computed,
    you can do `edge = autodiff.graph.get_gradient_edge(variable)`.

    Attributes:
        node: The backward node responsible for producing gradients.
            This is an instance of a backward Function (e.g. AccumulateGrad,
            SumBackward0, ChatCompletionBackward0).
        output_nr: The index of the output of `node` to which this edge corresponds.
            Required because a backward node may produce multiple gradient
            outputs.
    """

    node: Node
    output_nr: int

    def __repr__(self):
        name = (
            f"<{self.node.name()} object at {hex(id(self.node))}>"
            if self.node
            else "None"
        )
        return f"({name}, {self.output_nr})"

    def __str__(self):
        name = (
            f"<{self.node.name()} object at {hex(id(self.node))}>"
            if self.node
            else "None"
        )
        return f"({name}, {self.output_nr})"