Skip to content

afnio.autodiff.grad_mode

afnio.autodiff.grad_mode.is_grad_enabled()

Check whether grad mode is currently enabled.

Returns:

Type Description
bool

True if grad mode is currently enabled, False otherwise.

Source code in afnio/autodiff/grad_mode.py
 9
10
11
12
13
14
15
16
def is_grad_enabled() -> bool:
    """
    Check whether grad mode is currently enabled.

    Returns:
        `True` if grad mode is currently enabled, `False` otherwise.
    """
    return getattr(_grad_enabled, "enabled", True)

afnio.autodiff.grad_mode.set_grad_enabled(mode)

Set the global state of gradient calculation on or off.

set_grad_enabled will enable or disable gradients based on its argument mode.

Parameters:

Name Type Description Default
mode bool

If True, enables gradient calculation. If False, disables it.

required

Examples:

>>> x = afnio.Variable("Hello", requires_grad=True)
>>> _ = afnio.set_grad_enabled(True)
>>> y = x + x
>>> y.requires_grad
True
>>> _ = afnio.set_grad_enabled(False)
>>> y = x + x
>>> y.requires_grad
False
Source code in afnio/autodiff/grad_mode.py
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
def set_grad_enabled(mode: bool):
    """
    Set the global state of gradient calculation on or off.

    `set_grad_enabled` will enable or disable gradients based on its argument `mode`.

    Args:
        mode: If `True`, enables gradient calculation. If `False`, disables it.

    Examples:
        >>> x = afnio.Variable("Hello", requires_grad=True)
        >>> _ = afnio.set_grad_enabled(True)
        >>> y = x + x
        >>> y.requires_grad
        True
        >>> _ = afnio.set_grad_enabled(False)
        >>> y = x + x
        >>> y.requires_grad
        False
    """
    _grad_enabled.enabled = mode

afnio.autodiff.grad_mode.no_grad()

Context manager that disables gradient calculation. All operations within this block will not track gradients, making them more memory-efficient.

Disabling gradient calculation is useful for inference, when you are sure that you will not call Variable.backward(). It will reduce memory consumption for computations that would otherwise have requires_grad=True.

In this mode, the result of every computation will have requires_grad=False, even when the inputs have requires_grad=True. There is an exception! All factory functions, or functions that create a new Variable and take a requires_grad kwarg, will NOT be affected by this mode.

This context manager is thread local; it will not affect computation in other threads.

Also functions as a decorator.

Examples:

>>> x = afnio.Variable("abc", role="variable", requires_grad=True)
>>> with afnio.no_grad():
...     y = x + x
>>> y.requires_grad
False
>>> @afnio.no_grad()
... def doubler(x):
...     return x + x
>>> z = doubler(x)
>>> z.requires_grad
False
>>> @afnio.no_grad()
... def tripler(x):
...     return x + x + x
>>> z = tripler(x)
>>> z.requires_grad
False
>>> # factory function exception
>>> with afnio.no_grad():
...     a = afnio.cognitive.Parameter("xyz")
>>> a.requires_grad
True
Source code in afnio/autodiff/grad_mode.py
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
@contextmanager
def no_grad():
    """
    Context manager that disables gradient calculation. All operations within this block
    will not track gradients, making them more memory-efficient.

    Disabling gradient calculation is useful for inference, when you are sure
    that you will not call [`Variable.backward()`][afnio.Variable.backward]. It will
    reduce memory consumption for computations that would otherwise have
    `requires_grad=True`.

    In this mode, the result of every computation will have
    `requires_grad=False`, even when the inputs have `requires_grad=True`.
    There is an exception! All factory functions, or functions that create
    a new Variable and take a requires_grad kwarg, will NOT be affected by
    this mode.

    This context manager is thread local; it will not affect computation
    in other threads.

    Also functions as a decorator.

    Examples:
        >>> x = afnio.Variable("abc", role="variable", requires_grad=True)
        >>> with afnio.no_grad():
        ...     y = x + x
        >>> y.requires_grad
        False
        >>> @afnio.no_grad()
        ... def doubler(x):
        ...     return x + x
        >>> z = doubler(x)
        >>> z.requires_grad
        False
        >>> @afnio.no_grad()
        ... def tripler(x):
        ...     return x + x + x
        >>> z = tripler(x)
        >>> z.requires_grad
        False
        >>> # factory function exception
        >>> with afnio.no_grad():
        ...     a = afnio.cognitive.Parameter("xyz")
        >>> a.requires_grad
        True
    """
    previous_state = is_grad_enabled()  # Store the current state
    set_grad_enabled(False)  # Disable gradients
    try:
        yield  # Execute the block
    finally:
        set_grad_enabled(previous_state)  # Restore the original state