Mygrad

Latest version: v2.2.0

Safety actively analyzes 628924 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 3

1.8.0

1.7.1

Fixes a bug in :func:`~mygrad.nnet.losses.negative_log_likelihood`, where setting ``constant=True`` had no effect.

1.7.0

This release continues the process of integrating functions from `mynn <https://github.com/davidmascharka/MyNN>`_.

New features:

- Adds :func:`~mygrad.nnet.initializers.glorot_normal`
- Adds :func:`~mygrad.nnet.initializers.glorot_uniform`
- Adds :func:`~mygrad.nnet.initializers.he_normal`
- Adds :func:`~mygrad.nnet.initializers.he_uniform`
- Adds :func:`~mygrad.nnet.initializers.normal`
- Adds :func:`~mygrad.nnet.initializers.uniform`
- Adds :func:`~mygrad.nnet.losses.focal_loss`
- Adds :func:`~mygrad.nnet.losses.negative_log_likelihood`

Big thanks to David Mascharka!

Improvements:

The interfaces to :func:`~mygrad.reshape` and :func:`~mygrad.Tensor.reshape` were adjusted to match exactly the interfaces to their NumPy counterparts.
I.e. :func:`~mygrad.reshape` now requires ``newshape`` to be a sequence, whereas :func:`~mygrad.Tensor.reshape` can accept an unpacked sequence for its
``newshape``.

:func:`~mygrad.Tensor.shape` is now settable - triggering an in-place reshape of a tensor, matching the corresponding behavior in NumPy.

Internal changes:

The logic for writing an in-place operation has been consolidated into a convenient wrapper: :func:`~mygrad.Tensor._in_place_op`.

1.6.0

Page 3 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.