Tangent that is developed by Alex Wiltschko, Bart van Merriënboer and Dan Moldovan is a free, new, and open-source Python library that is used basically for automatic differentiation and where all the implementation of automatic differentiation of the Existing libraries is done by the tracing of a program’s execution at runtime, like PyTorch or by staging out a dynamic data-flow graph and then differentiating the graph ahead-of-time, like that in TensorFlow. In contrast to be very specific, Tangent performs ahead-of-time autodiff on the Python source code itself, and produces Python source code as its output.
"In contrast to existing machine learning libraries, Tangent is a source-to-source system, consuming a Python function f and emitting a new Python function that computes the gradient of f. This allows much better user visibility into gradient computations, as well as easy user-level editing and debugging of gradients." says the site.
You can therefore use Tangent to generate gradient code in Python that is not only easy to interpret but also to debug and modify as well.
Tangent is also said to come with many more features for debugging and designing machine learning models a few of which include:
Easily debug your backward pass
Fast gradient surgery
Forward mode automatic differentiation
Efficient Hessian-vector products
Tangent also supports the following listed below:
Using TensorFlow Eager functions, for the processing of the arrays of numbers.
You can also very easily inspect and debug your own models that are written in Tangent, without the help of any special tools or indirection. Tangent works on a large and growing subset of Python and provides extra autodiff features that the other Python Machine Learning libraries don’t presently have, is comparatively quite high-performance, and is compatible with TensorFlow and NumPy as well.
Tangent is useful when it comes to researchers and students who not only want to write their models in Python, but also aim at reading and debugging automatically generated derivative code without sacrificing between both speed and flexibility.
The current release of this project is experimental as of now, is open sourced, and also under active development as well. There might be API changes that are expected as Tangent continues its process to build, and respond to feedback from the community.
When we talk about its performance, Although Tangent when made was not built for performance and it is competitive with major ML libraries.
"Because we are generating derivatives ahead-of-time, there is no interpretive overhead like there is with runtime autodiff libraries. We implemented a few compiler optimizations (dead code elimination, and constant folding), but we are still working on extra optimization passes to further increase performance." as written on the official site.
More Information: GitHub