Differentiable programming
Differentiable programming is a programming paradigm in which a numeric computer program can be differentiated throughout via automatic differentiation. This allows for gradient-based optimization of parameters in the program, often via gradient descent, and other learning approaches based on higher-order derivative information. Differentiable programming has found use in a wide variety of areas, particularly scientific computing and artificial intelligence. One of the early proposals to adopt such a framework in a systematic fashion to improve upon learning algorithms was made by the Advanced Concepts Team at the European Space Agency in early 2016.
Differentiable Programming refers to utilizing automatic differentiation in some way that allows a program to optimize its parameters in order to get better at some task. It requires only three things:
- A parameterized function/method / model to be optimized
- A loss that is suitable to measure performance, and
- (Automatic) differentiability of the object to be optimized
Approaches:
- Static, compiled graph-based approaches such as Tensor Flow,[note 1] Theano, and MXNet. They tend to allow for good compiler optimization and easier scaling to large systems, but their static nature limits interactivity and the types of programs that can be created easily (e.g. those involving loops or recursion), as well as making it harder for users to reason effectively about their programs. A proof-of-concept compiler toolchain called Myia uses a subset of Python as a front end and supports higher-order functions, recursion, and higher-order derivatives.
- Operator overloading, dynamic graph-based approaches such as PyTorch and AutoGrad. Their dynamic and interactive nature lets most programs be written and reasoned about more easily. However, they lead to interpreter overhead (particularly when composing many small operations), poorer scalability, and reduced benefit from compiler optimization. A package for the Julia programming language — Zygote — works directly on Julia’s intermediate representation, allowing it to still be optimized by Julia’s just-in-time compiler.
Organizations that use Differential Programming:
Deep Mind, Open AI, Mila, Google Brain, Meta AI
Applications:
- Scientific Machine Learning
- Universal Differential Equations
- Probabilistic Programming
- Reinforcement Learning
- Physics Engine in Robotics
- Differentiable Ray Tracing
Follow for more such content,
Thank You!