Most differentiable programming frameworks work by constructing a graph containing the control flow and data structures in the program.[7] Attempts generally fall into two groups:
Static, compiled graph-based approaches such as TensorFlow,[note 1]Theano, and MXNet. They tend to allow for good compiler optimization and easier scaling to large systems, but their static nature limits interactivity and the types of programs that can be created easily (e.g. those involving loops or recursion), as well as making it harder for users to reason effectively about their programs.[7] A proof of concept compiler toolchain called Myia uses a subset of Python as a front end and supports higher-order functions, recursion, and higher-order derivatives.[8][9][10]
Operator overloading, dynamic graph based approaches such as PyTorch and NumPy's autograd package. Their dynamic and interactive nature lets most programs be written and reasoned about more easily. However, they lead to interpreter overhead (particularly when composing many small operations), poorer scalability, and reduced benefit from compiler optimization.[9][10] A package for the Julia programming language – Zygote – works directly on Julia's intermediate representation, allowing it to still be optimized by Julia's just-in-time compiler.[7][11][5]
A limitation of earlier approaches is that they are only able to differentiate code written in a suitable manner for the framework, limiting their interoperability with other programs. Newer approaches resolve this issue by constructing the graph from the language's syntax or IR, allowing arbitrary code to be differentiated.[7][9]
↑ 5.05.15.25.3Innes, Mike; Edelman, Alan; Fischer, Keno; Rackauckas, Chris; Saba, Elliot; Viral B Shah; Tebbutt, Will (2019), ∂P: A Differentiable Programming System to Bridge Machine Learning and Scientific Computing
↑Degrave, Jonas; Hermans, Michiel; Dambre, Joni; wyffels, Francis (2016-11-05). "A Differentiable Physics Engine for Deep Learning in Robotics". arXiv:1611.01652 [cs.NE].
↑Li, Li; Hoyer, Stephan; Pederson, Ryan; Sun, Ruoxi; Cubuk, Ekin D.; Riley, Patrick; Burke, Kieron (2021). "Kohn-Sham Equations as Regularizer: Building Prior Knowledge into Machine-Learned Physics". Physical Review Letters126 (3): 036401. doi:10.1103/PhysRevLett.126.036401. PMID33543980. Bibcode: 2021PhRvL.126c6401L.