Abstract
Algorithmic differentiation (AD) by source-transformation is an established method for computing derivatives of computational algorithms. Static dataflow analysis is commonly used by AD tools to determine the set of active variables, that is, variables that are influenced by the program input in a differentiable way and have a differentiable influence on the program output. In this work, a context-sensitive static analysis combined with procedure cloning is used to generate specialised versions of differentiated procedures for each call site. This enables better detection and elimination of unused computations and memory storage, resulting in performance improvements of the generated code, in both forward- and reverse-mode AD. The implications of this multi-activity AD approach on the static analysis of an AD tool is shown using dataflow equations. The worst-case cost of multi-activity AD on the differentiation process is analysed and practical remedies to avoid running into this worst case are presented. The method was implemented in the AD tool Tapenade, and we present its application to a 3D unstructured compressible flow solver, for which we generate an adjoint solver that performs significantly faster when multi-activity AD is used.
- Christian Bischof, Alan Carle, George Corliss, Andreas Griewank, and Paul Hovland. 1992. ADIFOR--generating derivative codes from Fortran programs. Sci. Program. 1, 1 (1992), 11--29. Google ScholarDigital Library
- Martin Bücker, Bruno Lang, Arno Rasch, Christian H. Bischof, and Dieter an Mey. 2002. Explicit loop scheduling in OpenMP for parallel automatic differentiation. In Proceedings of the 16th Annual International Symposium on High Performance Computing Systems and Applications, 2002. 16 (2002), 121--126. Google ScholarDigital Library
- Faidon Christakopoulos, Dominic Jones, and Jens-Dominik Müller. 2011. Pseudo-timestepping and verification for automatic differentiation derived CFD codes. Comput, Fluids 46, 1 (2011), 174--179.Google ScholarCross Ref
- Bruce Christianson. 1994. Reverse accumulation and attractive fixed points. Optim, Methods Softw, 3, 4 (1994), 311--326.Google ScholarCross Ref
- Mike Fagan and Alan Carle. 2004. Activity Analysis in ADIFOR: Algorithms and Effectiveness. Technical Report. Technical Report TR04-21, Department of Computational and Applied Mathematics, Rice University, Houston, TX.Google Scholar
- Ralf Giering. 1999. Tangent Linear and Adjoint Model Compiler, Users Manual 1.4.Google Scholar
- Ralf Giering and Thomas Kaminski. 1998. Recipes for adjoint code construction. ACM Trans, Math, Softw, 24, 4 (1998), 437--474. Google ScholarDigital Library
- Ralf Giering, Thomas Kaminski, Ricardo Todling, Ronald Errico, Ronald Gelaro, and Nathan Winslow. 2006. Tangent linear and adjoint versions of NASA/GMAO’s Fortran 90 global weather forecast model. In Automatic Differentiation: Applications, Theory, and Implementations. Springer, 275--284.Google Scholar
- Michael B. Giles. 2000. On the use of Runge-Kutta Time-marching and Multigrid for the Solution of Steady Adjoint Equations. Technical Report. Oxford University Computing Laboratory.Google Scholar
- Andreas Griewank and Andrea Walther. 2000. Algorithm 799: Revolve: An implementation of checkpointing for the reverse or adjoint mode of computational differentiation. ACM Trans. Math. Softw. 26, 1 (March 2000), 19--45. Google ScholarDigital Library
- Laurent Hascoët, Stefka Fidanova, and Christophe Held. 2002. Adjoining independent computations. In Automatic Differentiation of Algorithms, from Simulation to Optimization. Springer, New York, 299--304. Google ScholarDigital Library
- Laurent Hascoët and Valérie Pascual. 2012. The Tapenade Automatic Differentiation Tool: Principles, Model, and Specification. INRIA Research Report RR-7957. 53 pages.Google Scholar
- Laurent Hascoet and Valérie Pascual. 2013. The tapenade automatic differentiation tool: Principles, model, and specification. ACM Trans. Math. Softw. 39, 3, Article 20 (May 2013), 43 pages. Google ScholarDigital Library
- Gilles Kahn. 1987. Natural Semantics. Springer.Google Scholar
- Barbara Kreaseck, Luis Ramos, Scott Easterday, Michelle Strout, and Paul Hovland. 2006. Hybrid static/dynamic activity analysis. In Computational Science--ICCS 2006. Springer, 582--590. Google ScholarDigital Library
- J.-D. Müller and P. Cusdin. 2005. On the performance of discrete adjoint CFD codes using automatic differentiation. Int. J. Numer. Methods Fluids 47, 8--9 (2005), 939--945.Google ScholarCross Ref
- Uwe Naumann. 2004. Optimal accumulation of jacobian matrices by elimination methods on the dual computational graph. Math. Program. 99, 3 (April 2004), 399--421.Google ScholarDigital Library
- Jaewook Shin and Paul D. Hovland. 2007. Comparison of two activity analyses for automatic differentiation: Context-sensitive flow-insensitive vs. context-insensitive flow-sensitive. In Proceedings of the 2007 ACM Symposium on Applied Computing (SAC’07). ACM, New York, NY, 1323--1329. Google ScholarDigital Library
- Jaewook Shin, Priyadarshini Malusare, and Paul D. Hovland. 2007. Design and implementation of a context-sensitive, flow-sensitive activity analysis algorithm for automatic differentiation. In 5th International Conference on Automatic Differentiation (AD’08), Vol. 64. Lect. Notes Comput. Sci. Eng., Bonn, Germany, 115--125.Google Scholar
- Jean Utke, Uwe Naumann, Mike Fagan, Nathan Tallent, Michelle Strout, Patrick Heimbach, Chris Hill, and Carl Wunsch. 2008. OpenAD/F: A modular open-source tool for automatic differentiation of fortran codes. ACM Trans. Math. Softw. 34, 4, Article 18 (Jul. 2008), 36 pages. Google ScholarDigital Library
- Qiqi Wang, Parviz Moin, and Gianluca Iaccarino. 2009. Minimal repetition dynamic checkpointing algorithm for unsteady adjoint calculation. SIAM J. Sci. Comput. 31, 4 (2015/05/03 2009), 2549--2567. Google ScholarDigital Library
- Shenren Xu, David Radford, Marcus Meyer, and Jens-Dominik Müller. 2015. Stabilisation of discrete steady adjoint solvers. J. Comput. Phys. 299 (2015), 175--195. Google ScholarDigital Library
Index Terms
- Algorithmic Differentiation of Code with Multiple Context-Specific Activities
Recommendations
Comparison of two activity analyses for automatic differentiation: context-sensitive flow-insensitive vs. context-insensitive flow-sensitive
SAC '07: Proceedings of the 2007 ACM symposium on Applied computingAutomatic differentiation (AD) is a family of techniques to generate derivative code from a mathematical model expressed in a programming language. AD computes partial derivatives for each operation in the input code and combines them to produce the ...
The Tapenade automatic differentiation tool: Principles, model, and specification
Tapenade is an Automatic Differentiation (AD) tool which, given a Fortran or C code that computes a function, creates a new code that computes its tangent or adjoint derivatives. Tapenade puts particular emphasis on adjoint differentiation, which ...
Algorithmic Differentiation of Numerical Methods: Tangent and Adjoint Solvers for Parameterized Systems of Nonlinear Equations
We discuss software tool support for the algorithmic differentiation (AD), also known as automatic differentiation, of numerical simulation programs that contain calls to solvers for parameterized systems of n nonlinear equations. The local ...
Comments