Parametric inference with universal function approximators

Staff working papers set out research in progress by our staff, with the aim of encouraging comments and debate.
Published on 08 March 2019

Staff Working Paper No. 784

By Andreas Joseph

Universal function approximators, such as artificial neural networks, can learn a large variety of target functions arbitrarily well given sufficient training data. This flexibility comes at the cost of the ability to perform parametric inference. We address this gap by proposing a generic framework based on the Shapley-Taylor decomposition of a model. A surrogate parametric regression analysis is performed in the space spanned by the Shapley value expansion of a model. This allows for the testing of standard hypotheses of interest. At the same time, the proposed approach provides novel insights into statistical learning processes themselves derived from the consistency and bias properties of the nonparametric estimators. We apply the framework to the estimation of heterogeneous treatment effects in simulated and real-world randomised experiments. We introduce an explicit treatment function based on higher-order Shapley-Taylor indices. This can be used to identify potentially complex treatment channels and help the generalisation of findings from experimental settings. More generally, the presented approach allows for a standardised use and communication of results from machine learning models. 

This version was updated in July 2020. The Staff Working Paper was first published on 8 March 2019 under the title ‘Shapley regressions: a framework for statistical inference on machine learning models’.

PDFParametric inference with universal function approximators

The code and data to reproduce all analyses presented in this paper are provided on GitHub.