Papers Read on AI

Papers Read on AI header image 1
November 4, 2022  

On the Versatile Uses of Partial Distance Correlation in Deep Learning

November 4, 2022

Comparing the functional behavior of neural network models, whether it is a single network over time or two (or more networks) during or post-training, is an essential step in understanding what they are learning (and what they are not), and for identifying strategies for regularization or efficiency improvements. Despite recent progress, e.g., comparing vision transformers to CNNs, systematic comparison of function, especially across different networks, remains difficult and is often carried out layer by layer. Approaches such as canonical correlation analysis (CCA) are applicable in principle, but have been sparingly used so far. In this paper, we revisit a (less widely known) from statistics, called distance correlation (and its partial variant), designed to evaluate correlation between feature spaces of different dimensions.

2022: Xingjian Zhen, Zihang Meng, Rudrasis Chakraborty, Vikas Singh

https://arxiv.org/pdf/2207.09684v2.pdf