Papers Read on AI

Papers Read on AI header image 1
October 31, 2022  

TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second

October 31, 2022

We present TabPFN, a trained Transformer that can do supervised classification for small tabular datasets in less than a second , needs no hyperparameter tuning and is competitive with state-of-the-art classification methods. TabPFN is fully entailed in the weights of our network, which accepts training and test samples as a set-valued input and yields predictions for the entire test set in a single forward pass. TabPFN is a Prior-Data Fitted Network (PFN) and is trained offline once, to approximate Bayesian inference on synthetic datasets drawn from our prior.

2022: Noah Hollmann, Samuel Muller, Katharina Eggensperger, F. Hutter