Trustworthy Bayesian Perceptrons

  • Autor:

    Markus Walker, Hayk Amirkhanian, Marco F. Huber, Uwe D. Hanebeck

  • Quelle:

    Proceedings of the 27th International Conference on Information Fusion (FUSION 2024), Venice, Italy, Trustworthy Bayesian Perceptrons

  • Datum: 8.-11. Juli, 2024
  • Abstract:

    Bayesian Neural Networks (BNNs) offer a sophisticated framework for extending classical neural network point estimates to encompass predictive distributions. Despite the high potential of BNNs, established BNN training methods such as Variational Inference (VI) and Markov Chain Monte Carlo (MCMC) grapple with issues such as scalability and hyperparameter dependence. In addressing these issues, our research focuses on the fundamental elements of BNNs, in particular perceptrons and their predictive capabilities. We introduce a new perspective on the closed-form solution for backward-pass computation for the Bayesian perceptron and prove that the state-of-the-art solution is equivalent to statistical linearization. To assess the efficacy of Bayesian perceptrons and provide insights into their performance in distinct input space regions, a novel methodology utilizing k-d trees as a space partitioning method is introduced to evaluate prediction quality within specific input space regions.