View Complete Reference

Sahu, SK, Java, A and Shaikh, A (2021)

Rethinking Neural Networks with Benford’s Law

Proceedings of Fourth Workshop on Machine Learning and the Physical Sciences (NeurIPS 2021).

ISSN/ISBN: Not available at this time. DOI: Not available at this time.



Abstract: Benford’s Law (BL) or the Significant Digit Law defines the probability distribution of the first digit of numerical values in a data sample. This Law is observed in many datasets. It can be seen as a measure of naturalness of a given distribution and finds its application in areas like anomaly and fraud detection. In this work, we address the following question: Is the distribution of the Neural Network parameters related to the network’s generalization capability? To that end, we first define a metric, MLH (Model Enthalpy), that measures the closeness of a set of numbers to BL. Second, we use MLH as an alternative to Validation Accuracy for Early Stopping and provide experimental evidence that even if the optimal size of the validation set is known beforehand, the peak test accuracy attained is lower than early stopping with MLH i.e. not using a validation set at all.


Bibtex:
@inProceedings {, AUTHOR = {Surya Kant Sahu and Abhinav Java and Arshad Shaikh}, TITLE = {Rethinking Neural Networks with Benford’s Law}, BOOKTITLE = {Proceedings of Fourth Workshop on Machine Learning and the Physical Sciences (NeurIPS 2021)}, YEAR = {2021}, URL = {https://ml4physicalsciences.github.io/2021/files/NeurIPS_ML4PS_2021_99.pdf }, }


Reference Type: Conference Paper

Subject Area(s): Computer Science