Speaker
Description
Information theory serves as a practical tool for converting intuitive information into quantifiable numerical values. One key measure, Shannon entropy, plays a role in measuring information content within probability distribution functions. Similarly, mutual information proves valuable in determining correlations between variables, even if they are non-linear. Despite its potential, information theory has not been overlooked in the current landscape of machine learning. This presentation aims to address this gap by presenting relevant measurement metrics that could contribute to the development of machine learning-based approaches in the analysis of experimental data.