The 10th Israel Machine Vision Conference (IMVC) took place on March 18, 2019 at Pavilion 10, EXPO Tel Aviv.
Prof. Naftali Tishby spoke at a conference on “The Information Theory of Deep Learning”.

While a comprehensive theory of deep learning is still to be developed, one of the most interesting existing theoretical frameworks is the information bottleneck theory of deep neural networks. In this framework, we consider the information flow from the input layer through the layers, which form a cascade of filters of the irrelevant information about the desired label. The theory explains how stochastic gradient descent can achieve optimal internal representations, layer by layer, and what each layer represents. Moreover, it explains the Computational benefits of the hidden layers and the specific information encoded by the weights of each layer in the deep network.

Naftali Tishby is a professor of Computer Science and Computational Neuroscience at the Hebrew University. He is one of the founders of Machine Learning research in Israel. His research is at the interfaces between physics, biology, and computer science and he is known for his integration of information theory, control, and learning theory. Among his best-known contributions are the information bottleneck method, information constrained reinforcement learning, and the information theory of deep neural networks.

Legal Disclaimer:
You understand that when using the Site you may be exposed to content from a variety of sources, and that SagivTech is not responsible for the accuracy, usefulness, safety or intellectual property rights of, or relating to, such content and that such content does not express SagivTech’s opinion or endorsement of any subject matter and should not be relied upon as such. SagivTech and its affiliates accept no responsibility for any consequences whatsoever arising from use of such content. You acknowledge that any use of the content is at your own risk.