Tutorial And Workshop

1559601270605

Farzan Farnia

Reliable and Concise Interpretation of Deep Neural Networks

Kaevh Vaezi (1)

Kaveh Vaezi

5G Trends: From Standardization to the Market

Tutorial Title:

Submodular Optimization and Learning: From Discrete to Continuous and Back

Wednesday, 19 May 2021, 16:30-19:30

Tutorial Abstract:

This tutorial will cover recent advancements in discrete optimization methods prevalent in large-scale machine learning problems. Traditionally, machine learning has been harnessing convex optimization to

design fast algorithms with provable guarantees for a broad range of applications. In recent years, however, there has been a surge of interest in applications that involve discrete optimization. For discrete domains, the analog of convexity is considered to be submodularity, and the evolving theory of submodular optimization has been a catalyst for progress in extraordinarily varied application areas including active learning and experimental design, vision, sparse reconstruction, graph inference, video analysis, clustering, document summarization, object detection, information retrieval, network inference, interpreting neural network, and discrete adversarial attacks.

As applications and techniques of submodular optimization mature, a fundamental gap between theory and application emerges. In the past decade, paradigms such as large-scale learning, distributed systems, and sequential decision making have enabled a quantum leap in the performance of learning methodologies.   Incorporating these paradigms in discrete problems has led to fundamentally new frameworks for submodular optimization. The goal of this tutorial is to cover rigorous and scalable foundations for discrete optimization in complex, dynamic environments, addressing challenges of scalability and uncertainty, and facilitating distributed and sequential learning in broader discrete settings.

Biography:

Hamed Hassani is currently an assistant professor of the department of Electrical and Systems Engineering as well as the department of Computer and Information Sciences at the University of Pennsylvania. Prior to that, he was a research fellow at Simons Institute for the Theory of Computing (UC Berkeley) affiliated with the program of Foundations of Machine Learning, and a post-doctoral researcher in the Institute of Machine Learning at ETH Zurich. He received a Ph.D. degree in Computer and Communication Sciences from EPFL, Lausanne. He is the recipient of the 2014 IEEE Information Theory Society Thomas M. Cover Dissertation Award, 2015 IEEE International Symposium on Information Theory Student Paper Award, 2017 Simons-Berkeley Fellowship, 2018 NSF-CRII Research Initiative Award, 2020 Air Force Office of Scientific Research (AFOSR) Young Investigator Award, 2020 National Science Foundation (NSF) CAREER Award, and 2020 Intel Rising Star Award.

Tutorial Title:

Concentration of Measures with Application in Information Theory and Machine Learning

Thursday, 20 May 2021, 9:30-12:30

Tutorial Abstract:

It is well-known that the average of independent and identically distributed random variables tightly concentrates around its mean. However, it is discovered that such concentration is true for a class of functions of many independent random variables which do depend slightly on the value of each variables. This class of functions arises in many application from convex geometry (as presumably the origin concentration of measure)  to statistics, learning, functional analysis, information theory, etc.

In this tutorial, we will discus some interesting inequalities with the aim of illustration of the concentration of measure phenomenon.  Further, we explore some applications in information theory and learning. Specific topics to be briefly covered include entropy method and transportation cost inequalities.

Biography:

Mohammad Hossein Yassaee is currently an assistant professor at Sharif University of Technology, Tehran, Iran. He was a researcher in the school of Mathematics, Institute for research in fundamental sciences (IPM), 2017-2020. Before joining IPM, he was a postdoctoral research associate at the electrical engineering department, Princeton University, 2016. He received his Ph.D. from Sharif University of Technology in Dec., 2014. His research interests include network information theory, non-asymptotic information theory and applications of information theory in learning theory.

He is a recipient of the 2013 Jack Keil Wolf ISIT Student Paper Award. He also was a finalist for the 2012 ISIT Student Paper Award.