Browsing Transdisciplinary Research Institute for Advancing Data Science (TRIAD) Lectures by Issue Date
Now showing items 1-16 of 16
-
Sharp Oracle Inequalities for Non-Convex Loss
(Georgia Institute of Technology, 2018-08-31)There will be three lectures, which in principle will be independent units. Their common theme is exploiting sparsity in high-dimensional statistics. Sparsity means that the statistical model is allowed to have quite a few ... -
Compatibility and the Lasso
(Georgia Institute of Technology, 2018-09-04)There will be three lectures, which in principle will be independent units. Their common theme is exploiting sparsity in high-dimensional statistics. Sparsity means that the statistical model is allowed to have quite a few ... -
The Debiased Lasso
(Georgia Institute of Technology, 2018-09-06)There will be three lectures, which in principle will be independent units. Their common theme is exploiting sparsity in high-dimensional statistics. Sparsity means that the statistical model is allowed to have quite a few ... -
Combinatorial Testing Problems
(Georgia Institute of Technology, 2018-10-15)In these lectures we discuss some statistical problems with an interesting combinatorial structure behind. We start by reviewing the "hidden clique" problem, a simple prototypical example with a surprisingly rich structure. ... -
Mean Estimation: Median of Means Tournaments
(Georgia Institute of Technology, 2018-10-25)In these lectures we discuss some statistical problems with an interesting combinatorial structure behind. We start by reviewing the "hidden clique" problem, a simple prototypical example with a surprisingly rich structure. ... -
Lecture 1: Mathematics for Deep Neural Networks
(2019-03-06)There are many different types of neural networks that differ in complexity and the data types that can be processed. This lecture provides an overview and surveys the algorithms used to fit deep networks to data. We discuss ... -
Lecture 2: Mathematics for Deep Neural Networks: Theory for shallow networks
(2019-03-08)We start with the universal approximation theorem and discuss several proof strategies that provide some insights into functions that can be easily approximated by shallow networks. Based on this, a survey on approximation ... -
Lecture 3: Mathematics for Deep Neural Networks: Advantages of Additional Layers
(2019-03-13)Why are deep networks better than shallow networks? We provide a survey of the existing ideas in the literature. In particular, we discuss localization of deep networks, functions that can be easily approximated by deep ... -
Lecture 4: Mathematics for Deep Neural Networks: Statistical theory for deep ReLU networks
(2019-03-15)We outline the theory underlying the recent bounds on the estimation risk of deep ReLU networks. In the lecture, we discuss specific properties of the ReLU activation function that relate to skipping connections and efficient ... -
Lecture 5: Mathematics for Deep Neural Networks: Energy landscape and open problems
(2019-03-18)To derive a theory for gradient descent methods, it is important to have some understanding of the energy landscape. In this lecture, an overview of existing results is given. The second part of the lecture is devoted to ... -
Visual Data Analytics: A Short Tutorial
(2019-08-08) -
Lecture 1: The power of nonconvex optimization in solving random quadratic systems of equations
(2019-08-28)We consider the fundamental problem of solving random quadratic systems of equations in n variables, which spans many applications ranging from the century-old phase retrieval problem to various latent-variable models in ... -
Lecture 2: Random initialization and implicit regularization in nonconvex statistical estimation
(2019-08-29)Recent years have seen a flurry of activities in designing provably efficient nonconvex procedures for solving statistical estimation / learning problems. Due to the highly nonconvex nature of the empirical loss, ... -
Lecture 3: Projected Power Method: An Efficient Algorithm for Joint Discrete Assignment
(2019-09-03)Various applications involve assigning discrete label values to a collection of objects based on some pairwise noisy data. Due to the discrete---and hence nonconvex---structure of the problem, computing the optimal assignment ... -
Lecture 4: Spectral Methods Meets Asymmetry: Two Recent Stories
(2019-09-04)This talk is concerned with the interplay between asymmetry and spectral methods. Imagine that we have access to an asymmetrically perturbed low-rank data matrix. We attempt estimation of the low-rank matrix via ... -
Lecture 5: Inference and Uncertainty Quantification for Noise Matrix Completion
(2019-09-05)Noisy matrix completion aims at estimating a low-rank matrix given only partial and corrupted entries. Despite substantial progress in designing efficient estimation algorithms, it remains largely unclear how to assess the ...