Transdisciplinary Research Institute for Advancing Data Science (TRIAD)
The Transdisciplinary Research Institute for Advancing Data Science (TRIAD) integrates research and education in mathematical, statistical, and algorithmic foundations for data science.
All materials in SMARTech are protected under U.S. Copyright Law and all rights are reserved. Such materials may be used, quoted or reproduced for educational purposes only with prior permission, provided proper attribution is given. Any redistribution, reproduction or use of the materials, in whole or in part, is prohibited without prior permission of the author.
Collections in this community
Recent Submissions

Lecture 5: Mathematics for Deep Neural Networks: Energy landscape and open problems
(20190318)To derive a theory for gradient descent methods, it is important to have some understanding of the energy landscape. In this lecture, an overview of existing results is given. The second part of the lecture is devoted to ... 
Lecture 4: Mathematics for Deep Neural Networks: Statistical theory for deep ReLU networks
(20190315)We outline the theory underlying the recent bounds on the estimation risk of deep ReLU networks. In the lecture, we discuss specific properties of the ReLU activation function that relate to skipping connections and efficient ... 
Lecture 3: Mathematics for Deep Neural Networks: Advantages of Additional Layers
(20190313)Why are deep networks better than shallow networks? We provide a survey of the existing ideas in the literature. In particular, we discuss localization of deep networks, functions that can be easily approximated by deep ... 
Lecture 2: Mathematics for Deep Neural Networks: Theory for shallow networks
(20190308)We start with the universal approximation theorem and discuss several proof strategies that provide some insights into functions that can be easily approximated by shallow networks. Based on this, a survey on approximation ... 
Lecture 1: Mathematics for Deep Neural Networks
(20190306)There are many different types of neural networks that differ in complexity and the data types that can be processed. This lecture provides an overview and surveys the algorithms used to fit deep networks to data. We discuss ... 
Mean Estimation: Median of Means Tournaments
(Georgia Institute of Technology, 20181025)In these lectures we discuss some statistical problems with an interesting combinatorial structure behind. We start by reviewing the "hidden clique" problem, a simple prototypical example with a surprisingly rich structure. ... 
Combinatorial Testing Problems
(Georgia Institute of Technology, 20181015)In these lectures we discuss some statistical problems with an interesting combinatorial structure behind. We start by reviewing the "hidden clique" problem, a simple prototypical example with a surprisingly rich structure. ... 
The Debiased Lasso
(Georgia Institute of Technology, 20180906)There will be three lectures, which in principle will be independent units. Their common theme is exploiting sparsity in highdimensional statistics. Sparsity means that the statistical model is allowed to have quite a few ... 
Compatibility and the Lasso
(Georgia Institute of Technology, 20180904)There will be three lectures, which in principle will be independent units. Their common theme is exploiting sparsity in highdimensional statistics. Sparsity means that the statistical model is allowed to have quite a few ... 
Sharp Oracle Inequalities for NonConvex Loss
(Georgia Institute of Technology, 20180831)There will be three lectures, which in principle will be independent units. Their common theme is exploiting sparsity in highdimensional statistics. Sparsity means that the statistical model is allowed to have quite a few ...