Recent Submissions

  • Lecture 3: Mathematics for Deep Neural Networks: Advantages of Additional Layers 

    Schmidt-Hieber, Johannes (2019-03-13)
    Why are deep networks better than shallow networks? We provide a survey of the existing ideas in the literature. In particular, we discuss localization of deep networks, functions that can be easily approximated by deep ...
  • Lecture 2: Mathematics for Deep Neural Networks: Theory for shallow networks 

    Schmidt-Hieber, Johannes (2019-03-08)
    We start with the universal approximation theorem and discuss several proof strategies that provide some insights into functions that can be easily approximated by shallow networks. Based on this, a survey on approximation ...
  • Lecture 1: Mathematics for Deep Neural Networks 

    Schmidt-Hieber, Johannes (2019-03-06)
    There are many different types of neural networks that differ in complexity and the data types that can be processed. This lecture provides an overview and surveys the algorithms used to fit deep networks to data. We discuss ...
  • Mean Estimation: Median of Means Tournaments 

    Lugosi, Gabor (Georgia Institute of Technology, 2018-10-25)
    In these lectures we discuss some statistical problems with an interesting combinatorial structure behind. We start by reviewing the "hidden clique" problem, a simple prototypical example with a surprisingly rich structure. ...
  • Combinatorial Testing Problems 

    Lugosi, Gabor (Georgia Institute of Technology, 2018-10-15)
    In these lectures we discuss some statistical problems with an interesting combinatorial structure behind. We start by reviewing the "hidden clique" problem, a simple prototypical example with a surprisingly rich structure. ...
  • The Debiased Lasso 

    van de Geer, Sara (Georgia Institute of Technology, 2018-09-06)
    There will be three lectures, which in principle will be independent units. Their common theme is exploiting sparsity in high-dimensional statistics. Sparsity means that the statistical model is allowed to have quite a few ...
  • Compatibility and the Lasso 

    van de Geer, Sara (Georgia Institute of Technology, 2018-09-04)
    There will be three lectures, which in principle will be independent units. Their common theme is exploiting sparsity in high-dimensional statistics. Sparsity means that the statistical model is allowed to have quite a few ...
  • Sharp Oracle Inequalities for Non-Convex Loss 

    van de Geer, Sara (Georgia Institute of Technology, 2018-08-31)
    There will be three lectures, which in principle will be independent units. Their common theme is exploiting sparsity in high-dimensional statistics. Sparsity means that the statistical model is allowed to have quite a few ...