Show simple item record

dc.contributor.authorSchmidt-Hieber, Johannes
dc.date.accessioned2019-03-19T19:44:48Z
dc.date.available2019-03-19T19:44:48Z
dc.date.issued2019-03-13
dc.identifier.urihttp://hdl.handle.net/1853/60948
dc.descriptionPresented on March 13, 2019 at 10:30 a.m. in the Groseclose Building, room 402.en_US
dc.descriptionJohannes Schmidt-Hieber is the Chair of Statistics in the Department of Applied Mathematics at the University of Twente. His research topics include statistical theory for deep neural networks, nonparametric Bayes, confidence statements for qualitative constraints, asymptotic equivalence, and spot volatility estimation.en_US
dc.descriptionRuntime: 61:24 minutesen_US
dc.description.abstractWhy are deep networks better than shallow networks? We provide a survey of the existing ideas in the literature. In particular, we discuss localization of deep networks, functions that can be easily approximated by deep networks and finally discuss the Kolmogorov-Arnold representation theorem.en_US
dc.format.extent61:24 minutes
dc.language.isoen_USen_US
dc.relation.ispartofseriesTRIAD Distinguished Lecture Seriesen_US
dc.subjectDeep neural networksen_US
dc.subjectKolmogorov-Arnolden_US
dc.subjectLocalizationen_US
dc.titleLecture 3: Mathematics for Deep Neural Networks: Advantages of Additional Layersen_US
dc.typeLectureen_US
dc.typeVideoen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Transdisciplinary Research Institute for Advancing Data Scienceen_US
dc.contributor.corporatenameUniversity of Twente. Dept. of Applied Mathematicsen_US


Files in this item

Thumbnail
Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record