• Login
    View Item 
    •   SMARTech Home
    • Georgia Tech Theses and Dissertations
    • Georgia Tech Theses and Dissertations
    • View Item
    •   SMARTech Home
    • Georgia Tech Theses and Dissertations
    • Georgia Tech Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Learning over functions, distributions and dynamics via stochastic optimization

    Thumbnail
    View/Open
    DAI-DISSERTATION-2018.pdf (4.669Mb)
    Date
    2018-07-27
    Author
    Dai, Bo
    Metadata
    Show full item record
    Abstract
    Machine learning has recently witnessed revolutionary success in a wide spectrum of domains. The learning objectives, model representation, and learning algorithms are important components of machine learning methods. To construct successful machine learning methods that are naturally fit to different problems with different targets and inputs, one should consider these three components together in a principled way. This dissertation aims for developing a unified learning framework for such purpose. The heart of this framework is the optimization with the integral operator in infinite-dimensional spaces. Such an integral operator representation view in the proposed framework provides us an abstract tool to consider these three components together for plenty of machine learning tasks and will lead to efficient algorithms equipped with flexible representations achieving better approximation ability, scalability, and statistical properties. We mainly investigate several motivated machine learning problems, i.e., kernel methods, Bayesian inference, invariance learning, policy evaluation and policy optimization in reinforcement learning, as the special cases of the proposed framework with different instantiations of the integral operator in the framework. These instantiations result in the learning problems with inputs as functions, distributions, and dynamics. The corresponding algorithms are derived to handle the particular integral operators via efficient and provable stochastic approximation by exploiting the particular structure properties in the operators. The proposed framework and the derived algorithms are deeply rooted in functional analysis, stochastic optimization, nonparametric method, and Monte Carlo approximation, and contributed to several sub-fields in machine learning community, including kernel methods, Bayesian inference, and reinforcement learning. We believe the proposed framework is a valuable tool for developing machine learning methods in a principled way and can be potentially applied to many other scenarios.
    URI
    http://hdl.handle.net/1853/60316
    Collections
    • College of Computing Theses and Dissertations [1191]
    • Georgia Tech Theses and Dissertations [23877]
    • School of Computational Science and Engineering Theses and Dissertations [100]

    Browse

    All of SMARTechCommunities & CollectionsDatesAuthorsTitlesSubjectsTypesThis CollectionDatesAuthorsTitlesSubjectsTypes

    My SMARTech

    Login

    Statistics

    View Usage StatisticsView Google Analytics Statistics
    facebook instagram twitter youtube
    • My Account
    • Contact us
    • Directory
    • Campus Map
    • Support/Give
    • Library Accessibility
      • About SMARTech
      • SMARTech Terms of Use
    Georgia Tech Library266 4th Street NW, Atlanta, GA 30332
    404.894.4500
    • Emergency Information
    • Legal and Privacy Information
    • Human Trafficking Notice
    • Accessibility
    • Accountability
    • Accreditation
    • Employment
    © 2020 Georgia Institute of Technology