• Login
    View Item 
    •   SMARTech Home
    • Georgia Tech Theses and Dissertations
    • Georgia Tech Theses and Dissertations
    • View Item
    •   SMARTech Home
    • Georgia Tech Theses and Dissertations
    • Georgia Tech Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    A reduction framework for approximate extended formulations and a faster algorithm for convex optimization

    Thumbnail
    View/Open
    ZINK-DISSERTATION-2017.PDF (2.400Mb)
    Date
    2017-04-06
    Author
    Zink, Daniel
    Metadata
    Show full item record
    Abstract
    Linear programming (LP) and semidefinite programming (SDP) are among the most important tools in Operations Research and Computer Science. In this work we study the limitations of LPs and SDPs by providing lower bounds on the size of (approximate) linear and semidefinite programming formulations of combinatorial optimization problems. The hardness of (approximate) linear optimization implied by these lower bounds motivates the lazification technique for conditional gradient type algorithms. This technique allows us to replace (approximate) linear optimization in favor of a much weaker subroutine, achieving significant performance improvement in practice. We can summarize the main contributions as follows: (i) Reduction framework for LPs and SDPs: We present a new view on extended formulations that does not require an initial encoding of a combinatorial problem as a linear or semidefinite program. This new view allows us to define a purely combinatorial reduction framework transferring lower bounds on the size of exact and approximate LP and SDP formulations between different problems. Using our framework we show new LP and SDP lower bounds for a large variety of problems including Vertex Cover, various (binary and non-binary) constraint satisfaction problems as well as multiple optimization versions of Graph-Isomorphism. (ii) Lazification technique for Conditional Gradient algorithms: In Convex Programming conditional gradient type algorithms (also known as Frank-Wolfe type methods) are very important in theory as well as in practice due to their simplicity and fast convergence. We show how we can eliminate the linear optimization step performed in every iteration of Frank-Wolfe type methods and instead use a weak separation oracle. This oracle is significantly faster in practice and enables caching for additional improvements in speed and the sparsity of the obtained solutions.
    URI
    http://hdl.handle.net/1853/58274
    Collections
    • Georgia Tech Theses and Dissertations [23877]
    • School of Industrial and Systems Engineering Theses and Dissertations [1457]

    Browse

    All of SMARTechCommunities & CollectionsDatesAuthorsTitlesSubjectsTypesThis CollectionDatesAuthorsTitlesSubjectsTypes

    My SMARTech

    Login

    Statistics

    View Usage StatisticsView Google Analytics Statistics
    facebook instagram twitter youtube
    • My Account
    • Contact us
    • Directory
    • Campus Map
    • Support/Give
    • Library Accessibility
      • About SMARTech
      • SMARTech Terms of Use
    Georgia Tech Library266 4th Street NW, Atlanta, GA 30332
    404.894.4500
    • Emergency Information
    • Legal and Privacy Information
    • Human Trafficking Notice
    • Accessibility
    • Accountability
    • Accreditation
    • Employment
    © 2020 Georgia Institute of Technology