College of Computing (CoC)
http://hdl.handle.net/1853/5977
The College of Computing houses one of the largest computer science programs in the countryFri, 28 Oct 2016 22:34:58 GMT2016-10-28T22:34:58ZFrom Minimum Cut to Submodular Minimization: Leveraging the Decomposable Structure
http://hdl.handle.net/1853/55973
From Minimum Cut to Submodular Minimization: Leveraging the Decomposable Structure
Ene, Alina
Submodular function minimization is a fundamental optimization problem that arises in several applications in machine learning and computer vision. The problem is known to be solvable in polynomial time, but general purpose algorithms have high running times and are unsuitable for large-scale problems. Recent work aims to obtain very practical algorithms for minimizing functions that are sums of "simple" functions. This class of functions provides an important bridge between functions with a rich combinatorial structure – notably, the cut function of a graph – and general submodular functions, and it brings along powerful combinatorial structure reminiscent of graphs as well as a fair bit of modeling power that greatly expands its applications. In this talk, we describe recent progress on designing very efficient algorithms for minimizing decomposable functions and understanding their combinatorial structure.
This talk is based on joint work with Huy Nguyen (Northeastern University) and Laszlo Vegh (London School of Economics).
Presented on October 17, 2016 at 11:00 a.m. in the Klaus Advanced Computing Building, Room 1116E.; Alina Ene is an Assistant Professor in the Computer Science department at Boston University. Her research interests include the design and analysis of algorithms, the mathematical aspects of combinatorial optimization topics such as submodularity and graphs, and their applications to machine learning.; Runtime: 58:14 minutes
Mon, 17 Oct 2016 00:00:00 GMThttp://hdl.handle.net/1853/559732016-10-17T00:00:00ZEne, AlinaSubmodular function minimization is a fundamental optimization problem that arises in several applications in machine learning and computer vision. The problem is known to be solvable in polynomial time, but general purpose algorithms have high running times and are unsuitable for large-scale problems. Recent work aims to obtain very practical algorithms for minimizing functions that are sums of "simple" functions. This class of functions provides an important bridge between functions with a rich combinatorial structure – notably, the cut function of a graph – and general submodular functions, and it brings along powerful combinatorial structure reminiscent of graphs as well as a fair bit of modeling power that greatly expands its applications. In this talk, we describe recent progress on designing very efficient algorithms for minimizing decomposable functions and understanding their combinatorial structure.
This talk is based on joint work with Huy Nguyen (Northeastern University) and Laszlo Vegh (London School of Economics).Prices, Auctions, and Combinatorial Prophet Inequalities
http://hdl.handle.net/1853/55928
Prices, Auctions, and Combinatorial Prophet Inequalities
Lucier, Brendan
The most common way to sell resources, from apples to business licenses to concert tickets, is to post prices. A choice of prices can be viewed as an algorithm for an online stochastic optimization problem, which makes decisions using value thresholds. This connection provides an opportunity to use the famous prophet inequality -- which describes the power of threshold rules -- to study pricing problems, and vice-versa. In this talk I'll present a general framework for deriving new prophet inequalities using economic insights from pricing, with algorithmic applications. Along the way, I'll describe an unexpected connection between posted prices and equilibria of non-truthful auctions.
Presented on October 3, 2016 at 11:00 a.m. in the Klaus Advanced Computing Building, Room 1116E.; Brendan Lucier is a Researcher at Microsoft Research, New England. Prior to joining Microsoft, he received his Ph.D. in Computer Science from the University of Toronto. His research interests lie in the intersection of theoretical Computer Science and Economics, and include algorithmic market design, algorithmic pricing, and social processes on networks. He is especially interested in the tradeoffs between simplicity, robustness, and optimality in markets for complex goods and services.; Runtime: 51:51 minutes
Mon, 03 Oct 2016 00:00:00 GMThttp://hdl.handle.net/1853/559282016-10-03T00:00:00ZLucier, BrendanThe most common way to sell resources, from apples to business licenses to concert tickets, is to post prices. A choice of prices can be viewed as an algorithm for an online stochastic optimization problem, which makes decisions using value thresholds. This connection provides an opportunity to use the famous prophet inequality -- which describes the power of threshold rules -- to study pricing problems, and vice-versa. In this talk I'll present a general framework for deriving new prophet inequalities using economic insights from pricing, with algorithmic applications. Along the way, I'll describe an unexpected connection between posted prices and equilibria of non-truthful auctions.A Fast and Simple Unbiased Estimator for Network (Un)reliability
http://hdl.handle.net/1853/55915
A Fast and Simple Unbiased Estimator for Network (Un)reliability
Karger, David
The following procedure yields an unbiased estimator for the disconnection probability of an n-vertex graph with minimum cut c if every edge fails independently with probability p: (i) contract every edge independently with probability 1-n^{-2/c}, then (ii) recursively compute the disconnection probability of the resulting tiny graph if each edge fails with probability n^{2/c}p. We give a short, simple, self-contained proof that this estimator can be computed in linear time and has relative variance O(n^2). Combining these two facts with a relatively standard sparsification argument yields an O(n^3\log n)-time algorithm for estimating the (un)reliability of a network. We also show how the technique can be used to create unbiased samples of disconnected networks.
Presented on September 26, 2016 at 11:00 a.m. in the Klaus Computing Building, Room 1116E; David Karger is a member of the Computer Science and Artificial Intelligence Laboratory in the EECS department at MIT. His primary interest is in developing tools that help individuals manage information better. This involves studying people and current tools to understand where the problems are, creating and evaluating tools that address those problems, and deploying those tools to learn how people use them and iterate the whole process. He draws on whatever fields can help: information retrieval, machine learning, databases, and algorithms, but most often human computer interaction.; Runtime: 70:45 minutes
Mon, 26 Sep 2016 00:00:00 GMThttp://hdl.handle.net/1853/559152016-09-26T00:00:00ZKarger, DavidThe following procedure yields an unbiased estimator for the disconnection probability of an n-vertex graph with minimum cut c if every edge fails independently with probability p: (i) contract every edge independently with probability 1-n^{-2/c}, then (ii) recursively compute the disconnection probability of the resulting tiny graph if each edge fails with probability n^{2/c}p. We give a short, simple, self-contained proof that this estimator can be computed in linear time and has relative variance O(n^2). Combining these two facts with a relatively standard sparsification argument yields an O(n^3\log n)-time algorithm for estimating the (un)reliability of a network. We also show how the technique can be used to create unbiased samples of disconnected networks.Strongly Rayleigh distributions and their Applications in Algorithm Design
http://hdl.handle.net/1853/55876
Strongly Rayleigh distributions and their Applications in Algorithm Design
Gharan, Shayan Oveis
A multivariate polynomial p(z1,...,zn) is stable if p(z1,...,zn) <> 0 whenever Im(zi)>0 for all i. Strongly Rayleigh distributions are probability distributions on 0-1 random variables whose generating polynomial is stable. They can be seen as a natural generalization of product distributions. Borcea, Branden and Liggett used the geometry of stable polynomials to prove numerous properties of strongly Rayleigh distributions, including negative association, and closure under conditioning and truncation.
In this talk I will go over basic properties of these distributions, and then I will describe several algorithmic applications.
Based on joint works with Nima Anari, Alireza Rezaei, Mohit Singh, Amin Saberi.
Presented on September 12, 2016 at 11:00 a.m. in the Klaus Advanced Computing Building, Room 1116E.; Shayan Oveis Gharan is an assistant professor in the computer science and engineering department at University of Washington. Shayan's research includes Algorithm design, Graph Theory and Applied Probability.; Runtime: 58:51 minutes
Mon, 12 Sep 2016 00:00:00 GMThttp://hdl.handle.net/1853/558762016-09-12T00:00:00ZGharan, Shayan OveisA multivariate polynomial p(z1,...,zn) is stable if p(z1,...,zn) <> 0 whenever Im(zi)>0 for all i. Strongly Rayleigh distributions are probability distributions on 0-1 random variables whose generating polynomial is stable. They can be seen as a natural generalization of product distributions. Borcea, Branden and Liggett used the geometry of stable polynomials to prove numerous properties of strongly Rayleigh distributions, including negative association, and closure under conditioning and truncation.
In this talk I will go over basic properties of these distributions, and then I will describe several algorithmic applications.
Based on joint works with Nima Anari, Alireza Rezaei, Mohit Singh, Amin Saberi.