Now showing items 1-5 of 5

• Coordinate Sampling for Sublinear Optimization and Nearest Neighbor Search ﻿

(Georgia Institute of Technology, 2011-04-22)
I will describe randomized approximation algorithms for some classical problems of machine learning, where the algorithms have provable bounds that hold with high probability. Some of our algorithms are sublinear, that is, ...
• Extending Hadoop to Support Binary-Input Applications ﻿

(Georgia Institute of Technology, 2012-10-19)
Many data-intensive applications naturally take multiple inputs, which is not well supported by some popular MapReduce implementations, such as Hadoop. In this talk, we present an extension of Hadoop to better support such ...
• Optimization for Machine Learning: SMO-MKL and Smoothing Strategies ﻿

(Georgia Institute of Technology, 2011-04-15)
Our objective is to train $p$-norm Multiple Kernel Learning (MKL) and, more generally, linear MKL regularised by the Bregman divergence, using the Sequential Minimal Optimization (SMO) algorithm. The SMO algorithm is simple, ...
• Stochastic Gradient Descent with Only One Projection ﻿

(Georgia Institute of Technology, 2012-09-28)
Although many variants of stochastic gradient descent have been proposed for large-scale convex optimization, most of them require projecting the solution at {\it each} iteration to ensure that the obtained solution stays ...
• Virus Quasispecies Assembly using Network Flows ﻿

(Georgia Institute of Technology, 2009-09-25)
Understanding how the genomes of viruses mutate and evolve within infected individuals is critically important in epidemiology. In this talk I focus on optimization problems in sequence assembly for viruses based on 454 ...