Show simple item record

dc.contributor.advisorSong, Le
dc.contributor.authorPatnaik, Kaushik
dc.date.accessioned2016-01-07T17:23:27Z
dc.date.available2016-01-07T17:23:27Z
dc.date.created2015-12
dc.date.issued2015-08-20
dc.date.submittedDecember 2015
dc.identifier.urihttp://hdl.handle.net/1853/54353
dc.description.abstractRegression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern (also known as model selection) in linear models from observations contaminated by noise. We examine a scenario where a fraction of the zero co-variates are highly correlated with non-zero co-variates making sparsity recovery difficult. We propose two methods that adaptively increment the regularization parameter to prune the Lasso solution set. We prove that the algorithms achieve consistent model selection with high probability while using fewer samples than traditional Lasso. The algorithm can be extended to a broad set of L1-regularized M-estimators for linear statistical models.
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.publisherGeorgia Institute of Technology
dc.subjectLasso
dc.subjectL1 regression
dc.subjectAdaptive methods
dc.subjectActive learning
dc.titleAdaptive learning in lasso models
dc.typeThesis
dc.description.degreeM.S.
dc.contributor.departmentComputer Science
thesis.degree.levelMasters
dc.contributor.committeeMemberDilkina, Bistra
dc.contributor.committeeMemberChau, Duen Horng (Polo)
dc.contributor.committeeMemberDavenport, Mark
dc.date.updated2016-01-07T17:23:27Z


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record