Huge-Scale Sparse Optimization: Theory, Algorithms and Applications
Coordinator: François Glineur (CORE and INMA, UCL) and Ion Necoara (Politechnica University of Bucharest)
Date:
Sponsor: Romanian Academy of Sciences
The age of Big Data has begun. Data of huge sizes is becoming ubiquitous and practitioners unprecedented sizes, but with specific structure, in particular sparsity. For example, in many applications from machine learning, compressed sensing, social networks and computational biology we can formulate sparse (quadratic) optimization problems with millions or billions of variables. Classical first or second order optimization algorithms are not designed to scale to instances of huge sizes. As a consequence, new mathematical programming tools and methods are required to solve efficiently these big data problems. The goal of this project is to develop new tools and optimization algorithms with low per-iteration cost and good scalability properties for solving sparse huge scale optimization problems. The project brings together researchers with expertise in optimization capable of dealign with the big and sparse data settings.