Kimon Fountoulakis

I am a postdoctoral fellow and Principal Investigator (PI) at University of California Berkeley (UCB) in the Department of Statistics and ICSI. I am working with Michael Mahoney. Before that I completed a PhD in optimization at University of Edinburgh (UoE) under the supervision of Prof. Jacek Gondzio.

My most recent work focuses on large scale optimization and its application to local graph clustering. I have also worked on parallelizing optimization and local graph clustering algorithms. Past work includes higher-order optimization methods for machine learning and signal processing problems.

I am currently on the job market.

D. Zhang, K. Fountoulakis, J. Cao, M. Mahoney and A. Pozdnoukhov. Social Discrete Choice Models. arXiv:1703.07520.

(arXiv)

E. Faerman, F. Borutta, K. Fountoulakis, M. Mahoney. Locality And Structure Aware Graph Node Embedding. arXiv:1710.06520, 2017.

(arXiv)

A. Devarakonda, K. Fountoulakis, J. Demmel and M. Mahoney. Avoiding communication in primal and dual block coordinate descent methods. arXiv:1612.04003, 2017.

(arXiv)

A. Devarakonda, K. Fountoulakis, J. Demmel and M. Mahoney. Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization. arXiv:1712.06047, 2017. (Accepted at IPDPS 2018)

(arXiv)

D. Wang, K. Fountoulakis, M. Henzinger, M. Mahoney and S. Rao. Capacity Releasing Diffusion for Speed and Locality. ICML 2017.

(arXiv,Conference)

J. Shun, F. Roosta-Khorasani, K. Fountoulakis and M. Mahoney. Parallel Local Graph Clustering. Proceedings of the VLDB Endowment, 9(12):1041-1052, 2016

(arXiv,Conference)

K. Fountoulakis and R. Tappenden. A Flexible Coordinate Descent Method. arXiv:1507.03713, 2017. (Accepted at COAP)

(arXiv, MATLAB software)

K. Fountoulakis, F. Roosta-Khorasani, J. Shun, X. Cheng and M. Mahoney. Variational Perspective on Local Graph Clustering. arXiv:1602.01886, 2017. (Accepted in Mathematical Programming B).

(arXiv, Journal, Software at PyPi, Python software at GitHub)

K. Fountoulakis, D. Gleich and M. Mahoney. An optimization approach to locally-biased graph algorithms. Proceedings of the IEEE, 105(2):256-272, 2017.

(arXiv,Journal, Software at PyPi, Python software at GitHub)

K. Fountoulakis, A. Kundu, E.-M. Kontopoulou and P. Drineas. A Randomized Rounding Algorithm for Sparse PCA. ACM Transactions on Knowledge Discovery from Data (TKDD), 11(3). No. 38, 2017.

(arXiv,Journal)

K. Fountoulakis and J. Gondzio. A Second-Order Method for Strongly-Convex L1-Regularization Problems. Mathematical Programming, 156(1-2):189-219, 2016.

(Tech. report, arXiv, Journal, MATLAB software)

K. Fountoulakis and J. Gondzio. Performance of First- and Second-Order Methods for L1-Regularized Least Squares Problems. Computational Optimization and Applications, 65(3):605-635, 2016.

(Tech. report, arXiv, Journal, MATLAB software)

I. Dassios, K. Fountoulakis and J. Gondzio. A Preconditioner for a Primal-dual Newton Conjugate Gradients Method for Compressed Sensing Problems. SIAM J. Sci. Comput., 37(6):A2783-A2812, 2015.

(Tech. report, arXiv, Journal, MATLAB software)

K. Fountoulakis, J. Gondzio, and P. Zhlobich. Matrix-free interior point method for compressed sensing problems. Mathematical Programming Computation, 6(1): 1-31, 2014.

(Tech. report, arXiv, Journal, MATLAB software)

Higher-Order Methods for Large-Scale Optimization. School of Mathematics, University of Edinburgh.

(PDF)

Local Graph Clustering provides methods to find local clusters in a given graph without touching the whole graph.

Instance generators for l1-regularized over- and underdetermined least squares. The generators are implemented in MATLAB, they are memoryless and computationally inexpensive. Hence, large-scale instances can be created.

A MATLAB implementation for the solution of unconstrained l1-regularized problems. For example, Machine Learning problems, such as l1-regularized least-squares and logistic regression, Compressed Sensing problems, such as l1-synthesis, l1-analysis and isotropic total-variation. The solver is memoryless, it requires only matrix-vector product operations, hence it is appropriate for large-scale instances.

An interior point method implemented in MATLAB for the solution of real valued compressed sensing problems. The "matrix-free" implies that only matrix-vector products operations are allowed and the process is memoryless. The solver implements efficient preconditioning techniques for the fast solution of linear systems at every iteration.

This software only reproduces the experiments in paper "Flexible Coordinate Descent".

Address: 1947 Center Street, Ste. 600, Berkeley, CA 94704

Email:

Bay Area Scientific Computing Day (Parallel Methods for Convex Optimization)

(Download pdf)

UCB RISELab Summer Retreat 2017 (Communication Avoiding Optimization Algorithms)

(Download pdf)

Workshop on Distributed and Parallel Data Analysis 2016 (Parallel Local Graph Clustering)

(Download pdf)

Workshop on Algorithms for Modern Massive Data Sets 2016 (Local graph clustering algorithms: an optimization perspective)

(Download pdf, Video)

International Symposium on Mathematical Programming (ISMP2015) (Performance of First- and Second-Order Methods for Large-Scale Optimization)

(Download pdf

Oxford Numerical Analysis Group Internal Seminar (On Newton-type methods for support vector machines and signal reconstruction problems)

(Download pdf)

SIAM Conferecne on Computational Science and Engineering (On primal-dual Newton conjugate gradients for compressed sensing)

(Download pdf)

ERGO Seminar (On robust block coordinate descent and primal-dual Newton conjugate gradients)

(Download pdf)

IMA Conference on the Mathematical Challenges of Big Data 2014 (On robust block coordinate descent)

(Download pdf)

Edinburgh SIAM Student Chapter Conference 2014 (On robust block coordinate descent)

(Download pdf)

4th IMA Conference on Numerical Linear Algebra and Optimisation (On primal-dual Newton conjugate gradients for l1-regularized
problems)

(Download pdf)

SIAM Conference on Optimization 2014 (On primal-dual Newton conjugate gradients for sparse signal reconstruction)

(Download pdf)

12th EUROPT Workshop on Advances in Continuous Optimization 2014 (On primal-dual Newton conjugate gradients for l1-regularized
problems)

(Download pdf)

20th Conference of the International Federation of Operational Research Societies (IFORS) 2014
(On primal-dual Newton conjugate gradients for sparse signal reconstruction)

(Download pdf)

Computational linear algebra and optimization for the Digital Economy workshop 2013
(On second-order methods for various l1-regularized problems)

(Download pdf)

International Conference on Continuous Optimization (ICCOPT) 2013
(On second-order methods with continuation for various l1-regularized problems)

(Download pdf)

11th EUROPT Workshop on Advances in Continuous Optimization
(On second-order methods with continuation for l1-regularized problems in Machine Learning)

(Download pdf)

European Conference on Operational Research EURO/INFORMS 2013
(On second-order methods with continuation for l1-regularized problems in Machine Learning)

(Download pdf)

International Conference on Preconditioning Techniques for Scientific and Industrial Applications 2013
(On preconditioners for second-order methods applied to l1-regularized problems)

(Download pdf)

Conference 3rd IMA Conference on Numerical Linear Algebra and Optimization (NLAO)
(On interior point methods for sparse signal reconstruction)

(Download pdf)

Operational Research 54 (OR54)
(On interior point methods for sparse signal reconstruction)

(Download pdf)

International Symposium on Mathematical Programming (ISMP2012)
(On interior point methods for sparse signal reconstruction)

(Download pdf)

3rd Conference on Optimization Methods and Software (OMS)
(On interior point methods for sparse signal reconstruction)

(Download pdf)