Interntional Computer Science Institute
UC Berkeley, California, USA

cnieuwe (at) eecs (dot) berkeley (dot) edu

Dr. Claudia Nieuwenhuis

About me

I am a postdoctoral researcher at the University of Berkeley in beautiful California working with the group of Professor Trevor Darrell. Before I came to Berkeley I worked with Yuri Boykov and Olga Veksler in the field of discrete optimization methods at the University of Western Ontario in London, Canada. Before that I was with the group of Professor Daniel Cremers at the Technical University of Munich working on continuous convex optimization.

I received my PhD from the University of Heidelberg in 2009 working on the topic of motion estimation, optical flow and automatic accuracy evaluation. I received my master's degree in computer science from the Technical University of Ilmenau in 2006 and a master's degree in mathematics from the University of Hagen in 2008. Between 2003 and 2006 I worked as an intern at Siemens Corporate Research in Princeton, NJ, USA and at the Fraunhofer Center for Advanced Media Technology in Singapore.

Research Interests

In my research, I am mainly concerned with energy minimization methods for large optimization problems with millions of unknowns. Such problems can be formulated on discrete domains, i.e. the pixel grid, or as continuous optimization problems. For discrete formulations Markov Random Fields are applied, which can be optimized e.g. by message passing or move making graph algorithms and graph cuts. Continuous problems can be formulated as variational approaches and are optimized by solving large sets of partial dif- ferential equations. I am especially interested in convex optimization problems, which allow for the computation of global optima independent of the initialization. To be able to also deal with non-convex problems, the theory of convex relaxations is important to obtain approximate solutions which are close to global optimality.

Apart from optimization, statistical models and learning based approaches are in- dispensable to cope with large amounts of data in image analysis. Regarding images as realizations of random processes, such models try to explain these realizations in a way which maximizes the probability of the image. To create such statistical models I often use Bayesian formalisms, methods for density estimation, support vector machines, compressed sensing, sparsity approaches and subspace methods.