University of North Carolina at Chapel Hill, USA
Title: Some recent developments of large margin classification techniques
Abstract: Classification is a supervised learning problem with many practical applications in various disciplines. Among numerous existing classification methods, margin-based classification techniques have been popular in both machine learning and statistics communities. The well-known Support Vector Machine (SVM) is a typical example. It has weak distributional assumptions and great flexibility in dealing with high dimensional data. In this talk, I will present some recent developments of large margin classifiers with the focus on SVMs. Issues including statistical properties of the SVM, robust SVM, multicategory SVM will be covered. Computational aspects will be discussed. Some of these techniques involve difference-of- convex (DC) minimization. The usage of difference-of-convex algorithm (DCA) will be highlighted as well.
Brief bio: Yufeng Liu is a full professor at Department of Statistics and Operations Research and Department of Biostatistics at University of North Carolina at Chapel Hill (UNC) in USA. He also holds a joint appointment with the Carolina Center for Genome Sciences and the Linberger Comprehensive Cancer Center at UNC. He obtained his Ph.D in statistics from The Ohio State University in June 2004 and has been on the faculty at UNC since then. Professor Liu received the Faculty Early Career Development (CAREER) Award from the US National Science Foundation in 2008, and Ruth and Phillip Hettleman Prize for Artistic and Scholarly Achievement from UNC at 2010. He was elected as Fellow of The American Statistical Association (ASA) in 2013 and became an elected-member of the International Statistical Institute (ISI) in 2014. He has served on editorial boards of several journals including Journal of the American Statistical Association, Journal of Royal Statistical Society, and Statistica Sinica. He has co-authored more than 60 papers in international journals. His main research interests include statistical machine learning, optimization for large scale problems, nonparametric statistics, big data analysis, cancer genomics, and neuroimaging analysis.
Karlsruhe Institute of Technology, Germany
Title: Online optimization with lookahead and the relation to new IT techniques in logistics
Abstract: This talk covers the topic of online optimization with lookahead. In this problem setting, optimization decisions have to be made throughout the time horizon under incomplete information with respect to future input data. However, a certain time window lying ahead can be seen by the decision maker, hence providing him some form of informational lookahead. In fact, many real world problems feature this type of problem setting in contrast to the classical discipline of offline optimization where all data is given in advance.
At first, we consider online optimization with lookahead from a theoretical point of view. We give a precise definition for lookahead, examine its effects on solution quality, and derive a framework for modeling and evaluating the quality of online algorithms operating under different degrees of lookahead. Furthermore, we establish a link between discrete event systems and online optimization, which leads us to discrete event simulation as one of the most promising tools for the analysis of algorithm performance in dynamic applications.
In the second part of the talk, we have a detailed look at how the theoretical concepts introduced in the first part can be applied in practical applications and real world logistics projects of different complexity. To this end, we first conduct basic numerical experiments from a sampling-based and an exact-methods-based point of view. The results show that the performance of online algorithms heavily depends on the amount of lookahead provided, but also on the problem class itself. In several simulation studies of real world problems, we learn that reliable information derived from innovative technologies endowed with information-detecing devices – such as Bar-Code Scanners, RFID, GPS or general sensors – can contribute significantly to enhanced system performance. However, we also recognize limitations of our approach for performance analysis in online optimization with lookahead, and we give an outlook on promising future research avenues in this field. Finally, we present several projects and case studies from different industry sectors where applying online optimization methods has proven significantly beneficial to the decision makers and led to sustainable improvements in day-to-day operations.
Brief bio: Stefan Nickel is a full professor at the Karlsruhe Institute of Technology - KIT (Germany) and one of the directors of the Institute of Operations Research.
He obtained his PhD in mathematics at the Technical University of Kaiserslautern (Germany) in 1995. From 1995 to 2003 he was assistant and associate professor in mathematics at the Technical University of Kaiserslautern. After a full professor position at the Saarland University (Chair of Operations Research and Logistics) from 2003 to 2009, he joined the Karlsruhe Institute of Technology as the Chair in Discrete Optimization and Logistics in April 2009. Since 2014 he is the dean of the Department of Economics and Management at KIT. Stefan Nickel is also member of the scientific advisory board as well as of the management board of the Fraunhofer Institute for Applied Mathematics (ITWM) in Kaiserslautern. Since 2011 he additionally holds the positions of one of the directors of the Karlsruhe Service Research Institute (KSRI) and of the Research Center for Computer Science (FZI). Since 2006 he is editor-in-chief of Computers & Operations Research and member of the editorial board of Health Care Management Science. He has coordinated the Health Care working group within the German OR society (GOR) and has been the president of GOR from 2013-2014.
Stefan Nickel has authored or co-authored 5 books as well as more than 90 scientific articles in his research areas Locational Analysis, Supply Chain Management, Health Care Logistics, and Online Optimization. He has been awarded the EURO prize for the best EJOR review paper (2012) and the Elsevier prize for the EJOR top cited article 2007-2011. In addition he conducted several industry projects with well-known companies such as BASF, Lufthansa, Miele, or SAP.
University of Southern California, USA
Title: Computing B-Stationary Points of Nonsmooth DC Programs
Abstract: Motivated by a class of applied problems arising from physical layer based security in a digital communication system, in particular, by a secrecy sum- rate maximization problem, this paper studies a nonsmooth, difference-of- convex (dc) minimization problem. The contributions of this paper are: (i) clarify several kinds of stationary solutions and their relations; (ii) develop and establish the convergence of a novel algorithm for computing a d- stationary solution of a problem with a convex feasible set that is arguably the sharpest kind among the various stationary solutions; (iii) extend the algorithm in several directions including: a randomized choice of the subproblems that could help the practical convergence of the algorithm, a distributed penalty approach for problems whose objective functions are sums of dc functions, and problems with a specially structured (nonconvex) dc constraint. For the latter class of problems, a pointwise Slater constraint qualification is introduced that facilitates the verification and computation of a B(ouligand)-stationary point
Brief bio: Jong-Shi Pang joined the University of Southern California as the Epstein Family Professor of Industrial and Systems Engineering in August 2013. Prior to this position, he was the Caterpillar Professor and Head of the Department of Industrial and Enterprise Systems Engineering for six years between 2007 and 2013. He held the position of the Margaret A. Darrin Distinguished Professor in Applied Mathematics in the Department of Mathematical Sciences and was a Professor of Decision Sciences and Engineering Systems at Rensselaer Polytechnic Institute from 2003 to 2007. He was a Professor in the Department of Mathematical Sciences at the Johns Hopkins University from 1987 to 2003, an Associate Professor and then Professor in the School of Management from 1982 to 1987 at the University of Texas at Dallas, and an Assistant and then an Associate Professor in the Graduate School of Industrial Administration at Carnegie-Mellon University from 1977 to 1982. During 1999 and 2001 (full time) and 2002 (part-time), he was a Program Director in the Division of Mathematical Sciences at the National Science Foundation.
Professor Pang was a winner of the 2003 George B. Dantzig Prize awarded jointly by the Mathematical Programming Society and the Society for Industrial and Applied Mathematics for his work on finite-dimensional variational inequalities, and a co-winner of the 1994 Frederick W. Lanchester Prize awarded by the Institute for Operations Research and Management Science. Several of his publications have received best paper awards in different engineering fields: signal processing, energy and natural resources, computational management science, and robotics and automation. He is an ISI Highly Cited Researcher in the Mathematics Category between 1980--1999; he has published 3 widely cited monographs and more than 100 scholarly journals in top peer reviewed journals. Dr. Pang is a member in the inaugural 2009 class of Fellows of the Society for Industrial and Applied Mathematics. Professor Pang's general research interest is in the mathematical modeling and analysis of a wide range of complex engineering and economics systems with focus in operations research, (single and multi-agent) optimization, equilibrium programming, and constrained dynamical systems.
CNRS Senior Researcher (DR2)
CNRS and Ceremade, University of Paris Dauphine, France
Title: Low Complexity Regularization of Inverse Problems
Abstract: In this talk, we investigate in a unified way the structural properties of a large class of convex regularizers for linear inverse problems. These penalty functionals are crucial to force the regularized solution to conform to some notion of simplicity/low complexity. Classical priors of this kind includes sparsity, piecewise regularity and low-rank. These are natural assumptions for many applications, ranging from medical imaging to machine learning. Our main set of contributions gives a theoretical assessment of the recovery performances of these regularizations, which includes robustness to noise in the measurements. This is a joint work with Samuel Vaiter and Jalal Fadili.
Brief bio: Gabriel Peyré graduated from Ecole Normale Supérieure de Cachan, France, in 2003 and received his Ph.D in applied mathematics from école Polytechnique, Paris, France, in 2005. Since 2006, he has been a researcher at the Centre Nationale de Recherche Scientifique (CNRS), working in Ceremade, University Paris-Dauphine. He his head of the research group SIGMA-Vision, which is funded by the European Research Council (ERC). SIGMA-Vision activity is focussed on sparse and adaptive representations with application in computer vision, computer graphics and neurosciences. Since 2005 Gabriel Peyré has co-authored 40 papers in international journals, 50 conference proceedings in top vision and image processing conferences, and two books. He is the creator of the "Numerical tour of signal processing" (www.numerical-tours.com), a popular online repository of Matlab/Python/Scilab resources to teach modern signal and image processing.
University of California at Irvine, USA
Title: A Tale of Two DCAs in Compressed Sensing: DL12 and TL1
Abstract: In the past decade, compressed sensing (CS) has generated enormous research activities in mathematics, statistics, information and data science. A fundamental problem is to reconstruct a sparse signal under a few linear measurements far less than the physical dimension of the signal. This is mathematically guaranteed if the sensing matrix has sufficiently incoherent columns or satisfies the restricted isometry property (RIP). Convex (L1) and non-convex (Lp, 0< p< 1) relaxations work well in the RIP regime. In this lecture, two Lipschitz continuous sparsity promoting non-convex relaxations are introduced: the difference of L1 and L2 norms (DL12) and the transformed L1 penalty (TL1). Efficient minimization algorithms are constructed and analyzed based on the difference of convex function (DC) methodology originated by Pham-Dinh Tao in 1985, and extensively developed by Le-Thi Hoai An and Pham-Dinh Tao since 1994. The resulting DC algorithms (DCA) can be viewed as convergent and stable iterations on top of L1 minimization, hence improving L1 consistently. In sparse recovery test problems, DCA-DL12 outperforms state-of-the-art CS methods when the sensing matrices become highly coherent (ill-conditioned) as encountered in super-resolution imaging. On the other hand, DCA-TL1 appears as the most robust method by being a consistent top performer among the representative CS methods for a broad range of sensing matrices of varying degree of incoherence (hence the least sensitive to RIP). Applications to image processing will also be illustrated.
Brief bio: Jack Xin has been Professor of Mathematics in the Department of Mathematics, Center for Hearing Research, Institute for Mathematical Behavioral Sciences, and Center for Mathematical and Computational Biology at UC Irvine since 2005. He received his Ph.D in applied mathematics at Courant Institute, New York University in 1990. He was a postdoctoral fellow at Berkeley and Princeton in 1991 and 1992. He was assistant and associate professor of mathematics at the University of Arizona from 1991 to 1999. He was professor of mathematics from 1999 to 2005 at the University of Texas at Austin. His research interests include applied analysis, computational methods and their applications in multi-scale problems, sparse optimization, and signal processing. He authored over hundred journal papers and two Springer books, and became an ISI highly cited researcher in mathematics in 2002. He is a fellow of the John S. Guggenheim Foundation, and an inaugural fellow of the American Mathematical Society (AMS) in 2012. He is Editor-in-Chief of Society of Industrial and Applied Mathematics (SIAM) Interdisciplinary Journal Multi-scale Modeling & Simulation (MMS).