CPAIOR 2017, the Fourteenth International Conference on Integration of Artificial Intelligence and Operations Research Techniques in Constraint Programming, will be held in Padova, Italy, June 5 - June 8, 2017. A Master Class on Computational Techniques for Combinatorial Optimization on June 5, and the Main Conference on June 6 - June 8, 2017.
The aim of the conference is to bring together interested researchers from Constraint Programming (CP), Artificial Intelligence (AI), and Operations Research (OR) to present new techniques or applications in combinatorial optimization and to provide an opportunity for researchers in one area to learn about techniques in the others. A main objective of this conference series is also to give these researchers the opportunity to show how the integration of techniques from different fields can lead to interesting results on large and complex problems. Therefore papers that actively combine, integrate, or contrast approaches from more than one of the areas are especially solicited. High quality papers from a single area are also welcome, provided that they are of interest to other communities involved. Application papers showcasing CP/AI/OR techniques on novel and challenging applications or experience reports on such applications are strongly encouraged.
The program committee invites submissions that include but are not limited to the following topics:
The proceedings of the conference are published on the Lecture Notes on Computer Science series. as LNCS 10335.
Submission Schedule (for both long and short papers):
All deadlines are intended until 23:59 UCT-12: for example, if it is still November 16 anywhere in the world, then you can still submit an abstract.
Registration Deadlines:
Conference Schedule:
Paper submissions are of two types:
Long papers should present original unpublished work and be at most 15 LNCS pages plus references in length, and should be prepared in the format used for the Springer Lecture Notes in Computer Science series. These papers will undergo rigorous review. The proceedings will be published in the Springer Lecture Notes in Computer Science series.
Short papers are also encouraged, limited to 8 LNCS pages plus references and should be prepared with the same format as long papers. Although containing less material, short papers should describe original unpublished work and will be reviewed to the same criteria of quality as long papers. It is also encouraged to submit short papers about work in progress on ideas that are interesting but for which the practical or theoretical relevance is not yet fully identified. Short papers will be presented at the conference and published in the conference proceedings.
Additionally, outstanding submissions to the technical program will be offered the opportunity to be published exclusively through a fast track process in the Constraint Journal. Journal fast track paper will still be regularly presented at the conference.
For any queries on the submission process, please contact the program chairs at domenico.salvagnin@unipd.it and michele.lombardi2@unibo.it.
Abstracts and papers must be submitted on EasyChair by following this link:
Program chairs:
Conference chair:
Program Committee:
Registration can be done online here.
The registration website also provides information about:
You can download a PDF version of the program here.
The master class will be on Computational Techniques for Combinatorial Optimization, and will be made of four talks:
Many optimization problems appearing in industry applications can be modeled as mixed integer programs (MIP). Typically, such a model is passed to a general-purpose MIP solver, which needs to find a solution to the problem in a black-box fashion without any additional user interaction.
This lecture provides an overview of the main ingredients of state-of-the-art MIP solvers. We will show how MIPs are tackled in practice and evaluate the performance impact of the individual components of a MIP solver. In particular, we will cover presolving, linear programming, cutting planes, branching, and primal heuristics.[PDF]
The tutorial-style lecture will first delve into MiniCP, a minimalistic Java micro-kernel solver designed expressly for teaching Constraint Programming. It will gradually introduce key concepts such as domains, variables, constraints, fixpoints and the fundamentals of search and state management. Architectural design decisions as well as implementation issues will be front and center.[PDF]
The tutorial will revisit the mantra CP = Model + Search to explore design decisions pertaining to modeling and propagation implementation. This includes, for instance, rich variables, views, constraint classes, event models, propagation, consistency handling and priorities. It will also explore the issues surrounding technology agnostic model reformulation and manipulations such as linearizations, relaxations and parallel support.[PDF]
The tutorial will explore the spectrum of issues arising from the implementation of search techniques. For instance, it will review goal-directed search procedures and detail continuation-based search as well as search combinators. The tutorial will also explore the issues that surface with tree-based parallelization, semantic decomposition and embarrassingly parallel search. Advanced hybrid techniques (including LNS) will complete the study of hybrid solvers.
Parallel to the master class, a DSO (Data Science meets Optimization) workshop will also take place, jointly with CEC 2017. You can find all the relevant information here.
In this talk, I try to explain my point of view as a Mathematical Optimizer -- especially concerned with discrete (integer) decisions -- on Big Data. I advocate a tight integration of Machine Learning and Mathematical Optimization (among others) to deal with the challenges of decision-making in Data Science. For such an integration I concentrate on three questions: 1) what can optimization do for machine learning? 2) what can machine learning do for optimization? 3) which new applications can be solved by the combination of machine learning and optimization? Finally, I will discuss in details two areas in which machine learning techniques have been (successfully) applied in the area of mixed-integer programming.[PDF]
Symmetry is the essential element of lifted inference that has recently demonstrated the possibility to perform very efficient inference in highly-connected, but symmetric probabilistic models models aka. relational probabilistic models. This raises the question, whether this holds for optimization problems in general. In this talk I shall demonstrate that for a large class of mathematical programs this is actually the case. More precisely, I shall introduce the concept of fractional symmetries of linear and convex quadratic programs (QPs), which lie at the heart of many machine learning approaches, and exploit it to lift, i.e., to compress them. These lifted QPs can then be tackled with the usual optimization toolbox (off-the-shelf solvers, cutting plane algorithms, stochastic gradients etc.): If the original QP exhibits symmetry, then the lifted one will generally be more compact, and hence their optimization is likely to be more efficient.[PDF]
This talk is based on joint works with Martin Mladenov, Martin Grohe, Leonard Kleinhans, Pavel Tokmakov, Babak Ahmadi, Amir Globerson, and many others.
The proceedings of the conference are available online for a limited period here.
Talks in order of presentation:
We would like to thank our sponsors for their generous support of CPAIOR 2017.
The conference will take place in the Aula Magna of the Department of Information Engineering, University of Padova (Via Gradenigo 6b, 35131 Padova).
Additional travel information can be found on the registration website here.