ALGO Talks

ALGO Talk 06 - Multiobjective optimization methods for neural network training

Kathrin Klamroth, University of Wuppertal, 13 October 2021

The training of neural networks is a topical example of a large-scale optimization problem for which the specification of “the one” optimization goal remains a big challenge. Indeed, classical error functions like, for example, the mean squared error or the cross entropy often lead to overfitting and/or vulnerability to adversarial attacks and data errors. This can partially be explained by the fact that error functions typically ignore relevant optimization criteria such as the robustness of the neural network or its complexity. Particularly network complexity is a pivotal criterion, e.g., in applications with limited hardware resources like autonomous driving. Moreover, reducing the network complexity has a regularizing effect and has proven to reduce potential overfitting in practice. In this talk, we consider the simultaneous optimization of an error function and a regularizing cost function in a truly biobjective model. We consider several algorithmic strategies, including a multiobjective stochastic gradient descent algorithm and a bisection enhanced dichotomic search (BEDS) approach that aims at  identifying potential knees of the Pareto front. The methods are combined with a pruning strategy that is completely integrated in the training process and which requires only marginal extra computational cost. This provides a new perspective on automated machine learning, helps to reduce the time-consuming determination of hyperparameters, and thus paves the way for an adaptive decision support tool for preferable network architectures.

ALGO Talk 05 - Exploiting an integer linear programming formulation for the hypervolume subset selection problem

Andreia Guerreiro, INESC-ID, 2 July 2021

Set-quality indicators, which map a set of points into a scalar value, are a convenient way of assessing the quality of a set of solutions in multiobjective optimization. The hypervolume indicator is one of the most commonly used quality indicators. Due to its theoretical properties, it is frequently used within the (environmental) selection step in Evolutionary Multiobjective Optimization Algorithms (EMOAs). In such a case, selection may be viewed as a Hypervolume Subset Selection Problem (HSSP) that consists of selecting a subset of $k$ points out of a set of n points that maximizes the hypervolume indicator. However, already with 3 objectives, HSSP-based selection in EMOAs is usually solved either for particular cases such as $k=n-1$, or with greedy algorithms. Although a few exact algorithms exist to compute the HSSP for $k<n-1$, faster algorithms are required to make its integration in EMOAs practical. In this talk, a new integer linear programming formulation for the HSSP is presented, as well as an algorithm that exploits such formulation to considerably speed up the computation of the HSSP. Such algorithm makes HSSP-based selection in EMOAs more practical, at least for 3 objectives.

ALGO Talk 04 - Vertex sparsification for edge connectivity

Daniel Vaz, Technical University of Munich, 7 May 2021

Graph compression or sparsification is a fundamental question in information theory and computer science. Given a graph with a special set of vertices called terminals, we ask if it is possible to find a smaller graph, named sparsifier, which preserves some property of the graph with respect to the terminals. A major open problem in this area is to find small sparsifiers which preserve cuts between sets of terminals, that is, sparsifiers in which the minimum cut separating any two sets of terminals is the same as in the original graph. In this talk, we look at sparsifiers for the closely related setting of c-connectivity. For a fixed value of c, a c-connectivity sparsifier preserves the number of edge-disjoint paths between any two sets of terminals, unless the number of such paths is at least c in both the original graph and the sparsifier. We show that c-connectivity sparsifiers with $O(kc^4)$ edges exist, where $k$ is the number of terminals, and describe two algorithms for this problem: the first algorithm finds a sparsifier with $O(kc^4)$ edges in time $O(m(c \log n)^{O(c)})$, while the second finds a slightly worse sparsifier with $k O(c)^{2c}$ edges, in time $O(m c^{O(c)} \log^{O(1)} n)$. Our results lead to new data structures for dynamic c-connectivity problems, as well as more efficient algorithms for network design problems.

ALGO Talk 03 - One-exact ε-Pareto sets

Arne Herzel, University of Kaiserslautern, 23 April 2021

Abstract:

For multiobjective optimization problems, an ε-Pareto set is a set of feasible solutions that, for each (efficient) solution, contains a solution that dominates it up to a factor of 1+ε. It is well known that the polynomial-time solvability of a certain single-objective problem determines the class of multiobjective optimization problems that admit a polynomial-time computable ε-Pareto set. Similarly, in this talk, we characterize the class of problems having a polynomial-time computable ε-Pareto set that optimizes one objective function exactly by the efficient solvability of an appropriate single-objective problem. This class includes important problems such as multiobjective shortest path and spanning tree and the approximation guarantee we provide is, in general, best possible. Furthermore, for biobjective problems from this class, we provide an adaptive algorithm that computes a one-exact ε-Pareto set of cardinality at most twice the cardinality of a smallest such set.

ALGO Talk 02 - How to Catch Marathon Cheaters: New Approximation Algorithms for Tracking Paths

Pedro Matias, University of California (Irvine), 12 March 2021

Abstract:

Given an undirected graph, G, and vertices, s and t in G, the tracking paths problem is that of finding the smallest subset of vertices in G whose intersection with any s-t path results in a unique sequence. This problem is known to be NP-complete and has applications to animal migration tracking and detecting marathon course-cutting, but its approximability is largely unknown. In this talk, we address this latter issue, giving novel algorithms having approximation ratios of O(1+ε), O(lg OPT) and O(lg n), for H-minor-free, general, and weighted graphs, respectively. We also give a linear kernel for H-minor-free graphs.

ALGO Talk 01 - A Weight Set Decomposition Method for the Weighted Tchebycheff Scalarization

Stephan Helfrich, University of Kaiserslautern, 27 October 2020

Abstract:

In general, solving the weighted Tchebycheff scalarization problem is an adequate way to calculate supported as well as unsupported nondominated images of discrete multiobjective optimization problems. For a nondominated image y, the associated weight set component is defined by all weights under which y is optimal for the corresponding weighted Tchebycheff scalarization problem. Hence, the weight set can be decomposed into subsets whose structures allow a design of efficient solution methods and the study of ”robustness of images under preferences”. Weight set decomposition under the weighted sum scalarization has extensively been studied, although unsupported images cannot be calculated. We present the weight set decomposition for the weighted Tchebycheff scalarization, provide fundamental geometric properties of the weight set components and propose an adaption of the algorithm developed in [1, 2] to calculate the full decomposition including all nondominated images. Further, we introduce a Tchebycheff variant of the partial Dichotomic search algorithm, and show, restricting to tri-objective discrete optimization problem, how it can be utilized to iteratively enlarge weight set components.

[1] K. Dächert, K. Klamroth, R. Lacour, and D. Vanderpooten. Efficient computation of the search region in multi-objective optimization. European Journal of Operational Research, 260(3):841 – 855, 2017.

[2] Kathrin Klamroth, Renaud Lacour, and Daniel Vanderpooten. On the representation of the search region in multi-objective optimization. European Journal of Operational Research, 245(3):767 – 778, 2015.