Work

General caching

Caching structure
A scheme of the key structure used in proofs

Joint work with Jiří Sgall and the topic of my bachelor thesis On the Hardness of General Caching.

The computational hardness of general caching in the offline setting has been shown to be strongly NP-hard only recently, but under the assumption that pages as large as half of the cache size are allowed. In practical applications, it is usually assumed that pages are very small in comparison with the cache size. We prove that the problem is already strongly NP-hard when page sizes are restricted to be one, two or three. This result appeared as the paper General Caching Is Hard: Even with Small Pages in Algorithmica.

The thesis also contains an alternative proof for the characterization of work functions by layers in the case of uniform caching (paging) and the proof is further generalized to the case when the cache size varies over time. Two simple algorithms based on these results are developed in the last chapter of the thesis.

Multi-Level Aggregation Problem (MLAP)

MLAP example
A service (subtree containing the root) serving 6 out of the total 10 pending requests.

In the Multi-Level Aggregation Problem (MLAP), requests arrive at the nodes of an edge-weighted rooted tree T, and have to be served eventually. A service is defined as a subtree X of the tree T that contains its root. This subtree serves all requests that are pending in the nodes of X, and the cost of this service is equal to the total weight of X. Each request also incurs waiting cost between its arrival and service times. The objective is to minimize the total waiting cost of all requests plus the total cost of all service subtrees.

In our paper Marcin Bienkowski, Martin Böhm, Jaroslaw Byrka, Marek Chrobak, Christoph Dürr, Lukáš Folwarczný, Łukasz Jeż, Jiří Sgall, Nguyen Kim Thang, Pavel Veselý: Online Algorithms for Multi-Level Aggregation, we present the very first algorithm for the online variant of the problem which is competitive for an arbitrary (but fixed) depth of the tree T.

The competitive ratio of the algorithm is exponential in the depth of T. Algorithms with polynomial competitive ratio were developed in subsequent works.

IV-matching

Instance of IV-matching
A layered graph with an IV-matching depicted in red

IV-matching is a specific generalization of perfect bipartite matching to layered graphs and the complexity of finding IV-matching in a graph was posted as an open problem at the conference ICALP 2014. Together with Dušan Knop, we solved the problem during the summer program REU 2014. A simple proof that the problem is strongly NP-hard, already in the simplest case of four layers, is given in our note.