I. Introduction
A common technique to compute reduced lattice bases is the so-called LLL algorithm, which was proposed by Lenstra, Lenstra and Lovász (LLL) in 1982 [1]. Recent research on the modifications of the LLL algorithm has focussed on reducing the complexity and improving numerical stability in bases of high dimensionality; this was, e.g., done by means of parallel lattice basis reduction [2], segmented lattice reduction or random sampling methods; an overview can be found in [3]. An implementation of these algorithms, however, poses several problems in practice since the signal flow is determined by a column exchange condition, which depends on the input matrix. Thus, run-time and complexity are unknown.