I. Introduction
Suppose that we have the following box-constrained linear model: \begin{align*} {\boldsymbol {y}}=&\boldsymbol {A} {\hat { {\boldsymbol {x}}}} + \boldsymbol {v}, \quad \boldsymbol {v} \sim \mathcal {N}(\boldsymbol {0},\sigma ^{2} \boldsymbol {I}), \tag{1}\\ {\hat { {\boldsymbol {x}}}}\in \mathcal {B}\equiv&\{ {\boldsymbol {x}}\in \mathbb {Z}^{n}: \boldsymbol {\ell }\leq {\boldsymbol {x}}\leq \boldsymbol {u},\; \boldsymbol {\ell }, \boldsymbol {u}\in \mathbb {Z}^{n}, \boldsymbol {\ell }< \boldsymbol {u}\}, \tag{2}\end{align*}
where is an observation vector, with is a deterministic full column rank model matrix, is an integer parameter vector and is a noise vector following the Gaussian distribution with given .