I. Introduction
We consider the transmission of the information generated by a stationary ergodic binary source with memory over an AWGN channel. It is assumed that the source follows either a Markov Chain (MC) or a Hidden Markov Model (HMM). We denote by its entropy rate, i.e. {\cal H}(S)\buildrel{\triangle}\over{=}\lim_{n\rightarrow\infty}{1\over n}H(U_{1},\ldots, U_{n}) < H(U),\eqno{\hbox{(1)}}where is the entropy per single letter. The standard approach to tackle the problem of reliable transmission of information generated by through a noisy channel has been to separate the encoding process in two parts: first, a source encoder capable of compressing up to its theoretical limit (which is given by its entropy rate [9]), and second, a capacity-achieving channel code. Consequently, the lower limit of the average energy per real dimension for reliable transmission of the information generated by is given by {\cal H}(S)R_{c}={1\over 2}\log_{2}\left(1+{2E_{c}^{\ast}\over N_{0}}\right)\leftrightarrow{E_{c}^{\ast}\over N_{0}}={2^{2R_{c}{\cal H}(S)}- 1\over 2},\eqno{\hbox{(2)}}where is the one-sided noise power spectral density of the additive gaussian noise, and is the code rate (source symbols per channel use).