I. Introduction
In Release 8, LTE system [1], [2] was standardized by 3GPP as the successor of Universal Mobile Telecommunication System (UMTS). Its progression continues to LTE-Advanced release-12 [3] in order to increase data throughput and efficiency spectrum, representing an evolution of previous technologies: UMTS, HSPA, and HSPA+. The LTE radio interface proven previously [4], has been optimized for mobile networks. LTE has several other benefits, particularly high throughput as well as reducing latency [5]. The LTE downlink transmission scheme is based on Orthogonal Frequency Division Multiple Access (OFDMA) where the wide-band frequency selective channel is converted in many fading sub-channels [6]. This method allows the system to operate in channel bandwidths from 1.4 MHz to 20 MHz. This flexible band and a very finely divided carrier, offer LTE to achieve significantly higher speeds ( Gbit/s for downlink) than its predecessors. It allows adaptive radio settings to channel, through intelligent use of bandwidth and resources like adaptive modulation, coding and MIMO technology [7]. Realistic LTE standard performance evaluation requires well-adapted simulators. Several simulators have been developed [8], [9], some of them [10], [11] assess the LTE network layer. Our goal in this work consists to facilitate the understanding and evaluating of physical layer various aspects. It can also assists engineers in manipulating mathematical LTE bricks in order to go from specification to implementation. Matlab simplifies specification for ease of configuration and addition of specific algorithms (where the standard permits) and generates languages like C or VHDL which can implemented on eNodeB or UE and will upgrade the equipment on this standard.