I. Introduction
In pursuance of larger bandwidth that is required for realizing one of the main promises of 5G, i.e. enhanced mobile broad-band (eMBB), millimeter wave (mmWave) communications is a key technology due to abundance of unused spectrum available at mmWave frequency ranges [1]. However, high path loss and poor scattering associated with mm Wave communications leads to intense shadowing and severe blockage, especially in dense urban environments. These are among the major obstacles to increase data rate in such high frequency bands. To tackle these issues effective beamforming (BF) techniques are required to avoid the power leakage to undesired directions using directional transmission patterns, i.e., narrow beams [2]. Furthermore, several experimental results demonstrate that the mmWave channel usually consists of a few components (a.k.a spatial clusters) [3]. Therefore, it is essential to align the devised narrow transmission beams with the direction of the channel components. The problem of aligning the directions of the beams with the angle of departure (AoD) associated with clusters of the channel, is termed as the beam alignment (BA) problem. In the literature the beam alignment problem is also indexed as beam training or beam search. Devising effective beam alignment schemes is essential since a slight deviation of the transmitted beam AoDs from the mm Wave channel clusters may result in a severe drop in the beamforming gain [4] [5].