1 Introduction and Overview
When developing for mobile platforms, constraints such as compu-tational power and battery life often provide significant bottlenecks to overcome. Mobile Virtual Reality (VR) and Augmented Reality (AR) headsets are a particularly challenging area as visual rendering and 6DOF tracking tasks must be performed rapidly and consistently in order to provide a comfortable user experience, often placing hard constraints on the fidelity and complexity of content developed for such devices. It is speculated that gaze tracking is poised to become a core technology in VR and AR devices, as not only does it enable interaction via gaze leading to a more immersive experience, but also can be utilized to achieve significantly more efficient rendering via techniques such as foveated rendering [1].