The clutch: two-handed mobile multi-touch 3D object translation and manipulation | IEEE Conference Publication | IEEE Xplore

The clutch: two-handed mobile multi-touch 3D object translation and manipulation


Abstract:

Nowadays, handheld devices such as smartphones provide users with multi-touch input screens. Displaying interactive and touch-enabled 3D environments in such handheld dev...Show More

Abstract:

Nowadays, handheld devices such as smartphones provide users with multi-touch input screens. Displaying interactive and touch-enabled 3D environments in such handheld devices has become popular in different applications like games or virtual reality. Technologies such as Web3D and WebGL have made the creation and display of 3D environments in mobile devices easier than ever. However, object manipulation techniques are not as well developed. For example, moving an object within the 3D environment or other similar object-specific manipulations are neither intuitive nor easy to perform. Current manipulation techniques like Gizmo that are successful in systems that use mouse and keyboard are not designed for and do not work well for multi-touch handheld devices. In this paper, we present a novel technique to perform object manipulation in 6DOF in multi-touch screens. Our performance evaluations show that our technique compared to existing techniques such as Gizmo improves task completion time by 63% while increasing task precision by 52%.
Date of Conference: 11-11 October 2015
Date Added to IEEE Xplore: 21 December 2015
ISBN Information:
Conference Location: Ottawa, ON, Canada
References is not available for this document.

I. Introduction

Today, mobile 3D multimedia is widely used in a variety of applications such as mobile gaming, medical simulations, multimedia enhanced learning and working, virtual museum, tourist guides, and other interactive simulations. Many studies have reported that simplicity and usability are important factors for the adoption and acceptance of mobile 3D applications. However, designing simple and effective techniques for 3D interaction on touch-based mobile devices is a challenging task. While many research works have studied this issue, some interaction aspects need further research. In this paper, we introduce a novel method for 3D object translation and manipulation that is especially useful when two-handed mobile touch is used as the only input modality, which is very common today with smartphones or tablets. As an example, consider the case when a player wants to move a gun inside a mobile 3D game. In this situation, the player needs a specific interaction means to move the gun within the 3D space. While this is trivial on a gaming console or a device that has mouse and keyboard, it is very challenging in a touch display input device. We will show in the next chapter that most existing methods proposed for 3D object translation are not suitable for multi-touch input with two fingers on handheld devices. The main reason for this is that the user's input must take place on a small 2D plane (the display), yet moving and rotating virtual 3D objects requires user input with 6 degrees of freedom (6DOF).

Select All
1.
Paulo G de Barros, Robert J Rolleston and Robert W Lindeman, "Evaluation of multi-touch techniques for physically simulated virtual object manipulations in 3d space", Proc. of the 6th IASTED Intl Conf. on Human-Computer Interaction (HCI2011), 2011.
2.
Yoshiaki Akazawa, Yoshihiro Okada and Koichi Niijima, "Intelligent and intuitive interface for construction of 3d composite objects", Haptic Audio Visual Environments and their Applications 2005. IEEE International Workshop on, pp. 6, 2005.
3.
OD Gabriel Sepulveda and Vicente Parra, "Haptic cues for effective learning in 3d maze navigation", IEEE International Workshop on Haptic Audio Visual Environments and Games, 2008.
4.
M Anwar Hossain, Md Abdur Rahman, Abdulmotaleb El Saddik and Pierre Lévy, "Architecture for 3d navigation and authoring of distributed learning object repositories", Haptic Audio and Visual Environments and Their Applications 2004. HAVE 2004. Proceedings. The 3rd IEEE International Workshop on, pp. 117-122, 2004.
5.
Mark Fiala, "3d input using hand-held objects and computer vision", Haptic Audio Visual Environments and their Applications 2006. HAVE 2006. IEEE International Workshop on, pp. 94-98, 2006.
6.
Ali Asghar Nazari Shirehjini and Shervin Shirmohammadi, "A mobile 3d user interface for interaction with ambient audio visual environments", Haptic Audio visual Environments and Games 2009. HAVE 2009. IEEE International Workshop on, pp. 186-191, 2009.
7.
Aurélie Cohé, Fabrice Dècle and Martin Hachet, "tbox: a 3d transformation widget designed for touch-screens", Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 3005-3008, 2011.
8.
Alexander Kulik, Jan Dittrich and Bernd Froehlich, "The hold-and-move gesture for multi-touch interfaces", Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services, pp. 49-58, 2012.
9.
Mark Hancock, Sheelagh Carpendale and Andy Cockburn, "Shallow-depth 3d interaction: design and evaluation of one- two-and three-touch techniques", Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 1147-1156, 2007.
10.
Asier Marzo, Benoît Bossavit and Martin Hachet, "Combining multi-touch input and device movement for 3d manipulations in mobile augmented reality environments", Proceedings of the 2nd ACM symposium on Spatial user interaction, pp. 13-16, 2014.
11.
Annette Mossel, Benjamin Venditti and Hannes Kaufmann, "3dtouch and homer-s: intuitive manipulation techniques for one-handed handheld augmented reality", Proceedings of the Virtual Reality International Conference: Laval Virtual, pp. 12, 2013.
12.
Shumin Zhai and Paul Milgram, "Quantifying coordination in multiple dof movement and its application to evaluating 6 dof input devices", Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 320-327, 1998.

Contact IEEE to Subscribe

References

References is not available for this document.