EVEN-VE: Eyes Visibility Based Egocentric Navigation for Virtual Environments

EVEN-VE: Eyes Visibility Based Egocentric Navigation for Virtual Environments

Navigation is one of the 3D interactions often needed to interact with a synthetic world. The latest advancements in image processing have made possible gesture based interaction with a virtual world. However, the speed with which a 3D virtual world responds to a user’s gesture is far greater than p...

Saved in:
Journal Title: International Journal of Interactive Multimedia and Artificial Intelligence
First author: M. Raees
Other Authors: S. Ullah
Palabras clave:
Language: Undetermined
Get full text: http://www.ijimai.org/journal/sites/default/files/files/2018/08/ijimai_5_3_15_pdf_15422.pdf
https://www.ijimai.org/journal/node/2559
Resource type: Journal Article
Source: International Journal of Interactive Multimedia and Artificial Intelligence; Vol 5, No 3 (Year 2018).
Publisher: Universidad Internacional de La Rioja
Usage rights: Reconocimiento (by)
Categories: Physical/Engineering Sciences --> Computer Science, Artificial Intelligence
Abstract: Navigation is one of the 3D interactions often needed to interact with a synthetic world. The latest advancements in image processing have made possible gesture based interaction with a virtual world. However, the speed with which a 3D virtual world responds to a user’s gesture is far greater than posing of the gesture itself. To incorporate faster and natural postures in the realm of Virtual Environment (VE), this paper presents a novel eyes-based interaction technique for navigation and panning. Dynamic wavering and positioning of eyes are deemed as interaction instructions by the system. The opening of eyes preceded by closing for a distinct time-threshold, activates forward or backward navigation. Supporting 2-Degree of Freedom head’s gestures (Rolling and Pitching) panning is performed over the xy-plane. The proposed technique was implemented in a case-study project; EWI (Eyes Wavering based Interaction). With EWI, real time detection and tracking of eyes are performed by the libraries of OpenCV at the backend. To interactively follow trajectory of both the eyes, dynamic mapping is performed in OpenGL. The technique was evaluated in two separate sessions by a total of 28 users to assess accuracy, speed and suitability of the system in Virtual Reality (VR). Using an ordinary camera, an average accuracy of 91% was achieved. However, assessment made by using a high quality camera testified that accuracy of the system could be raised to a higher level besides increase in navigation speed. Results of the unbiased statistical evaluations suggest/demonstrate applicability of the system in the emerging domains of virtual and augmented realities.