Stage M2: Developing Avatar Control with Eye-Tracking in VR (LS2N)
Supervisors:
Rebecca FRIBOURG, Associate Professor at LS2N laboratory, Ecole Centrale de Nantes
- Rebecca.Fribourg@ec-nantes.fr
Jean-Marie NORMAND, Professor at LS2N laboratory, Ecole Centrale de Nantes
- Jean-Marie.Normand@ec-nantes.fr
Geoffrey GORISSE, Associate Professor at LS2N laboratory, Ecole Centrale de Nantes
- geoffrey.gorisse@ensam.eu
Structure:
- City: Nantes (France)
- Category of hosting institution: Research Laboratory
- Host institution: Ecole Centrale de Nantes
Keywords: Virtual Reality, Avatar Embodiment, Eye Tracking
Duration and start date: 6 months
Internship context:
Virtual Reality allows users to be immersed in virtual environments while being able to interact with their content (e.g. virtual objects or other users). In many applications, users are represented in the VE by a virtual body, also called an avatar, which is animated according to their own movements in real time. This can be done by using motion capture suits or only specific sensors (e.g., head, hands and feet sensors) used to estimate body movements with inverse kinematics algorithms [1]. However, this assumes that users are in the capacity to move, which is not always the case. Indeed, VR applications tend to be more and more inclusive, which implies being accessible to people that might suffer from paraplegia or tetraplegia and therefore cannot move their own body. Furthermore, VR applications can also provide medical solutions for limb rehabilitation of stroke patients with limited limb control [2]. While some of these applications only provide visual feedback without control to the participants (e.g., an animation over a virtual body walking seen from first person point of view [3]), some applications try to involve a source of control from the participants despite their lack of mobility. Indeed, the amount of control - even very low - over a virtual body can greatly improve the sense of embodiment towards it [4], and have positive fallouts on the rehabilitation and the global experience in VR. For instance, several works explored the use of EEG-based brain-computer interface to provide control over the virtual body [5]. Yet, the use of such systems raises a lot of technical challenges, requires training and is time consuming to install. Overall, there exist very few studies that explore control schemes for avatars that do not involve the physical body.
The aim of this internship is therefore to explore new control capacities over an avatar in VR that involves extremely little or no movement at all of the physical body of users. In particular, we are willing to explore the use of eye tracking, potentially combined with other inputs (voice, or one button) to control an avatar in VR.
The goal of the internship will be to develop an immersive environment in Unity 3D and implement different control methods over an avatar involving eye tracking. The internship can include a user study on healthy participants in order to compare different control methods and how they might impact different dimensions of the user experience: the sense of embodiment (i.e., the feeling that the avatar is really the user’s body) towards the avatar (and in particular the sense of agency, that is the sense of being in control of one’s avatar’s movements), the ease-of-use, the performance of the method on a specific task, etc. If the results are conclusive, the intern will also have the possibility to write a research paper with the help of the supervisors.
Adapted equipment will be provided (VR-compatible computer, VR HMD, trackers, etc.)
The internship is open for a duration of 6 months, with a potential opening towards a PhD depending on funding opportunities.
Expected skills:
- Object-oriented programming (C#, or C++)
- Unity 3D
- Autonomy
- Interest in user studies and data analysis (not primordial)
References:
[1] D. Roth, J. -L. Lugrin, J. Büser, G. Bente, A. Fuhrmann and M. E. Latoschik, "A simplified inverse kinematic approach for embodied VR applications," 2016 IEEE Virtual Reality (VR), 2016, pp. 275-276, doi: 10.1109/VR.2016.7504760.
[2] Bui, Julie & Luaute, Jacques & Farnè, Alessandro. (2021). Enhancing Upper Limb Rehabilitation of Stroke Patients With Virtual Reality: A Mini Review. Frontiers in Virtual Reality. 2. 595771. 10.3389/frvir.2021.595771.
[3] J. Saint-Aubert, M. Cogne, I. Bonan, Y. Launey and A. Lecuyer, "Influence of user posture and virtual exercise on impression of locomotion during VR observation," in IEEE Transactions on Visualization and Computer Graphics, doi: 10.1109/TVCG.2022.3161130.
[4] Konstantina Kilteni, Raphaela Groten, Mel Slater; The Sense of Embodiment in Virtual Reality. Presence: Teleoperators and Virtual Environments 2012; 21 (4): 373–387. doi: https://doi.org/10.1162/PRES_a_00124
[5] Luu, T.P., Nakagome, S., He, Y. et al. Real-time EEG-based brain-computer interface to a virtual avatar enhances cortical involvement in human treadmill walking. Sci Rep 7, 8895 (2017). https://doi.org/10.1038/s41598-017-09187-0

Poste Doctorant F/H Immersive and Situated Visualizations of Personal Data
Informations généralesThème/Domaine : Interaction et visualisationInstrumentation et...
Thèse : « «Dynamique d’interactions tactiles et cognition sociale » à l’UTC
Poursuivez ici selon votre inspiration...Type de financement : Demi-bourse région + demi-bourse...
Internship on Human Robot Interaction evaluation in Virtual Reality
Keywords:Human Machine Interaction, Human Robot Collaboration, Virtual Reality, Virtual Reality...
Internship on Intelligent Tutoring System in Virtual Reality
Keywords: Virtual Reality Training System (VRTS), Data collection and analysis, Machine...
Postdoctoral researcher / Research engineer (AR/VR)
Keywords: Human Machine Interaction, Virtual Reality, Virtual Reality Training Systems (VRTS),...
Thèse: Prototypage rapide d’interactions en RA/RV
Mots-clés : Interaction homme – machine (IHM), réalités virtuelle et augmentée (RV&A), génie...