INFORMATION ABOUT PROJECT,
SUPPORTED BY RUSSIAN SCIENCE FOUNDATION

The information is prepared on the basis of data from the information-analytical system RSF, informative part is represented in the author's edition. All rights belong to the authors, the use or reprinting of materials is permitted only with the prior consent of the authors.

 

COMMON PART


Project Number20-78-10135

Project titleEye-tracking in immersive virtual reality: personalization of learning trajectories in chemistry education

Project LeadKurushkin Mikhail

AffiliationITMO University,

Implementation period 07.2020 - 06.2022 

Research area 08 - HUMANITIES AND SOCIAL SCIENCES, 08-604 - Theory and methodology of innovative processes in education, experimentation and innovation in education

KeywordsEye-tracking, cognitive abilities, virtual reality, chemical education


 

PROJECT CONTENT


Annotation
Since 2018, the Russian Federation has been operating the National Program “Digital Economy of the Russian Federation”, aimed at the widespread use of end-to-end technologies, including neurotechnologies, artificial intelligence, as well as VR / AR technologies. With the increase in the number of digital services with which students interact within the educational environment, the cognitive load also increases, which impedes the effective development of the material. Providing a comprehensive assessment of the student’s functional state, reducing stress, analyzing his level of skills are one of the main factors that should be fully taken into account when building a learning immersive virtual environment. Considering the quantitative assessment system adopted in the modern Russian school, we must state that it does not always fulfill its main functions: informational, controlling, regulatory. The immersion educational environment, in turn, has potentially high potential for the implementation of the information-control and regulatory function with minimal distraction of the student to assessment procedures, providing an integrated approach to evaluating the results of mastering the program and forming the educational route taking into account the dynamics of individual achievements and biological signals of students. Analysis of the biological signals of students will allow you to get all the necessary information, create a comfortable environment for the student and create a learning path adapted to his current functional state. The uniqueness of the eye trajectory signal lies in the fact that it simultaneously contains three types of information about the student: information about his personality, skills and functional state. This will require the development of new models, methods and algorithms for identifying the functional state, fixing the achieved skills in an immersive virtual environment based on an analysis of eye movements. Thus, the proposed project is aimed at solving the scientific problem of developing and scientifically substantiating a learning process management mechanism in a virtual immersive educational environment, taking into account individual characteristics and students' achievements in order to realize an individual’s internal need for information that is adequate to its capabilities. The scientific novelty of the research is due to the fact that in the framework of the project: — New approaches to assessment, when the assessment of the achievement of the planned results performs the function of feedback and regulatory (control) element of the system will be created; — New models, methods and algorithms for identifying the student’s functional state and his skills in a virtual immersive educational environment will be developed, which will require the expansion of existing results in the field of analysis of eye movements for the case of three-dimensional space; — For the first time, with the achievement of a synergistic effect, both VR and eye-tracking technologies will be simultaneously used to build and scientifically validate the effectiveness of virtual immersive educational environments. The relevance of solving this problem is as follows: The expected increase in the effectiveness of digital educational technologies and the level of motivation of students, the socio-technical nature of the technology being developed, aimed at individualizing the educational process, the possibility of distance learning will contribute to the growth of competitiveness of educational institutions, both within the country and abroad.

Expected results
The implementation of the project involves obtaining a set of scientific results: — a methodology for personalized monitoring of the process of teaching chemistry in immersive virtual reality based on eye-tracking technology; — an innovative methodology for assessing the results of teaching chemistry in a virtual educational environment that meets the requirements of individualization, health conservation, non-stress, and background in relation to the learning process; — a technology of forming a personalized trajectory of chemistry education in a virtual educational environment, taking into account the learner's skills and their cognitive abilities. The scientific significance of the expected results is due to the following: The implementation of the project will ensure the creation of new approaches to assessment in virtual reality, when the assessment of the achievement of the planned results performs the function of feedback and a regulatory (control) element of the system. These approaches, in turn, will contribute to the understanding of the assessment procedures for learning in virtual reality, currently available in the pedagogical sciences. During the course of the project, new models, methods and algorithms for identifying the student’s functional state and his skills in a virtual immersive educational environment were developed that will expand the applicability of existing results in the field of analyzing eye movements in cases of three-dimensional space. The compliance of the planned results with the world level is guaranteed by a number of factors, including: — High multidisciplinary scientific qualification of a project manager with, on the one hand, a PhD in chemical sciences, and, on the other hand, the title of International Engineering Teacher (Ing.Paed.IGIP) and an internationally recognized specialist in chemical education with regular publications in the leading journal of chemical education “Journal of Chemical Education”; — Interdisciplinary scientific background of the team leader and performers at the junction of chemical, pedagogical, computer and cognitive sciences, confirmed by publications; — Correspondence of the proposed methods and approaches to the world-class level of research in the field of cognitive and information educational technologies, in particular in the field of digital and virtual learning. The possibility of practical use of the expected results in the social sphere: The prospects for their use in creating high-tech health-saving training courses in immersive virtual reality. The practical use of technology for personalized monitoring of the learning process will streamline the ways, forms and speed of presentation of educational information. with knowledge at each of the stages of the lesson, the algorithm for creating an individualized trajectory of chemistry education in a virtual educational environment will significantly improve the quality of education by matching the training load based on the intellectual abilities of the student with monitoring of physiological functions. The possibility of practical use of the expected results in the economy: Education and corporate training is a priority industry for the application of VR / AR technologies and subtechnologies, important for social development and economic growth (roadmap for the development of “cross-cutting” digital technology “Virtual and Augmented Reality Technologies”, Moscow 2019). The introduction of VR / AR using neurotechnologies and artificial intelligence in the educational segment will provide affordable tools for users and complement training programs with interactive visual VR / AR content in the amount of up to 30% of all educational materials (with priority on subject areas that cannot be reproduced in traditional formats). This can lead to the following effects: improving the effectiveness of online learning; providing continuing professional education; ensuring the availability of quality education in the regions. With the development of the marketplace for educational projects, it is possible for Russian companies to receive 15% of the global market for VR education.


 

REPORTS


Annotation of the results obtained in 2021
The main goal of the project was to develop personalized educational trajectories in chemistry education. In this study, such data were measured as: general anxiety (questionnaire), chemical anxiety (questionnaire), spatial ability (questionnaire); heart rate, stress level; frequency of gaze movement; emotional condition. The work consisted of two parts: hardware-software and pedagogical experiment. HARDWARE AND SOFTWARE PART 1) Emotion tracking function, pupil size tracking function has been set up; 2) A code has been written to convert the received “raw” data on facial expressions from the facial tracker into a more convenient and understandable format for the user; 3) An algorithm for integrating heart rate and VR has been developed; 4) The view processing class has been rewritten; 5) Changed the processing of the direction of view; 6) A new menu has been introduced for both the participant and the operator. 7) Created and trained a neural network model for classifying 8 emotions using ML.NET Model Builder: a. Collected expressions of 41 people b. Created and trained a neural network model based on the received data c. The trained model is embedded in the application d. Added logging of predicted emotions 8) Expressions were collected using SRanipal: a. Implemented logging of data received using EyeTracker; b. Fixed SRanipal SDK bug due to which the change in pupil diameter was not registered.; c. Implemented logging of data received using FaceTracker. The research project was implemented in the NarupaXR program. Due to the fact that NarupaXR is about creating your own simulations that need to be carried out in another program (Narupa Builder), a companion application to NarupaXR was created that allows you to create, design and edit molecules in virtual reality. PEDAGOGICAL EXPERIMENT The function "recognition of emotions" was set up. With the help of the face tracker for the virtual reality system "FaceTracker" version 2.0, the emotional state of the study participants was analyzed. Previously, this tracker was intended for entertainment purposes only. The code developed as part of this project allows you to adapt emotion recognition to the learning goals. FaceTracker recognizes 38 facial expressions, of which 26 are basic emotions. But not all of these emotions a person experiences during training. The most common emotions during training were identified: neutral, surprise, fear, happiness, contempt, boredom, interest, shame. It also found that positive emotions such as pleasure, hope, and pride were positively associated with student effort, self-regulation, and more complex learning strategies, while anger, shame, anxiety, and boredom were associated with lower academic achievement and greater external regulation. For this, a neural network model was created and trained to classify 8 emotions using the ML.NET Model Builder. In addition, the results of the general anxiety test, the results of the chemical anxiety test, the results of the mental rotation test were analyzed to identify correlations, and in the future to use these data to personalize educational trajectories in chemistry education. 83 students of secondary schools in St. Petersburg took part in the study within the framework of the project. Of these, 42 people in the first year and 41 in the second. To identify the conceptual behavioral model of a student in a virtual immersive educational environment, different conditions were created for 2 experimental groups. The first group was tested for knowledge of chemical molecules, based on the color scheme of the Corey-Pauling-Koltun models and residual knowledge of chemistry. After conducting a study in VR for the first group, several important dynamic trends were identified: • Change in stress level before and after the study (on average: falls, rises, remains unchanged); • Change in heart rate before and after the study (on average: falls, rises, remains unchanged); • Change in anxiety before and after the experiment (test) (falls, rises, remains unchanged); • Measurement of spatial abilities (high, medium, low) • Measurement of the prevailing emotion during the passage of the task. At the next stage, a relationship was established between the leading emotion and the number of correct answers, as well as the frequency of eye movement. The higher the pulse, the more correct answers the research participant gives (happiness, fear). And the smaller the pulse, the fewer correct answers the subject gives. The next point of the work was to conduct a study with the correction of the duration and complexity of the presentation of the pedagogical tasks of the lesson scenario according to the results obtained using the developed technology. In this section, we studied the dynamics of the level of stress before completing tasks in VR and after. Stress levels have been observed to drop with time spent completing tasks in VR. This suggests that conducting classes in a virtual environment is a favorable circumstance. Further, it was found out how spatial abilities correlate with a predictor of success. The spatial abilities involved in solving spatial problems in the "classical" form and in virtual reality are very close, which means that conventional spatial tests can be used to predict the success of solving spatial problems in VR. Research in VR shows a high predictive power of spatial tests in predicting the success of chemistry tasks in VR, which means that spatial ability tests can be used to diagnose future chemistry performance and use spatial ability training to develop chemistry performance. With the help of an electroencephalograph, the following conclusions were obtained: • There is a dependence of abilities in chemical sciences on the distribution of EEG power rhythms of the brain in the process of solving USE tasks. • There are certain areas of the brain that differ in the rhythms of the brain in the process of solving chemistry problems in students in the control group and the group with the ability to chemistry. These features are reflected in the left frontal region (leads FC5, F3). • There are significant differences in the dynamics of changes in the power of different EEG frequency ranges in the process of solving chemical problems in the two groups of subjects. These differences are expressed in an increase in orderliness at the end of solving problems in students with pronounced chemical abilities, as well as in a decrease in the high-frequency components of rhythms. Based on all the data obtained, software was created based on all the developed models, methods and algorithms, and the developed virtual immersive educational environment was tested. The software developed within the framework of this project is based on the functional state, cognitive fatigue, stress level, heart rate and eye movement frequency. A specially designed module collects data from a fitness tracker, eye tracker, face tracker and electroencephalograph, a neural network analyzes the data, sorts them into numerical groups and issues adaptive tasks. All tasks start from the basic difficulty level (10 molecules, 2nd difficulty group). There are 4 difficulty groups in total, each group has 12 tasks. All data from the fitness tracker, eye tracker, face tracker and electroencephalograph are divided into 3 - 5 groups based on the obtained numerical data.

 

Publications

1. E.A. Korsakova, O.A. Sokolovskaya, D.A. Minakova, Yu.Yu Gavronskaya, N.I. Maksimenko, M.V. Kurushkin Chemist Bot as a Helpful Personal Online Training Tool for the Final Chemistry Examination Journal of Chemical Education, 99, 2, 1110–1117 (year - 2021) https://doi.org/10.1021/acs.jchemed.1c00789

2. J.S. Khukalenko, P.S. Bazhina, D.I. Zemtsov Immersive technologies in school education: based on the results of the All-Russian Testing Program Perspektivy Nauki i Obrazovania, - (year - 2022)

3. M.V. Likhanov, E.S. Maslennikova, G. Costantini, A.V. Budakova, E.A Esipenko, V.I. Ismatullina, Y.V. Kovas This is the way: network perspective on targets for spatial ability development programmes British Journal of Educational Psychology, - (year - 2022)

4. P.S. Pereshivkina, N.A. Karandasheva, M.D. Mikhaylenko, M.V. Kurushkin Immersive Molecular Dynamics in Virtual Reality: Increasing Efficiency of Educational Process with Companion Converter for NarupaXR Journal of Imaging, 7(6), 97 (year - 2021) https://doi.org/10.3390/jimaging7060097

5. M.D. Mikhaylenko, N.I. Maksimenko, M.V. Kurushkin Eye-Tracking in Immersive Virtual Reality for Education: A Review of the Current Progress and Applications Frontiers in Education, Article number 697032 (year - 2022) https://doi.org/10.3389/feduc.2022.697032

6. A.I. Markovnikova, M.D. Mikhaylenko, M.V. Kurushkin Teaching Based Personalized Monitoring Chemistry In Immersive Virtual Reality: Eye-tracking Proceedings of 2022 8th International Conference of the Immersive Learning Research Network, iLRN 2022, - (year - 2022)

7. I.I. Kuzminov, M.D. Mikhaylenko, M.V. Likhanov, M.V. Kurushkin Identifying Eyes Patterns Using Eye Tracking for Improving Virtual Chemistry Education Proceedings of 2022 8th International Conference of the Immersive Learning Research Network, iLRN 2022, - (year - 2022)


Annotation of the results obtained in 2020
To identify and mark information-significant objects of virtual reality in the context of the dynamics of the tasks of the scenario, the research groups were provided with the necessary workstations (including the HTC VIVE Pro Full Kit virtual reality systems), which included complexes of computer systems and equipment that were located on the basis of experimental sites. To record the duration of gaze fixation points that coincide with virtual reality objects, as well as to track the sequence of gaze transitions between virtual reality objects, software was used, which is a project in the Unity development environment. Software development was formally divided into relevant parts: 1) Server part - a module that was intended for reading files with work descriptions (in pdb format or other), configuring the display and parameters of molecular dynamics, as well as for managing the launch of the works themselves and logging data with information about the works being launched. 2) The client part is a module in which an algorithm for eye tracking and logging of its data was implemented. Since the narupa-protocol provides the ability to use several options for processing molecules, the option with the MDAnalysis toolkit (https://www.mdanalysis.org/) was chosen to carry out work on the server side. This option made it possible to read a large number of file formats with molecules, had a user-friendly interface and, most importantly, did not delete information about the bonds of atoms. For convenience and to increase the speed of selecting and launching files with tasks, a desktop application was developed. The client side was implemented in the C # programming language in the Narupa iMD toolkit (https://gitlab.com/intangiblerealities/narupa-applications/narupa-imd). This toolkit is presented as a client for displaying data received from the narupa server in a VR device. It is implemented using the Unity framework (version 2019.3.7). The SRanipal SDK was installed in the project directly to work with eye tracking, since there was no implementation of eye tracking in the original version of the project. The main function that was used is the function of obtaining a beam in the direction of a person's gaze. With the help of this function, it became possible to form an understanding of the interaction of a person's gaze with objects of a virtual environment by observing the intersection of the gaze ray with objects. In order to record the intersection of the gaze in VR glasses and atoms, as well as to obtain information about the trajectory of a person's gaze (otherwise, to be logged), work was done that made it possible to obtain the Narupa software module and the SRanipal SDK for interacting with molecules. To implement the study on the registration of eye movements in VR, tasks of various levels of complexity were developed. The essence of the tasks was to identify molecules in the program. The created task prototypes were written in the form of code (scripts) of molecules in PDB format, in order to reproduce the simulation on the Narupa iMD server. In the created tasks, the following molecular parameters were selected: color, molecular dynamics, size. Each chemical element in the created tasks had its own unique color in accordance with the color code. To carry out the tasks loaded into the software and hardware, volunteers who had previously performed classical testing were invited. As the primary results, numerical values ​​were obtained for nine indicators: the speed of moving the gaze between objects, time, the number of colors, the number of atoms, the number of molecules, the school marks of volunteers, the number of correct answers, test results, and the frequency of gaze movement. Next, a module was implemented for parsing, aggregating and analyzing the received data. The implemented aggregating functions made it possible to determine such parameters as: the average number of gaze movements when performing tasks, the average number of gaze movements per second (both from atom to atom and from molecule to molecule), the duration of the gaze at the object and the duration of the task, the number of different atoms, molecules and colors in the assignment, the number of views on specific chemical elements. Using the obtained software module, graphs were built and analyzed. A graphic confirmation of the correspondence and similarity of the volunteers' assessments was obtained as a result of the tests carried out: classical and with the use of virtual reality. Experiments in virtual reality, for the first time, combined existing software and technologies in order to create an adaptive test using eye-tracking technology in VR. Based on the data obtained in the field of cognitive research, it was concluded that for a quick and effective assessment of knowledge in VR, it is important to take into account the following parameters: the frequency of gaze movement; the number of colors used in the encryption of chemical elements in the task; complexity and number of molecules. The results obtained confirmed the prospects and feasibility of adaptive testing (testing technology for volunteers, in which each next question is selected automatically, based on the data answers to previous questions at a predetermined level of difficulty). The main difference between adaptive testing and classical tests is dynamic (in real time), rather than static determination of the list of questions that will be asked to the test taker. In this case, the choice of the next question is determined by the personal characteristics of each individual learner, and not by general rules "for everyone." For the further development of the technique, it is necessary to combine high-level programming languages ​​with eye tracking and database management. One of the tasks will be to obtain an automated adaptive testing process, in which a program-based orientation to the frequency of gaze movement will occur. The source code of the project is in the repository of ITMO University at the link: https://digital-code.itmo.ru/emotion-ai/htc-vive-pro-eye.

 

Publications

1. Maksimenko Nadezhda I., Okolzina Anzhelika I., Vlasova Anna, Tracey Chantal T., Kurushkin Mikhail V. Introducing the Atomic Structure to First-Year Undergraduate Chemistry Students with an Immersive Virtual Reality Experience Journal of Chemical Education, - (year - 2021) https://doi.org/10.1021/acs.jchemed.0c01441