A team of researchers at the University of California San Diego has developed an artificial intelligence system that can track and analyze the eyes of teachers and students in the classroom in order to enhance the virtual learning process in the future.
Researcher Shlomo Dubnov, an expert in music education through the computer at the University's Entertainment and Learning Research Center, began developing the new system to deal with the defects of music education methods through the electronic application Zoom during the Covid-19 pandemic.
Dubnov said, in the music halls, non-verbal means of communication such as physical gestures and facial expressions are extremely important to follow students’ performance, coordinate playing and communicate ideas the nonverbal aspects of process Communication has been severely damaged in virtual classrooms, due to the absence of the teacher and student in the same spatial space.
The new system follows the teacher's attention during the lesson, determines the person the teacher is looking at and allows the student to know that he is the focus of the teacher at a specific moment during the explanation. The study team made a prototype of the system and tested it in a virtual hall for teaching music through the Zoom application at the aforementioned university.
For his part, researcher Ross Greer said, the new system uses a camera to photograph the movements of the teacher's eyes to know their direction, and we have devised an algorithmic equation to accurately determine the direction the teacher looks at, which allows us to determine which student the teacher looks at or to whom the explanation is directed.
Jarir said, when the system detects any change in the teacher's angle of view, it identifies the new student who is looking at him, and displays a message on the screen to identify the name of the person the teacher is looking at.