close
AI

Observe in the dark, there is no “headband”: the AI ​​camera can see whether you are distracted in class

1b6c587e-376f-11ea-b168-aaaa00151f43.jpg

Editor’s note: This article is from the WeChat public account “Heart of the Machine” (ID: almosthuman2014), the author is Synced, and participates in: Zhang Qian, Egg Sauce, Jamin, 36 氪 Published with permission.After the AI ​​camera entered the classroom, it became more and more difficult to pretend to listen carefully.Recently, researchers from the Hong Kong University of Science and Technology and Harbin Engineering have developed a system that uses AI cameras to record and analyze students’ emotional changes.course”.Many people have become accustomed to the monitoring work that AI can do in the classroom.”A professor looks at his computer after the lecture. With the help of a software, he can see the emotional changes of the students in this whole lesson. After 30 minutes of the lesson, most students have lost interest and started to lose their mind.Probably the time when he talked about the off-topic. So the professor made a record to remind himself not to run off-topic in the future. “Most of the actual classrooms are not like this, but with the development of technology, such scenarios will become more and more common.Recently, a paper on classroom monitoring technology was published in “IEEE Transactions on Visualization and Computer Graphics”.In the paper, researchers from the Hong Kong University of Science and Technology, Harbin Engineering University and other institutions proposed a system called EmotionCues.The system mainly focuses on how to record students ‘facial expressions, and analyzes students’ emotional changes and concentration in the classroom based on this.One of the authors, Qu Huamin, a computer professor at the Hong Kong University of Science and Technology, said that the system “provides a faster and more convenient way for teachers to measure student participation in the classroom.” The original intention of the study was “goodwill”:Rely on the system to monitor students’ emotional feedback in the classroom, determine when the students start to feel bored, and when they are more focused, so as to remind teachers how to improve the content of the classroom and improve the quality of teaching.The research team tested the proposed system in two classrooms. One classroom was a student from the Hong Kong University of Science and Technology, representing a group of college students; the other classroom was a kindergarten in Japan, representing a group of young students.Tests have found that this visual analysis system performs better in detecting those “obvious emotions”, such as the sense of joy when learning interest is more intense.However, the system’s ability to interpret “anger” or “sadness” is still lacking.Students may simply focus on the content of the class and frown just because of in-depth thinking, but they are easily interpreted as “anger” by the system.System workflow Figure 2 below shows the workflow of the entire system, including the two stages of data processing and visual exploration.figure 2.The first stage of the data processing process is to process a series of raw data and use computer vision algorithms to extract emotional information, including face detection, face recognition, emotion recognition, feature extraction and other steps.In the face detection step, the researchers used MTCNN (Multi-Task Convolutional Convolutional Network, a deep convolutional network for predicting faces and Landmark locations) to detect faces in each sample frame.In the face recognition phase, the usual method of face contrast is to vectorize the image.The researchers used facenet, a deep learning model that is more sophisticated in facial recognition, which can directly learn the mapping from facial images to compact European-style spaces.At the stage of emotion recognition, researchers chose to use classification models for reasons of intuitiveness and comprehension.They fine-tuned a CNN model (ResNet-50) using the FER 2013 dataset.This data set has been widely used for facial expression recognition.Considering that emotion recognition may not be so accurate, the researchers singled out some influencing factors (such as face size, occlusion, image resolution, lighting, etc.) and visually encoded them in the system to judge students.Their emotional condition.These influencing factors may play a key role in systematic sentiment analysis.For example, a person who is far away from the camera has a smaller area in the video, and is more likely to be misidentified.In addition, if a person’s face is often blocked by others, there will be a higher risk of misjudgment.Researchers have integrated these factors into the system analysis process and provided richer interactive functions to improve system performance.Interactive vision system The second stage is to design an interactive vision system based on the five major requirements (see the paper for details). The system can support two granular classroom video visual analysis, including the overall emotional evolution of a student and a student’s individualEmotional evolution.The researchers implemented a web-based system based on the Vue.js front-end framework and the Flask back-end framework, as shown in Figure 3 below.The system includes three views: a summary view (Figure 3a-b); a character view (Figure 3c) and a video view (Fig. 3d).image 3.It is important to provide the teacher with an overview of the student’s emotional changes, so the researchers designed a summary view that allows the teacher to see the static and dynamic evolution of student emotions.Figure 3 (a) shows the student’s sentiment file, which is used to show the student’s sentiment distribution (static summary); Figure 3 (b) shows the student’s emotional change curve (dynamic summary).The character view visually expresses the emotional state of the selected target person through a portrait-type glyph.The differences between different emotional portraits enable users to identify and compare the characteristics of different people.As shown in Figure 5 below, the researchers used a customized pie chart in the design: Figure 5: Visualization of emotional changes.With this customized pie chart design, users can easily observe detailed emotional information and the influencing factors that interest them.At the same time, the screen snapshot function makes it easier to compare emotional information between different people.If the user wants to see the details, they can click on the snapshot of interest to view it.An example of a snapshot is to the left of the character view (Figure 3c).In the system, the researchers provided the original video for users to browse in the video view (Figure 3d).At the same time, users can play videos at different speeds. When the user pauses the video, the corresponding face in each frame will be highlighted.Users can also select interesting parts for further exploration and mining based on their observation of emotional flow.”Improve” teaching or “monitor” teaching?The original intention of this research is to help the lecturer collect student feedback and improve teaching quality.But can the facts really do what they want?Compared to analyzing emotions based on video recordings, there are more exaggerated “smart headbands” in domestic classrooms.In the classroom of a primary school in Jinhua, Zhejiang, each seated student wore a black headband like a “golden hoop”, which turned on a red light when he focused and a blue light when he was distracted. This attention score was every 10 minutesSend it to the teacher’s computer once and synchronize it to the parent WeChat group, so that parents outside the school can keep track of the child’s class status at any time.But this headband, or this kind of classroom monitoring technology, faces a lot of questions.For example, ethical issues: it exposes students’ personal emotions in the classroom, allowing teachers to know who is focusing or not focusing in the classroom.This involves the privacy of students.In addition, in a 40-minute course, it is impossible for a student’s attention to remain fully focused, and it is not meaningful to continuously monitor the student’s attention and correct any attention-deficit behavior.On the other hand, such a monitoring system may distract teachers and students, because the people in them will feel that they have eyes “staring at themselves all the time.”If you wear a gold hoop, this emotion will become more obvious.This feeling of being monitored in real time can affect classroom participants’ freedom of expression to some extent.Reference link: https://spectrum.ieee.org/the-human-os/biomedical/devices/ai-tracks-emotions-in-the-classroom https://ieeexplore.ieee.org/stamp/stamp.jsp?tp= & arnumber = 8948010.

Tags : EntrepreneurshipInternet entrepreneurshipInternet entrepreneurship project