Impact of COVID-19 on Education — Practical AI Architecture to determine online learning efficacy
COVID-19 has had a broad impact across multiple industries — one of the heavily impacted sectors is Education. Schools have switched to online classes for the rest of the academic year (2019–2020). Based on a recent McKinsey research (shown below), it is still too early to predict if the in-person classes will start from the Fall of 2020. This research was done mainly for higher education institutions, but is relevant for K-12 system as well.
Google has offered access to all the public schools to leverage their “Google Classroom” and “Google Meet” as the platform for online course creation and distribution. An exponential surge in usage of online instant messaging platforms like Microsoft Teams, Zoom, Slack, WhatsApp has become the new norm for work, school, and social interaction.
It is not unrealistic to expect that we need to get used to the new way of learning through increasingly remote online platform. For the last few years, Massive Open Online Courses (MOOC) movement has enabled to have a balance between online/self-paced learning and face to face interactive learning that is held in a classic classroom setting. One of the challenges in a remote online based learning mode is to understand how well the student is engaged during an online session delivered by the teacher.
In a physical classroom, the teacher is able to interact and monitor the kids and provide personalized attention for certain kids who may require additional help. It is hard to get this level of feedback and interaction when the teacher is interacting with 25 kids in a virtual classroom setting. This is where technology comes to the rescue. All recent laptops used by the students are equipped with a camera and a mouse. Real time facial recognition using Convolutional Neural Networks (CNN) coupled with tracking of mouse movement and keyboard recognition can be used as cues for a Deep Learning/Machine learning model to determine student engagement.
A sample architecture for the solution using Azure is shown below. The solution works in a classroom setting where you have multiple inputs from sensors for light, heat, HVAC, and classroom information such as location, room number, number of students checked in with their smart IDs etc. It can also be used for a remote classroom where the inputs are the camera, mouse tracking, and keyboard input.
1. The inputs are pre-processed and sent to a bidirectional IoT Hub.
2. For a remote student logging in with their unique student ID, an association is made for the ID and the facial image associated with student name. Using Microsoft Azure Cognitive Services, we can use facial recognition API to see if the student is distracted. Face API provides one of these many emotions without writing any custom code –“anger, contempt, disgust, fear, happiness, neutral, sadness, surprise”
3. The real time facial features along with keyboard and mouse inputs form a Time Series data that can be analyzed through Azure Stream Analytics (ASA). For further processing, data can be stored in warm storage using Cosmos DB or cold storage using Blob Storage.
4. Azure Machine Learning Studio can be used for drag and drop experience using numerous pre-built machine learning models. Long Short-Term Memory (LSTM) neural networks are generally used for “time series” data that is ingested real time.
5. Teacher can see real time reports of which kid is distracted or engaged through powerful dashboards that are updated real time using Power BI.
It is helpful to have a baseline image dataset that can be used as the corpus for supervised machine learning of data frames of student faces captured through the camera. You can use the DAiSEE (Dataset for Affective States in E-Environments) dataset for this purpose. You can learn more about this dataset at https://iith.ac.in/~daisee-dataset/
As an example, here is a picture showing students with different levels of engagement based on the Face API interpretation.
- The first row (a-d) shows students with normal engagement.
- The second row (e-h) shows students with high engagement.
- The third row (i-l) shows students with low engagement
In this article we have provided a high level overview of one possible solution that can be built using the services in Microsoft Azure platform and pre-built machine learning datasets. As we get used to the new norm of remote online learning, there are opportunities to create disruptive solutions to educate students in a safe, effective, engaging, and scalable ways.