Please use this identifier to cite or link to this item:
Title: YOLO-Based Real Time Face Detection and Expression Recognition
Authors: ZHOU, JUN YU(周俊宇)
Department: Department of Computer and Information Science
Faculty: Faculty of Science and Technology
Issue Date: May-2023
Citation: Zhou, J. Y. (2023). YOLO-Based Real Time Face Detection and Expression Recognition(Outstanding Academic Papers by Students (OAPS)). Retrieved from University of Macau, Outstanding Academic Papers by Students Repository.
Abstract: In this project, we propose a YOLO-based real-time system for detecting human faces and recognizing facial expressions. The system we made for this project is to detect and classify faces in real time, it has a high accuracy and is applicable to many fields. For example, video surveillance, human-computer interaction and psychology research applications. The face detection module uses the YOLO algorithm to efficiently locate and identify faces in the input video stream. In contrast, the facial expression recognition module classifies the detected faces into one of several predefined expression categories (such as happy, sad, and angry) using a new method based on action units (AU). Our system is able to combine YOLO-based face detection and design new action unit to recognize form a project; it is based on computer vision and pattern recognition,and most importantly, we propose an innovative solution that uses the Facial Action Coding System to recognize expressions. We evaluate the performance of the system using a publicly accessible dataset and demonstrate its efficacy in a variety of scenarios. The system has numerous applications, including security and surveillance, human-computer interaction, and psychological research. In summary, the system we propose is capable of detecting faces and recognizing facial expressions in real-time.
Instructor: Prof. Liming Zhang
Programme: Bachelor of Science in Computer Science
Appears in Collections:FST OAPS 2023

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.