Please use this identifier to cite or link to this item: http://oaps.umac.mo/handle/10692.1/160
Full metadata record
DC FieldValueLanguage
dc.contributor.authorCHEANG, KIN HENG (鄭健興)-
dc.date.accessioned2017-10-06T02:12:46Z-
dc.date.available2017-10-06T02:12:46Z-
dc.date.issued2017-
dc.identifier.citationCHEANG, K. H. (2017). MOOC-Gaze: Online Solution for Tracking Learners’ Gazing Dynamics of MOOC Videos (Outstanding Academic Papers by Students (OAPS)). Retrieved from University of Macau, Outstanding Academic Papers by Students Repository.en_US
dc.identifier.urihttp://oaps.umac.mo/handle/10692.1/160-
dc.description.abstractVideo-based lecturing has become the major teaching-and-learning activity of Massive Open Online Courses (MOOC). While students can learn any time at their own pace, course providers cannot have face-to-face communication with students to receive instantaneous feedback. Conventional ways like web forums allow learners to submit feedback after class, but this demands their additional effort to reflect on the lecturing process. Therefore, new techniques to evaluate learners’ in-class behaviour are needed. Gaze tracking is a technique which keeps track of the gaze position of a user. As a standard computer comes with a web camera, it is practical to perform non-intrusive gaze tracking of MOOC learners directly from user’s image. In this project, we created MOOC-Gaze, the software application to predict users' gaze positions of a MOOC video by analysing user’s facial and eye dynamics via web cameras. The prediction model is essentially a ridge-regression trained mapping function which takes users’ facial picture as input and outputs predicted gaze position as screen coordinates. The prediction model is built on-the-fly and user-specific, which requires one short calibration procedure before user starts watching the video. The predicted gaze positions can be visualized in heat-map form in a video replay for educators to observe learners’ individual and aggregated gaze patterns. This post-watch analysis can help in course evaluation, e.g. to find out whether gazing positions matching the expectation of the designer of the course video. MOOC-Gaze was designed to work with any kinds of video and to operate in a normal web browser. To improve the prediction accuracy, a literature survey of gaze tracking techniques was conducted and some solutions were proposed and tested. Our web framework together with the improved gaze prediction method delivers a new online gaze-tracking solution that can also be used in use cases other than MOOC.en_US
dc.language.isoen_USen_US
dc.titleMOOC-Gaze: Online Solution for Tracking Learners’ Gazing Dynamics of MOOC Videosen_US
dc.typeOAPSen_US
dc.contributor.departmentDepartment of Computer and Information Scienceen_US
dc.description.instructorMr. GOMES DA COSTA Junior, Miguelen_US
dc.contributor.facultyFaculty of Science and Technologyen_US
dc.description.programmeBachelor of Science in Computer Scienceen_US
Appears in Collections:FST OAPS 2017

Files in This Item:
File Description SizeFormat 
OAPS_2017_FST_010.pdf7.43 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.