Please use this identifier to cite or link to this item: http://oaps.umac.mo/handle/10692.1/311
Full metadata record
DC FieldValueLanguage
dc.contributor.authorCHAN, HAN TOU(陳漢韜)-
dc.contributor.authorCHAO, CHI HANG(周志恆)-
dc.date.accessioned2023-06-20T03:33:56Z-
dc.date.available2023-06-20T03:33:56Z-
dc.date.issued2023-05-
dc.identifier.citationChan, H. T., Chao, C. H. (2023). Development of 3D Indusrial Models and 3D Scenes for Robotics Grasping Application (Outstanding Academic Papers by Students (OAPS)). Retrieved from University of Macau, Outstanding Academic Papers by Students Repository.en_US
dc.identifier.urihttp://oaps.umac.mo/handle/10692.1/311-
dc.description.abstractThe advancement of data-driven robot vision technology heavily relies on the availability of a vast amount of 3D data, including 3D instance models and diverse scenes. The objective of this project is to create a comprehensive 3D database that primarily comprises 3D models of common parts and tools, as well as various application scenarios, to facilitate data-driven safety monitoring robotics tasks. The project focuses primarily on fundamental vision tasks, such as 3D point cloud segmentation and object detection, as well as robotics tasks, such as grasping. Despite the availability of existing datasets, the scarcity of large-scale fullyannotated 3D datasets of parts and tools makes our project more valuable and significant for academic research. The primary objective of this project is to develop a comprehensive database of 3D models and scenes to support 3D safety monitoring robots. To achieve this, we utilize computeraided design software, SolidWorks, to create 3D models of commonly used tools and parts. These models are processed using MeshLab software to perform surface sampling and convert them into 3D point cloud data. Various 3D scenes are created by randomly settling these 3D instance models, so each different instance model has its range. Using the Python programs and Open3D library to segment the scene for the robotic grasping task. Finally, conduct robotic experiments to evaluate the effectiveness of the convert point cloud data in robotic grasping. However, during the experiment, some issues identified can be improved.en_US
dc.language.isoenen_US
dc.titleDevelopment of 3D Indusrial Models and 3D Scenes for Robotics Grasping Applicationen_US
dc.typeOAPSen_US
dc.contributor.departmentDepartment of Electromechanical Engineeringen_US
dc.description.instructorProf. Zhixin YANGen_US
dc.contributor.facultyFaculty of Science and Technologyen_US
dc.description.programmeBachelor of Science in Electromechanical Engineeringen_US
Appears in Collections:FST OAPS 2023



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.