Please use this identifier to cite or link to this item: http://dspace.uniten.edu.my/jspui/handle/123456789/11608
DC FieldValueLanguage
dc.contributor.authorIqbal, J.en_US
dc.contributor.authorSidhu, M.S.en_US
dc.contributor.authorAriff, M.B.M.en_US
dc.date.accessioned2019-01-16T08:43:34Z-
dc.date.available2019-01-16T08:43:34Z-
dc.date.issued2018-
dc.description.abstractPose matching and skeletal mapping method are an integral part of Augmented Reality (AR) based learning technology. In this paper a mechanism for pose matching is presented based on extraction of skeletal data from the dance trainer's physical movements in the form of color defined images snapped by Kinect, where each pose is modelled by a sequence of key movements and continues data frames. In order to extract the exact matched pose, the frame sequence is divided into pose feature frame and skeletal data frame by the use of pose matching dance training movement recognition algorithm (PMDTMR). This proposed algorithm is compared with other published methods in terms of frame level accuracy and learning time of dance session. The experimental results show that the proposed algorithm outperforms the state of art techniques for successful identification and recognition of matched pose between the dance trainer and the expert of the pre-recorded video through the Kinect sensor. © 2018 Authors.en_US
dc.language.isoenen_US
dc.titleAR oriented pose matching mechanism from motion capture dataen_US
dc.typeArticleen_US
dc.identifier.doi10.14419/ijet.v7i4.35.22749-
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:UNITEN Scholarly Publication
Files in This Item:
File SizeFormat 
AR oriented pose matching mechanism from motion capture data.pdf464.98 kBAdobe PDFView/Open
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.