Please use this identifier to cite or link to this item: https://repository.cihe.edu.hk/jspui/handle/cihe/848
DC FieldValueLanguage
dc.contributor.authorPoon, Geoffreyen_US
dc.contributor.authorPang, Raymond Wai Man-
dc.contributor.otherLi, K.-M.-
dc.date.accessioned2021-07-11T13:54:20Z-
dc.date.available2021-07-11T13:54:20Z-
dc.date.issued2017-
dc.identifier.urihttps://repository.cihe.edu.hk/jspui/handle/cihe/848-
dc.description.abstractThe recent advance in deep learning technologies has provided many opportunities to the industries in developing smarter products, such as smart toys, smart cars and smart homes. Unfortunately, a common practical issue to these deep learning methods is the high requirements in computing power even for the prediction part. As the deep learning models' complexities increase, the memory requirements for these models usually exceed the limit of many low end computing devices, such as mobile IoT devices. It is even worse when a multimodel approach is employed. In this paper, we will demonstrate with the development of multimodal emotion analysis on a smart toy, where the user's speech, facial expression and action are used for the understanding of user's emotion. By trimming down DNN complexity or replaced by other learning approaches, we are able to sequeeze four classifiers in 800MB of memory. Finally, results of these methods are ensembled with a fusion approach using a fully connected neural network to obtain a more accuracy and stable result. Our multimodal approach achieved an improvement of about 20% when comparing any unimodal emotion analysis.en_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.titleA memory-friendly multi-modal emotion analysis for smart toyen_US
dc.typeconference proceedingsen_US
dc.relation.publicationProceedings of 2017 IEEE International Symposium on Multimedia (ISM 2017)en_US
dc.identifier.doi10.1109/ISM.2017.86-
dc.contributor.affiliationSchool of Computing and Information Sciencesen_US
dc.relation.isbn9781538629376en_US
dc.description.startpage432en_US
dc.description.endpage437en_US
dc.cihe.affiliatedYes-
item.languageiso639-1en-
item.fulltextNo Fulltext-
item.openairetypeconference proceedings-
item.grantfulltextnone-
item.openairecristypehttp://purl.org/coar/resource_type/c_5794-
item.cerifentitytypePublications-
crisitem.author.deptYam Pak Charitable Foundation School of Computing and Information Sciences-
crisitem.author.deptYam Pak Charitable Foundation School of Computing and Information Sciences-
Appears in Collections:CIS Publication
SFX Query Show simple item record

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.