Please use this identifier to cite or link to this item:
|Title:||Bootstrapping social emotion classification with semantically rich hybrid neural networks||Author(s):||Xie, Haoran
Wang, Philips Fu Lee
Lau, R. Y. K.
|Issue Date:||2017||Publisher:||IEEE||Journal:||IEEE Transactions on Affective Computing||Volume:||8||Issue:||4||Start page:||428||End page:||442||Abstract:||
Social emotion classification aims to predict the aggregation of emotional responses embedded in online comments contributed by various users. Such a task is inherently challenging because extracting relevant semantics from free texts is a classical research problem. Moreover, online comments are typically characterized by a sparse feature space, which makes the corresponding emotion classification task very difficult. On the other hand, though deep neural networks have been shown to be effective for speech recognition and image analysis tasks because of their capabilities of transforming sparse low-level features to dense high-level features, their effectiveness on emotion classification requires further investigation. The main contribution of our work reported in this paper is the development of a novel model of semantically rich hybrid neural network (HNN) which leverages unsupervised teaching models to incorporate semantic domain knowledge into the neural network to bootstrap its inference power and interpretability. To our best knowledge, this is the first successful work of incorporating semantics into neural networks to enhance social emotion classification and network interpretability. Through empirical studies based on three real-world social media datasets, our experimental results confirm that the proposed hybrid neural networks outperform other state-of-the-art emotion classification methods.
|URI:||https://repository.cihe.edu.hk/jspui/handle/cihe/427||DOI:||10.1109/TAFFC.2017.2716930||CIHE Affiliated Publication:||Yes|
|Appears in Collections:||CIS Publication|
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.