TY - JOUR
T1 - Attention-based long short-term memory network using sentiment lexicon embedding for aspect-level sentiment analysis in Korean
AU - Song, Minchae
AU - Park, Hyunjung
AU - Shin, Kyung shik
N1 - Publisher Copyright:
© 2018 Elsevier Ltd
PY - 2019/5
Y1 - 2019/5
N2 - Although deep learning breakthroughs in NLP are based on learning distributed word representations by neural language models, these methods suffer from a classic drawback of unsupervised learning techniques. Furthermore, the performance of general-word embedding has been shown to be heavily task-dependent. To tackle this issue, recent researches have been proposed to learn the sentiment-enhanced word vectors for sentiment analysis. However, the common limitation of these approaches is that they require external sentiment lexicon sources and the construction and maintenance of these resources involve a set of complexing, time-consuming, and error-prone tasks. In this regard, this paper proposes a method of sentiment lexicon embedding that better represents sentiment word's semantic relationships than existing word embedding techniques without manually-annotated sentiment corpus. The major distinguishing factor of the proposed framework was that joint encoding morphemes and their POS tags, and training only important lexical morphemes in the embedding space. To verify the effectiveness of the proposed method, we conducted experiments comparing with two baseline models. As a result, the revised embedding approach mitigated the problem of conventional context-based word embedding method and, in turn, improved the performance of sentiment classification.
AB - Although deep learning breakthroughs in NLP are based on learning distributed word representations by neural language models, these methods suffer from a classic drawback of unsupervised learning techniques. Furthermore, the performance of general-word embedding has been shown to be heavily task-dependent. To tackle this issue, recent researches have been proposed to learn the sentiment-enhanced word vectors for sentiment analysis. However, the common limitation of these approaches is that they require external sentiment lexicon sources and the construction and maintenance of these resources involve a set of complexing, time-consuming, and error-prone tasks. In this regard, this paper proposes a method of sentiment lexicon embedding that better represents sentiment word's semantic relationships than existing word embedding techniques without manually-annotated sentiment corpus. The major distinguishing factor of the proposed framework was that joint encoding morphemes and their POS tags, and training only important lexical morphemes in the embedding space. To verify the effectiveness of the proposed method, we conducted experiments comparing with two baseline models. As a result, the revised embedding approach mitigated the problem of conventional context-based word embedding method and, in turn, improved the performance of sentiment classification.
KW - Attention mechanism
KW - Embedding learning
KW - LSTM
KW - Sentiment analysis
UR - http://www.scopus.com/inward/record.url?scp=85059673324&partnerID=8YFLogxK
U2 - 10.1016/j.ipm.2018.12.005
DO - 10.1016/j.ipm.2018.12.005
M3 - Article
AN - SCOPUS:85059673324
SN - 0306-4573
VL - 56
SP - 637
EP - 653
JO - Information Processing and Management
JF - Information Processing and Management
IS - 3
ER -