Please use this identifier to cite or link to this item: https://repository.seku.ac.ke/handle/123456789/7531
Title: Maximal activation weighted memory for aspect based sentiment analysis
Authors: Mokhosi, Refuoe
Shikali, Casper S.
Qin, Zhiguang
Liu, Qiao
Keywords: Aspect based sentiment analysis
Memory decay
Memory activation
Attention networks
Word similarity
Issue Date: Nov-2022
Publisher: Elsevier
Citation: Computer Speech & Language, Volume 76, 101402 November 2022
Abstract: The vast diffusion of social networks has made an unprecedented amount of user-generated data available, increasing the importance of Aspect Based Sentiment Analysis(ABSA) when extracting sentiment polarity. Although recent research efforts favor the use of self attention networks to solve the ABSA task, they still face difficulty in extracting long distance relations between non-adjacent words, especially when a sentence has more than one aspect. We propose the BERT-MAM model which approaches the ABSA task as a memory activation process regulated by memory decay and word similarity, implying that the importance of a word decays over time until it is reactivated by a similarity boost. We base experiments on the less commonly used Bidirectional Encoder Representations from Transformers (BERT), to achieve competitive results in the Laptop and Restaurant datasets.
Description: https://doi.org/10.1016/j.csl.2022.101402
URI: https://www.sciencedirect.com/science/article/abs/pii/S0885230822000390
http://repository.seku.ac.ke/xmlui/handle/123456789/7531
ISSN: 0885-2308
Appears in Collections:School of Science and Computing (JA)

Files in This Item:
File Description SizeFormat 
Mokhosi_Maximal activation weighted memory....pdfAbstract3.51 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.