dc.contributor.author |
Mokhosi, Refuoe |
|
dc.contributor.author |
Shikali, Casper S. |
|
dc.contributor.author |
Qin, Zhiguang |
|
dc.contributor.author |
Liu, Qiao |
|
dc.date.accessioned |
2024-03-25T13:08:26Z |
|
dc.date.available |
2024-03-25T13:08:26Z |
|
dc.date.issued |
2022-11 |
|
dc.identifier.citation |
Computer Speech & Language, Volume 76, 101402 November 2022 |
en_US |
dc.identifier.issn |
0885-2308 |
|
dc.identifier.uri |
https://www.sciencedirect.com/science/article/abs/pii/S0885230822000390 |
|
dc.identifier.uri |
http://repository.seku.ac.ke/xmlui/handle/123456789/7531 |
|
dc.description |
https://doi.org/10.1016/j.csl.2022.101402 |
en_US |
dc.description.abstract |
The vast diffusion of social networks has made an unprecedented amount of user-generated data available, increasing the importance of Aspect Based Sentiment Analysis(ABSA) when extracting sentiment polarity. Although recent research efforts favor the use of self attention networks to solve the ABSA task, they still face difficulty in extracting long distance relations between non-adjacent words, especially when a sentence has more than one aspect. We propose the BERT-MAM model which approaches the ABSA task as a memory activation process regulated by memory decay and word similarity, implying that the importance of a word decays over time until it is reactivated by a similarity boost. We base experiments on the less commonly used Bidirectional Encoder Representations from Transformers (BERT), to achieve competitive results in the Laptop and Restaurant datasets. |
en_US |
dc.language.iso |
en |
en_US |
dc.publisher |
Elsevier |
en_US |
dc.subject |
Aspect based sentiment analysis |
en_US |
dc.subject |
Memory decay |
en_US |
dc.subject |
Memory activation |
en_US |
dc.subject |
Attention networks |
en_US |
dc.subject |
Word similarity |
en_US |
dc.title |
Maximal activation weighted memory for aspect based sentiment analysis |
en_US |
dc.type |
Article |
en_US |