Abstract:
The vast diffusion of social networks has made an unprecedented amount of user-generated data available, increasing the importance of Aspect Based Sentiment Analysis(ABSA) when extracting sentiment polarity. Although recent research efforts favor the use of self attention networks to solve the ABSA task, they still face difficulty in extracting long distance relations between non-adjacent words, especially when a sentence has more than one aspect. We propose the BERT-MAM model which approaches the ABSA task as a memory activation process regulated by memory decay and word similarity, implying that the importance of a word decays over time until it is reactivated by a similarity boost. We base experiments on the less commonly used Bidirectional Encoder Representations from Transformers (BERT), to achieve competitive results in the Laptop and Restaurant datasets.