Figure - uploaded by Viviane Moreira
Content may be subject to copyright.
Sentiment Polarity Classification

Sentiment Polarity Classification

Source publication
Preprint
Full-text available
BERT (Bidirectional Encoder Representations from Transformers) and ALBERT (A Lite BERT) are methods for pre-training language models which can later be fine-tuned for a variety of Natural Language Understanding tasks. These methods have been applied to a number of such tasks (mostly in English), achieving results that outperform the state-of-the-ar...

Contexts in source publication

Context 1
... some examples expose the sentiment through emoticons. So, correctly interpreting these symbols Table 5, we did a 5-fold crossvalidation over all data. The fact that this corpus has several emoticons and out-of-vocabulary expressions makes it hard for the models that were There is a public ranking of this classification task on Kaggle 15 . ...
Context 2
... some examples expose the sentiment through emoticons. So, correctly interpreting these symbols Table 5, we did a 5-fold crossvalidation over all data. The fact that this corpus has several emoticons and out-of-vocabulary expressions makes it hard for the models that were There is a public ranking of this classification task on Kaggle 15 . ...

Similar publications

Article
Full-text available
Aspect-based sentiment analysis (ABSA) includes two sub-tasks namely, aspect extraction and aspect-level sentiment classification. Most existing works address these sub-tasks independently by applying a supervised learning approach using labeled data. However, obtaining such labeled sentences is difficult and extremely expensive. Hence, it is impor...
Preprint
Full-text available
Stimuli are easier to process when the preceding context (e.g., a sentence, in the case of a word) makes them predictable. However, it remains unclear whether context-based facilitation arises due to predictive preactivation of a limited set of relatively probable upcoming stimuli (with facilitation then linearly related to probability) or, instead...
Conference Paper
Full-text available
Despite the impressive growth of the abilities of multilingual language models, such as XLM-R and mT5, it has been shown that they still face difficulties when tackling typologically-distant languages, particularly in the low-resource setting. One obstacle for effective cross-lingual transfer is variability in word-order patterns. It can be potenti...
Conference Paper
Full-text available
This paper presents our solution for SemEval2022 Task 10: Structured Sentiment Analysis. The solution consisted of two modules: the first for sequence tagging and the second for relation classification. In both modules we used transformer-based language models. In addition to utilizing language models specific to each of the five competition langua...
Preprint
Full-text available
We benchmark different strategies of adding new languages (German and Korean) into the BigScience's pretrained multilingual language model with 1.3 billion parameters that currently supports 13 languages. We investigate the factors that affect the language adaptability of the model and the trade-offs between computational costs and expected perform...