FIGURE 5 - available via license: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International
Content may be subject to copyright.
Source publication
In an era dominated by digital communication, the vast amounts of data generated from social media and financial markets present unique opportunities and challenges for forecasting stock market prices. This paper proposes an innovative approach that harnesses the power of social media sentiment analysis combined with stock market data to predict st...
Context in source publication
Context 1
... more significant errors in this case stress the importance of robust model validation and the possible impact of volatile market conditions on prediction accuracy. a) b) To ensure our model neither overfits nor underperforms on the training and validation datasets, we presented its performance in Figure 5. This figure clearly depicts the RMSE loss curves for both datasets throughout the training phase. ...
Similar publications
Deep learning, a foundational technology in artificial intelligence, facilitates the identification of complex associations between stock prices and various influential factors through comprehensive data analysis. Stock price data exhibits unique time-series characteristics; models emphasizing long-term data may miss short-term fluctuations, while...
Stock price fluctuations reflect market expectations for the economic situation and company profits. Accurately predicting stock prices has become a hot topic in academia. With the rapid development of artificial intelligence, many researchers are starting to use machine learning algorithms to predict stock prices. In this paper, a new time series...
The importance of the steel industry and its continuous development for steel companies in developing countries may lead to potential investment opportunities for investors in the stock markets. With the direct impact of financial indicators, this study was conducted with the main purpose to investigate the relationship between financial factors an...
Accurately forecasting stock prices helps investors decide when and where to invest. However, the dynamic, non-linear, complex and chaotic nature of the stock market makes price forecasting difficult. Market movements are influenced by many macroeconomic factors such as political events, corporate policies, economic conditions, commodity prices and...
The dynamic nature of stock markets, characterized by intricate patterns and sudden fluctuations, poses significant challenges to accurate price prediction. Traditional analytical methods are often unable to capture this complexity. This requires the use of advanced techniques capable of modelling non-linear dependencies. This study aims to build a...
Citations
... DL models, specifically neural networks, have the capacity to process amounts of unstructured data, such as text and images, used for complex financial tasks. Applications of DL in finance include sentiment analysis of news articles for stock price prediction and the development of risk assessment models [10][11][12]. In recent years, the finance has adopted DL techniques, because of the increasing availability of big data and computational power. ...
... DL can process large data and provide good results in fraud detection. A systematic literature review highlighted the effectiveness of different DL models, such as CNN and LSTM networks, in detecting fraud across domains like credit card transactions and insurance claims [10]. These models have improved accuracy [3,23]. ...
Machine learning (ML) and deep learning (DL) have impacted financial analytics, with advanced solutions for classification and regression tasks. This systematic review provides an analysis of state-of-the-art ML/DL applications in finance, with focus on methods, but also challenges. A total of 41 papers were analysed to identify trends, methodologies, and research gaps in this domain. The study begins with an overview of ML/DL in finance, main classification and regression problems, and challenges in financial data modelling. It then explores ML/DL techniques for classification tasks such as credit scoring, fraud detection, and algorithmic trading, evaluating traditional and modern approaches, including transformer-based models for sentiment analysis. Regression-oriented applications are analysed, with focus on stock price prediction, volatility forecasting, and portfolio optimization, with insights into hybrid modelling strategies. Comparative analysis assesses ML/DL models based on performance metrics, interpretability, and trade-offs between accuracy, computational complexity, and generalizability. The paper also identifies challenges, including data quality, ethical concerns, models, and the integration of ML/DL with traditional financial frameworks. Trends such as explainable AI, federated learning, and quantum computing are discussed as future directions. Findings show the increasing role of ML/DL in financial decision-making.
... Investor sentiment plays a key role in stock price variations, particularly in an era where social media and digital news platforms drive market behaviour. [10], focused on regional implementations and incorporated social sentiment with models like transductive LSTM. The application of generative models and transfer learning also presents a growing trend. ...
... The application of generative models and transfer learning also presents a growing trend. Peivandizadeh et al. [10] employed transductive LSTM combined with social media sentiment to enhance generalization capabilities in price forecasting. These studies offered contextual depth but called for broader validation across indices and real-time markets. ...
... Furthermore, Lou (2023) [12] and Zhu and Yen (2024) Most models rely on historical data rather than realtime sentiment updates, limiting practical trading [1], [7], [10], [12], [13], [26] Research Gap Category Description References applications. ...
The conventional financial value estimating techniques rely primarily on historical stock data, technical indicators, and fundamental parameters, and frequently ignore the psychological and sentiment-driven characters. The most research efforts in ML-based stock prediction models, focusing on decision fusion techniques, hybrid ensemble learning, deep neural networks, and sentiment-aware financial forecasting models. The recently designed time-series predicting models like RNN, CNN, and LSTM show a certain level of accuracy of the model. However, the market trend is influenced by social media mood, price volatility, investors' mentality, and global cues of fiscal markets. Also, Hybrid approaches that collect information from social media platforms like Twitter, Reddit, financial news, and technical indicators have verified superior projecting accuracy than the conventional models that rely only on historical datasets. The decision fusion paradigm has gained traction in stock forecasting, allowing researchers to combine multiple prediction models to enhance forecasting precision. The effectiveness of ensemble learning models, including XGBoost, GBM, and AdaBoost, in stock price forecasting has been widely studied. DL models such as Transformer-based NLP models have further advanced sentiment analysis applications in finance. The ability of BERT to contextualize textual data and extract nuanced financial sentiment has led to significant improvements in sentiment-aware forecasting models. Additionally, Aspect-Based Sentiment Analysis has been employed to disaggregate financial news sentiment, enabling a more granular understanding of investor perceptions and economic events. The review identifies key limitations in current stock forecasting models, including overfitting issues, interpretability concerns, and data scarcity for rare financial events. Additionally, multimodal financial forecasting frameworks integrating textual, numerical, and visual data sources can provide more comprehensive market insights. The adoption of reinforcement learning-based trading strategies, coupled with real-time streaming sentiment analysis, holds significant potential for improving algorithmic trading performance. This paper bridges the gap between machine intelligence research and real-world stock market applications, contributing to the expanding field of AI-driven financial analytics.
... This imbalance can reduce the effectiveness of Machine Learning models, leading to biased predictions and suboptimal classification results [6]. In sentiment analysis research, imbalanced datasets often cause models to favor dominant sentiment classes while underrepresenting minority classes, which can lead to misleading conclusions [7]. Previous studies have shown that addressing data imbalance through balancing techniques can significantly improve model performance in sentiment classification [8], [9]. ...
Sentiment analysis is crucial for understanding public opinion, especially in political contexts like the 2024 South Sumatra gubernatorial election. Social media platforms such as Twitter and YouTube provide key sources of public sentiment, which can be analyzed using machine learning to classify opinions as positive, neutral, or negative. However, challenges such as data imbalance and selecting the right model to improve classification accuracy remain significant. This study compares five machine learning algorithms (SVM, Naïve Bayes, KNN, Decision Tree, and Random Forest) and examines the impact of data balancing on their performance. Data was collected via Twitter crawling (140 entries) and YouTube scraping (384 entries), and text features were extracted using CountVectorizer. The models were then evaluated on imbalanced and balanced datasets using accuracy, precision, recall, and F1-score. The Decision Tree and Random Forest models achieved the highest accuracies of 79.22% and 75.32% on imbalanced data, respectively. However, they also exhibited overfitting, as indicated by their near-perfect training performance. Naïve Bayes, on the other hand, demonstrated the lowest accuracy at 54.55% despite achieving high precision, suggesting frequent misclassification, particularly for the minority class. SVM and KNN also struggled with imbalanced data, recording accuracies of 58.44% and 63.64%, respectively. Significant improvements were observed after applying data balancing techniques. The accuracy of SVM increased to 71.43%, and KNN improved to 66.23%, indicating that these models are more stable and effective when class distributions are even. These findings highlight the substantial impact of data balancing on model performance, particularly for methods sensitive to class distribution. While tree-based models achieved high accuracy on imbalanced data, their tendency to overfit underscores the importance of balancing techniques to enhance model generalization.
... An further critical component of IT traffic is the increasing speed of broadband. Compared to IP video and audio, the network requirements are substantially higher [3]. Distributing available bandwidth in a heavily populated network is no easy task. ...
... Ablation investigations confirm that the TLSTM and Off-policy PPO components have a good effect on the model's overall performance. The proposed method contributes to the field of financial analytics by providing a more sophisticated understanding of market dynamics, while also providing investors and policymakers with useful information to assist them navigate the complexities of the stock market with greater assurance and precision [35]. ...
Extreme learning approaches are used to perform sentiment analysis on restaurant evaluations. Sluggish training and overfitting are two problems that traditional supervised learning techniques frequently face. In order to prevent overfitting and speed up training, extreme learning makes use of single hidden layer neural networks with randomly assigned weights. The goal is to create a quick and accurate model for identifying sentiment in meal reviews and to assess the differences between supervised learning techniques and models based on extreme learning. In order to map the scores to attitudes, the study uses a dataset of food reviews, where scores larger than three are interpreted as favourable and scores below that as negative. Training and testing sets are created from the dataset following preparation, which includes handling missing values and choosing pertinent columns. For modeling purposes, text data is transformed into Term Frequency-Inverse Document Frequency (TF-IDF) characteristics. The network is given randomly initialized weights and biases in both single layer and multi layer perceptron implemented. During model training, the loss function is computed, weights and biases are adjusted by backpropagation, and predictions are computed using forward propagation technique. By using a threshold of 0.5, the model's accuracy is assessed. Accuracy scores are used as a statistic for reporting training and testing accuracy. The model further validates the effectiveness for sentiment analysis in the context of food reviews by showing quicker training times and less sensitivity to overfitting. The work presents extreme learning approaches as a competitive substitute for supervised learning, which advances sentiment analysis tools. Comparing the model based on extreme learning to traditional supervised learning methods, experimental results show that the latter achieves competitive accuracy in sentiment analysis. Faster training times and less sensitivity to overfitting are further features. As a strong substitute for supervised learning techniques, this study highlights the effectiveness of extreme learning approaches for sentiment analysis in meal evaluations. Extreme learning improves the efficacy and precision of sentiment analysis models, especially in areas such as restaurant evaluations, furthering the practical uses of sentiment analysis tools.
... The developed DRL approach improves the precision of the decisionmaking process. Peivandizadeh et al. 20 proposed a transductive long short-term memory (TLSTM) based stock market prediction model. A social media sentiment analysis technique is also employed in the model to analyze the scenarios of the markets. ...
Stock market stability relies on the shares, investors, and stakeholders’ participation and global commodity exchanges. In general, multiple factors influence the stock market stability to ensure profitable returns and commodity transactions. This article presents a contradictory-factor-based stability prediction model using the sigmoid deep learning paradigm. Sigmoid learning identifies the possible stabilizations of different influencing factors toward a profitable stock exchange. In this model, each influencing factor is mapped with the profit outcomes considering the live shares and their exchange value. The stability is predicted using sigmoid and non-sigmoid layers repeatedly until the maximum is reached. This stability is matched with the previous outcomes to predict the consecutive hours of stock market changes. Based on the actual changes and predicted ones, the sigmoid function is altered to accommodate the new range. The non-sigmoid layer remains unchanged in the new changes to improve the prediction precision. Based on the outcomes the deep learning’s sigmoid layer is trained to identify abrupt changes in the stock market.
... Sudden stockmarket volatility creates a risk to investments and investors' credibility [1,2]. Foremost, analysts and investors face difficulty predicting company stocks' future statuses [3,5]. Therefore, the prediction problems of the stock market affect the decision-making of the investors. ...
... Therefore, the prediction problems of the stock market affect the decision-making of the investors. Whereas knowing the best time to buy or sell stocks is the main goal of prediction [5,6]. The stock news indicators provide an opportunity of analyzing the market through news sentiment, expectations, or behaviors. ...
... The stock news indicators provide an opportunity of analyzing the market through news sentiment, expectations, or behaviors. This fact produces the need for an accurate prediction model of stock behavior to reduce the trading risk [5]. The sentiment of stock news regarding companies provides primary variables that impact stock prices [6]. ...
Stock market prediction of companies is a vital interest for financial analysts, investors, and other competitors. There is difficulty in predicting the future status of the companies' stocks. However, stock market behavior depends on the polarity prediction classification. Therefore, it is essential to use sentiment analysis to study attention indicators for stock market behavior in the news. Sentiment analysis (SA) can be used to extract public sentiments from stock news microblog platforms. Previous studies used machine learning (ML) algorithms to classify Arabic stock news into positive, negative, or neutral types. Recently, deep learning (DL) algorithms have provided good accuracy for Arabic SA. Motivated by such results; this study applies ML and DL techniques to classify sentiments of Arabic stock news. 30,098 articles were collected and preprocessed from the Saudi stock-market platform, Tadawul. For the sake of comparison, two ML and two DL techniques were performed for SA: Naive Bayes (NB), logistic regression, fast-text, and long short-term memory (LSTM). These algorithms were used to classify the sentiments of the collected data and help investors and stock analysts with decision-making. The results show that the DL techniques outperformed the ML algorithm. The experimental result of the LSTM model was 84%, which is the same as the reduced-features logistic regression model, but it has the lowest features over the same timeframe. Therefore, the LSTM model simultaneously has the best accuracy and the fewest features. On the other hand, the NB models achieve the worst performance. The Arabic SA models assist in decision-making based on the stocks news sentiments predicting the upcoming stock trends for the investors or analysts. These sentiment's models would limit the risks by supporting the decision-making analysts. Thus, this study would be a valuable resource to the stock market sector based on Arabic linguistic features.
... The training data influenced the recommended features of T-LSTM which were based on the length of the test data point X n referred to as T. The definitive aim of the training phase was to improve the efficacy closer to the test point and enhance the efficiency of the model. Considering a hidden state K (n) and the state space of T-LSTM model is presented in Equation 1 as follows (Peivandizadeh et al., 2024): ...
Tomatoes are considered one of the most valuable vegetables around the world due to their usage and minimal harvesting period. However, effective harvesting still remains a major issue because tomatoes are easily susceptible to weather conditions and other types of attacks. Thus, numerous research studies have been introduced based on deep learning models for the efficient classification of tomato leaf disease. However, the usage of a single architecture does not provide the best results due to the limited computational ability and classification complexity. Thus, this research used Transductive Long Short-Term Memory (T-LSTM) with an attention mechanism. The attention mechanism introduced in T-LSTM has the ability to focus on various parts of the image sequence. Transductive learning exploits the specific characteristics of the training instances to make accurate predictions. This can involve leveraging the relationships and patterns observed within the dataset. The T-LSTM is based on the transductive learning approach and the scaled dot product attention evaluates the weights of each step based on the hidden state and image patches which helps in effective classification. The data was gathered from the PlantVillage dataset and the pre-processing was conducted based on image resizing, color enhancement, and data augmentation. These outputs were then processed in the segmentation stage where the U-Net architecture was applied. After segmentation, VGG-16 architecture was used for feature extraction and the classification was done through the proposed T-LSTM with an attention mechanism. The experimental outcome shows that the proposed classifier achieved an accuracy of 99.98% which is comparably better than existing convolutional neural network models with transfer learning and IBSA-NET.
... Following the advent of the original LSTM, various enhancements and unique configurations, such as TLSTM, have been designed to increase its usefulness in diverse study applications [38]. The state space of the TLSTM is described in Equation 29. ...
This study tackles the complex challenge of accurately predicting stock market volatility through indicators from the housing market. We propose a sophisticated Early Warning System (EWS) designed to forecast stock market instability by leveraging the predictive power of housing market bubbles. Current EWS methods often face significant hurdles, including model generalization, feature selection, and hyperparameter optimization challenges. To directly address these issues, our innovative approach utilizes a spatial attention-based Transductive Long Short-Term Memory (TLSTM) model combined with a Reinforcement Learning (RL) strategy, which is further enhanced by a novel scope loss function for refined feature selection and an Artificial Bee Colony (ABC) algorithm for hyperparameter optimization. The TLSTM model surpasses traditional LSTM models by effectively capturing subtle temporal shifts and prioritizing data points proximate to the test sample, thereby enhancing model generalization. The RL component actively refines feature selection through continuous data interaction, ensuring the model captures the most significant features and effectively mitigates the risk of overfitting. The introduction of the scope loss function strategically manages the trade-off between exploiting known data and exploring new patterns, thereby maintaining a healthy balance between accuracy and generalizability. Additionally, the customized ABC algorithm specifically optimizes hyperparameters to increase the adaptability and performance of the model under varying market conditions. We validated our EWS using data from the Korean market, achieving an impressive accuracy of 90.427%. This validation demonstrates the robust capability of the system to forecast market dynamics. Our study significantly contributes to financial analytics by providing deeper insights into the interactions between housing and stock markets, particularly during periods of market bubbles. This research not only enhances predictive accuracy but also aids in understanding complex market behaviors, thereby offering valuable tools for financial risk management and decision-making.
... Neutrosophic logic is the field of neonate analysis in which every proposition is measured to have the proportions of indeterminacy in subset I, falsity in subset F, and truth in subset T [1]. A neutrosophic set (NS) is effectively implemented for unknown data processing and offers advantages to overcome the indeterminacy data and a model supported for classification and data analysis applications [2]. NS delivers an accurate and efficient methodology to explain imbalanced data depending upon the data features. ...
Neutrosophic cognitive maps are expansion of fuzzy cognitive maps, containing indetermination in causal relations. Fuzzy cognitive maps do not require an indeterminate relationship, making it less adequate for real-time applications. A logic in which every proposition is projected to have the truth percentage in subset T and the falsity percentage in subset F is named Neutrosophic Logic. This logic is also considered the general form of Intuitionistic fuzzy logic. Stock price prediction is a main topic in economics and finance, which has promoted the priority of investigators in recent years to improve improved predictive methods. Predicting price and tendency of the stock market denote indispensable features of finance and investment. Many scientists have presented their ideas to predict the market price to make money while trading utilizing different methods like statistical and technical analysis. This manuscript proposes a Neutrosophic Cognitive Map-Based Short-Term Financial Stock Market Price Trend Prediction (NCM-SFSMPTP) model. The main goal of NCM-SFSMPTP technique relies on improving the accurate approach for stock market price trend prediction. At first, the min-max normalization methodology is utilized in the data normalization phase to standardize and scale data for consistency, comparability, and efficient processing. For the classification process, the neutrosophic cognitive map (NCM) technique is employed. Finally, the improved arithmetic optimization algorithm (IAOA)-based hyper-parameter selection is implemented to enhance the classification outcomes of the NCM system. The performance validation of the NCM-SFSMPTP methodology is verified under the Apple Stock Price Trend and Indicators dataset and the outcomes are determined regarding to several measures. The experimental validation of the NCM-SFSMPTP method illustrated a superior accuracy value of 94.79% over existing models in stock market price trend prediction process.