Article

Dynamic graph in a symbolic data framework: An account of the causal relation using COVID-19 reports and some reflections on the financial world

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

This article aims to evaluate a complex relation structure represented by a graph, considering a high dimensional dataset in the Symbolic Data Analysis domain. We consider COVID-19 pandemic dynamic data regarding the first semester of 2020 associated with the daily infection rate in 214 countries with remarkable trends from the financial market; thus, the empirical causality. This work is innovative as we developed a dynamic graphical model for interval data based on center-range representation, which can shrink the parametric high-dimensional time series space and uncovers causal relations. Symbolic Data Analysis provides tools to reduce data dimension through the fusion of multivariate time series in data classes, which allows considering complex information through symbolic interval multivalued variables. Additionally, the Multiregression Dynamic Model (MDM) approach estimates a Directed Acyclic Graph (DAG) which distinguishes structural changes and irregular patterns by modeling the joint learning of multivariate time series, that is, allowing heterogeneous pattern collections and simultaneously estimating relationships across series, now as symbolic interval data. Time-varying parameter estimates of allowed us to translate the influence (internal and external) of these structures dynamically, during the first months of 2020, on the interconnectedness of global regions and the spread of coronavirus worldwide. Then, descriptions of the internal variation of the regions are obtained, after the first months of the semester, reflecting the lockdown (that is, the virus transmission occurs in a generalized way worldwide, then reduced, but concentrated within the regions and not more between them). Finally, an association was sought on the impact of the disclosure (news) of COVID-19 and empirical impacts with performances of the main indices of the global financial market, in which an association between these phenomena was noticeable.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... It is the results of fast person-to-person transmission (Fidan & Yuksel, 2021;Pitchaimani & Devi, 2021) of the Covid-19 delta variant as well as shortage of suitable strategies for preventing the spreads of virus among groups of confirmed Covid-19 cases to other healthy ones. In addition, the shortage of proper data modelling and short/long-term Covid-19 outbreak forecasting solutions also led to challenges for the governments to effectively manage as well as plan for social resource optimization (Nascimento et al., 2021).The accurate pandemic forecasting mechanism also supports for the governments to properly impose suitable policies to simultaneously deal with the expansion of Covid-19 as well as ensure the social/economic stability (Miao, Last, & Litvak, 2022). Due to the severe influences of this pandemic in multiple social aspects, it is considered as necessary for building data analysis systems which support to capture the spreading temporal patterns of this pandemic. ...
... Similar to that, recently Chatterjee, A. et al. (Chatterjee, Gerdes, & Martinez, 2020) studied on using multiple LSTM-based architectures to efficiently preserve the dynamic temporal information from reported Covid-19 spreading data to conduct accurate predictions. Also related to Covid-19 time series based data evaluation and learning, Nascimento et al. (Nascimento et al., 2021) recently propped a novel dynamic graph-based analysis technique with multi-regression dynamic model (MDM) approach. It supports to find relationships between time series routine Covid-19 reported data and financial market trends. ...
Article
For preventing the outbreaks of Covid-19 infection in different countries, many organizations and governments have extensively studied and applied different kinds of quarantine isolation policies, medical treatments as well as organized massive/fast vaccination strategy for over-18 citizens. There are several valuable lessons have been achieved in different countries this Covid-19 battle. These studies have presented the usefulness of prompt actions in testing, isolating confirmed infectious cases from community as well as social resource planning/optimization through data-driven anticipation. In recent times, many studies have demonstrated the effectiveness of short/long-term forecasting in number of new Covid-19 cases in forms of time-series data. These predictions have directly supported to effectively optimize the available healthcare resources as well as imposing suitable policies for slowing down the Covid-19 spreads, especially in high-populated cities/regions/nations. There are several progresses of deep neural architectures, such as recurrent neural network (RNN) have demonstrated significant improvements in analyzing and learning the time-series datasets for conducting better predictions. However, most of recent RNN-based techniques are considered as unable to handle chaotic/non-smooth sequential datasets. The consecutive disturbances and lagged observations from chaotic time-series dataset like as routine Covid-19 confirmed cases have led to the low performance in temporal feature learning process through recent RNN-based models. To meet this challenge, in this paper, we proposed a novel dual attention-based sequential auto-encoding architecture, called as: DAttAE. Our proposed model supports to effectively learn and predict the new Covid-19 cases in forms of chaotic and non-smooth time series dataset. Specifically, the integration between dual self-attention mechanism in a given Bi-LSTM based auto-encoder in our proposed model supports to directly focus the model on a specific time-range sequence in order to achieve better prediction. We evaluated the performance of our proposed DAttAE model by comparing with multiple traditional and state-of-the-art deep learning-based techniques for time-series prediction task upon different real-world datasets. Experimental outputs demonstrated the effectiveness of our proposed attention-based deep neural approach in comparing with state-of-the-art RNN-based architectures for time series based Covid-19 outbreak prediction task.
... Scealy and Wood (2022) developed a novel scoring matching method to improve the estimation of parameters. While facing with the uncertainty, symbolic data including intervalvalued data has been widely studied (Boczek et al., 2022;Kang et al., 2022;Liu, Wang et al., 2022;Nascimento et al., 2021;Silva et al., 2019). Among these types of data, compositional data is a type of relative data reflecting the proportion of different units in an integrated system. ...
... Furthermore, the S&P 1500 company in the United States was the worst performer in its history. However, South Africa, Ivory Coast and Uganda have not had a significant impact due to the COVID-19 pandemic (Nascimento et al., 2021). ...
Article
This study aims to see how the COVID-19 pandemic can affect the development of the capital market. This is done by reviewing articles through the literature review method. There are as many as 15 articles that will be studied in this study. This research concludes that investor confidence can remain stable or increase despite the COVID-19 pandemic because of good government policies. Although this policy can help the capital market, the capital market will remain uncertain without medical solutions such as vaccines or drugs sold cheaply. Therefore, it is necessary to have good management from the top in dealing with various shocks in the stock market, which can indicate the quality of corporate governance.
Article
Full-text available
This study goal to scrutinize the influences of the COVID-19 pandemic on unemployment in five selected European economies. To this end, the study uses a Fourier causality test for the period of December-2019 to December-2020. In Z-test results, Germany, Spain, and the UK have a significant positive change in unemployment due to COVID-19. The finding shows that COVID-19 cases cause unemployment for Germany, Italy, and the UK. Moreover, in terms of deaths, COVID-19 also causes unemployment in Italy and UK. Overall, the study's outcomes highlight that the pandemic increases the unemployment rate robustly in the mostly European economies. That is one of the rare negative effects of the virus on the European labor market. Novel COVID-19 findings provide a reliable guide to the future policy implication for the labor market. An active labor market policy will be needed to be in front of the world urgently.
Article
Full-text available
Background: The COVID-19 coronavirus pandemic has affected virtually every region of the globe. At the time of conducting this study, the number of daily cases in the United States is more than any other country, and the trend is increasing in most of its states. Google trends provide public interest in various topics during different periods. Analyzing these trends using data mining methods might provide useful insights and observations regarding the COVID-19 outbreak. Objective: The objective of this study is to consider the predictive ability of different search terms not directly related to COVID-19 with regards to the increase of daily cases in the US. In particular, we are concerned with searches for dine-in restaurants and bars. Data were obtained from Google trends API and COVID tracking project. Methods: To test causation of one time series on another, we used Granger's Causality Test. We considered the causation of two different search query trends related to dine-in restaurant and bars, on daily positive cases in top-10 states/territories of the United States with the highest and lowest daily new positive cases. In addition, to measure the linear relation of different trends, we used Pearson correlation. Results: Our results showed for states/territories with higher numbers of daily cases, the historical trends in search queries related to bars and restaurants, which mainly happened after re-opening, significantly affect the daily new cases, on average. California, for example, had most searches for restaurants on June 7th, 2020, which affected the number of new cases within two weeks after the peak with the P-value of .004 for Granger's causality test. Conclusions: Although a limited number of search queries were considered, Google search trends for restaurants and bars showed a significant effect on daily new cases for states/territories with higher numbers of daily new cases in the United States. We showed that such influential search trends could be used as additional information for prediction tasks in new cases of each region. This prediction can help healthcare leaders manage and control the impact of COVID-19 outbreaks on society and be prepared for the outcomes. Clinicaltrial:
Article
Full-text available
The novel coronavirus (COVID-19) pandemic that emerged from Wuhan city in December 2019 overwhelmed health systems and paralyzed economies around the world. It became the most important public health challenge facing mankind since the 1918 Spanish flu pandemic. Various theoretical and empirical approaches have been designed and used to gain insight into the transmission dynamics and control of the pandemic. This study presents a primer for formulating, analysing and simulating mathematical models for understanding the dynamics of COVID-19. Specifically, we introduce simple compartmental, Kermack-McKendrick-type epidemic models with homogeneously- and heterogeneously-mixed populations, an endemic model for assessing the potential population-level impact of a hypothetical COVID-19 vaccine. We illustrate how some basic non-pharmaceutical interventions against COVID-19 can be incorporated into the epidemic model. A brief overview of other kinds of models that have been used to study the dynamics of COVID-19, such as agent-based, network and statistical models, is also presented. Possible extensions of the basic model, as well as open challenges associated with the formulation and theoretical analysis of models for COVID-19 dynamics, are suggested.
Article
Full-text available
Caputo fractional derivatives using a predictor-corrector method. We provided numerical simulations to show the nature of the diseases for different classes. We derived existence of unique global solutions to the given time delay fractional differential equations (DFDEs) under a mild Lipschitz condition using properties of a weighted norm, Mittag-Leffler functions and the Banach fixed point theorem. For the graphical simulations, we used real numerical data based on a case study of Wuhan, China, to show the nature of the projected model with respect to time variable. We performed various plots for different values of time delay and fractional order. We observed that the proposed scheme is highly emphatic and easy to implementation for the system of DFDEs.
Article
Full-text available
In the present paper, we formulate a new mathematical model for the dynamics of COVID-19 with quarantine and isolation. Initially, we provide a brief discussion on the model formulation and provide relevant mathematical results. Then, we consider the fractal-fractional derivative in Atangana-Baleanu sense, and we also generalize the model. The generalized model is used to obtain its stability results. We show that the model is locally asymptotically stable if R 0 < 1 . Further, we consider the real cases reported in China since January 11 till April 9, 2020. The reported cases have been used for obtaining the real parameters and the basic reproduction number for the given period, R 0 ≈ 6.6361 . The data of reported cases versus model for classical and fractal-factional order are presented. We show that the fractal-fractional order model provides the best fitting to the reported cases. The fractional mathematical model is solved by a novel numerical technique based on Newton approach, which is useful and reliable. A brief discussion on the graphical results using the novel numerical procedures are shown. Some key parameters that show significance in the disease elimination from the society are explored.
Article
Full-text available
The outbreak of coronavirus named COVID-19 has disrupted the Chinese economy and is spreading globally. The evolution of the disease and its economic impact is highly uncertain which makes it difficult for policymakers to formulate an appropriate macroeconomic policy response. In order to better understand possible economic outcomes, this paper explores seven different scenarios of how COVID-19 might evolve in the coming year using a modelling technique developed by Lee and McKibbin (2003) and extended by McKibbin and Sidorenko (2006). It examines the impacts of different scenarios on macroeconomic outcomes and financial markets in a global hybrid DSGE/CGE general equilibrium model. The scenarios in this paper demonstrate that even a contained outbreak could significantly impact the global economy in the short run. These scenarios demonstrate the scale of costs that might be avoided by greater investment in public health systems in all economies but particularly in less developed economies where health care systems are less developed and popultion density is high.
Article
Full-text available
In this Commentary, we would like to comment on the article titled “A rapid advice guideline for the diagnosis and treatment of 2019 novel coronavirus (2019-nCoV) infected pneumonia (standard version)” as a featured article in Military Medical Research. In the guideline, except for “confirmed cases”, “suspected cases”, “close contact” and “suspicious exposure” were defined by clinical perspective based on epidemiological risk, clinical symptoms and auxiliary examination. Combined with our experience, we introduced a simple scoring proposal additionally based on not only CT imaging as strongly recommended by the guideline but also blood routine test, especially for primary screening of such patients in the out-patient department.
Research
Full-text available
The outbreak of coronavirus named COVID-19 has disrupted the Chinese economy and is spreading globally. The evolution of the disease and its economic impact is highly uncertain which makes it difficult for policymakers to formulate an appropriate macroeconomic policy response. In order to better understand possible economic outcomes, this paper explores seven different scenarios of how COVID-19 might evolve in the coming year using a modelling technique developed by Lee and McKibbin (2003) and extended by McKibbin and Sidorenko (2006). It examines the impacts of different scenarios on macroeconomic outcomes and financial markets in a global hybrid DSGE/CGE general equilibrium model. The scenarios in this paper demonstrate that even a contained outbreak could significantly impact the global economy in the short run. These scenarios demonstrate the scale of costs that might be avoided by greater investment in public health systems in all economies but particularly in less developed economies where health care systems are less developed and population density is high.
Article
Full-text available
The Dynamic Conditional Correlation GARCH (DCC-GARCH) mutation model is considered using a Monte Carlo approach via Markov chains in the estimation of parameters, time-dependence variation is visually demonstrated. Fifteen indices were analyzed from the main financial markets of developed and developing countries from different continents. The performances of indices are similar, with a joint evolution. Most index returns, especially SPX and NDX, evolve over time with a higher positive correlation.
Article
Full-text available
São Paulo is the largest city in South America, with high criminality rates. The number and type of crimes varies considerably around the city, assuming different patterns depending on urban and social characteristics. In this scenario, enabling tools to explore particular locations of the city is very important for domain experts to understand how urban features as to mobility, passersby behavior, and urban infrastructures can influence the quantity and type of crimes. In present work, we present CrimAnalyzer, a visualization assisted analytic tool that allows users to analyze crime behavior in specific regions of a city, providing new methodologies to identify local crime hotspots and their corresponding patterns over time. CrimAnalyzer has been developed from the demand of experts in criminology and it deals with three major challenges: i) flexibility to explore local regions and understand their crime patterns, ii) Identification of not only prevalent hotspots in terms of number of crimes but also hotspots where crimes are frequent but not in large amount, and iii) understand the dynamic of crime patterns over time. The effectiveness and usefulness of the proposed system are demonstrated by qualitative/quantitative comparisons as well as case studies involving real data and run by domain experts.
Conference Paper
Full-text available
The emergence of the big data has called for considering new methodologies to analyze big networks. In these particular contexts there are many cases in which it is important to take into account not only the single node but groups of nodes which can have the same or similar functions on a defined network. On large networks it is important to represent them in a meaningful way. Interval data seems an adequate representation which can be used to represent these networks. The specific contribution of this work it is to show the way in which is possible to rank the different structural characteristics of the different robust communities represented by the network. The rank applied to the structural characteristics allows the understanding also of the relevant core of the network
Article
Full-text available
Connectivity studies of the brain are usually based on functional Magnetic Resonance Imaging (fMRI) experiments involving many subjects. These studies need to take into account not only the interaction between areas of a single brain but also the differences amongst those subjects. In this paper we develop a methodology called the group-structure (GS) approach that models possible heterogeneity between subjects and searches for distinct homogeneous sub-groups according to some measure that reflects the connectivity maps. We suggest a GS method that uses a novel distance based on a model selection measure, the Bayes factor. We then develop a new class of Multiregression Dynamic Models to estimate individual networks whilst acknowledging a GS type dependence structure across subjects. We compare the efficacy of this methodology to three other methods, virtual-typical-subject (VTS), individual-structure (IS) and common-structure (CS), used to infer a group network using both synthetic and real fMRI data. We find that the GS approach provides results that are both more consistent with the data and more flexible in their interpretative power than its competitors. In addition, we present two methods, the Individual Estimation of Multiple Networks (IEMN) and the Marginal Estimation of Multiple Networks (MEMN), generated from the GS approach and used to estimate all types of networks informed by an experiment —individual, homogeneous subgroups and group networks. These methods are then compared both from a theoretical perspective and in practice using real fMRI data.
Article
Full-text available
Human behavior and cognition result from a complex pattern of interactions between brain regions. The flexible reconfiguration of these patterns enables behavioral adaptation, such as the acquisition of a new motor skill. Yet, the degree to which these reconfigurations depend on the brain's baseline sensorimotor integration is far from understood. Here, we asked whether spontaneous fluctuations in sensorimotor networks at baseline were predictive of individual differences in future learning. We analyzed functional MRI data from 19 participants prior to six weeks of training on a new motor skill. We found that visual-motor connectivity was inversely related to learning rate: sensorimotor autonomy at baseline corresponded to faster learning in the future. Using three additional scans, we found that visual-motor connectivity at baseline is a relatively stable individual trait. These results suggest that individual differences in motor skill learning can be predicted from sensorimotor autonomy at baseline prior to task execution.
Article
Full-text available
The Multiregression Dynamic Model (MDM) is a multivariate graphical model for a multidimensional time series that allows the estimation of time-varying effective connectivity. An MDM is a state space model where connection weights reflect the contemporaneous interactions between brain regions. Because the marginal likelihood has a closed form, model selection across a large number of potential connectivity networks is easy to perform. With application of the Integer Programming Algorithm, we can quickly find optimal models that satisfy acyclic graph constraints and, due to a factorisation of the marginal likelihood, the search over all possible directed (acyclic or cyclic) graphical structures is even faster. These methods are illustrated using recent resting-state and steady-state task fMRI data.
Article
Full-text available
On large networks there is a specific need to consider specific patterns which can be related to structured groups of nodes which could be also defined communities. In this sense we will propose an approach to cluster the different communities using interval data. This approach is relevant in the context of the analysis of large network and in particular on discovering the different functionalities of the communities inside a network. The approach is shown in this paper by considering different examples of networks by means of synthetic data. The application of the approach is specifically related to a large network as the co-authorship network in Astrophysics.
Article
Full-text available
bnlearn is an R package which includes several algorithms for learning the structure of Bayesian networks with either discrete or continuous variables. Both constraint-based and score-based algorithms are implemented, and can use the functionality provided by the snow package to improve their performance via parallel computing. Several network scores and conditional independence algorithms are available for both the learning algorithms and independent use. Advanced plotting options are provided by the Rgraphviz package.
Article
Full-text available
A Multiregression Dynamic Model (MDM) is a class of multivariate time series that represents various dynamic causal processes in a graphical way. One of the advantages of this class is that, in contrast to many other Dynamic Bayesian Networks, the hypothesised relationships accommodate conditional conjugate inference. We demonstrate for the first time how straightforward it is to search over all possible connectivity networks with dynamically changing intensity of transmission to find the Maximum a Posteriori Probability (MAP) model within this class. This search method is made feasible by using a novel application of an Integer Programming algorithm. The efficacy of applying this particular class of dynamic models to this domain is shown and more specifically the computational efficiency of a corresponding search of 11-node Directed Acyclic Graph (DAG) model space. We proceed to show how diagnostic methods, analogous to those defined for static Bayesian Networks, can be used to suggest embellishment of the model class to extend the process of model selection. All methods are illustrated using simulated and real resting-state functional Magnetic Resonance Imaging (fMRI) data.
Chapter
Full-text available
Starting from the main idea of Symbolic Data Analysis to extend statistics and data mining methods from first-order to second-order objects, we focus on network data as defined in the framework of Social Network Analysis in order to define a graph structure and the underlying network in the context of complex data objects. A Network Symbolic Object is defined according to the statistical characterization of the network topological properties. We deal with the choice of suitable network measures transformed in symbolic variables. Their study through multidimensional data analysis, allows the synthetic representation of a network as a point onto a metric space. The proposed approach is discussed on the basis of a simulation study considering three classical network growth processes.
Article
Full-text available
We describe a Bayesian approach for learning Bayesian networks from a combination of prior knowledge and statistical data. First and foremost, we develop a methodology for assessing informative priors needed for learning. Our approach is derived from a set of assumptions made previously as well as the assumption oflikelihood equivalence, which says that data should not help to discriminate network structures that represent the same assertions of conditional independence. We show that likelihood equivalence when combined with previously made assumptions implies that the user''s priors for network parameters can be encoded in a single Bayesian network for the next case to be seen—aprior network—and a single measure of confidence for that network. Second, using these priors, we show how to compute the relative posterior probabilities of network structures given data. Third, we describe search methods for identifying network structures with high posterior probabilities. We describe polynomial algorithms for finding the highest-scoring network structures in the special case where every node has at mostk=1 parent. For the general case (k>1), which is NP-hard, we review heuristic search algorithms including local search, iterative local search, and simulated annealing. Finally, we describe a methodology for evaluating Bayesian-network learning algorithms, and apply this approach to a comparison of various approaches.
Chapter
Full-text available
State space models have gained tremendous popularity in recent years in as disparate fields as engineering, economics, genetics and ecology. After a detailed introduction to general state space models, this book focuses on dynamic linear models, emphasizing their Bayesian analysis. Whenever possible it is shown how to compute estimates and forecasts in closed form; for more complex models, simulation techniques are used. A final chapter covers modern sequential Monte Carlo algorithms. The book illustrates all the fundamental steps needed to use dynamic linear models in practice, using R. Many detailed examples based on real data sets are provided to show how to set up a specific model, estimate its parameters, and use it for forecasting. All the code used in the book is available online. No prior knowledge of Bayesian statistics or time series analysis is required, although familiarity with basic statistics and R is assumed. Giovanni Petris is Associate Professor at the University of Arkansas. He has published many articles on time series analysis, Bayesian methods, and Monte Carlo techniques, and has served on National Science Foundation review panels. He regularly teaches courses on time series analysis at various universities in the US and in Italy. An active participant on the R mailing lists, he has developed and maintains a couple of contributed packages. Sonia Petrone is Associate Professor of Statistics at Bocconi University,Milano. She has published research papers in top journals in the areas of Bayesian inference, Bayesian nonparametrics, and latent variables models. She is interested in Bayesian nonparametric methods for dynamic systems and state space models and is an active member of the International Society of Bayesian Analysis. Patrizia Campagnoli received her PhD in Mathematical Statistics from the University of Pavia in 2002. She was Assistant Professor at the University of Milano-Bicocca and currently works for a financial software company.
Chapter
Full-text available
Probabilistic graphical models, such as Bayesian networks, allow representing conditional independence information of random variables. These relations are graphically represented by the presence and absence of arcs and edges between vertices. Probabilistic graphical models are nonunique representations of the independence information of a joint probability distribution. However, the concept of Markov equivalence of probabilistic graphical models is able to offer unique representations, called essential graphs. In this survey paper the theory underlying these concepts is reviewed.
Article
Full-text available
Learning Bayesian networks is known to be an NP-hard problem and that is the reason why the application of a heuristic search has proven advantageous in many domains. This learning approach is computationally efficient and, even though it does not guarantee an optimal result, many previous studies have shown that it obtains very good solutions. Hill climbing algorithms are particularly popular because of their good trade-off between computational demands and the quality of the models learned. In spite of this efficiency, when it comes to dealing with high-dimensional datasets, these algorithms can be improved upon, and this is the goal of this paper. Thus, we present an approach to improve hill climbing algorithms based on dynamically restricting the candidate solutions to be evaluated during the search process. This proposal, dynamic restriction, is new because other studies available in the literature about restricted search in the literature are based on two stages rather than only one as it is presented here. In addition to the aforementioned advantages of hill climbing algorithms, we show that under certain conditions the model they return is a minimal I-map of the joint probability distribution underlying the training data, which is a nice theoretical property with practical implications. In this paper we provided theoretical results that guarantee that, under these same conditions, the proposed algorithms also output a minimal I-map. Furthermore, we experimentally test the proposed algorithms over a set of different domains, some of them quite large (up to 800 variables), in order to study their behavior in practice.
Conference Paper
Full-text available
We present LiveRAC, a visualization system that supports the analysis of large collections of system management time- series data consisting of hundreds of parameters across thou- sands of network devices. LiveRAC provides high informa- tion density using a reorderable matrix of charts, with se- mantic zooming adapting each chart's visual representation to the available space. LiveRAC allows side-by-side visual comparison of arbitrary groupings of devices and parameters at multiple levels of detail. A staged design and develop- ment process culminated in the deployment of LiveRAC in a production environment. We conducted an informal lon- gitudinal evaluation of LiveRAC to better understand which proposed visualization techniques were most useful in the target environment.
Chapter
Marketing and technology academics, practitioners and policymakers need to think about how COVID-19 has changed their thinking and economic outlook. The aim of this chapter is to focus on the developments COVID-19 has made in the marketing and technology space with an aim of thinking about future research possibilities. This means understanding the way crisis management and organisational resilience has adapted as a result of COVID-19 change. To do this a number of different future research areas are discussed in terms of context, research approaches and integrative opportunities. This enables there to be more focus on new and innovative ways COVID-19 has changed society. In addition, practitioner perspectives are stated in terms of what marketers need to do in terms of how to survive and thrive in a crisis situation. Policy implications are also stated that highlight the need for an ecosystem and stakeholder approach to COVID-19, marketing and technology research and practice.
Article
In this paper, we analyse how the Covid-19 pandemic changed the dynamics of the euro to dollar exchange rate. To do so, we make use of spectral non-causality tests to uncover the determinants of the euro to dollar exchange rate, using data that cover the pre-Covid-19 and the actual Covid-19 era, by considering the exchange rate movements of other currencies, the stock market index of S&P500, and the price of oil and gold, as well as their realized volatilities. Based on our findings, the Covid-19 pandemic has indeed significantly changed the determinants of the euro to dollar exchange rate. Also, to investigate the potential shifts in the regimes of the euro to dollar exchange rate, we formulate a Markov-switching model with two regimes, based on the determinants that have been found in the previous step. Based on our findings, the duration of the high volatility state in the Covid-19 era has doubled, from almost 3 to approximately 6 days, compared to the pre-Covid-19 era, whereas the high volatility state in the Covid-19 era is characterized by a statistically significant higher range of volatility compared to the pre-Covid-19 era.
Article
We add a simple dynamic process for adaptive "social distancing" measures to a standard SIR model of the COVID pandemic. With a limited attention span and in the absence of a consistent long-term strategy against the pandemic, this process leads to a sweeping of an instability, i.e. fluctuations in the effective reproduction number around its bifurcation value of R e f f = 1 . While mitigating the pandemic in the short-run, this process remains intrinsically fragile and does not constitute a sustainable strategy that societies could follow for an extended period of time.
Article
No previous infectious disease outbreak, including the Spanish Flu, has affected the stock market as forcefully as the COVID-19 pandemic. In fact, previous pandemics left only mild traces on the U.S. stock market. We use text-based methods to develop these points with respect to large daily stock market moves back to 1900 and with respect to overall stock market volatility back to 1985. We also evaluate potential explanations for the unprecedented stock market reaction to the COVID-19 pandemic. The evidence we amass suggests that government restrictions on commercial activity and voluntary social distancing, operating with powerful effects in a service-oriented economy, are the main reasons the U.S. stock market reacted so much more forcefully to COVID-19 than to previous pandemics in 1918–1919, 1957–1958, and 1968.
Article
Market reactions to the 2019 novel coronavirus disease (COVID-19) provide new insights into how real shocks and financial policies drive firm value. Initially, internationally oriented firms, especially those more exposed to trade with China, underperformed. As the virus spread to Europe and the United States, corporate debt and cash holdings emerged as important value drivers, relevant even after the Fed intervened in the bond market. The content and tone of conference calls mirror this development over time. Overall, the results illustrate how anticipated real effects from the health crisis, a rare disaster, were amplified through financial channels. (JEL G01, G12, G14, G32, F14) Received: May 27, 2020; editorial decision June 16, 2020 by Editor Andrew Ellul. Authors have furnished an Internet Appendix, which is available on the Oxford University Press Web site next to the link to the final published paper online.
Article
In this paper, we analyze the connectedness between the recent spread of COVID-19, oil price volatility shock, the stock market, geopolitical risk and economic policy uncertainty in the US within a time-frequency framework. The coherence wavelet method and the wavelet-based Granger causality tests applied to US recent daily data unveil the unprecedented impact of COVID-19 and oil price shocks on the geopolitical risk levels, economic policy uncertainty and stock market volatility over the low frequency bands. The effect of the COVID-19 on the geopolitical risk substantially higher than on the US economic uncertainty. The COVID-19 risk is perceived differently over the short and the long-run and may be firstly viewed as an economic crisis. Our study offers several urgent prominent implications and endorsements for policymakers and asset managers.
Article
As there is no vaccination and proper medicine for treatment, the recent pandemic caused by COVID-19 has drawn attention to the strategies of quarantine and other governmental measures, like lockdown, media coverage on social isolation, and improvement of public hygiene, etc to control the disease. The mathematical model can help when these intervention measures are the best strategies for disease control as well as how they might affect the disease dynamics. Motivated by this, in this article, we have formulated a mathematical model introducing a quarantine class and governmental intervention measures to mitigate disease transmission. We study a thorough dynamical behavior of the model in terms of the basic reproduction number. Further, we perform the sensitivity analysis of the essential reproduction number and found that reducing the contact of exposed and susceptible humans is the most critical factor in achieving disease control. To lessen the infected individuals as well as to minimize the cost of implementing government control measures, we formulate an optimal control problem, and optimal control is determined. Finally, we forecast a short-term trend of COVID-19 for the three highly affected states, Maharashtra, Delhi, and Tamil Nadu, in India, and it suggests that the first two states need further monitoring of control measures to reduce the contact of exposed and susceptible humans.
Article
This work aimed to appraise a multivariate time series, high-dimensionality data-set, presented as intervals using a Symbolic Data Analysis (SDA) approach. SDA reduces data dimensionality, considering the complexity of the model information through a set-valued (interval or multi-valued). Additionally, Dynamic Linear Models (DLM) are distinguished by modeling univariate or multivariate time series in the presence of non-stationarity, structural changes and irregular patterns. We considered neurophysiological (EEG) data associated with experimental manipulation of verticality perception in humans, using transcranial electrical stimulation. The innovation of the present work is centered on use of a dynamic linear model with SDA methodology, and SDA applications for analyzing EEG data.
Chapter
With regard to large networks there is a specific need to consider particular patterns relatable to structured groups of nodes which could be also defined as communities. In this work we will propose an approach to cluster the different communities using interval data. This approach is relevant in the context of the analysis of large networks and, in particular, in order to discover the different functionalities of the communities inside a network. The approach is shown in this paper by considering different examples of networks by means of synthetic data. The application is specifically related to a large network, that of the co-authorship network in Astrophysics.
Article
Multiregression dynamic models are defined to preserve certain conditional independence structures over time across a multivariate time series. They are non‐Gaussian and yet they can often be updated in closed form. The first two moments of their one‐step‐ahead forecast distribution can be easily calculated. Furthermore, they can be built to contain all the features of the univariate dynamic linear model and promise more efficient identification of causal structures in a time series than has been possible in the past.
Article
Simultaneous graphical dynamic linear models (SGDLMs) define an ability to scale on-line Bayesian analysis and multivariate volatility forecasting to higher-dimensional time series. Advances in the methodology of SGDLMs involve a novel, adaptive method of simultaneous predictor selection in forward filtering for on-line learning and forecasting. This Bayesian methodology for dynamic variable selection and Bayesian computation for scalability are highlighted in a case study evidencing the potential for improved short-term forecasting of large-scale volatility matrices. In financial forecasting and portfolio optimization with a 400-dimensional series of daily stock prices, analysis demonstrates SGDLM forecasts of volatilities and co-volatilities that contribute to quantitative investment strategies to improve portfolio returns. Performance metrics linked to the sequential Bayesian filtering analysis define a leading indicator of increased financial market stresses, comparable to but leading standard financial risk measures. Parallel computation using GPU implementations substantially advance the ability to fit and use these models.
Article
Recent studies have suggested that human brain functional networks are topologically organized into functionally specialized but inter-connected modules to facilitate efficient information processing and highly flexible cognitive function. However, these studies have mainly focused on group-level network modularity analyses using “static” functional connectivity approaches. How these extraordinary modular brain structures vary across individuals and spontaneously reconfigure over time remain largely unknown. Here, we employed multiband resting-state functional MRI data (N = 105) from the Human Connectome Project and a graph-based modularity analysis to systematically investigate individual variability and dynamic properties in modular brain networks. We showed that the modular structures of brain networks dramatically vary across individuals, with higher modular variability primarily in the association cortex (e.g., fronto-parietal and attention systems) and lower variability in the primary systems. Moreover, brain regions spontaneously changed their module affiliations on a temporal scale of seconds, which cannot be simply attributable to head motion and sampling error. Interestingly, the spatial pattern of intra-subject dynamic modular variability largely overlapped with that of inter-subject modular variability, both of which were highly reproducible across repeated scanning sessions. Finally, the regions with remarkable individual/temporal modular variability were closely associated with network connectors and the number of cognitive components, suggesting a potential contribution to information integration and flexible cognitive function. Collectively, our findings highlight individual modular variability and the notable dynamic characteristics in large-scale brain networks, which enhance our understanding of the neural substrates underlying individual differences in a variety of cognition and behaviors.
Article
Data Science, considered as a science by itself, is in general terms, the extraction of knowledge from data. Symbolic data analysis ( SDA ) gives a new way of thinking in Data Science by extending the standard input to a set of classes of individual entities. Hence, classes of a given population are considered to be units of a higher level population to be studied. Such classes often represent the real units of interest. In order to take variability between the members of each class into account, classes are described by intervals, distributions, set of categories or numbers sometimes weighted and the like. In that way, we obtain new kinds of data, called ‘symbolic’ as they cannot be reduced to numbers without losing much information. The first step in SDA is to build the symbolic data table where the rows are classes and the variables can take symbolic values. The second step is to study and extract new knowledge from these new kinds of data by at least an extension of Computer Statistics and Data Mining to symbolic data. SDA is a new paradigm which opens up a vast domain of research and applications by giving complementary results to classical methods applied to standard data. SDA also gives answers to big data and complex data challenges as big data can be reduced and summarized by classes and as complex data with multiple unstructured data tables and unpaired variables can be transformed into a structured data table with paired symbolic‐valued variables. WIREs Comput Stat 2016, 8:172–205. doi: 10.1002/wics.1384 This article is categorized under: Statistical Learning and Exploratory Methods of the Data Sciences > Clustering and Classification Statistical Learning and Exploratory Methods of the Data Sciences > Exploratory Data Analysis
Article
The recently introduced class of simultaneous graphical dynamic linear models (SGDLMs) defines an ability to scale on-line Bayesian analysis and forecasting to higher-dimensional time series. This paper advances the methodology of SGDLMs, developing and embedding a novel, adaptive method of simultaneous predictor selection in forward filtering for on-line learning and forecasting. The advances include developments in Bayesian computation for scalability, and a case study in exploring the resulting potential for improved short-term forecasting of large-scale volatility matrices. A case study concerns financial forecasting and portfolio optimization with a 400-dimensional series of daily stock prices. Analysis shows that the SGDLM forecasts volatilities and co-volatilities well, making it ideally suited to contributing to quantitative investment strategies to improve portfolio returns. We also identify performance metrics linked to the sequential Bayesian filtering analysis that turn out to define a leading indicator of increased financial market stresses, comparable to but leading the standard St. Louis Fed Financial Stress Index (STLFSI) measure. Parallel computation using GPU implementations substantially advance the ability to fit and use these models.
Article
The recent interest in the dynamics of networks and the advent, across a range of applications, of measuring modalities that operate on different temporal scales have put the spotlight on some significant gaps in the theory of multivariate time series. Fundamental to the description of network dynamics is the direction of interaction between nodes, accompanied by a measure of the strength of such interactions. Granger causality and its associated frequency domain strength measures (GEMs) (due to Geweke) provide a framework for the formulation and analysis of these issues. In pursuing this setup, three significant unresolved issues emerge. First, computing GEMs involves computing submodels of vector time series models, for which reliable methods do not exist. Second, the impact of filtering on GEMs has never been definitively established. Third, the impact of downsampling on GEMs has never been established. In this work, using state-space methods, we resolve all these issues and illustrate the results with some simulations. Our analysis is motivated by some problems in (fMRI) brain imaging, to which we apply it, but it is of general applicability.
Article
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology. New to the Second Edition New chapter on Bayesian network classifiers New section on object-oriented Bayesian networks New section that addresses foundational problems with causal discovery and Markov blanket discovery New section that covers methods of evaluating causal discovery programs Discussions of many common modeling errors New applications and case studies More coverage on the uses of causal interventions to understand and reason with causal Bayesian networks Illustrated with real case studies, the second edition of this bestseller continues to cover the groundwork of Bayesian networks. It presents the elements of Bayesian network technology, automated causal discovery, and learning probabilities from data and shows how to employ these technologies to develop probabilistic expert systems. Web ResourceThe books website at www.csse.monash.edu.au/bai/book/book.html offers a variety of supplemental materials, including example Bayesian networks and data sets. Instructors can email the authors for sample solutions to many of the problems in the text.
Article
Traffic flow data are routinely collected for many networks worldwide. These invariably large data sets can be used as part of a traffic management system, for which good traffic flow forecasting models are crucial. The linear multiregression dynamic model (LMDM) has been shown to be promising for forecasting flows, accommodating multivariate flow time series, while being a computationally simple model to use. While statistical flow forecasting models usually base their forecasts on flow data alone, data for other traffic variables are also routinely collected. This paper shows how cubic splines can be used to incorporate extra variables into the LMDM in order to enhance flow forecasts. Cubic splines are also introduced into the LMDM to parsimoniously accommodate the daily cycle exhibited by traffic flows. The proposed methodology allows the LMDM to provide more accurate forecasts when forecasting flows in a real high-dimensional traffic data set. The resulting extended LMDM can deal with some important traffic modelling issues not usually considered in flow forecasting models. Additionally, the model can be implemented in a real-time environment, a crucial requirement for traffic management systems designed to support decisions and actions to alleviate congestion and keep traffic flowing.
Article
Multiregression dynamic models are defined to preserve certain conditional independence structures over time across a multivariate time series. They are non-Gaussian and yet they can often be updated in closed form. The first two moments of their one-step-ahead forecast distribution can be easily calculated. Furthermore, they can be built to contain all the features of the univariate dynamic linear model and promise more efficient identification of causal structures in a time series than has been possible in the past.
Article
Considerable studies have been done in the past years on to answer the question that why money affects output. Some of it has come from clarifying old and fuzzy ideas coordination problems and the respective roles of expectations of nominal and real rigidities. A lot of it has come from running into dead ends such as the failure to explain the joint price and output responses to money in “as if” competitive models or the failure of individual nominal rigidities to generate aggregate price inertia under simple Ss rules. Research on real rigidities is the most urgent. A general feature of goods markets is that the fluctuations in demand lead mostly to movements in output rather than in markups; and a general feature of labor markets is that the fluctuations in the demand for labor lead mostly to movements in employment rather than in real wages. Given these features, a very small amount of nominal rigidity will lead to long-lasting effects of nominal money on output.
Article
This paper considers a mean shift with an unknown shift point in a linear process and estimates the unknown shift point (change point) by the method of least squares. Pre-shift and post-shift means are estimated concurrently with the change point. The consistency and the rate of convergence for the estimated change point are established. The asymptotic distribution for the change point estimator is obtained when the magnitude of shift is small. It is shown that serial correlation affects the variance of the change point estimator via the sum of the coefficients of the linear process. When the underlying process is autoregressive moving average, a mean shift causes overestimation of its order. A simple procedure is suggested to mitigate the bias in order estimation.