Content uploaded by Robert Bartels

Author content

All content in this area was uploaded by Robert Bartels on May 15, 2015

Content may be subject to copyright.

A preview of the PDF is not available

Journal of the American Statistical Association

Although rank tests for randomness were proposed in the literature as early as 1943, no such test has gained wide acceptance comparable to, say, Spearman's rho test. This may be due to the lack of small-sample theory and of tables of critical values to enable such a test to be carried out on small samples. In this article, we consider the rank version of the von Neumann ratio statistic and we obtain the critical values of this statistic under the randomization hypothesis. In a Monte Carlo experiment we then show that the resulting nonparametric test for randomness has far greater power than the test based on the number of runs up and down. Moreover, under normality, its power vis-a-vis the normal theory von Neumann ratio test is also very good. It is therefore suggested that with the tables presented in this article, the rank von Neumann ratio test for randomness provides an easy and powerful alternative to nonparametric tests now in common use.

Content uploaded by Robert Bartels

Author content

All content in this area was uploaded by Robert Bartels on May 15, 2015

Content may be subject to copyright.

A preview of the PDF is not available

... This approach makes minimal assumptions regarding the underlying N-gram frequency distributions. Second, Bartels' test for randomness [39] was applied. This test is a ranked version of von Neumann's ratio test for randomness [40]. ...

... First, the Ljung-Box test is used to investigate return autocorrelation (Ljung & Box, 1978). Second, the independence of the stock return is tested using the runs test (Wald & Wolfowitz, 1940) and the Bartels test (Bartels, 1982). Finally, the variance ratio test designed by Lo and MacKinlay (1988) is used to determine if the standard deviation of returns scales with T. Kim's (2009) wild-bootstrapped automatic variance test (AVR) is employed to perform the variance-ratio test. ...

The Borsa Istanbul has experienced a significant increase in investor participation in the past few years, and the growing number of companies are opting to raise capital through IPOs (Initial Public Offerings). In the context of this transformation, the goal of this research is to investigate the connection between the market efficiency and liquidity of 397 stocks traded on Borsa Istanbul by using the daily data over the period from 1 January 2022 to 18 August 2023, including the new stocks that have been listed in recent years. The stocks are ranked in accordance with the degree of informational efficiency using a sample entropy (SampEn) approach. The analysis shows that all stocks exhibit different levels of informational complexity and illiquidity, and many stocks display evidence of autocorrelation and non-independence. Further, it is revealed that entropy and liquidity have a significant relationship on a cross-sectional basis, suggesting that liquidity has an important impact on both inefficiency and predictability.

... Estimating this model using the method of maximum likelihood requires the data to be independent, as the likelihood function is defined as the product of probability density functions. We tested for independence using several methods: [27]'s test, [28]'s test, the difference sign test, the rank test, [29]'s test, the turning point test, and [30]'s test. The corresponding -values for these tests were 0.144, 0.159, 0.173, 0.050, 0.112, 0.064, and 0.055, respectively. ...

The Akosombo Dam is the largest dam in Ghana and is linked to the world's largest man-made lake by surface area. The top of flood control pool of the dam has been breached a number of times, so it is of interest to know the corresponding probability. The paper fits the generalized extreme value distribution to the extreme water levels – with all three of its parameters (including the shape parameter) accounting for various linear trends, seasonality and cyclic trends with respect to time, the first time such a model has been fitted. The fitted model contains in total 50 parameters. It provided an adequate fit, as evaluated by probability plots, quantile plots, and the Kolmogorov-Smirnov test. It is used to provide return level estimates as well as probabilities of the top of flood control pool of the dam being breached.

... The distance between the replicate plots (≥ 15 m) was tested for spatial independence using the first data set of soil gross and net N 2 O fluxes. Based on von Neumann's ratio test for randomness (Bartels 1982), these fluxes were not auto-correlated. Thus, our replicate plots were considered biological replicates in our statistical analysis. ...

In addition to the removal of excess mineral nitrogen (N) via root uptake, trees in agroforestry systems may mitigate negative effects of high N fertilization of adjacent crops by enhancing complete denitrification of excess mineral N aside from root uptake. Presently, little is known about the potential for NO3⁻ reduction through denitrification (conversion to greenhouse gas N2O and subsequently to non-reactive N2) in contrasting agroforestry systems: riparian tree buffer versus tree row of an upland alley-cropping system. Our study aimed to (1) quantify gross N2O emissions (both N2O + N2 emissions) and gross N2O uptake (N2O reduction to N2), and (2) determine their controlling factors. We employed the ¹⁵N2O pool dilution technique to quantify gross N2O fluxes from 0 to 5 cm (topsoil) and 40 to 60 cm (subsoil) depths with seasonal field measurements in 2019. The riparian tree buffer exhibited higher topsoil gross N2O emissions and uptake than the alley-cropping tree row (P < 0.03). Gross N2O emissions were regulated by N and carbon (C) availabilities and aeration status rather than denitrification gene abundance. Gross N2O uptake was directly linked to available C and nirK gene abundance. In the subsoil, gross N2O emission and uptake were low in both agroforestry systems, resulting from low mineral N contents possibly due to N uptake by deep tree roots. Nonetheless, the larger available C and soil moisture in the subsoil of riparian tree buffer than in alley-cropping tree row (P < 0.05) suggest its large potential for N2O uptake whenever NO3⁻ is transported to the subsoil.

... First, we evaluate the autocorrelation of returns using the Ljung and Box (1978) test, which assesses whether there are significant correlations between observations at different time points. We also conduct the Runs test (also called the Wald-Wolfowitz runs test) and the Bartels test (Bartels 1982) to examine the randomness and the absence of seasonality patterns in cryptocurrency returns. Moreover, the BDS test (Broock et al. 1996) is employed to detect non-linear dependencies in our data. ...

The rise of cryptocurrencies as alternative financial investments, with potential safe-haven and hedging properties, highlights the need to examine their market efficiency. This study is the first to investigate the combined impact of liquidity and volatility features of cryptocurrencies on their price delays. Using a wide spectrum of cryptocurrencies, we investigate whether the COVID-19 outbreak has affected market efficiency by studying price delays to market information. We find that as liquidity increases and volatility decreases, cryptocurrencies demonstrate stronger market efficiency. Additionally, we show that price delay differences during the COVID-19 outbreak increase with higher levels of illiquidity, particularly for highly volatile quintiles. We suggest that perceived risks and high transaction costs in illiquid and highly volatile cryptocurrencies reduce active traders’ willingness to engage in arbitrage trading, leading to increased market inefficiencies. Our findings are relevant to investors, aiding in improving their decision-making processes and enhancing their investment efficiency. Our paper also presents significant implications for policymakers, emphasizing the need for reforms aimed at enhancing the speed at which information is incorporated into cryptocurrency returns. These reforms would help mitigate market distortions and increase the sustainability of cryptocurrency markets.

Within the adaptive market hypothesis (AMH) framework, this study explores the dynamic impact of cryptocurrency heists on Bitcoin's market efficiency. By analysing Bitcoin's one‐minute price data, we calculate permutation entropy to assess market disorder and employ the complexity‐entropy causality plane to quantify structural changes in the market. The analysis focuses on the market efficiency changes the day before, the day of, and the day after a heist, revealing that heists significantly disrupt market efficiency. Specifically, on the day of and following a heist, we observe a marked decrease in permutation entropy alongside a significant increase in complexity, indicating a notable decline in market efficiency. Further analysis shows that when a heist targets a specific token, this token draws investor attention, causing a less severe drop in Bitcoin's market efficiency, while the affected token's market efficiency drops more dramatically. These findings suggest that different token markets react differently to heists, and investors should consider adjusting their strategies to respond to these changes. For policymakers, the results highlight the critical need to enhance market stability and security through informed policy measures to mitigate the impact of such disruptive events.

This Thesis attempts to develop and verify novel distribution-free multivariate statistical process control (MSPC) chart
approaches that can handle the nonnormal and nonlinear correlation structure of the MQC during monitoring. The aim is to first simultaneously monitor and efficiently detect ‘location’ and ‘dispersion’ parameter shifts of individual observation in a manufacturing process.

From the launch of Bitcoin till the present moment, cryptocurrency market had expanded continuously, gaining more and more influence over the global economy with each passing year. Yet, the events of 2020 marked a new phase for the cryptocurrency ecosystem, which has experienced a significant increase in size and complexity. The third halving cycle that led to an increase in cryptocurrency prices, the beginning of pandemic, and afterward inflation and economic uncertainty made Bitcoin an attractive asset for both retail and institutional investors. Although the liquidity of the cryptocurrency assets increased, their volatile nature is still persistent, causing mixed views on its status. While crypto-enthusiasts are perceiving it as a worthwhile investment with novel economic properties, the more skeptic participants consider it only a speculative asset with a transitory presence. The absence of a consensus on this topic has attracted the interest of the academic community, which aims to analyze whether cryptocurrencies display economic properties. A keystone characteristic for considering cryptocurrency an economic asset is the lack of price manipulation. In this respect, numerous papers have investigated the efficiency of the cryptocurrency market. Even though the results are mixed, a large body of studies indicate that the efficiency of the crypto-assets market varies, increasing from period to period. However, most of the papers focus on testing information efficiency only on the spot market. Thus, the objective of this study is to analyze whether the futures cryptocurrency market is efficient. In this regard, the futures prices for Bitcoin from 2018 to 2022 are used. On them, a battery of tests is applied, which investigate several statistical properties that can assess the efficiency hypothesis. Furthermore, under the assumption of efficient market hypothesis the spot and future prices are supposed to move together. In the contrary case, the efficient market hypothesis is rejected. Thus, the property is evaluated from a double perspective by using statistical tests and evaluating the relation between the spot and the future price.

The least squares estimators βj(N), j = 1, ⋯, p, from N data points, of the autoregressive constants for a stationary autoregressive model are considered when the disturbances have a distribution attracted to a stable law of index $\alpha \alpha$. Some comments are made on alternative definitions of the βj(N).

A new algorithm is presented for simulating stable random variables on a digital computer for arbitrary characteristic exponent ??(0 < ?? ??? 2) and skewness parameter ??(-1 ??? ?? ??? 1). The algorithm involves a nonlinear transformation of two independent uniform random variables into one stable random variable. This stable random variable is a continuous function of each of the uniform random variables, and of ?? and a modified skewness parameter ??' throughout their respective permissible ranges.

This paper considers the estimation of the first—order autoregressive scheme when the underlying distribution is non—normal stable. The results of a simulation experiment of the least—squares estimator and the uncorrected and corrected serial correlation coefficients are presented. It is found that, in general, normal theory results are inapplicable. Nevertheless, the corrected coefficient provides a reliable and very efficient estimator of ρ; however, the least—squares estimator and the uncorrected coefficient are severely biased with skewed populations. Furthermore, all of the estimators seem to be asymptotically non—normal.

In an earlier paper [5] the authors compared the power of the BLUS test with the probability of a correct decision of the Durbin-Watson bounds test. A method to compute the distribution of the Von Neumann ratio under the null hypothesis and under the alternative hypothesis was given. In the present paper the latter method is used to tabulate the BLUS-test statistic and to compute the exact significance points of the Durbin-Watson test for several examples. Powers of both tests are computed and compared. It appears that, for the cases considered, the power of the exact Durbin-Watson test exceeds that of the BLUS procedure, while the latter is greater than the probability of a correct decision in the Durbin-Watson bounds test.

Existing evidence indicates that the time-series behaviour of corporate annual earnings is well approximated by a random-walk, or some similar process. There is, however, little Australian evidence on this issue.
This note presents the results of an investigation into the time-series behaviour of the annual earnings of a sample of relatively large Australian corporations. The conclusion, that successive changes in such earnings are well approximated by a random-walk, is consistent with the existing evidence.

A new probability distribution is presented that offers considerable potential for providing stochastic inputs to Monte Carlo simulation studies. The distribution includes the exponential power family as a special case. An efficient computational strategy is proposed for random variate generation. An example for testing the hypothesis of unit variance illustrates the advantages of the proposed distribution.