Article

Econometric Analysis

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Social Sciences (SPSS: version 23) was used to analyse the collected data and to examine the variables in research. All statistical and econometric analysis developed has been based on Greene (2018) and Hair et al. (2018). ...
... The research only considered independent variables that showed a significant correlation (p > 0,05) with the dependent variable CSRD. In Table 2 are presented the correlation test (Greene 2018). According with Table 2 the variables that presents significant correlations are: company size (COMPS), employees and turnover. ...
... The confirmatory analysis has been supported on the methods propose by Greene (2018) and Hair et al. (2018). From Table 4, it is evident that CSRD has positive correlation with all other explanatory variables. ...
Article
Full-text available
Companies in the Water Industry present digital Corporate Social Responsibility (CSR) agenda and, also, the social and environmental commitment to their stakeholders through the websites. The purpose of this research is to assess the digital CSR in Portuguese companies of the Water Industry. Furthermore, the research examines factors that impacts on the digital status of the online disclosure. The authors analyze the CSR information published on their websites of the Portuguese companies, operating in bottle water industry using empirical analysis. The data was collected based on the Global Reporting Initiative (GRI 2021a) standards that details the level of disclosure in this industry and highlight areas of underreporting. The results point to factors that need to improve to companies’ digital CSR report good practices and weak points based on the companies’ size, number of employees and turnover as factors that influence this level of disclosure.
... Statistical methods used are standard (Greene, 2012) and generally as used in Leggett and Ball (2015). Categories of methods used are: normalisation; differentiation (approximated by differencing); integration (approximated by the cumulative sum); and time-series analysis. ...
... This serial nature of the measurements must be addressed by careful examination of the lag structure of the model. This type of ordinary least squares regression is termed 'time series analysis' (Greene, 2012). ...
... A further issue in time series analysis concerns what is termed the 'order of integration' of each of the series used. Greene (2012) states: 'The series yt is said to be integrated of order one, denoted I(1), because taking a first difference produces a stationary process. A non-stationary series is integrated of order d, denoted I(d), if it becomes stationary after being first-differenced d times. ...
Article
Full-text available
Here we provide statistically significant observational evidence that a feedback control system moderating atmospheric temperature is presently operating coherently at global scale. Further, this control system is of a sophisticated type, involving the corrective feedback not only of a linear error term but also its derivative and its integral. This makes it of the same type as the most widely used control system developed by humans, the proportional-integral-derivative (PID) control system.
... A suitable model for F(x,β) is assumed, usually considering a normal or logistic distribution. The question of which distribution to use has been widely analyzed, with the conclusion that the two distributions tend to give similar probabilities, except in the tails [21,22]. After analyzing both distributions in the present study, we chose the normal distribution, because a small improvement in terms of the Akaike Information Criterion (AIC) and the McFadden pseudo R 2 was detected in the model outcomes, giving rise to the Probit model, as follows: ...
... For this reason, the Poisson regression was applied for the analysis of the lane departures. The Poisson regression specifies that each observation y i is drawn from a Poisson distribution with the parameter λ i , related to a vector of explanatory variables X i [22]. The Poisson probability of the outcome Y = y i can be expressed by: ...
Article
Full-text available
Drowsiness and fatigue are major safety issues that cannot be measured directly. Their measurements are sustained on indirect parameters such as the effects on driving performance, changes in physiological states, and subjective measures. We divided this study into two distinct lines. First, we wanted to find if any driver’s physiological characteristic, habit, or recent event could interfere with the results. Second, we aimed to analyze the effects of subjective sleepiness on driving behavior. On driving simulator experiments, the driver information and driving performance were collected, and responses to the Karolinska Sleepiness Scale (KSS) were compared with these parameters. The results showed that drowsiness increases when the driver has suffered a recent stress situation, has taken medication, or has slept fewer hours. An increasing driving time is also a strong factor in drowsiness development. On the other hand, robustness, smoking habits, being older, and being a man were revealed to be factors that make the participant less prone to getting drowsy. From another point of view, the speed and lane departures increased with the sleepiness feeling. Subjective drowsiness has a great correlation to drivers’ personal aspects and the driving behavior. In addition, the KSS shows a great potential to be used as a predictor of drowsiness.
... First, we ran a time-series Generalized Estimating Equation (GEE) as an alternate method. This alternate method is appropriate to account for the hierarchical structure of the panel data, as well as the multilevel nature of the observations (Greene, 2003;Robinson, 2008). Further, this method assumes and identifies the group structure for the random effects between levels (Bell & Jones, 2015;Hoffman, Griffin, & Gavin, 2000). ...
... Second, we ran a timeseries linear regression (OLS). This method is appropriate to account for the time-series nature of the observations, as well as the continuous nature of the variables (Greene, 2003). ...
Article
This study introduces the novel concept of CSR reputation signaling and analyzes its impact on cross-country investments. We combine Diffuse Reciprocity from international relations literature and Institutional Theory to argue that having a home country with high CSR reputation signaling is an advantage that firms can exploit, as it makes these firms more attractive to potential host markets as policymakers contemplate inward foreign direct investment strategies. Moreover, we propose that there are increased cross-country investments between countries with dissimilar levels of CSR reputation signaling, and that firms can benefit from this normative institutional gap; additionally, this relationship is positively moderated by the economic distance between the countries. Analyses of 25,672 country-pair observations across 5,474 country pairs from 153 countries from 2004 to 2011 provide robust support. Overall, our analyses suggest that high country-level CSR reputation signaling imprints on firms as they invest abroad, helping to promote CSR on a global basis, thus supporting policymakers as they pursue sustainable development and the 2030 Agenda.
... In this paper we demonstrate, via simulation studies and experiments with real data at which different types of regression models will be considered, the computational advantages of the score test when employed for univariate filtering. The score test, also known as Rao's test (Rao, 1948) or Lagrange Multiplier test (Greene, 2003), is robust in the sense that it does not depend on the functional relationship between the response and the predictor variable(s) and it depends on the null distribution of the response y only through the MLE of the distribution under the H 0 (Chen, 1983). It is asymptotically equivalent to the log-likelihood ratio test (Greene, 2003) and for logistic and Poisson regression its formula is similar to the Pearson correlation coefficient (Hosmer Jr et al., 2013). ...
... The score test, also known as Rao's test (Rao, 1948) or Lagrange Multiplier test (Greene, 2003), is robust in the sense that it does not depend on the functional relationship between the response and the predictor variable(s) and it depends on the null distribution of the response y only through the MLE of the distribution under the H 0 (Chen, 1983). It is asymptotically equivalent to the log-likelihood ratio test (Greene, 2003) and for logistic and Poisson regression its formula is similar to the Pearson correlation coefficient (Hosmer Jr et al., 2013). Both the score test and Pearson correlation coefficient are applicable to numerous regression models, such as logistic, Poisson, negative binomial, Beta, Gamma, Weibull, etc. Further, we illustrate the performance of the Welch's t-test when the response variable is binary. ...
Preprint
The vast availability of large scale, massive and big data has increased the computational cost of data analysis. One such case is the computational cost of the univariate filtering which typically involves fitting many univariate regression models and is essential for numerous variable selection algorithms to reduce the number of predictor variables. The paper manifests how to dramatically reduce that computational cost by employing the score test or the simple Pearson correlation (or the t-test for binary responses). Extensive Monte Carlo simulation studies will demonstrate their advantages and disadvantages compared to the likelihood ratio test and examples with real data will illustrate the performance of the score test and the log-likelihood ratio test under realistic scenarios. Depending on the regression model used, the score test is 30 - 60,000 times faster than the log-likelihood ratio test and produces nearly the same results. Hence this paper strongly recommends to substitute the log-likelihood ratio test with the score test when coping with large scale data, massive data, big data, or even with data whose sample size is in the order of a few tens of thousands or higher.
... This supports our first hypothesis that cost stickiness increases with the level of a differentiation strategy. We further regress Models (2) and (3) using fixed effects regression (A fixed effects regression model is a statistical model in which the model parameters are fixed or non-random quantities [57]. Random effects and mixed regression models in which all or some of the model parameters are considered as random variables are contrasting with this fixed effect regression model [57]) in Column IV and Ⅴ. ...
... We further regress Models (2) and (3) using fixed effects regression (A fixed effects regression model is a statistical model in which the model parameters are fixed or non-random quantities [57]. Random effects and mixed regression models in which all or some of the model parameters are considered as random variables are contrasting with this fixed effect regression model [57]) in Column IV and Ⅴ. The results are similar to GLS estimates, but the coefficients are slightly smaller. ...
Article
Full-text available
This paper investigates the relationship between business strategy and cost stickiness under different ownership. Using the data from listed firms in China from 2002 to 2015, we find that first, firms with different strategies exhibit different cost behavior. The cost stickiness of choosing a differentiation strategy is higher than that of choosing a low-cost strategy. Second, management expectations will affect cost stickiness. Optimistic expectations will increase cost stickiness, while pessimistic expectations will reduce cost stickiness. Third, management expectations can adjust the relationship between business strategy and cost stickiness in terms of government-created advantages (GCAs). If management expectations tend to be optimistic, the cost stickiness is higher with a differentiation strategy than with a low-cost strategy. If management expectations tend to be pessimistic, then cost stickiness is higher with a low-cost strategy than with a differentiation strategy. Finally, the state-owned equity affects the extent of the effect of a differentiation strategy on cost stickiness. State-owned firms, which receive more GCAs than non-state-owned firms, have stronger cost stickiness than non-state-owned firms, even if both categories of firms use more differentiation strategy.
... In general, the coefficients of the probit regression cannot be interpreted from the initial output, thus the need to interpret the marginal effects of the regressors (Greene, 2000). That is to say, how much the (conditional) probability of the outcome variable changes when there is a change in the value of variables, holding all other variable constant at some values. ...
... This is different from the linear regression case where a direct interpretation can be estimated for the coefficients (Gujarati, 2004). This is because, with linear regression, the regression coefficients (output) are the marginal effects, whereas in the probit regression, there is an additional step of computing required to get the marginal effects (Greene, 2000). This is the notion of marginal effects measure and is shown in Table 8. ...
Article
Full-text available
The study investigated the factors influencing land reform beneficiaries' willingness to pay for extension services. Furthermore, the study determined the extension services for which farmers were willing to pay, and the cost. The study was conducted in seven districts in the Eastern Cape and KwaZulu-Natal provinces. Research activities included a formal survey conducted on a sample of 111 farmers using simple random sampling. Data were collected using a structured questionnaire through interviews and using a semi-structured interview guide for focus group discussions. The study employed Chi-square and T-test analyses to determine the relationship between the socio-economic characteristics of the farmers and their willingness to pay for extension services. The main findings were that 64% of land reform beneficiaries were in favour of privatisation of extension services. Furthermore, 98% of these farmers said they were willing to pay for extension services and indicated the price and type of services preferred. From the results of the probit regression analysis, it was seen thatfarmers who were likely to pay are those who are younger, with larger land sizes, and who have access to extension services. The study concluded that farmers were in favour ofprivatisation and were willing to pay for extension services, as they felt this would improve their farm returns.
... Multinomial logit model is widely used to examine the determinants of livelihood strategies ( Jan et al., 2012;Nasir, 2005;Amelie and Zimmermann, 2004). Probit Model has also been used in such studies but it is less popular because of computational issues (Greene, 2003). ...
... Maximum likelihood method is used to estimate multinomial logit model and above probabilities will enter into the likelihood function (Greene, 2003). ...
Article
Full-text available
This study was intended to identify the endowment of livelihood assets and to assess their role in adoption of livelihood strategies in district Bhimber, AJ&K. A basic survey was carried out to collect primary data from 310 households selected through convenient sampling. Endowment of livelihood assets was calculated by constructing asset indices. Asset weights were obtained using Principal Component Analysis (PCA). Assets indices revealed that district has performed best on physical capital (index value 0.63) and human capital (index value 0.59) dimensions. It was found that foreign remittance is the most popular livelihood strategy (29% of households’ main source of income) in the district. Apart from this, livelihood strategies are fairly diverse in different tehsils (sub-divisions) of the district. Results of Multinomial Logit Model demonstrate that human capital has the strongest positive role to enable households to enter into more rewarding livelihood strategies. Study recommends to take appropriate steps to reduce pressure on natural resources, to make further investment in human capital to enable people entering into more rewarding livelihood strategies and to educate households to use remittance money for job creating activities to enhance employment opportunities and to augment household income.
... The Breusch-Pagan Lagrange multiplier test showed that effectively unit effects are present in our data therefore either the fixed or random effects model should be used (Greene, 2003). Hausman tests in two of three regressions do not reject the null hypothesis that the unit effects are orthogonal with the regressors, implying that the random effects model would be appropriate. ...
Article
This paper applies a multiobjective goal programming (GP) model to define the profile of the most profitable insurers by focusing on 14 firm-decision variables and considering different scenarios resulting from the exogenous change in interest rate and GDP per capita growth variables.We consider a detailed database of Spanish non-life insurers over the period 2003–2012 taking into account two dimensions of insurers’ results: underwriting results and investment results. A prior econometric analysis is used to find out relevant relations among the variables. Next, a GP model is formulated on the basis of the relationships obtained. The model is tested in a robust environment, allowing changes in the coefficients of the objective functions, and for several scenarios regarding crisis/noncrisis situations and changes in interest rates.We find that having the stock organizational form, being an unaffiliated single company andmaintaining low levels of investment risk, leverage, and regulatory solvency are recommended for result optimization. Growth and reinsurance utilization are not advisable for optimizing the results, whereas size should be positively emphasized even more in instability periods and when interest rates increase. The results also show that the optimal level of the diversification/specialization strategy depends on economic conditions. More specialization is advisable as negative changes in interest rates increase. However, we find that the optimal values of the diversification variable are higher for the crisis scenarios compared to the corresponding noncrisis scenarios, suggesting that diversification creates value in crisis. Further sensitivity analyses show the soundness of the conclusions obtained.
... If the null hypothesis of the F-test is rejected, then OLS would be inappropriate and unobserved heterogeneity would occur, thereby suggesting significant team-level effects. Third, we performed the Hausman test to detect endogenous regressors and determine whether the test rejects the random effect assumption that no correlation exists between nuisance parameters and regressors (Greene, 2002). RE is preferred under the null hypothesis of the Hausman test, which maintains that team-level effects are uncorrelated with predictors and that the variation across teams is random. ...
Article
Full-text available
This study tests the uncertainty of outcome hypothesis for single games and playoff appearances in the Korean Professional Baseball League from 2007 to 2015. Our panel data analysis shows that the difference in winning percentages between two teams and playoff uncertainty based on games behind are important factors for increasing game attendance. This study supports the potential importance of analyzing daily game attendance of the literature on diverse sport leagues. It also presents implications for policymakers and league owners-which typically leverage teams as promotional instruments-on improving the self-sustainability of sport teams.
... Various criticisms of the empirical approaches include the following. First, the use of time averaged data, resulting in loss of information and bias (Greene, 2000). Second, the reliance on GDP growth rates, i.e. first differences, resulting in misleading inferences regarding long run relationships (Ericsson et al., 2001). ...
Preprint
The causal relationship between FDI inflows and growth is of great policy interest, yet the state of concrete knowledge on the issue is rather poor. Our contribution is to investigate the causal relationship between the ratio of FDI to GDP (FDIG) and economic growth (GDPG) using a battery of cutting-edge methods and an extensive data set. We employ the heterogeneous-panel tests of the Granger non-causality hypothesis based on the works of Hurlin (2004a), Fisher (1932, 1948) and Hanck (2013). Our panel data set is compiled from 136 developed and developing countries over the 1970-2006 period. According to the Hurlin and Fisher tests, FDIG unambiguously Granger-causes GDPG for at least one country. However, the results from these tests are ambiguous regarding whether GDPG Granger-causes FDIG for at least one country. Using a test based upon Hanck (2013), both with and without one structural break in the vector autoregression, we are able to determine whether and for which countries there is Granger-causality. This test suggests that at most there are six countries (Estonia, Guyana, Poland, Switzerland, Tajikistan and Yemen) where FDIG Granger-causes GDPG and at most four countries (Dominican Republic, Gabon, Madagascar and Poland) where GDPG Granger-causes FDIG.
... Following Greene (1997), we can write Eq. (6) as a two-sided censored Tobit model: 312 ...
Article
Full-text available
The paper estimates the impacts of climate change, agroecological and socio-economic characteristics on agricultural productivity and efficiency changes in Bangladesh agriculture using a rich panel dataset of 17 regions covering a period of 61-years (1948–2008). Results revealed that land has the most dominant role in increasing agricultural production followed by labour and irrigation. The contribution of non-cereal crops (i.e., potatoes, pulses, oilseeds, jute and cash crops) to total production are also significant, ranging from 2 to 8% per annum. An increase in annual-rainfall and long-term-temperature (LTT) significantly enhance production. Production is significantly higher in floodplain agroecologies. However, production efficiency fluctuated sharply and declined overtime. The mean efficiency score of 0.74 implies substantial room to improve production by resource reallocation. Average farm size, crop specialization and investment in R&D significantly improve efficiency whereas increases in annual temperature-variability and LTT significantly reduce efficiency. Efficiency is significantly lower in low-lying floodplain and coastal-plain agroecologies. Policy implications include investments in diversifying cropping portfolio into other cereals (i.e., wheat and maize), research to develop crop varieties suited to changing climatic conditions and specific agroecological regions, and land/tenurial reforms to consolidate farm size to enhance productivity and efficiency of Bangladesh agriculture.
... We simultaneously estimate the probability of re-arrest P(A ict ) based on the endogenous variable program participation (24/7 i *) and the other covariates. We assume that ε ict and u ict are distributed bivariate normal, such that E[ε ict ] = E[u ict ] = 0, var[ε ict ] = var[u ict ] = 1 (Greene, 2011). This county-level instrument allows us to isolate and evaluate individual-level variation across and within 24/7 assignment. ...
... There are numerous examples in the literature of how people's stress level and mental health can be affected by images or contact with nature. At the same time, individuals or groups can feel a deep sense of loss and dissatisfaction when their local environment is degraded by unregulated, 12 Although we are aware that many methodological advances have been proposed in some recent contributions (Greene, 2017; 2017; Wheat et al., 2019), we use the standard specification proposed by Battese and Coelli (1995), whose use is quite consolidated in the literature. industrial pollution and others uninvited and unwelcome developments. ...
Article
The paper aims to shed light on the geography of well-being in Italian regions and explain the distance of a given region from an efficiency frontier. We build a composite regional well-being index over the period 2010–2015. Then using the index as output, we estimate a well-being generating function to rank Italian regions in terms of efficiency in attaining well-being. The rankings confirm the divide between Northern and Southern regions as regards overall well-being and efficiency. Our findings indicate that regions more dependent on external financing achieve lower efficiency scores. Current failures should not be used to reinforce selfish localism; they should rather stimulate the search for more effective policies to reduce disparities.
... De acuerdo con los resultados de la Tabla 8 se rechaza la hipótesis nula por lo tanto el modelo también presenta problemas de correlación cruzada. El último aspecto se usó el test de Wald modificado para heterocedasticidad (Greene, 2000) (Baum, 2001), en donde se rechaza la hipótesis nula indicando que el modelo presenta también problemas de heterocedasticidad. ...
Article
Full-text available
El costo de capital se entiende como la rentabilidad esperada sobre la inversión realizada en la empresa por parte de los inversionistas (Besley, Brigham, y Gomez, 2001), este indicador se puede ver afectado por el cambio de estándares contables. Es por esto que el objetivo de la investigación es identificar el efecto en el costo de capital de las principales empresas colombianas listadas en el índice bursátil COLCAP, tras la adopción de las Normas Internacionales de Información Financiera (IFRS), por medio del análisis de diferentes empresas entre el 2009 y 2017, a través de los modelos Mínimos Cuadrados Generalizados Factibles (FGLS) y con Error Estándar Corregido en el Panel (PCSE), tomando información calculada bajo el estándar contable local y el internacional. Se evidencia que el implementar las IFRS, no genera un efecto significativo en el costo de capital, a diferencia de variables como el endeudamiento, el margen operacional, el ROE y el crecimiento de las empresas. Igualmente, se concluye que a nivel general la normativa contable inicial y la posterior son indiferentes para el cálculo del costo de capital, sin embargo, a nivel particular esta implementación hizo que algunas variables específicas si tuvieran efecto.
... Before discussing various methods of combining forecasts, let's give an empirical justification by making up the primary combination of the two forecasts. Selected forecasts relate to discrete time series data for which forecasts are made monthly for one period ahead [9]. The forecasts published in [10] confirm that the forecasting methods developed in the studies turned out to be so successful that it is necessary to look for processes for which alternative methods can be found which forecast is better. ...
... The modified Wald test examines the groupwise heteroskedasticity in the fixed-effects model which assumes homoskedasticity -but is 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 A c c e p t e d M a n u s c r i p t 7 often violated due to specific error variances of cross-sectional units. The application of the modified Wald test (44) stems from the results of the Jarque-Bera test statistic -which confirms the data series violates the assumption of normality (see Table S1). Significantly, the modified Wald test statistic accommodates panel settings with unequal distribution of observations across cross-sectional unitsa typical case in this study. ...
Article
Full-text available
Significant progress has been made towards mitigating climate change and its impacts across countries. However, the transboundary effect of CO2 emissions means that excluding the actions and inactions of certain countries and territories that escalate emissions is alarming. On this note, we examined the heterogeneous contribution of immediate and underlying drivers of emissions across 206 countries and territories for the period spanning 1960–2018. We deployed a dynamic panel estimation technique that accounts for cross-sectional dependence, heterogeneous parameters across countries, and dynamic correlated effects—a constraint for socio-economic, consumption- and pollution-based models. A global accounting of economic policy and debt, population structure, density and urbanization, and environmental-related aggregate indicators in a carbon emission function is presented. The empirical results demonstrate that the overarching effect of the instantaneous increase in economic development, population dynamics and energy utilization stimulate global emissions at national, urban and household levels across countries and territories. Industrialization and trade were found to escalate global pollution levels due to the impact of carbonized and energy-intensive economic structure in many developing and developed economies. Urbanization, urban income growth, and urban energy consumption are intertwined, hence, the institution of urban-related policy interventions is likely to negate the trio-impact on environmental sustainability. The triple effect (exploitation of natural resources, production and consumption) of economic development spurs environmental pollution, thus, calls for structural change from a carbonized to a decarbonized economy. The complex interaction highlights diversification of the energy mix by the inclusion of clean and renewable energy sources, fossil fuel-switching, and modern technologies like carbon capture and storage to improve energy efficiency and decline emission intensities.
... We created a dummy variable that takes a value of 1 for households with real per capita expenditure of more than 1.25 USD and 0 otherwise, and we estimate the coefficients inTable 3using a linear probability model. Probit and logit fixed effects models yield biased estimates resulting from the incidental parameter problem(Greene 2003;2004). We can obtain consistent slope estimate using conditional fixed effects in the logit model, yielding similar results (qualitatively and statistically) as the corresponding linear probability model (results available from the authors upon request). ...
Article
Full-text available
We estimate the effect of mobile money adoption on consumption smoothing, poverty, and human capital investments in Tanzania. We exploit the rapid expansion of the mobile money agent network between 2010 and 2012 and use this together with idiosyncratic shocks from variation in rainfall over time and across space in a difference-in-difference framework. We find that adopter households are able to smooth consumption during periods of shocks and maintain their investments in human capital. Results on time use of children and labor force participation complement the findings on the important role of mobile money for the intergenerational transmission of poverty.
... Therefore, the two stage least square (2SLS) was applied to full fill the exogeinty assumption of Classical Linear Model (CLRM) since quantity produced was taken as an explanatory variable. Following (Greene, 2000), econometric model specification of the multiple linear regression models in matrix notation is: ...
Article
Full-text available
Ethiopia has a broad genetic diversity among its coffee varieties. Despite high coffee production potential of the district, the market and marketing system of the area is generally dominated by conventional system of marketing and producers are forced to sale directly for conventional transaction root that do not provide premium price for their coffee produce and results low market margins. Both primary and secondary data were used for this study. Descriptive statistics like: percentage, frequency, mean and standard deviation and econometric model which is stages least square (2SLS) were used to analyze the data. The result of econometric analysis of 2SLS regression shows that four variables (which are education level of household head, membership to coffee cooperative, transport ownership and quantity of coffee produced) positively and significantly affected market supply of coffee. However, distance to the nearest market affected it market supply of coffee negatively and significantly. Therefore, policy implication drawn from the findings aimed at strengthening farmers coffee cooperative and enhancing the financial capacity of cooperative with functional collection center, improving accessibility of transport services and developing infrastructure,improving farmers’knowledge through adult education as well as their experience sharing with other coffee producing farmers, improving productivity through strengthening supportive institutions(extension service provider).
... By the criterion of interpretability of models, physical models are preferred in comparison with statistical econometric models. The difference compared to econometrics is that, instead of searching for statistical relationships between time series based on selected regressions and estimating their likelihood (Greene 1997), it is suggested to derive relationships based on a meaningful analysis: physical, climatic, and economic. Since global indicators are of different nature, the general problem is divided into a number of specific tasks for establishing links between the indicators, and this forces us to use a mixed set of methods. ...
Article
Full-text available
The work is aimed at developing a conceptual model of the relationship among global indicators such as world population, GDP, primary energy consumption, anthropogenic carbon dioxide emissions, and mean surface temperature anomaly. The world economy is viewed from three perspectives as (1) a manufacturing system that consumes energy and returns a product; (2) a climate-active system that shifts the planetary thermal equilibrium due to greenhouse gas emissions; and (3) a resource-distributed system in which the generalized resource is distributed among consumers of different scale and can be equivalently expressed in both monetary and energy units. It was established that dependencies between the indicators are power-law: temperature anomaly increases proportionally to cumulative energy consumption, GDP grows in proportion to the product of current and cumulative energy consumption raised to a power of less than unity, and energy consumption, in turn, is a power-law function of population with the exponent being expressed through the Gini coefficient, which is a measure of the inequality in income distribution on a global scale. Parameters of these dependencies were determined using a special procedure of fitting to empirical data. It was found that energy consumption, temperature anomaly, and GDP grow over the industrial period in proportion to population raised to power close to 1.5, 1.8, and 2, respectively.
... I use the probit model to estimate equation (1). Using the ordinary least squares (OLS) technique to estimate a model that has a binary dependent variable can produce inefficient estimators (Greene, 2008;Wooldridge, 2009). First, the error term depends on the value of the independent variables. ...
Article
Using international data from the Life in Transition Survey, I analyse the role of social trust on pro‐environmental behaviours aimed at helping to fight climate change. Social trust might increase pro‐environmental behaviour by reducing the free‐rider problem, restraining opportunistic behaviour, and enhancing cooperation. The results suggest that social trust increases the probability of individuals taking personal actions aimed at helping to fight climate change; the results are robust to using different sets of control variables, and to controlling for country and region fixed effects. The results also indicate that social trust is positively and significantly associated with environmental actions that are time‐consuming, but there is no significant relationship with environmental actions that impose monetary costs on individuals.
... Different metrics including accuracy and F1 scores are used to evaluate the performance of our method. Also, we compared the performance of this inverse feature learning versus several deep representation learning approaches, as described in [25], such as Linear ELM [26], Deep Belief Networks [27], Stacked Auto-Encoder [11] for pre-training weights of the deep network alongside a softmax classifier [28], DrELM [25], and DrELM r [25]. The results are reported using "Statistics and Machine Learning Toolbox" of MATLAB R . ...
Preprint
Full-text available
This paper proposes inverse feature learning as a novel supervised feature learning technique that learns a set of high-level features for classification based on an error representation approach. The key contribution of this method is to learn the representation of error as high-level features, while current representation learning methods interpret error by loss functions which are obtained as a function of differences between the true labels and the predicted ones. One advantage of such learning method is that the learned features for each class are independent of learned features for other classes; therefore, this method can learn simultaneously meaning that it can learn new classes without retraining. Error representation learning can also help with generalization and reduce the chance of over-fitting by adding a set of impactful features to the original data set which capture the relationships between each instance and different classes through an error generation and analysis process. This method can be particularly effective in data sets, where the instances of each class have diverse feature representations or the ones with imbalanced classes. The experimental results show that the proposed method results in significantly better performance compared to the state-of-the-art classification techniques for several popular data sets. We hope this paper can open a new path to utilize the proposed perspective of error representation learning in different feature learning domains.
... Table 2 in the appendix reports the results from the panel group wise heteroscedasticity test. This test basically comprises of three panel data heteroscedasticity identification analyses namely Lagrange multiplier, likelihood ratio and Wald tests (Judge et al. 1982;Greene 1993). The corresponding results from these three aforementioned tests indicate the problem of heteroscedasticity in the empirical model considered in this paper. ...
Article
Full-text available
Inflows of foreign currencies into the developing economies, in particular, have been associated with the Dutch disease phenomenon whereby a surge in such inflows is believed to stimulate real appreciation of the real exchange rate. As a result, there could be de-industrialization impacts on the recipient economies following a growth in the non-tradable sector at the expense of the tradable sector's contraction. This paper empirically investigates the dynamics of real exchange rate responses to official development assistance, foreign direct investments and international remittances flowing into the four emerging South Asian economies Bangladesh, India, Pakistan, and Sri Lanka. The results from the extensive econometric analyses show that a 1% rise in the total volume of official development assistance and remittances received appreciates the real exchange rate by 0.18% and 0.23% respectively. In contrast, a 1% rise in FDI inflows was found to trigger a 0.19% depreciation of the real exchange rate. Furthermore, the Dumitrescu and Hurlin (2012) test results reveal unidirectional long run causalities running from official development assistances and FDI inflow to real exchange rate while certifying a bidirectional causal association between inward international remittances and the real exchange rate.
... Y represents the probability for an individual i to adopt mechanisation (1 if the farmer adopt and 0 if otherwise); X i are the explanatory variables including the characteristics of the farmer and farm attributes; β 0 is the intercept; β (1-k) are the coefficients for the respective variables in the logit function and u is error term (Greene, 2003). ...
Article
Full-text available
Farm mechanisation plays a major role in the agricultural sector, as it facilitates achievement of energy-intensive operations. In developing countries, policies and technical constraints have greatly affected the development of a coherent agricultural mechanisation system, that is accessible to farmers, especially the poorest. This study aimed to identify the socioeconomic factors that have driven the use of different sources of farm power in family farming in Ruzizi Plain in Democratic Republic of Congo. A random sample of 190 smallholder farmers and 30 technicians were surveyed in 2014 and 2015 in six areas of Kamanyola, Luvungi, Luberizi, Sange, Kiringye and Kiliba. Results showed that mechanisation in the Ruzizi Plain involved a range of sources of farm power, including draft animals, tractors and rototillers. Factors such as gender, attitude of the head of a household, farm productivity and profitability and non-farm incomes played a crucial role in the choice of whether or not to mechanise. Maize profitability was higher under mechanisation (US$ 535.46 ha-1) compared to non-user farms (US$ 7.73 ha-1). For cassava, however, there were no significant differences in profitability between mechanised and non-mechanised farms. Other benefits of mechanisation included better working conditions, reduction in the duration of farming operations, and the expansion of cultivated land parcels.
... The results of both Hausman tests (p > 0.05) indicate that the random effects coefficients were not significantly different from the fixed effects coefficients. Consistent with common practice, we, therefore, prefer the efficiency and consistency of the random effects model (Greene, 2003). ...
Article
Full-text available
Purpose Process innovation is a key determinant of performance. While extant literature paints a clear picture of the drivers of process innovation, the effect of process innovation on performance has received little attention. This paper aims to examine how the divergence of process innovation impacts performance. Divergence concerns the extent to which the observed level of process innovation diverges from the expected level of process innovation. Positive divergence occurs when the observed level of process innovation is higher than expected while for negative divergence the opposite occurs. In turn, the authors consider how divergence acts as a driver of performance. Design/methodology/approach The authors use survey and archival data from 5,594 firms across 15 countries. The authors analyze the data using an advanced two-step random-effects estimator that accounts for the multi-level data used. Findings The authors find negative divergence to reduce performance under high competitive intensity, whereas positive divergence is detrimental under high environmental uncertainty. Research limitations/implications The authors present new and unique insights into the relationship between divergence and performance. The authors argue that each firm has an “ideal” level of process innovation, based on their resources and business environment, relative to which performance diminishes. Specifically, the authors argue that divergence from the firm’s expected level of process innovation is associated with the reduced performance during high environmental uncertainty or high competitive intensity. Furthermore, the authors argue that there can be “too much” process innovation. This nuance of the majority of prior empirical studies in this area suggests that more innovation is always better for firms. The more nuanced approach reveals that the process innovation-performance debate should not focus on more or less innovation per se , but on how innovation is constructed and supported. Practical implications Some argue the existence of an academia-practitioner gap, with both living in different worlds (Reibstein et al. , 2009). The findings suggest that theory is not only useful to practitioners but also has a crucial and central role regarding decisions relating to efficiency and effectiveness of scarce resources, in the field of process innovation. More specifically, the authors demonstrate that the prior study on process innovation seems to be useful in that relative to a theory-predicted level, divergence diminishes performance in the global sample of companies across a wide range of industries. In addition, the authors suggest that firms should not strive for more innovation per se . The findings suggest that positive divergence or too much innovation is detrimental for performance under environmental uncertainty, while negative divergence or too little innovation is harmful to performance under competitive uncertainty. Moreover, the divergence approach is also useful for comparing performance to that of other firms, typically referred to as benchmarking. Originality/value This paper is useful and important for managers and theory development as it provides insight into situations where a firm may have “too little” or “too much” process innovation. Thus, divergence advances understanding as, in contrast with the previous study, the authors do not suggest that more innovation is always better.
... In all our models, the Lagrange Multiplier (LM) tests conducted suggest the existence of the heteroscedasticity problem. Therefore, the feasible generalized least squares (FGLS) method rather than OLS was used to estimate all the models as suggested in Harvey (1976) and Greene (2000). The FGLS estimation is consistent and more efficient than OLS in the presence of heteroscedasticity and serial correlation. ...
Article
China has become the second largest air transport market in the world since 2005. Its total length of high-speed rail (HSR) tracks in operation has been greater than that of all other countries combined since 2012. HSR poses a significant challenge to the Chinese airline industry, especially on major airline routes. The impacts of HSR on two market-competition measures, namely, the Herfindahl–Hirschman Index (HHI) and the Lerner index, are examined in this study. In general, the entry of HSR had the effect of reducing market power measured by both the unweighted and weighted Lerner indexes. However, the Lerner index and HHI of the routes with parallel HSR services remained consistently higher than those of the routes without parallel HSR services.
... Because of the overlapping components in these interaction terms, the highest VIF in this model is 38, which is well above the cut-off point of 5. This multicollinearity increases standard errors in the estimation even though they do not affect the coefficient estimates (Greene, 2003), leading to nonsignificant results. Consequently, we refrain from interpreting the results of this model specification. ...
Article
Full-text available
Research in signaling theory has recently begun to explore the consequences of incongruity across signals from a single source. However, attention has been directed towards the incongruity across signals along a single dimension, even though audiences evaluate firms based on signals along different dimensions. Here, we extend this theory to investigate the incongruity across signals along different dimensions. Specifically, we theorize that the salience of positive capability signals when organizational misconduct is revealed (a negative integrity signal) causes interdimensional incongruence. We argue that audiences face greater incongruity and react more negatively to misconduct by firms whose positive capability signals were more salient. Using irregular financial restatements as the negative integrity signals and alliance announcements as the positive capability signals, we find that investors react more negatively to restatements by firms whose alliance announcements were more salient. That is, firms that announced more frequently and firms that created more positive expectations from those announcements were penalized more. We also found that firm size and diversification weaken these negative effects. We contribute to research on signaling theory, organizational misconduct, and alliances.
... where x i is the vector of observed variables as outlined in Equation (1). For ease of interpretation, marginal effects were calculated [9,10]. Dog 28 73 51 15 4 3 Cat 45 55 34 13 4 4 Fish 85 15 4 2 2 7 Horse 97 3 1 0 1 1 Bird 92 8 4 2 1 1 Reptile 94 6 3 1 1 0 Rabbit 96 4 2 1 0 1 Small Mammal 1 94 5 3 1 0 1 Other 97 3 0 1 1 1 Number and species of pets respondents have had in the past 5 years, percentage of pet owners n = 94 0 Have at least 1 1 2 3 4 or more Dog 29 72 49 14 6 2 Cat 68 32 16 9 5 2 Fish 78 22 5 5 4 7 Horse 93 7 5 0 2 0 Bird 90 10 4 2 2 1 Reptile 91 9 6 2 0 0 Rabbit 93 7 2 2 1 2 Small Mammal 1 88 12 6 3 1 1 Other 96 4 0 2 1 1 1 Small mammals include hamster, ferret, guinea pig, rat, mouse, chinchilla, gerbil. ...
Article
Full-text available
Pet ownership, veterinary use, and beliefs regarding veterinary care were elicited through the use of a nationally representative survey of 997 U.S. residents. Fifty-one percent of respondents have or had a dog in the past five years and 37% have or had a cat in the past five years. Over ninety percent of cat and dog owners had visited a veterinarian at any time, but only about 40% visited a veterinarian annually. With the rise of options in veterinary medicine, including low-cost options for vaccines and spay/neuter, further study and analysis of pet-owners use of veterinary care is warranted. Fifty-four percent of dog owners and 40% of cat owners who went to a low-cost spay/neuter clinic also went to a veterinarian/clinic/practice. This finding suggests that pet-owners who use low-cost options do so in a manner that supplements rather than replaces traditional veterinary care. Logit models were employed to evaluate the relationship between dog and cat owner demographics and visiting a veterinarian. The probability of visiting a veterinarian increased with age and income for dog owners.
... Assuming a normal distribution of errors and following from Greene [7], the probability of a farmer participation in the HBCMP is given by; ...
... The panel Tobit estimation employed the Tobit estimation in panel data analysis. The panel Tobit estimation can solve heteroscedasticity and autocorrelation with maximum likelihood [53]. Table 4. Heteroscedasticity and autocorrelation test. ...
Article
Full-text available
Flood damage continues to be an issue in coastal cities. Impervious areas that contribute to flood damage are increasing due to the continuous development of ports in cities. However, previous research has not explored development in port hinterlands and in the coastal flood risk areas of coastal cities. Therefore, this study analyzed the impact of coastal city development on flood damage in Korea. A panel Tobit analysis was conducted on 58 coastal cities between 2002 and 2018. The results revealed that a 1% increase in impervious surfaces and one coastal development permit would increase damage costs by 1.29% and 2%, respectively. The analysis revealed that the increasing development of coastal cities had a significant impact on flood damage. The findings suggest that land-use plans highlight a conflict between port development and safety. This article provides insight that can be used by policy makers to manage risk areas near ports.
... where n is the number of observations of firm-specific daily returns during the fiscal year t. The denominator is a normalization factor (Greene, 1993). This study adopts the convention that an increase in NCSKEW corresponds to a stock being more "crash prone", that is, having a more left-skewed distribution, hence the minus sign on the right-hand side of equation (4). ...
Article
Full-text available
This study examines the impact of the corporate diversification strategy on the stock price crash risk. Using a large sample of Chinese A-share listed companies for the period 2003-2017, we find the stock price crash risk significantly increases when the operation strategy of a firm changes from a specialized operation to a diversified operation or the degree of diversified operations deepens. We also find that our results are stronger for non-state-owned listed firms, but not significant for state-owned firms. Furthermore, we find that the significant positive association between diversification and crash risk is more pronounced for firms with low external audit quality and low analyst coverage. Our study suggests that the diversification of operating strategy matter in determine stock price crash risk.
... Our empirical strategy to identify the link between urban structural characteristics and future water deficit relies on estimated dependent variable (EDV) models ). An EDV model is the last stage of a multilevel two stages model (Greene, 2003). In our case, the first stage is the estimation of water deficit as a function of hydro-climatic variables (e.g., as estimated by and the second stage is the estimation of the association of those water deficits with centrality and maturity. ...
Preprint
Full-text available
Cities face the risk of water deficits. This risk involves substantial costs and damages that impair water access, biodiversity, public health, education and business. Consequently, comparative research is growing to understand urban water deficit risks and to derive policy lessons that can limit the vulnerability of large population centres. So far, this body of the literature has mostly focused on short term analysis (<10 years) and emphasized particular policy instruments to cope with shocks while neglecting the role of socio-economic contexts. We intend to fill this gap by questioning how current urban structural characteristics affect future urban water deficits. We combine indicators of cities’ centrality and maturity in 2010 with the likelihood and magnitude of cities’ water deficits between 2050 and 2070. The dataset covers 235 of the 595 cities over 750 000 inhabitants in 2010. We show that urban centrality and maturity are negatively associated with future urban deficit, as these two characteristics enable cities to attract political, technical, and economic resources to fuel their development. Further, we depict the non-linearity of these relationships. Whereas management responses and strategies may impact short-term water deficits in cities, we argue for the role of urban structural factors in shaping future water deficits.
... These estimated effects are based on a weighted least squares regression as is common in meta-analytical studies. Such weighting procedures are used to account for the heterogeneity in methodological approaches and sample sizes in primary studies and lead to a more efficient estimation compared to ordinary least squares (Greene 2003). For our main results, we use weights based on the inverse of the number of authors per study as described in Section 4.2. ...
Article
Full-text available
The size of fiscal multipliers is intensively debated as large (small) multipliers provide arguments to expand (cut) public spending. We use data on multiplier estimates from over a hundred scholarly studies, and ask whether the national imprint and various incentives that the authors face can help explain the large observed variance in these estimates. We complement this meta-analytical data with information on economists’ personal characteristics collected from their biographies and through a self-conducted survey. Our evidence is consistent with the hypothesis that national background and policy orientation of researchers matter for the size of multiplier estimates. We only find weak support for the hypothesis that the interests of donors financing the research are relevant. Significant biases largely disappear for teams of international co-authors.
... reports that SUR is an effective method for estimating models depicting mediating and/or moderating conditions using cross-sectional data. This technique is also known to alleviate endogeneity concerns (Autry and Golicic, 2010), since possible correlation between error terms are accounted for and the focal variables can be modeled as both independent and dependent within the model (Greene, 2003;p.340). There are several other methods, such as structural equation modeling (SEM), that are used to estimate similar models. ...
Article
Full-text available
Purpose This paper investigates the inter-relationships among supply integration, demand integration and internal integration in the context of food banking. Design/methodology/approach This study utilizes survey data from managers at 71 different food banks in the US combined with secondary data gathered from Feeding America's website to provide model controls and an objective measure of food bank performance. The performance metric is the amount of food distributed per food insecure individual in the food bank's service area. Theoretically developed hypotheses were tested using seemingly unrelated regression techniques and a Monte Carlo simulation-based mediation analysis. Findings While the previous research on integration relationships on for-profit supply chains has shown that managing internal integration forms the foundation for integrating with suppliers and customers, the findings indicate that, for not-for-profit food banks, external integration should precede internal integration and that demand integration has a stronger influence on performance than supply integration. Research limitations/implications The heavy reliance of food banks on external partners necessitates an internal integration structure that supplements and builds upon these external relationships. The basic programs thus developed have a direct impact on the amount of food distributed per food insecure individual. Originality/value This paper contributes to the humanitarian supply chain management literature by analyzing supply chain integration and its performance implications in a slow onset disaster setting.
Article
در اين مقاله به منظور ارزيابي کارايي صنعت خودرو در ايران، يک تابع توليد مرزي تصادفي با استفاده از داده هاي تلفيقي چهار شرکت بزرگ خودروساز، برآورد شده است. تابع توليد از نوع کاب - داگلاس با دو نهادةنيروي کار و سرمايه و به شکل خطي لگاريتمي درنظر گرفته شده و به منظور تفکيک اثرات خارج از کنترل بنگاهها برناکارايي آنها مدل اثرات تصادفي با دو فرض ثابت بودن و متغير بودن ناکارايي در طول زمان برآورد ميشود. نتايج نشان ميدهدکه کارايي صنعت خودرو بين 3/79تا 6/76درصد است. همچنين نتايج نشان ميدهد بازدهي به مقياس در صنعت خودرو فزاينده است
Article
Full-text available
: تهدف الدراسة إلى بيان أثر آليات الحوكمة على ربحية المصارف الإسلامية بدول مجلس التعاون الخليجي. واعتمدت الدراسة على عينة مكونة من 24 مصرفًا إسلاميًا في خمس دول من دول مجلس التعاون الخليجي خلال الفترة 2005م-2016م، من خلال تحليل نموذج بيانات السلاسل الزمنية المقطعية الديناميكي باستخدام طريقة العزوم المعممة “Generalized Methods of Moment” ((GMM System، وقد تم استخدام مقياسين للتعبير عن الربحية هما: العائد على الأصول، والعائد على حقوق الملكية. أظهرت نتائج الدراسة وجود علاقة قوية بين متغيرات الحوكمة والربحية للمصارف الإسلامية بدول مجلس التعاون الخليجي؛ حيث وجدت أن عدد أعضاء الهيئة الشرعية تؤثر إيجابًا على ربحية المصارف الإسلامية، بينما نجد أن كلاً من حجم مجلس الإدارة وعدد الأعضاء المستقلين في مجلس الإدارة تؤثر سلبًا على الربحية، كما أظهرت النتائج أن كلاً من حجم المصرف ونسبة كفاية رأس المال والسيولة النقدية ومعدل الفائدة تمثل محددات إيجابية مؤثرة على ربحية المصارف الإسلامية.
Article
This study empirically examines familiarity bias in a residential real estate context. The results confirm the status quo deviation aversion hypothesis and to a lesser extent the increasing status quo deviation aversion hypothesis. Familiarity bias is strongest at the country level and least pronounced in the Asian population, while North Americans and females tend to experience it the most.
Article
Full-text available
The objective of this study is to analyze the evolution and determinants of market power in Peru’s regulated microfinance sector during the period of January 2003 to June 2016. We estimate both a conventional Lerner index (LICON) and an efficiency-adjusted Lerner index (LIADJ) using information from a wide panel of microfinance institutions (MFIs), thus finding that the LIADJ is significantly greater than the LICON. This result confirms that not considering MFIs’ inefficiency leads to an underestimation of their market power. Both indices decreased until 2014, which indicates that regulated MFIs’ market power decreased significantly for more than a decade. Beginning in 2015, market power significantly grew; the largest entities as well as those with the highest efficiency have greater market power. This last result evidences the fulfillment of the efficient structure (ES) hypothesis. In addition, a less elastic demand for microcredit, a lower default risk, as well as the processes of mergers, takeovers, and changes in the business structure of some MFIs, increase market power. Finally, the MFIs that operate in localized areas exhibit greater market power.
Conference Paper
Full-text available
Resumo: A relevância da estrutura de capital para a maximização de valor é um tema que ainda provoca controvérsias. Em razão de restrições e controles legais sobre os passivos, ainda persistem discussões sobre a capacidade dos bancos criarem valor a partir da estrutura de capital. A forma como essas empresas realizam a captação de recursos na estrutura de capital pode afetar significativamente seus resultados. Há uma preferência pelos bancos por privilegiar uma maior participação do capital de terceiros, principalmente maiores níveis de depósitos de clientes por ser considerada uma das formas de captação menos onerosa, criando assim condições ideais para a alavancagem de resultados. Nesse sentido, a pesquisa de natureza empírico-analítica, descritiva, de abordagem quantitativa, teve por objetivo identificar evidências que sustentem a relevância da estrutura de capital para os maiores bancos de capital aberto no Brasil e verificar se as principais estratégias bancárias relacionadas à estrutura de capital, eficiência operacional, solvência e liquidez seriam relevantes para a maximização de valor. Foram coletados dados longitudinais da base de dados Economatica® entre o período do 1º trimestre de 2008 ao 4º trimestre de 2018 e calculados, com base nas demonstrações financeiras consolidadas, indicadores econômico-financeiros que representaram as variáveis do estudo. As análises foram realizadas por meio de modelos de regressão com dados em painel (panel data). Os resultados indicaram evidências de relevância da estrutura de capital e a influência das estratégias bancárias relacionadas à eficiência operacional sobre a rentabilidade dos bancos. Palavras-Chaves: Relevância da Estrutura de Capital. Determinantes da Rentabilidade Bancária. Desempenho dos Bancos | Abstract: The relevance of the capital structure for value maximization is a subject that still cause controversy. Due to legal restrictions and controls on liabilities, there are still discussions about the capability of banks to create value from the capital structure. The way these companies carry out raising funds in the capital structure can significantly affect their results. There is a preference for banks to privilege a greater participation of debt, mainly higher levels of customer deposits, as it is considered one of the least costly forms of funding, thus creating ideal conditions for leverage results. In this way, the empirical-analytical, descriptive, quantitative-based research aimed to identify evidence that supports the relevance of the capital structure for the largest publicly-held banks in Brazil and to verify if the main banking strategies related to the capital structure, operational efficiency, solvency and liquidity would be relevant to value maximization. Longitudinal data from the Economatica® database were collected between the period from the first quarter of 2008 to the fourth quarter of 2018 and calculated based on the consolidated financial statements, economic and financial indicators that represented the study variables. The analyzes were performed using regression models with panel data. The results indicated evidence of capital structure relevance and the influence of banking strategies related to operational efficiency on bank’s profitability. Keywords: Capital Structure Relevance; Bank’s Profitability Determinants; Banking Performance.
Book
Full-text available
This book offers broad evidence on how new information and communication technologies (ICT) impact social development and contribute to social welfare. Its aim is to show how new technological solutions may contribute to society's welfare by encouraging new 'socially responsible' initiatives and practices as the broad adoption of new technologies becomes an integral component of organizations, and of the overall economy. Society and Technology: Opportunities and Challenges is designed to provide deep insight into theoretical and empirical evidence on ICT as socially responsible technologies. More specifically, it puts special focus on examining the following: • how channels of ICT impact on social progress, environmental sustainability and instability • the role of ICT in creating social networks, with positive and negative consequences of networking • how ICT encourages education, skills development, institutional development, etc. • the ethical aspects of technological progress, and • technology management for social corporate responsibility. The book is written primarily for scholars and academic professionals from a wide variety of disciplines that are addressing issues of economic development and growth, social development, and the role of technology progress in broadly defined socioeconomic progress. It is also an invaluable source of knowledge for graduate and postgraduate students, particularly within economic and social development, information and technology, worldwide studies, social policy or comparative economics.
Article
Full-text available
When the objective is to administer the best of two treatments to an individual, it is necessary to know his or her individual treatment effects (ITEs) and the correlation between the potential responses (PRs) and under treatments 1 and 0. Data that are generated in a parallel‐group design RCT does not allow the ITE to be determined because only two samples from the marginal distributions of these PRs are observed and not the corresponding joint distribution. This is due to the “fundamental problem of causal inference.” Here, we present a counterfactual approach for estimating the joint distribution of two normally distributed responses to two treatments. This joint distribution of the PRs and can be estimated by assuming a bivariate normal distribution for the PRs and by using a normally distributed baseline biomarker functionally related to the sum . Such a functional relationship is plausible since a biomarker and the sum encode for the same information in an RCT, namely the variation between subjects. The estimation of the joint trivariate distribution is subjected to some constraints. These constraints can be framed in the context of linear regressions with regard to the proportions of variances in the responses explained and with regard to the residual variation. This presents new insights on the presence of treatment–biomarker interactions. We applied our approach to example data on exercise and heart rate and extended the approach to survival data.
Article
Full-text available
Mobile internet is considered one of the most important developments in information and communication technology due to its considerable effect on both the economy and our daily lives. Furthermore, mobile internet is an essential tool for overcoming the rural–urban digital divide. With respect to agriculture, mobile internet can play a central role in information gathering as well as the implementation of precision and smart farming technologies. Yet, no study has identified the determinants of mobile internet adoption in agriculture. Using a bivariate probit model with a sample selection and a representative data set from 815 German farmers, this study showed that, among other characteristics, the age of the farmer, farm size and location, as well as familiarity with internet risks is associated with mobile internet adoption in agriculture. These results may be of interest to policy makers, who deal with internet infrastructure, and providers of farm equipment that rely on mobile internet connection.
Article
Full-text available
Article
Full-text available
Legislative bargaining theory suggests that fiscal transfers among member states of a federation are determined to a substantial degree by political bargaining powers. Malapportionment of the states' population in the legislature is claimed to lead to disproportionally higher benefits to overrepresented states. The present paper analyses empirically the distribution of fiscal transfers in Germany's intergovernmental transfer system over the period 1970-2002. It can be shown that overrepresented states in the upper house receive disproportionate shares of transfers, while malapportionment in the lower house does not seem to matter. We also find empirical evidence that overrepresentation became more important over time.
Article
Prior research suggests that diversified firms are often unable to match resources to the market needs and opportunities of their divisions due to factors such as influence activities. In this research we propose that when such internal inefficiencies arise, diversified firms may form alliances to access resources externally to support their divisions in their industries and operations. Using a sample of US firms between 1997‐ 2006, we find that, on average, diversified firms form more alliances within industries that they currently operate in when compared to single business firms. The alliancing activity in related industries increases when businesses with diverse growth opportunities exist within the same firm, and it decreases with the intensity of internal control and coordination mechanisms. Our study suggests a link between internal resource allocation processes and external alliancing activity, while highlighting that alliances may play an important role in how diversified firms manage the inefficiencies that arise within their boundaries.
Article
This study examines the effect of status homophily between a team organizer and a project team on project performance. Due to the Matthew effect, high statuses of the team organizer and project team may lead to better project performance, respectively. However, similar-status association on a project can make it difficult to form an informal hierarchy and therefore cause internal conflict. Thus, status homophily may weaken project performance. We test this idea in the context of the Chinese film industry. The results show that both producer status and artistic team status enhance project performance. However, the joint effect (i.e., status homophily) negatively affects project performance.
Article
The aim of this paper is to study the effect of income inequality on the probability of democratization, in a panel of 51 transition countries during the period 1960-2008. Using a conditional fixed effect logit estimation, we find robust results suggesting that income inequality (measured by the Gini index of household's income inequality) has an inverse--U shaped relation with the probability of transition from autarchy to democracy. We show that there is a turning point at a level of household's income inequality equal to a Gini index of 40. When income inequality is below 40, then probability of transition is positively related to inequality, but when inequality is higher, a subsequent increase in inequality decreases the probability of democratization. This is consistent with Acemoglu and Robinson's theory that shows how transitions are likeliest at moderate levels of inequality while autocracy is likelier at the lowest and highest levels of inequality.
Article
Full-text available
This study was carried out to estimate factors influencing the multidimensional poverty status of rural households in Ogun State, Nigeria. A multistage sampling technique was used to select 240 rural households. Data was obtained through the use of a structured interview schedule, and was analyzed with descriptive techniques, multidimensional poverty index and logistic regression models. The result revealed that 69% of the rural households are multidimensionally poor. It was found that (on average) the rural poor households were deprived in 41% of the weighted indicators. Another finding is that rural households were deprived in 28% of total deprivations they could experience. It was also revealed that deprivation in infrastructure contributed most to the total deprivation experienced, followed by deprivation in living standard, social capital, health and education. The study further found that household size (p < 0.05), gender (p < 0.01), off-farm income (p < 0.1), availability of community health extension workers (p < 0.05) and availability of public market (p < 0.1) significantly influence the poverty status of rural households. The study concluded that an increase in household size increases the likelihood of being multidimensionally poor while an increase in off-farm income, access to public market and health extension services reduce the likelihood of being poor. The study recommended that rural farmers diversify their livelihood sources into off-farm activities during their lean periods as this will be instrumental in reducing their poverty status. Also, infrastructural facilities such as good healthcare services and public markets should be put in place as this will go a long way in reducing the poverty status of the rural farmers.
ResearchGate has not been able to resolve any references for this publication.