Bank Survival Around the World: A Meta-Analytic Review
Kočenda, Evžen; Iwasaki, I.
2021 - English
Bank survival is essential to economic growth and development because banks mediate the financing of the economy. A bank’s overall condition is often assessed by a supervisory rating system called CAMELS, an acronym for the components Capital adequacy, Asset quality, Management quality, Earnings, Liquidity, and Sensitivity to market risk. Estimates of the impact of CAMELS components on bank survival vary widely. We perform a meta-synthesis and meta-regression analysis (MRA) using 2120 estimates collected from 50 studies. In the MRA, we account for uncertainty in moderator selection by employing Bayesian model averaging. The results of the synthesis indicate an economically negligible impact of CAMELS variables on bank survival; in addition, the effect of bank-specific, (macro)economic, and market factors is virtually absent. The results of the heterogeneity analysis and publication bias analysis are consistent in terms that they do not find an economically significant impact of the CAMELS variables. Moreover, best practice estimates show a small economic impact of CAMELS components and no impact of other factors. The study concludes that caution should be exercised when using CAMELS rating to predict bank survival or failure.
Keywords:
bank survival; bank failure; CAMELS; meta-analysis; publication selection bias
Fulltext is available at external website.
Bank Survival Around the World: A Meta-Analytic Review
Bank survival is essential to economic growth and development because banks mediate the financing of the economy. A bank’s overall condition is often assessed by a supervisory rating system called ...
Does the Spillover Index Respond Significantly to Systemic Shocks? A Bootstrap-Based Probabilistic Analysis
Greenwood-Nimmo, M.; Kočenda, Evžen; Nguyen, V. H.
2021 - English
The spillover index developed by Diebold and Yilmaz (Economic Journal, 2009, vol. 119, pp. 158-171) is widely used to measure connectedness in economic and financial networks. Abrupt increases in the spillover index are typically thought to result from systemic events, but evidence of the statistical significance of this relationship is largely absent from the literature. We develop a new bootstrap-based technique to evaluate the probability that the value of the spillover index changes over an arbitrary time period following an exogenously defined event. We apply our framework to the original dataset studied by Diebold and Yilmaz and obtain qualified support for the notion that the spillover index increases in a timely and statistically significant manner in the wake of systemic shocks.
Keywords:
Spillover index; systemic events; bootstrap-after-bootstrap procedure
Fulltext is available at external website.
Does the Spillover Index Respond Significantly to Systemic Shocks? A Bootstrap-Based Probabilistic Analysis
The spillover index developed by Diebold and Yilmaz (Economic Journal, 2009, vol. 119, pp. 158-171) is widely used to measure connectedness in economic and financial networks. Abrupt increases in the ...
Distributed Sequential Zero-Inflated Poisson Regression
Žemlička, R.; Dedecius, Kamil
2021 - English
The zero-inflated Poisson regression model is a generalized linear model (GLM) for non-negative count variables with an excessive number of zeros. This letter proposes its low-cost distributed sequential inference from streaming data in networks with information diffusion. The model is viewed as a probabilistic mixture of a Poisson and a zero-located Dirac component, whose probabilities are estimated using a quasi-Bayesian procedure. The regression coefficients are inferred by means of a weighted Bayesian update. The network nodes share their posterior distributions using the diffusion protocol.\n
Keywords:
Poisson regression; zero inflation; GLM
Fulltext is available at external website.
Distributed Sequential Zero-Inflated Poisson Regression
The zero-inflated Poisson regression model is a generalized linear model (GLM) for non-negative count variables with an excessive number of zeros. This letter proposes its low-cost distributed ...
Media Treatment of Monetary Policy Surprises and Their Impact on Firms’ and Consumers’ Expectations
Pinter, J.; Kočenda, Evžen
2021 - English
We empirically investigate whether monetary policy announcements affect firms’ and consumers’ expectations by taking into account media treatments of monetary policy announcements. To identify exogenous changes in monetary policy stances, we use the standard financial monetary policy surprise measures in the euro area. We then analyze how a general newspaper and a financial newspaper (Le Monde and The Financial Times) report on announcements. We find that 87 % of monetary policy surprises are either not associated with the general newspaper reporting a change in the monetary policy stance to their readers or have a sign that is inconsistent with the media report of the announcement. When we use the raw monetary policy surprises variable as an independent variable in the link between monetary policy announcements and firms’/consumers’ expectations, we mostly do not find, in line with several previous studies, any statistically significant association. When we take only monetary policy surprises that are consistent with the general newspaper report, in almost all cases we find that monetary policy surprises on the immediate monetary policy stance do affect expectations. Surprises related to future policy inclination and information shocks usually do not appear to matter. The results appear to be in line with rational inattention theories and highlight the need for caution in the use of monetary policy surprise measures for macroeconomic investigations.
Keywords:
firm expectations; consumer expectations; monetary policy surprises; European Central Bank; information effect
Fulltext is available at external website.
Media Treatment of Monetary Policy Surprises and Their Impact on Firms’ and Consumers’ Expectations
We empirically investigate whether monetary policy announcements affect firms’ and consumers’ expectations by taking into account media treatments of monetary policy announcements. To identify ...
Transferring Improved Local Kernel Design in Multi-Source Bayesian Transfer Learning, with an application in Air Pollution Monitoring in India
Nugent, Sh.; Quinn, Anthony
2021 - English
Existing frameworks for multi-task learning [1],[2] often rely on completely modelled relationships between tasks, which may not be available. Recent work [3], [4] has been undertaken on approaches to fully probabilistic methods for transfer learning between two Gaussian Process (GP) tasks. There, the target algorithm accepts source knowledge in the form of a probabilistic prior from a source algorithm, without requiring the target to model their interaction with the source. These strategies have offered robust improvements on current state of the art algorithms, such as the Intrinsic Coregionalization Model. The Bayesian Transfer Learning algorithm proposed in [4], was found to provide robust, positive\ntransfer. This algorithm was then extended to accommodate knowledge transfer from multiple source modellers [5]. Improved predictive performance was observed from increases in the number of sources. This report reviews the multi-source transfer findings in [5] and applies it to a real world problem of pollution modelling in India, using public-domain data.
Keywords:
fully probabilistic methods; Bayesian Transfer Learning algorithm; Gaussian Process; Intrinsic Coregionalization Model; pollution modelling
Fulltext is available at external website.
Transferring Improved Local Kernel Design in Multi-Source Bayesian Transfer Learning, with an application in Air Pollution Monitoring in India
Existing frameworks for multi-task learning [1],[2] often rely on completely modelled relationships between tasks, which may not be available. Recent work [3], [4] has been undertaken on approaches to ...
Unsupervised Verification of Fake News by Public Opinion
Grim, Jiří
2021 - English
In this paper we discuss a simple way to evaluate the messages in social networks automatically, without any special content analysis or external intervention. We presume, that a large number of social network participants is capable of a relatively reliable evaluation of materials presented in the network. Considering a simple binary evaluation scheme (like/dislike), we propose a transparent algorithm with the aim to increase the voting power of reliable network members by means of weights. The algorithm supports the votes which correlate with the more reliable weighted majority and, in turn, the modified weights improve the quality of the weighted majority voting. In this sense the weighting is controlled only by a general coincidence of voting members while the specific content of messages is unimportant. The iterative optimization procedure is unsupervised and does not require any external intervention with only one exception, as discussed in Sec. 5.2 .\n\nIn simulation experiments the algorithm nearly exactly identifies the reliable members by means of weights. Using the reinforced weights we can compute for a new message the weighted sum of votes as a quantitative measure of its positive or negative nature. In this way any fake news can be recognized as negative and indicated as controversial. The accuracy of the resulting weighted decision making was essentially higher than a simple majority voting and has been considerably robust with respect to possible external manipulations.\n\nThe main motivation of the proposed algorithm is its application in a large social network. The content of evaluated messages is unimportant, only the related decision making of participants is registered and compared with the weighted vote with the aim to identify the most reliable voters. A large number of participants and communicated messages should enable to design a reliable and robust weighted voting scheme. Ideally the resulting weighted vote should provide a generally acceptable emotional feedback for network participants and could be used to indicate positive or controversial news in a suitably chosen quantitative way. The optimization algorithm has to be simple, transparent and intuitive to make the weighted vote well acceptable as a general evaluation tool.\n
Keywords:
weighted voting; unsupervised optimization
Fulltext is available at external website.
Unsupervised Verification of Fake News by Public Opinion
In this paper we discuss a simple way to evaluate the messages in social networks automatically, without any special content analysis or external intervention. We presume, that a large number of ...
Research Report Influence of Vehicle Assistant System on Track keeping
Nedoma, P.; Herda, Z.; Plíhal, Jiří
2021 - English
Presented results describe different methods for the evaluation of car stability in lateral direction. Due to the significant differences between the tests, uniform methodology for recognizing the drives with ESC and without it was not determined. Two different methods for the drive on the circle and VDA were proposed instead. For evaluation criteria of vehicle stability with respect to base measured quantities, was used model with weight functions.
Keywords:
Electronic stability control; Vehicle Assistant System; Vehicle stability
Fulltext is available at external website.
Research Report Influence of Vehicle Assistant System on Track keeping
Presented results describe different methods for the evaluation of car stability in lateral direction. Due to the significant differences between the tests, uniform methodology for recognizing the ...
Ockham's Razor from a Fully Probabilistic Design Perspective
Hoffmann, A.; Quinn, Anthony
2021 - English
This research report investigates an approach to the design of an Ockham prior penalising parametric complexity in the Hierarchical Fully Probabilistic Design (HFPD) [1] setting. We identify a term which penalises the introduction of an additional parameter in the Wold decomposition. We also derive the objective Ockham Parameter Prior (OPI) in this context, based on earlier work [2], and we show that the two are, in fact, closely related. This confers validity on the HFPD Ockham term.
Keywords:
Ockham’s Razor; Hierarchical Fully Probabilistic Design; Parametric Inference; Fully Probabilistic Design
Fulltext is available at external website.
Ockham's Razor from a Fully Probabilistic Design Perspective
This research report investigates an approach to the design of an Ockham prior penalising parametric complexity in the Hierarchical Fully Probabilistic Design (HFPD) [1] setting. We identify a term ...
Institutions, Financial Development, and Small Business Survival: Evidence from European Emerging Economies
Iwasaki, I.; Kočenda, Evžen; Shida, Y.
2020 - English
In this paper, we traced the survival status of 94,401 small businesses in 17 European emerging markets from 2007–2017 and empirically examined the determinants of their survival, focusing on institutional quality and financial development. We found that institutional quality and level of financial development exhibit statistically significant and economically meaningful impacts on the survival probability of the SMEs being researched. The evidence holds even when we control for a set of firm-level characteristics such as ownership structure, financial performance, firm size, and age. The findings are also uniform across industries and country groups and robust beyond the difference in assumption of hazard distribution.
Keywords:
small business; survival analysis; European emerging markets
Fulltext is available at external website.
Institutions, Financial Development, and Small Business Survival: Evidence from European Emerging Economies
In this paper, we traced the survival status of 94,401 small businesses in 17 European emerging markets from 2007–2017 and empirically examined the determinants of their survival, focusing on ...
Financial Crime and Punishment: A Meta-Analysis
de Batz, L.; Kočenda, Evžen
2020 - English
We examine how the publication of intentional financial crimes committed by listed firms is interpreted by financial markets, using a systematic and quantitative review of existing empirical studies. Specifically, we conduct a meta-regression analysis and investigate the extent and nature of the impact that the publication of financial misconducts exerts on stock returns. We survey 111 studies, published between 1978 and 2020, with a total of 439 estimates from event studies. Our key finding is that the average abnormal returns calculated from this empirical literature are affected by a negative publication selection bias. Still, after controlling for this bias, our meta-analysis indicates that publications of financial crimes are followed by statistically significant negative abnormal returns, which suggests the existence of an informational effect. Finally, the MRA results demonstrate that crimes committed in common law countries, alleged crimes, and accounting crimes carry particularly weighty information for market participants. The results call for more transparency on side of enforcers along enforcement procedures, to foster timely and proportionate market reactions and support efficient markets.
Keywords:
Meta-Analysis; Event study; Financial Misconduct; Fraud; Financial Markets; Returns; Listed Companies; Information and Market Efficiency
Fulltext is available at external website.
Financial Crime and Punishment: A Meta-Analysis
We examine how the publication of intentional financial crimes committed by listed firms is interpreted by financial markets, using a systematic and quantitative review of existing empirical studies. ...
NRGL provides central access to information on grey literature produced in the Czech Republic in the fields of science, research and education. You can find more information about grey literature and NRGL at service web
Send your suggestions and comments to nusl@techlib.cz
Provider
Other bases