No Arabic abstract
The long-lasting socio-economic impact of the global financial crisis has questioned the adequacy of traditional tools in explaining periods of financial distress, as well as the adequacy of the existing policy response. In particular, the effect of complex interconnections among financial institutions on financial stability has been widely recognized. A recent debate focused on the effects of unconventional policies aimed at achieving both price and financial stability. In particular, Quantitative Easing (QE, i.e., the large-scale asset purchase programme conducted by a central bank upon the creation of new money) has been recently implemented by the European Central Bank (ECB). In this context, two questions deserve more attention in the literature. First, to what extent, by injecting liquidity, the QE may alter the bank-firm lending level and stimulate the real economy. Second, to what extent the QE may also alter the pattern of intra-financial exposures among financial actors (including banks, investment funds, insurance corporations, and pension funds) and what are the implications in terms of financial stability. Here, we address these two questions by developing a methodology to map the macro-network of financial exposures among institutional sectors across financial instruments (e.g., equity, bonds, and loans) and we illustrate our approach on recently available data (i.e., data on loans and private and public securities purchased within the QE). We then test the effect of the implementation of ECBs QE on the time evolution of the financial linkages in the macro-network of the euro area, as well as the effect on macroeconomic variables, such as output and prices.
The choice of appropriate measures of deprivation, identification and aggregation of poverty has been a challenge for many years. The works of Sen, Atkinson and others have been the cornerstone for most of the literature on poverty measuring. Recent contributions have focused in what we now know as multidimensional poverty measuring. Current aggregation and identification measures for multidimensional poverty make the implicit assumption that dimensions are independent of each other, thus ignoring the natural dependence between them. In this article a variant of the usual method of deprivation measuring is presented. It allows the existence of the forementioned connections by drawing from geometric and networking notions. This new methodology relies on previous identification and aggregation methods, but with small modifications to prevent arbitrary manipulations. It is also proved that this measure still complies with the axiomatic framework of its predecessor. Moreover, the general form of latter can be considered a particular case of this new measure, although this identification is not unique.
While researchers increasingly use deep neural networks (DNN) to analyze individual choices, overfitting and interpretability issues remain as obstacles in theory and practice. By using statistical learning theory, this study presents a framework to examine the tradeoff between estimation and approximation errors, and between prediction and interpretation losses. It operationalizes the DNN interpretability in the choice analysis by formulating the metrics of interpretation loss as the difference between true and estimated choice probability functions. This study also uses the statistical learning theory to upper bound the estimation error of both prediction and interpretation losses in DNN, shedding light on why DNN does not have the overfitting issue. Three scenarios are then simulated to compare DNN to binary logit model (BNL). We found that DNN outperforms BNL in terms of both prediction and interpretation for most of the scenarios, and larger sample size unleashes the predictive power of DNN but not BNL. DNN is also used to analyze the choice of trip purposes and travel modes based on the National Household Travel Survey 2017 (NHTS2017) dataset. These experiments indicate that DNN can be used for choice analysis beyond the current practice of demand forecasting because it has the inherent utility interpretation, the flexibility of accommodating various information formats, and the power of automatically learning utility specification. DNN is both more predictive and interpretable than BNL unless the modelers have complete knowledge about the choice task, and the sample size is small. Overall, statistical learning theory can be a foundation for future studies in the non-asymptotic data regime or using high-dimensional statistical models in choice analysis, and the experiments show the feasibility and effectiveness of DNN for its wide applications to policy and behavioral analysis.
Rules of origin (ROO) are pivotal element of the Greater Arab Free Trade Area (GAFTA). ROO are basically established to ensure that only eligible products receive preferential tariff treatment. Taking into consideration the profound implications of ROO for enhancing trade flows and facilitating the success of regional integration, this article sheds light on the way that ROO in GAFTA are designed and implemented. Moreover, the article examines the extent to which ROO still represents an obstacle to the full implementation of GAFTA. In addition, the article provides ways to overcome the most important shortcomings of ROO text in the agreement and ultimately offering possible solutions to those issues.
We provide quantitative predictions of first order supply and demand shocks for the U.S. economy associated with the COVID-19 pandemic at the level of individual occupations and industries. To analyze the supply shock, we classify industries as essential or non-essential and construct a Remote Labor Index, which measures the ability of different occupations to work from home. Demand shocks are based on a study of the likely effect of a severe influenza epidemic developed by the US Congressional Budget Office. Compared to the pre-COVID period, these shocks would threaten around 22% of the US economys GDP, jeopardise 24% of jobs and reduce total wage income by 17%. At the industry level, sectors such as transport are likely to have output constrained by demand shocks, while sectors relating to manufacturing, mining and services are more likely to be constrained by supply shocks. Entertainment, restaurants and tourism face large supply and demand shocks. At the occupation level, we show that high-wage occupations are relatively immune from adverse supply and demand-side shocks, while low-wage occupations are much more vulnerable. We should emphasize that our results are only first-order shocks -- we expect them to be substantially amplified by feedback effects in the production network.
The transition to a low-carbon economy is one of the ambitions of the European Union for 2030. Biobased industries play an essential role in this transition. However, there has been an on-going discussion about the actual benefit of using biomass to produce biobased products, specifically the use of agricultural materials (e.g., corn and sugarcane). This paper presents the environmental impact assessment of 30% and 100% biobased PET (polyethylene terephthalate) production using EU biomass supply chains (e.g., sugar beet, wheat, and Miscanthus). An integral assessment between the life cycle assessment methodology and the global sensitivity assessment is presented as an early-stage support tool to propose and select supply chains that improve the environmental performance of biobased PET production. From the results, Miscanthus is the best option for the production of biobased PET: promoting EU local supply chains, reducing greenhouse gas (GHG) emissions (process and land-use change), and generating lower impacts in midpoint categories related to resource depletion, ecosystem quality, and human health. This tool can help improving the environmental performance of processes that could boost the shift to a low-carbon economy.