Do you want to publish a course? Click here

This article proposes a fundamental methodological shift in the modelling of policy interventions for sustainability transitions in order to account for complexity (e.g. self-reinforcing mechanism arising from multi-agent interactions) and agent heterogeneity (e.g. differences in consumer and investment behaviour). We first characterise the uncertainty faced by climate policy-makers and its implications for investment decision-makers. We then identify five shortcomings in the equilibrium and optimisation-based approaches most frequently used to inform sustainability policy: (i) their normative, optimisation-based nature, (ii) their unrealistic reliance on the full-rationality of agents, (iii) their inability to account for mutual influences among agents and capture related self-reinforcing (positive feedback) processes, (iv) their inability to represent multiple solutions and path-dependency, and (v) their inability to properly account for agent heterogeneity. The aim of this article is to introduce an alternative modelling approach based on complexity dynamics and agent heterogeneity, and explore its use in four key areas of sustainability policy, namely (1) technology adoption and diffusion, (2) macroeconomic impacts of low-carbon policies, (3) interactions between the socio-economic system and the natural environment, and (4) the anticipation of policy outcomes. The practical relevance of the proposed methodology is discussed by reference to four applications: the diffusion of transport technology, the impact of low-carbon investment on income and employment, the management of cascading uncertainties, and the cross-sectoral impact of biofuels policies. The article calls for a fundamental methodological shift aligning the modelling of the socio-economic system with that of the climatic system, for a combined and realistic understanding of the impact of sustainability policies.
The structure of the International Trade Network (ITN), whose nodes and links represent world countries and their trade relations respectively, affects key economic processes worldwide, including globalization, economic integration, industrial production, and the propagation of shocks and instabilities. Characterizing the ITN via a simple yet accurate model is an open problem. The traditional Gravity Model (GM) successfully reproduces the volume of trade between connected countries, using macroeconomic properties such as GDP, geographic distance, and possibly other factors. However, it predicts a network with complete or homogeneous topology, thus failing to reproduce the highly heterogeneous structure of the ITN. On the other hand, recent maximum-entropy network models successfully reproduce the complex topology of the ITN, but provide no information about trade volumes. Here we integrate these two currently incompatible approaches via the introduction of an Enhanced Gravity Model (EGM) of trade. The EGM is the simplest model combining the GM with the network approach within a maximum-entropy framework. Via a unified and principled mechanism that is transparent enough to be generalized to any economic network, the EGM provides a new econometric framework wherein trade probabilities and trade volumes can be separately controlled by any combination of dyadic and country-specific macroeconomic variables. The model successfully reproduces both the global topology and the local link weights of the ITN, parsimoniously reconciling the conflicting approaches. It also indicates that the probability that any two countries trade a certain volume should follow a geometric or exponential distribution with an additional point mass at zero volume.
It is often reported in forecast combination literature that a simple average of candidate forecasts is more robust than sophisticated combining methods. This phenomenon is usually referred to as the forecast combination puzzle. Motivated by this puzzle, we explore its possible explanations including estimation error, invalid weighting formulas and model screening. We show that existing understanding of the puzzle should be complemented by the distinction of different forecast combination scenarios known as combining for adaptation and combining for improvement. Applying combining methods without consideration of the underlying scenario can itself cause the puzzle. Based on our new understandings, both simulations and real data evaluations are conducted to illustrate the causes of the puzzle. We further propose a multi-level AFTER strategy that can integrate the strengths of different combining methods and adapt intelligently to the underlying scenario. In particular, by treating the simple average as a candidate forecast, the proposed strategy is shown to avoid the heavy cost of estimation error and, to a large extent, solve the forecast combination puzzle.
119 - Luca Amendola 2014
The number of Italian firms in function of the number of workers is well approximated by an inverse power law up to 15 workers but shows a clear downward deflection beyond this point, both when using old pre-1999 data and when using recent (2014) data. This phenomenon could be associated with employent protection legislation which applies to companies with more than 15 workers (the Statuto dei Lavoratori). The deflection disappears for agriculture firms, for which the protection legislation applies already above 5 workers. In this note it is estimated that a correction of this deflection could bring an increase from 3.9 to 5.8% in new jobs in firms with a workforce between 5 to 25 workers.
We show how every stock-flow consistent model of the macroeconomy can be represented as a directed acyclic graph. The advantages of representing the model in this way include graphical clarity, causal inference, and model specification. We provide many examples implemented with a new software package.
92 - Inga Ivanova , Oivind Strand , 2014
The knowledge base of an economy measured in terms of Triple Helix relations can be analyzed in terms of mutual information among geographical, sectorial, and size distributions of firms as dimensions of the probabilistic entropy. The resulting synergy values of a TH system provide static snapshots. In this study, we add the time dimension and analyze the synergy dynamics using the Norwegian innovation system as an example. The synergy among the three dimensions can be mapped as a set of partial time series and spectrally analyzed. The results suggest that the synergy at the level of both the country and its 19 counties shoe non-chaotic oscillatory behavior and resonates in a set of natural frequencies. That is, synergy surges and drops are non-random and can be analyzed and predicted. There is a proportional dependence between the amplitudes of oscillations and synergy values and an inverse proportional dependence between the oscillation frequencies relative inputs and synergy values. This analysis of the data informs us that one can expect frequency-related synergy-volatility growth in relation to the synergy value and a shift in the synergy volatility towards the long-term fluctuations with the synergy growth.
158 - Hao Meng 2014
We present the symmetric thermal optimal path (TOPS) method to determine the time-dependent lead-lag relationship between two stochastic time series. This novel version of the previously introduced TOP method alleviates some inconsistencies by imposing that the lead-lag relationship should be invariant with respect to a time reversal of the time series after a change of sign. This means that, if `$X$ comes before $Y$, this transforms into `$Y$ comes before $X$ under a time reversal. We show that previously proposed bootstrap test lacks power and leads too often to a lack of rejection of the null that there is no lead-lag correlation when it is present. We introduce instead two novel tests. The first the free energy p-value $rho$ criterion quantifies the probability that a given lead-lag structure could be obtained from random time series with similar characteristics except for the lead-lag information. The second self-consistent test embodies the idea that, for the lead-lag path to be significant, synchronizing the two time series using the time varying lead-lag path should lead to a statistically significant correlation. We perform intensive synthetic tests to demonstrate their performance and limitations. Finally, we apply the TOPS method with the two new tests to the time dependent lead-lag structures of house price and monetary policy of the United Kingdom (UK) and United States (US) from 1991 to 2011. The TOPS approach stresses the importance of accounting for change of regimes, so that similar pieces of information or policies may have drastically different impacts and developments, conditional on the economic, financial and geopolitical conditions. This study reinforces the view that the hypothesis of statistical stationarity is highly questionable.
178 - Dietrich Stauffer 2014
Capital usually leads to income, and income is more accurately and easily measured. Thus we summarize income distributions in USA, Germany, etc.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا