Coauthor
  • GAFFARD Jean-Luc (3)
  • NESTA Lionel (2)
  • GUILLOU Sarah (2)
  • SALIES Evens (2)
  • Show more
Document Type
  • Working paper (9)
  • Article (7)
  • Conference contribution (2)
  • Book (1)
Publication date 2017-07 Collection University of Kent School of economics discussion papers : 17/12
VAN DER HOOG Sander
2
views

0
downloads
Despite recent advances in bringing agent-based models (ABMs) to the data, the estimation or calibration of model parameters remains a challenge, especially when it comes to large-scale agentbased macroeconomic models. Most methods, such as the method of simulated moments (MSM), require in-the-loop simulation of new data, which may not be feasible for such computationally heavy simulation models. The purpose of this paper is to provide a proof-of-concept of a generic empirical validation methodology for such large-scale simulation models. We introduce an alternative ‘large-scale’ empirical validation approach, and apply it to the Eurace@Unibi macroeconomic simulation model (Dawid et al., 2016). This model was selected because it displays strong emergent behaviour and is able to generate a wide variety of nonlinear economic dynamics, including endogenous business- and financial cycles. In addition, it is a computationally heavy simulation model, so it fits our targeted use-case. The validation protocol consists of three stages. At the first stage we use Nearly-Orthogonal Latin Hypercube sampling (NOLH) in order to generate a set of 513 parameter combinations with good space-filling properties. At the second stage we use the recently developed Markov Information Criterion (MIC) to score the simulated data against empirical data. Finally, at the third stage we use stochastic kriging to construct a surrogate model of the MIC response surface, resulting in an interpolation of the response surface as a function of the parameters. The parameter combinations providing the best fit to the data are then identified as the local minima of the interpolated MIC response surface. The Model Confidence Set (MCS) procedure of Hansen et al. (2011) is used to restrict the set of model calibrations to those models that cannot be rejected to have equal predictive ability, at a given confidence level. Validation of the surrogate model is carried out by re-running the second stage of the analysis on the so identified optima and cross-checking that the realised MIC scores equal the MIC scores predicted by the surrogate model. The results we obtain so far look promising as a first proof-of-concept for the empirical validation methodology since we are able to validate the model using empirical data series for 30 OECD countries and the euro area. The internal validation procedure of the surrogate model also suggests that the combination of NOLH sampling, MIC measurement and stochastic kriging yields reliable predictions of the MIC scores for samples not included in the original NOLH sample set. In our opinion, this is a strong indication that the method we propose could provide a viable statistical machine learning technique for the empirical validation of (large-scale) ABMs

1
views

0
downloads
There centincreasein the breath of computational methodologies has been matched with a corresponding increase in the difficulty of comparing the relative explanatory power of models from different methodological lineages.In order to help address this problem a Markovian information criterion (MIC) is developed that is analogous to the Akaike information criterion (AIC) in its theoretical derivation and yet can be applied to any model able to generate simulated or predicted data,regardless of its methodology. Both the AIC and proposed MIC rely on the Kullback–Leibler (KL) distance between model predictions and real data as a measure of prediction accuracy. Instead of using the maximum likelihood approach like the AIC, the proposed MIC relies instead on the literal interpretation of the KL distance as the inefficiency of compressing real data using modelled probabilities, and therefore uses the output of a universal compression algorithm to obtain an estimate of the KL distance. Several Monte Carlo tests are carried out in order to (a) confirm the performance of the algorithm and (b) evaluate the ability of the MIC to identify the true data-generating process from a set of alternative models.

in Journal of Economic Dynamics and Control Publication date 2016-12
Observatoire français des conjonctures économiques
3
views

0
downloads
The present paper tests a new model comparison methodology by comparing multiple calibrations of three agent-based models of financial markets on the daily returns of 24 stock market indices and exchange rate series. The models chosen for this empirical application are the herding model of Gilli and Winker (2003), its asymmetric version by Alfarano et al. (2005) and the more recent model by Franke and Westerhoff (2011), which all share a common lineage to the herding model introduced by Kirman (1993). In addition, standard ARCH processes are included for each financial series to provide a benchmark for the explanatory power of the models. The methodology provides a consistent and statistically significant ranking of the three models. More importantly, it also reveals that the best performing model, Franke and Westerhoff, is generally not distinguishable from an ARCH-type process, suggesting their explanatory power on the data is similar

in Revue de l'OFCE - Analyse et prévisions Publication date 2012-10
0
views

0
downloads
Maximum entropy predictions are made for the Kirman ant model as well as the Abrams-Strogatz model of language competition, also known as the voter model. In both cases the maximum entropy methodology provides good predictions of the limiting distribution of states, as was already the case for the Schelling model of segregation. An additional contribution, the analysis of the models reveals the key role played by relative entropy and the model in controlling the time horizon of the prediction.

Publication date 2011-03 Collection Document de l'OFCE : 2011-04
0
views

0
downloads
An information-theoretic thought experiment is developed to provide a methodology for predicting endowment distributions in the absence of information on agent preferences. The allocation problem is first presented as a stylised knapsack problem. Although this knapsack allocation is intractable, the social planner can nevertheless make precise predictions concerning the endowment distribution by using its information-theoretic structure. By construction these predictions do not rest on the rationality of agents. It is also shown, however, that the knapsack problem is equivalent to a congestion game under weak assumptions, which means that the planner can nevertheless evaluate the optimality of the unobserved allocation.

Publication date 2011-03 Collection Document de travail de l'OFCE : 2011-05
0
views

0
downloads
The maximum entropy methodology is applied to the Schelling model of urban segregation in order to obtain a reliable prediction of the stable configuration of the system without resorting to numerical simulations. We show that this approach also provides an implicit equation describing the distribution of agents over a city which allows for directly assessing the effect of model parameters on the solution. Finally, we discuss the information theoretic motivation for applying this methodology to the Schelling model, and show that it effectively rests on the presence of a potential function, suggesting a broader applicability of the methodology.

L’industrie manufacturière française représente moins d’un emploi sur sept et concentre pourtant 75 % des exportations et 80 % de l’effort de recherche national privé. Loin d’être un secteur isolé, elle demeure un ressort de la croissance économique de la France. Les caractéristiques et les performances de ce secteur sont ici présentées en utilisant la nomenclature industrielle internationale et de nombreux indicateurs de la spécialisation industrielle française. Quatre changements interdépendants affectent l’industrie manufacturière contemporaine : l’accélération du progrès technique et la tertiarisation, la fragmentation des processus de production, l’internationalisation de la production et des marchés, la financiarisation. Si toutes les économies occidentales connaissent ces transformations, la France accuse un retard relativement à ses principaux partenaires dans les quatre évolutions identifiées. Une des explications proposées, reposant sur l’analyse des fondements microéconomiques de la dynamique industrielle, est l’existence d’importantes barrières à la croissance des entreprises. Ce diagnostic conduit à propositions en matière de politique industrielle.

Publication date 2009-12 Collection Documents de travail de l'OFCE : 2009-34
0
views

0
downloads
Following Becker (1962), an information-theoretical thought experiment is developed to investigate whether the equilibrium properties of an exchange economy depend on the rational behaviour of agents. Transactions are logged through a communication channel into an external observer's dataset, represented by Google. At some point this data connection fails and Google no longer receives the updates encoding the transactions. It is shown that Google can nevertheless make sharp predictions concerning the state of the economy. In particular, a stable long run distribution of endowments is expected, as well as a set of price-like variables. By construction this prediction does not rest on the rationality of agents, because the information-theoretical setting forces Google to treat the missed updates as random variables.

Publication date 2009-06 Collection Documents de travail : 2009-14
PIERSON John
1
views

0
downloads
An inconsistency is found in the demand side of the NEG models developed in Pflüger (2004) that follows from the absence of a non-negativity constraint on the consumption of agricultural goods. This seriously weakens the results of the original paper and those of ensuing contributions in Pflüger and Südekum (2008a,b). A solution to this problem is developed which imposes severe restrictions on the relative size of two of the core model parameters, the implications of which are examined.

2
views

0
downloads
Face au ralentissement des ventes de véhicules, le « Pacte automobile » mobilise 9 milliards d’euros pour venir en aide aux constructeurs français. L’enjeu est triple : soutenir à court terme la demande et l’emploi, assurer l’avenir d’une industrie stratégique au regard des technologies mobilisées, contribuer au développement de véhicules propres. En finançant la recherche et en facilitant le crédit, le Pacte permet aux constructeurs français de dégager des marges de manoeuvre pour leur trésorerie. Cet effet positif pourrait n’être que transitoire. La chute de la demande a une dimension structurelle qui peut faire douter de la capacité du Pacte à la relancer. Les coûts associés à des surcapacités durables peuvent être rédhibitoires. Il n’en demeure pas moins que des moyens doivent être déployés pour favoriser le développement des nouvelles technologies et les restructurations nécessaires. C’est à l’aune de cet objectif que sera jugé l’efficacité du Pacte. Dans ce contexte, le Pacte a le mérite d’engager un débat sur la définition d’une politique industrielle européenne.

Next