Tuesday, June 30, 2015

NBER IFM Data Session and Site

The NBER's International Finance and Macroeconomics (IFM) Program is sponsoring a 2015 Summer Institute "Data Session" and a corresponding web site ("Catalog of Data Sources") where the various datasets are archived.

Great idea. Hats off to the organizers, Galina Hale and Michael Klein

This summer's IFM data session is not the first, but the initiative is still young. Of course there may be incompletely-resolved issues regarding how best to organize the data sessions and site, what to leave in and what to leave out, etc. But the key first step is to take a first step, and Galina and Michael have taken a large leap. Hopefully others will follow suit, both inside and outside the NBER.

This summer six new datasets will be presented / discussed / archived:

Alberto F. Cavallo, Massachusetts Institute of Technology and NBER
Datasets from the Billion Prices Project at MIT

Adolfo Barajas, International Monetary Fund
Financial Development Index Database

Stijn Claessens, Federal Reserve Board
Neeltje Van Horen, De Nederlandsche Bank
Database on Bank Ownership

Andrés Fernández, Inter-American Development Bank
Capital Control Measures: A new data set

Sebnem Kalemli-Ozcan, University of Maryland and NBER
ORBIS and firm-level data

Stefan Bender, Institute for Employment Research (IAB)
The Bundesbank's Research Data and Service Center.

Thursday, June 25, 2015

Measuring and Monitoring Connectedness

I'm at the IMF soon for a couple days of lecturing on Diebold-Yilmaz's Connectedness. It was published earlier this year, and preparing for the IMF jogged my memory: I brilliantly forgot to announce it in a No Hesitations post. Anyway, it's available at the usual online shops (where you can also read the T.O.C. and first chapter), or directly from Oxford University Press. There's also a web site. Special thanks to Eric Ghysels, who put us in touch with our fine editor, Scott Parris. 

http://www.amazon.com/Financial-Macroeconomic-Connectedness-Measurement-Monitoring-ebook/dp/B00SAUJNFU/ref=sr_1_7?s=books&ie=UTF8&qid=1423485224&sr=1-7&keywords=diebold

Sunday, June 21, 2015

Online Volatility Data and Labs

I am reminded that I had planned to post on data/analysis sites that focus on financial asset return volatility measurement and modeling.

To my mind, the key trio is implied vol, GARCH vol, and realized vol. For implied vol it's the VIX at CBOE. For GARCH vol it's Rob Engle's V-Lab at NYU. For realized vol it's Neil Shephard's Realized Library at Oxford.


Yes, conspicuously missing is stochastic volatility. It's an academic simulator's paradise, but largely missing from serious/practical industry application. It's no accident; the benefit/cost ratio is just too low to excite many real financial-market modelers. One could argue that ten years from now things will look different. Perhaps, but I'm not at all sure. 

Saturday, June 20, 2015

Aarhus June 24-26 SoFiE

The Aarhus June 24-26 2015 annual meeting of the Society for Financial Econometrics (SoFiE) is looking tremendous, thanks to the local organizers at  CREATES, Torben Andersen and his fine program committee, SoFiE staff, and the numerous energetic presenters and participants. Unfortunately for me, I must miss this year's meeting, but I hope to be at both the 2016 meeting in Asia and the 2017 (tenth anniversary!) meeting in the Americas.

Sunday, June 14, 2015

A Conjecture Regarding Extracted Dynamic Factors (and Hence GDPplus)

Here's a conjecture that I'd love to see explored. It's well-posed, simple, and really interesting.

Conjecture: GDPplus (obtained by Kalman smoothing) may be very well approximated by taking a simple convex combination of exponentially smoothed GDPe (expenditure side GDP) and exponentially smoothed GDPi (income side GDP). 

That is,

\( GDPplus = \lambda \cdot SMOOTH_{\alpha_e} (GDPe)  + (1 - \lambda) \cdot   SMOOTH_{\alpha_i} (GDPi) , \)

where \(\lambda\) is a combining weight, \(SMOOTH(GDPx)\) denotes an exponential smooth of \(GDPx \), and the \(\alpha_x\)'s are smoothing parameters.

Or even more simply, forget about GDPplus, whose underlying probability model is a bit complicated, and just examine a simpler canonical case, as follows.

Conjecture: In a stationary bivariate single-factor dynamic factor model with AR(1) factor and all shocks Gaussian and orthogonal to all other shocks, the MSE-optimal factor extraction (obtained by Kalman smoothing) may be very well approximated by taking a simple convex combination of exponentially smoothed observed variables.

There are of course many variations and extensions:  N variables, richer dynamics, richer error correlation structures, different smoothers, etc.

Theoretically:

What, precisely, is the relationship between the optimal extraction and the approximation? The answer must be contained in the structure of the Kalman gain derived in ADNSS2.   

Empirically:

-- Check it out in simulated environments for various choices of \(  \lambda\), \(  \alpha_e\) and \(  \alpha_i\).

-- Again in simulated environments, minimize the average squared divergence between the exact and approximate extractions w.r.t. \( \lambda\), \(  \alpha_e\) and \(  \alpha_i\).  How close is it to zero?

-- Now do a serious application: GDPplus vs. a weighted combination of smoothed GDPe and GDPi.  Again minimize w.r.t. \(  \lambda\), \(  \alpha_e\) and \(  \alpha_i\). How close is it to zero? How much closer is it to zero than the divergence between GDPplus and GDPavg (the simple average of GDPe and GDPi now published by BEA -- see this No Hesitations post.)?

-- Based on ADNSS1 and ADNSS2, My guess is that the optimal \(\lambda\) will be around .4, and that the optimal \(\alpha_e\) will be much bigger than the optimal \(  \alpha_i\) (where bigger \(  \alpha\) corresponds to more smoothing).


[Note: In the two or three weeks since the draft of this post was written, we have explored things a bit, and it's looking good. The optimized parameters are \(  \lambda=.14\), \(  \alpha_e =.94 \) and \(  \alpha_i = .18\), and they deliver a predictive \( R^2\) for GDPplus of .94.]



Sunday, June 7, 2015

Econometric Seasonality Research is Back

Seasonality research is back! Well, at least a bit.

I recall heady earlier days, with classic work like Granger's typical spectral shape, mutiplicative seasonal Box-Jenkins and the airline model, Nerlove's unobserved-components models and Harvey's basic structural model, Barsky and Miron's work on seasonal cycles and business cycles, Engle, Granger and Hylleberg's work on seasonal integration and cointegration, and on and on. If you're interested and want systematic treatments, take a look at classic books like Nerlove, Grether and Carvalho (1979)Hylleberg (1986)Harvey (1991), Miron (1996), and Ghysels and Osborne (2001) (with an exceptionally-insightful foreword by Tom Sargent). And don't forget the legendary 1976 Conference on Seasonal Analysis of Economic Time Series (Zellner, ed., 1978).


Seasonality is clearly a large and important part of time-series econometrics; in the meticulous Nerlove et al. book index, for example, the seasonality entries alone occupy more than a page. [Historical note: Interestingly, the Nerlove et al. index was actually produced by Quang Vuong! Quang was Marc Nerlove's Ph.D. student just before me, and he moved with Marc from Northwestern to Penn in the early 1980's to finish his Northwestern Ph.D., just as I was starting my Penn Ph.D. with Marc. We overlapped at Penn for a little while. It was a great honor to join Quang, Isabel Perrigne and Ingmar Prucha in hosting Marc's 80th Birthday Conference in May 2014.]


Yet econometric seasonality research receded in the last fifteen years or so. No worries, pendulums swing, and the pendulum is swinging back. 


So then, what's going on now?


The classic issues in seasonal adjustment (e.g., overadjustment, underadjustment) are alive and well, and as relevant as ever. On underadjustment see Gilbert et al.'s 2015 Federal Reserve Board piece on the "residual seasonality" problem


There are also important issues of too-quickly-adapting adjustments (effectively a type of overadjustment). See Wright's very nice 2013 Brookings Papers piece on "unseasonal seasonals".


There are interesting and largely-unexplored issues of seasonality not only in conditional-mean dynamics, but also in conditional-variance dynamics, as in Campbell and Diebold's 2005 work on weather forecasting for weather derivatives.


Much remains to be explored regarding simultaneous adjustment of sets of series, which may for example share common seasonal components, as in McElroy (2015). (Of course this is not unrelated to the literature on seasonal cointegration.)

Related but distinct is the issue of "top down" vs. "bottom up" approaches to seasonal adjustment (e.g., is seasonally-adjusted GDP better-obtained by adjusting GDP directly or by adjusting its components separately and adding them?). See Rudebusch, Wilson and Mahedy's 2015 FRB San Francisco Economic Letter.


We're recognizing that in addition to "standard" seasonal adjustment, we may want to control for unusual weather conditions and specific weather events, as in 2015 work by Boldin and Wright.


In the U.S. there's the problem of likely-spurious Q1 GDP collapses, which is related to many of the above-mentioned issues. See the following 2015 pieces, many already mentioned above (and see Wolfers1 and  Wolfers2 for nice overviews/interpretations):
-- Federal Reserve Board piece on the "residual seasonality" problem 
-- Stark's FRB Phila Research Rap
-- Rudebusch, Wilson and Mahedy's FRB San Francisco Economic Letter

-- A recent No Hesitations post.

Thursday, June 4, 2015

2015 French Open Djokovic-Nadal Tennis Graphic, and Some Explanation

We now have the ability to produce our tennis graphic in near-real time. Here's Djokovic-Nadal from the French Open quarterfinals, 4 June 2015. The graph below plays quickly (click on it to enlarge and replay), but as usual on my web page we also have medium and slow versions that preserve more drama. I'll stop posting the graphs to No Hesitations unless we come up with something methodologically new; instead, they'll just be on my web page. Moving forward, I hope to post graphics at least for Men's and Women's grand slam finals.












One thing. People are sometimes confused as to what we're doing. We are not modeling the conditional probability that Mr. X wins the match, updated dynamically. That's very interesting, and there is good work in that direction by Klassen and Magnus, among others. But that's not what we want to do, and it's not what our graphic shows. Instead, we simply want to provide an informative visual summary of a match, doing for tennis precisely what box scores do for baseball, only better.