Den Haan JME(2000)
Den Haan JME(2000)
This is a replication for Wouter J. den Haan(2000), "The comovement between output and prices," Journal of Monetary Economics, vol 46, no 1, 3-30, which analyzes the comovement of series using multi-step forecast errors in a VAR.
Re: Den Haan JME(2000)
Dear Tom,
I'd like to replicate den Haan 2000 figures and tables but with the last code (2011) I get just one final graph named figure2a. How can I get other figures from the Haan paper (what changes in the code I need?)
Fondly,
Marinko
I'd like to replicate den Haan 2000 figures and tables but with the last code (2011) I get just one final graph named figure2a. How can I get other figures from the Haan paper (what changes in the code I need?)
Fondly,
Marinko
Re: Den Haan JME(2000)
Hi Tom,
I would like to check the interpretation of the correlations calculated by this routine. I think the correlations only refer to contemporaneous correlations at different horizons. But they don't/can't reflect lead-lag correlations between the variables. If I want to calculate lead correlations (i.e. correlation of x today with y tomorrow), can I do so by computing the correlation between the forecast error of x at time t and that of y at time t+1? Any hint will be of great help.
I would like to check the interpretation of the correlations calculated by this routine. I think the correlations only refer to contemporaneous correlations at different horizons. But they don't/can't reflect lead-lag correlations between the variables. If I want to calculate lead correlations (i.e. correlation of x today with y tomorrow), can I do so by computing the correlation between the forecast error of x at time t and that of y at time t+1? Any hint will be of great help.
Re: Den Haan JME(2000)
You would just need to change the # ... in this to reflect different values of J (which are the number of forecast steps). Note that you lose a possible "J" value when mismatch the horizons.
do j=1,ncov
cmom(corr,center) startl endl
# gdperrors(j) perrors(j)
compute corrs(j)=%cmom(1,2)
end do j
do j=1,ncov
cmom(corr,center) startl endl
# gdperrors(j) perrors(j)
compute corrs(j)=%cmom(1,2)
end do j
Re: Den Haan JME(2000)
Hi Tom, I am now trying to loop this for many models. I took the procedure outside of the VAR model specification, tried various things (local/global etc.). But I keep getting this message. Could you please help?
## CP17. PROCEDURE/FUNCTION Must be Initial Statement in a Compiled Section
>>>>procedure <<<<
## OP3. This Instruction Does Not Have An Option
>>>>procedure <<<<
## CP17. PROCEDURE/FUNCTION Must be Initial Statement in a Compiled Section
>>>>procedure <<<<
## OP3. This Instruction Does Not Have An Option
>>>>procedure <<<<
Re: Den Haan JME(2000)
I can guess that you have an unclosed DO loop before the procedure, but I can't tell without seeing the entire program.
Re: Den Haan JME(2000)
Thanks Tom. The code is attached. For now, I am simply asking the code to estimate the same model twice. I will eventually look this over different models.
Code: Select all
*
* Replication of Wouter J. den Haan(2000), "The comovement between
* output and prices," Journal of Monetary Economics, vol 46, no 1, 3-30.
*
* Quarterly data, multivariate VAR.
*
* Program to estimate a VAR, and calculate K-step ahead within sample
* forecasts, and their correlations. This imposes a unit-root and
* estimates the VAR in first-differences, but calculates the errors for
* the levels. It includes bootstrapped standard errors.
*
* Revision Schedule:
* Based upon version written by Wouter den Haan and posted on his web site.
* March 2010. Revised by Tom Doan, Estima
* a. Switched to use MODEL data type
* b. Changed loop control to do just a single FORECAST for
* each evaluation.
* c. Wrote procedure to calculate the correlation statistics
* for a given model
*
******************************
open data C:\Correlations\USA.xlsx
calendar(q) 1950
data(format=xlsx,org=columns) 1950:01 2015:01 loggdp bcon
*
set lgdp2 = loggdp
set bcon2 = bcon
set trend = t
set trend2 = t*t
*
*set lgdp = lgdp2 - lgdp2{1}
*set bcon = bcon2 - bcon2{1}
*set lgdp = lgdp2
set bcon = bcon2
filter(type=hp, tuning=1600) lgdp2 / temp
set lgdp = lgdp2-temp
*seed 112597
compute sss = 1998:1
compute eee = 2015:1
compute ncov = 60
compute nboot = 500 ;* 2500
compute nlags = 4
******************************************************************
*
* This is specific to a model estimated in first differences
* where the variables of interest are the first two.
*
procedure ForecastCorrs start end
option model model
option integer ncov 50 ;* Number of forecast steps
option vector *corrs
*
local integer startl endl
local integer time j
local vect[series] forecasts
local vect[series] gdperrors
local vect[series] perrors
local series actualy
local series actualp
*
* Assume to evaluate over the full regression range unless
* the user overrides.
*
if %defined(start)
compute startl=start
else
compute startl=%regstart()
*
if %defined(end)
compute endl=end
else
compute endl=%regend()
*
dim gdperrors(ncov) perrors(ncov) corrs(ncov)
do j=1,ncov
set gdperrors(j) startl endl = %na
set perrors(j) startl endl = %na
end do j
*
set actualy startl endl = %modeldepvars(model)(1){0}
set actualp startl endl = %modeldepvars(model)(2){0}
do time=startl,endl
forecast(model=model,from=time,to=time+ncov-1,results=forecasts)
*
* Replace forecasts of (changes) in y and p with their forecast
* errors. Then accumulate up to get the errors in log y and log p
* themselves.
*
set forecasts(1) time time+ncov-1 = actualy-forecasts(1)
set forecasts(2) time time+ncov-1 = actualp-forecasts(2)
*acc forecasts(1) time time+ncov-1
*acc forecasts(2) time time+ncov-1
*
* Record forecast errors only for the ones with actual data
*
do j=1,ncov
if time+j-1>%regend()
break
compute gdperrors(j)(time) = forecasts(1)(time+j-1)
compute perrors(j)(time) = forecasts(2)(time+j-1)
end do j
end do time
*
* Compute correlations for each forecast step
*
do j=1,ncov
cmom(corr,center) startl endl
# gdperrors(j) perrors(j)
compute corrs(j)=%cmom(1,2)
end do j
end
*************************************************************************************
* Estimate base model on actual data. GNP, P and DR are in differenced form.
*
do ser = 1,2
system(model=base)
variables lgdp bcon
lags 1 to nlags
det constant trend
end(system)
estimate(noftests,noprint,resids=resids) sss eee
*
* Evaluate the forecast correlations at the base model
*
@ForecastCorrs(model=base,corrs=basecorr)
*@ForecastCorrs(model=base,corrs=basecorr) 1960:3 *
*
@VARBootSetup(model=base) bootvar
dec series[vect] bootcorr
gset bootcorr 1 nboot = %zeros(ncov,1)
*
* Redefine the boostrap system to get the deterministic variables right
*
system(model=bootvar)
variables %%VARResample
lags 1 to nlags
det constant trend
end(system)
*
infobox(action=define,lower=1,upper=nboot,progress) "Bootstrapping"
do boot = 1,nboot
infobox(current=boot)
*
* Draw the new data
*
@VARBootDraw(model=base,resids=resids) sss eee
* @VARBootDraw(model=base,resids=resids) %regstart() %regend()
*
* Estimate the VAR on the bootstrapped data
*
estimate(model=bootvar,noprint) sss eee
@ForecastCorrs(model=bootvar,corrs=bootcorr(boot),ncov=ncov)
end do boot
infobox(action=remove)
*
* For each forecast horizon, compute the percentage with the same sign as
* the base correlations.
*
dec vect samesign(ncov)
do i=1,ncov
sstats(mean) 1 nboot (bootcorr(t)(i)>0.xor.basecorr(i)<0)>>samesign(i)
end do i
*
set c05 1 ncov = %if(samesign(t)>.95,basecorr(t),%na)
set c10 1 ncov = %if(samesign(t)>.90.and.samesign(t)<=.95,basecorr(t),%na)
set c00 1 ncov = %if(samesign(t)<=.90,basecorr(t),%na)
*
graph(nodates,style=bargraph,$
footer="Figure 3A Quarterly data with unit root imposed") 3
# c05 1 ncov
# c10 1 ncov
# c00 1 ncov
end do serRe: Den Haan JME(2000)
You need to put a source for the varbootsetup.src file outside the loop. It pulls in multiple procedures, so it can't be included implicitly.
Re: Den Haan JME(2000)
Thanks much, Tom. It works now.
-
edolson1983
- Posts: 1
- Joined: Tue Jul 25, 2023 4:57 pm
Re: Den Haan JME(2000)
Am I correct that the recent
"Datta, Deepa D., Benjamin K. Johannsen, Hannah Kwon, and Robert J. Vigfusson. 2021. "Oil, Equities, and the Zero Lower Bound." American Economic Journal: Macroeconomics, 13 (2): 214-53"
in their online appendix ([https://www.aeaweb.org/articles?id=10.1257/mac.20180488]) where they decompose shocks to the correlation using Variance decompositions are essentially doing the same thing as the Haan (2000) paper referenced in this thread?
"Datta, Deepa D., Benjamin K. Johannsen, Hannah Kwon, and Robert J. Vigfusson. 2021. "Oil, Equities, and the Zero Lower Bound." American Economic Journal: Macroeconomics, 13 (2): 214-53"
in their online appendix ([https://www.aeaweb.org/articles?id=10.1257/mac.20180488]) where they decompose shocks to the correlation using Variance decompositions are essentially doing the same thing as the Haan (2000) paper referenced in this thread?