ADF test

For questions that don't fall into one of the categories above, such as working with the RATS interface, using Wizards, etc.
JohnV

ADF test

Unread post by JohnV »

Hi,
I have a question regarding uradf.src procedure that does Augmented Dickey fuller tests. I opened up the procedure and noticed that when finding the optimal lag length using BIC it runs regressions on the level of the time series not the differece i.e. y(t)=c + y(t-1)+dy(t-1)+...+dy(t-maxlag).
Is it more efficient to choose optimal lags by regressing on levels instead of difference?

part of the procedure is below.

DO lagnum = 1,maxlag
LINREG(cmom,noprint) series #series{1} delseries{1 to lagnum}
COMPUTE aic(lagnum) = log(%rss/%nobs) + 2.*%nreg/%nobs
COMPUTE bic(lagnum) = log(%rss/%nobs) + (1.*%nreg/%nobs)*log(%nobs)
END DO
TomDoan
Posts: 7814
Joined: Wed Nov 01, 2006 4:36 pm

Re: ADF test

Unread post by TomDoan »

The two are identical. The only difference is the coefficient on the lagged y, since is 1.0 higher when you run the regression with levels on the left.
Post Reply