Page 2 of 4

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Thu Oct 17, 2024 5:13 am
by ac_1
TomDoan wrote: Tue Oct 15, 2024 1:52 pm ... Cointegration analysis is not designed to answer the question: what is the best guess of what GDP growth will be over the next year?
Why?


tsay3p438.rpf: Tsay, Analysis of Financial Time Series, 3rd edition Example 8.6.5 from pp 438-442: forecasts multi-steps ahead from a Cointegrated VAR model. I see no problem the example.

Here's some advantages of forecasting from a VECM. And questions.

(a) LR Insight: VECM captures the LR equilibrium relationships between variables through cointegration.

(b) SR Dynamics: While focusing on LR relationships, VECM also accounts for SR fluctuations (via 1st diffs). This dual approach may allow for more accurate forecasts, as it considers both immediate SR reactions and gradual LR adjustments.

(c) ECM: The inclusion of ECT's helps to understand how swiftly and efficiently series return to their LR equilibrium after a shock. Potentially crucial for decision-makers.

alpha = speed of adjustment: represents how quickly the system corrects any deviation from the LR equilibrium i.e. the proportion of equilibrium error corrected in each period, thus guiding the system back to stability. To calculate alpha regress delta(Y(t)) on (beta' * Y(t-1)) in the lag=1 case.

beta = LR equilibrium relationships: betas (or generalized eigenvectors) represent the coefficients forming the cointegration vectors (ECT's = beta' * Y(t-1)) that define the LR relationships among the variables. They tell us about the equilibrium level itself, rather than how quickly deviations from that equilibrium are corrected i.e. the alphas.

alpha*beta' = PI matrix. PI matrix can also be calculated from the VAR as-well i.e. PI = ((PI1 + PI2 + ... + PIk)-I)). And it's from the PI that the generalized e-values and e-vectors are calculated. But why a generalized eigen-analysis and not standard?

(d) Enhanced Forecast Accuracy: By integrating both SR (1st diffs) and LR (levels) information a VECM may yield more accurate forecasts compared to models that focus on only one aspect e.g. a VAR in levels or a VAR in 1st diffs. VECM's are a broad-based approach that can improve planning and decision-making processes.

(e) Multivariate Capability: VECM's handle multiple time series variables simultaneously, making them ideal for analyzing systems where variables are interdependent. This multivariate analysis (as opposed to just univariate analysis) is crucial in capturing the complexity and interconnections in economic, financial, and other related data.


A disadvantage of forecasting from a VAR or VECM: is a lay-man could say "very wide fans" especially for multi-step ahead forecasts using quartley data (not so much with static one-step). Are there ways to circumvent these?

TomDoan wrote: Tue Oct 15, 2024 1:52 pm Needless to say, rolling sample estimation of a model which is not designed to forecast in the first place is a bad idea. And besides that, pre-test model selection (depending upon test result, choose between models A and B) is generally a bad idea as it makes the forecasts a highly discontinuous function of the data (small change to the data can flip the test result) which increases the chances of serious forecast errors. That's not as big an issue with a univariate model since e.g. different ARMA models can produce almost identical forecasts. With a multivariate model, that's a bigger deal; rank 2 vs rank 3 may have very significant differences in the relationship among the series.
TomDoan wrote: Tue Oct 15, 2024 1:52 pm @JOHMLE has an ECT option which defines a VECT[EQUATION] which can be used on the ECT instruction when you have (potentially) more than one cointegration vector. See the SHORTANDLONGVECM.RPF example.
Based on those advantages, I would still like to be able to do a rolling analysis in RATS.

In SHORTANDLONGVECM.RPF to use ECT=ecteqns the assumed rank of the cointegration space needs to be set: I don't know the rank via the lambda(trace) statistics without looking at the output table as described previously. I need to set something like %%COINTRANK dynamically within the loop -- and there's also the lambda(Max) statistics.

Here's the static loop, how do I incorporate varying RANK=%%COINTRANK and ECT=ecteqns? They will affect the dimensions of the VECTORS, DUALVECTORS and LOADINGS matrices.

Code: Select all

dec vect[series] VECMFOR_S(5)
dec vect[series] VECMERRORS_S(5)

do regend = begin, end
*
   @johmle(lags=5,det=rc,eigenvalues=evalues_S,vectors=evectors_S,loadings=loadings_S) regstart regend; * 5 variables no cv=cv
   # series1 series2 series3 series4 series5

   disp "evalues_S:" evalues_S; * eigenvalues
   disp "evectors_S:" evectors_S; * beta's
   disp "loadings_S:" loadings_S; * alpha's
   comp PI_S = loadings_S*tr(evectors_S); disp "PI_S:" PI_S; * PI matrix

   equation(coeffs=%xcol(evectors_S,1)) ect1_S *
   # series1 series2 series3 series4 series5 constant
   equation(coeffs=%xcol(evectors_S,2)) ect2_S *
   # series1 series2 series3 series4 series5 constant
   equation(coeffs=%xcol(evectors_S,3)) ect3_S *
   # series1 series2 series3 series4 series5 constant
   equation(coeffs=%xcol(evectors_S,4)) ect4_S *
   # series1 series2 series3 series4 series5 constant
   equation(coeffs=%xcol(evectors_S,5)) ect5_S *
   # series1 series2 series3 series4 series5 constant


   system(model=ectmodel)
   variables series1 series2 series3 series4 series5; * specify the variables in levels
   lags 1 2 3 4 5; * 5 lagged levels are equivalent to 4 lagged changes
   ect ect1_S ect2_S; * include the error correction terms
   end(system)
   estimate(residuals=resids_S,noftests) regstart regend
   *
   *@regcrits; * CANNOT COMPARE VAR AND VECM IC
   *
   display "Residual covariance matrix = " ##.#### %sigma
   display "Residual correlation matrix = " ##.#### %cvtocorr(%sigma)
   display "Log likelihood = " %logl
   display "Log determinant = " %logdet
   *
   *
   FORECAST(MODEL=ectmodel,RESULTS=VECMFOR_S,STDERRS=VECMERRORS_S,static,steps=1) regend+1
*
end do regend

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Thu Oct 17, 2024 7:59 am
by TomDoan
I explained that. You clipped out the explanation. Different methodologies have different goals.

The Tsay example has a pair of US interest rate series, where (a) cointegration is theoretically suggested and (b) the rank is obvious from the statistics; not a set of general macro series where neither of the above is true. And how would you look at that and tell that there is "no problem"? It gives flat forecasts with very wide error bands. Univariate models would give you the same thing.

In your laundry list of the advantages of a VECM, you are missing one very important point: those can only be helpful if they can be well-estimated from the data (which is a general rule in forecasting, even "correct" models with too many free parameters forecast poorly). You have already demonstrated that, in your data set, they aren't since the most basic statistic, the rank, "changes" part way through the data set.

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Sat Oct 19, 2024 5:43 am
by ac_1
Thanks.

TomDoan wrote: Thu Oct 17, 2024 7:59 am The Tsay example has a pair of US interest rate series, where (a) cointegration is theoretically suggested and (b) the rank is obvious from the statistics; not a set of general macro series where neither of the above is true. And how would you look at that and tell that there is "no problem"? It gives flat forecasts with very wide error bands. Univariate models would give you the same thing.
The series in tsay3p438.rpf and montevecm.rpf are interest rates. In theory should the rank be 1 less than the total number of series tested in JohMLE.src, or at least 1?

Also, why a generalized eigen-analysis and not standard?
TomDoan wrote: Thu Oct 17, 2024 7:59 am In your laundry list of the advantages of a VECM, you are missing one very important point: those can only be helpful if they can be well-estimated from the data (which is a general rule in forecasting, even "correct" models with too many free parameters forecast poorly). You have already demonstrated that, in your data set, they aren't since the most basic statistic, the rank, "changes" part way through the data set.
For macro series then, it's probably best for a VAR-in-levels (not a VAR-in-differences UG-209, and not a VECM), and forecast with the use of Bayesian techniques UG-253.

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Tue Oct 22, 2024 3:35 am
by ac_1
In JohMLE.src the critical values for trace 95% are in fixed rect trace95(5,12), corresponding to the 5 rows: DET=NONE/[CONSTANT]/TREND/RC/RTREND, respectively.

Are there 95% critical values for the maximal-evalue statistics?

Please can you provide reference(s).


Further, I requested this a while back in RATS: I'd like to produce a 3D-plot of forecast distributions similar to:
(i) 'Chart 9': https://www.bankofengland.co.uk/quarter ... -fan-chart
(ii) And densities in R's rgl package; rgl Demos vignette: https://cran.r-project.org/web/packages ... demos.html

As an example, based on the fans generated in https://estima.com/webhelp/topics/gibbsvarbuildrpf.html and the new OS instruction https://estima.com/webhelp/topics/osinstruction.html. (I am familiar with R), how do I use the 'cloud of simulations’ in gibbsvarbuild.rpf to create a 3D plot of the forecast distributions in R?

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Thu Oct 24, 2024 4:07 pm
by TomDoan
ac_1 wrote: Tue Oct 22, 2024 3:35 am In JohMLE.src the critical values for trace 95% are in fixed rect trace95(5,12), corresponding to the 5 rows: DET=NONE/[CONSTANT]/TREND/RC/RTREND, respectively.

Are there 95% critical values for the maximal-evalue statistics?

Please can you provide reference(s).
Hamilton has those for a limited set of the models (you can probably track back through his references), but the trace tests are the ones you should be using.

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Thu Oct 24, 2024 4:12 pm
by TomDoan
ac_1 wrote: Tue Oct 22, 2024 3:35 am Further, I requested this a while back in RATS: I'd like to produce a 3D-plot of forecast distributions similar to:
(i) 'Chart 9': https://www.bankofengland.co.uk/quarter ... -fan-chart
(ii) And densities in R's rgl package; rgl Demos vignette: https://cran.r-project.org/web/packages ... demos.html

As an example, based on the fans generated in https://estima.com/webhelp/topics/gibbsvarbuildrpf.html and the new OS instruction https://estima.com/webhelp/topics/osinstruction.html. (I am familiar with R), how do I use the 'cloud of simulations’ in gibbsvarbuild.rpf to create a 3D plot of the forecast distributions in R?
1. We have no plans to add 3D graphics. 3D graphics are incredibly complicated (particularly labeling) and, in econometrics, applications tend to be fairly gimmicky. Even with an implementation of 3D graphics, the input of options to control it can easily be 8-10 lines long.

2. Have you actually read the BOE description? They don't base those on simulations, but on a committee judgment.

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Sun Oct 27, 2024 3:44 am
by ac_1
ac_1 wrote: Sat Oct 19, 2024 5:43 am
TomDoan wrote: Thu Oct 17, 2024 7:59 am The Tsay example has a pair of US interest rate series, where (a) cointegration is theoretically suggested and (b) the rank is obvious from the statistics; not a set of general macro series where neither of the above is true. And how would you look at that and tell that there is "no problem"? It gives flat forecasts with very wide error bands. Univariate models would give you the same thing.
The series in tsay3p438.rpf and montevecm.rpf are interest rates. In theory should the rank be 1 less than the total number of series tested in JohMLE.src, or at least 1?

montevecm.rpf: in theory using 3 interest rates there should be (3-1)=2 ECT's, empirically there is only 1 is significant ECT. How to explain?

Code: Select all

Likelihood Based Analysis of Cointegration
Variables:  FTBS3 FTB12 FCM7
Estimated from 1975:07 to 2001:06
Data Points 312 Lags 6 with Constant restricted to Cointegrating Vector

Unrestricted eigenvalues, -T log(1-lambda) and Trace Test
  Roots     Rank    EigVal   Lambda-max  Trace  Trace-95%
        3        0    0.0818    26.6264 41.2013   34.8000
        2        1    0.0333    10.5567 14.5750   19.9900
        1        2    0.0128     4.0183  4.0183    9.1300

Cointegrating Vector for Largest Eigenvalue
FTBS3     FTB12    FCM7      Constant
-3.154123 3.132882 -0.321838   0.619010
These are 3 month, 12 month and 7 year yields. While there might be some theoretical reasons to think that N interest rates (from a single market) might be linked with a single stochastic trend that's a pretty wide range of maturities, so even in the assumption were true, 25 years of data might not be enough to accurately test it.

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Sun Oct 27, 2024 4:04 am
by ac_1
TomDoan wrote: Thu Oct 24, 2024 4:12 pm
ac_1 wrote: Tue Oct 22, 2024 3:35 am Further, I requested this a while back in RATS: I'd like to produce a 3D-plot of forecast distributions similar to:
(i) 'Chart 9': https://www.bankofengland.co.uk/quarter ... -fan-chart
(ii) And densities in R's rgl package; rgl Demos vignette: https://cran.r-project.org/web/packages ... demos.html

As an example, based on the fans generated in https://estima.com/webhelp/topics/gibbsvarbuildrpf.html and the new OS instruction https://estima.com/webhelp/topics/osinstruction.html. (I am familiar with R), how do I use the 'cloud of simulations’ in gibbsvarbuild.rpf to create a 3D plot of the forecast distributions in R?
1. We have no plans to add 3D graphics. 3D graphics are incredibly complicated (particularly labeling) and, in econometrics, applications tend to be fairly gimmicky. Even with an implementation of 3D graphics, the input of options to control it can easily be 8-10 lines long.

2. Have you actually read the BOE description? They don't base those on simulations, but on a committee judgment.

1) Appreciated, but I think it looks useful, especially nowadays.

In RATS
To plot densities at EACH time step I would, e.g.

Code: Select all

density(smoothing=1.5) D_1_fcasts(1) 1 ndraws D_1_xf1 D_1_f1
density(smoothing=1.5) D_1_fcasts(2) 1 ndraws D_1_xf2 D_1_f2
etc

scatter 12
# D_1_xf1 D_1_f1 1 ndraws 1
# D_1_xf2 D_1_f2 1 ndraws 2
etc

3D plot via Excel & in R
In graph3daxes.xls I would have in COLUMN 1 and 2 respectively:
- Q1 to Q12 (say)
- macroseries min-to-max values

In graph3ddata.xls I would have the densities in each column, at EACH time Q1 to Q12, e.g.
D_1_f1
D_1_f2

Correct? Then...?


2) As described in the BOE Appendix, is it possible to implement the analytical PDF 'which takes into account non-zero skewness' and thereafter to generate forecasts with fans, in RATS, or is that 'bettered' via bootstrapping or more recent analysis?


3) Also, can I do in-sample fits & residual analysis using the Bayesian parameters as I would for OLS, i.e.
(a) Fits & Residual Plots
(b) CROSS(qstats,org=columns,from=-8,to=8) resids(1) resids(2) etc
(c) the various multivariate residual tests
(d) IRF and EVD using the Bayesian parameters?

e.g. in gibbsvarbuild.rpf comparing

estimate(ols,sigma)
disp %MODELGETCOEFFS(varmodel)

and then

infobox(action=remove)
disp %MODELGETCOEFFS(varmodel)

the latter being the Bayesian parameters. If so, how to incorporate the Bayesian parameters, instead of OLS?

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Sun Oct 27, 2024 10:34 am
by TomDoan
ac_1 wrote: Sun Oct 27, 2024 4:04 am
TomDoan wrote: Thu Oct 24, 2024 4:12 pm
ac_1 wrote: Tue Oct 22, 2024 3:35 am Further, I requested this a while back in RATS: I'd like to produce a 3D-plot of forecast distributions similar to:
(i) 'Chart 9': https://www.bankofengland.co.uk/quarter ... -fan-chart
(ii) And densities in R's rgl package; rgl Demos vignette: https://cran.r-project.org/web/packages ... demos.html

As an example, based on the fans generated in https://estima.com/webhelp/topics/gibbsvarbuildrpf.html and the new OS instruction https://estima.com/webhelp/topics/osinstruction.html. (I am familiar with R), how do I use the 'cloud of simulations’ in gibbsvarbuild.rpf to create a 3D plot of the forecast distributions in R?
1. We have no plans to add 3D graphics. 3D graphics are incredibly complicated (particularly labeling) and, in econometrics, applications tend to be fairly gimmicky. Even with an implementation of 3D graphics, the input of options to control it can easily be 8-10 lines long.

2. Have you actually read the BOE description? They don't base those on simulations, but on a committee judgment.

1) Appreciated, but I think it looks useful, especially nowadays.
What does "especially nowadays" mean?

The point of a graph is to convey information pictorially. In what way is the 3D graph at all more informative than the equivalent fan chart? Fan charts show probabilities over ranges. That's easy to understand---this covers 10%, this 40%,... Densities (aside from the problem of view point/perspective in 3D graphs) aren't as helpful since basically no one can visually integrate. There have been endless examples of graphs that "look cool" but provide misleading or unhelpful information.

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Sun Oct 27, 2024 12:16 pm
by ac_1
ac_1 wrote: Sun Oct 27, 2024 3:44 am
ac_1 wrote: Sat Oct 19, 2024 5:43 am
TomDoan wrote: Thu Oct 17, 2024 7:59 am The Tsay example has a pair of US interest rate series, where (a) cointegration is theoretically suggested and (b) the rank is obvious from the statistics; not a set of general macro series where neither of the above is true. And how would you look at that and tell that there is "no problem"? It gives flat forecasts with very wide error bands. Univariate models would give you the same thing.
The series in tsay3p438.rpf and montevecm.rpf are interest rates. In theory should the rank be 1 less than the total number of series tested in JohMLE.src, or at least 1?

montevecm.rpf: in theory using 3 interest rates there should be (3-1)=2 ECT's, empirically there is only 1 is significant ECT. How to explain?

Code: Select all

Likelihood Based Analysis of Cointegration
Variables:  FTBS3 FTB12 FCM7
Estimated from 1975:07 to 2001:06
Data Points 312 Lags 6 with Constant restricted to Cointegrating Vector

Unrestricted eigenvalues, -T log(1-lambda) and Trace Test
  Roots     Rank    EigVal   Lambda-max  Trace  Trace-95%
        3        0    0.0818    26.6264 41.2013   34.8000
        2        1    0.0333    10.5567 14.5750   19.9900
        1        2    0.0128     4.0183  4.0183    9.1300

Cointegrating Vector for Largest Eigenvalue
FTBS3     FTB12    FCM7      Constant
-3.154123 3.132882 -0.321838   0.619010
These are 3 month, 12 month and 7 year yields. While there might be some theoretical reasons to think that N interest rates (from a single market) might be linked with a single stochastic trend that's a pretty wide range of maturities, so even in the assumption were true, 25 years of data might not be enough to accurately test it.

So would it be best to do an interest-rate VECM for maturities at the Short-End only, and the Long-End only? I refer to: Anthony Hall, Heather Anderson and Clive Granger, A Cointegration Analysis of Treasury Bill Yields, The Review of Economics and Statistics, 1992, vol. 74, issue 1, 116-26.

In otherwords, should they be consecutive maturities for the yields chosen, along the yield-curve?

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Sun Oct 27, 2024 12:18 pm
by ac_1
TomDoan wrote: Sun Oct 27, 2024 10:34 am
ac_1 wrote: Sun Oct 27, 2024 4:04 am
TomDoan wrote: Thu Oct 24, 2024 4:12 pm

1. We have no plans to add 3D graphics. 3D graphics are incredibly complicated (particularly labeling) and, in econometrics, applications tend to be fairly gimmicky. Even with an implementation of 3D graphics, the input of options to control it can easily be 8-10 lines long.

2. Have you actually read the BOE description? They don't base those on simulations, but on a committee judgment.

1) Appreciated, but I think it looks useful, especially nowadays.
What does "especially nowadays" mean?

The point of a graph is to convey information pictorially. In what way is the 3D graph at all more informative than the equivalent fan chart? Fan charts show probabilities over ranges. That's easy to understand---this covers 10%, this 40%,... Densities (aside from the problem of view point/perspective in 3D graphs) aren't as helpful since basically no one can visually integrate. There have been endless examples of graphs that "look cool" but provide misleading or unhelpful information.

"especially nowadays" means: it's much easier and less complicated to generate 3D plots in mathematical/statistical software than back in 1998 (if 3D was available), and maybe expected in a publication such as BOE in 2024, if correct.

However, as the authors say:
p.34 The fan chart itself is best understood by looking at Charts 5 (Cross-sectional probability distribution of RPIX inflation with 10% confidence bands) and 6 (fan-chart RPIX inflation projection in August 1997), and
p.35 the 'fan' does not cover 100% of the probability.

And my thoughts a-priori was that a cross-section of fans are useful.

I have plotted a fan and also the following distributions of the forecasts at EACH time-step, seperately and ovelaid, but mine are with 3 vertical-lines for the mean forecast, actual value (not TOOS, OOS only), and zero line:
- theoretical distribution: Gaussian (no t/f), log-normal (log t/f), or non-central chi-squared (sqrt t/f)
- bootstrap forecasts: a density curve

Is that wrong?

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Mon Oct 28, 2024 3:36 pm
by TomDoan
ac_1 wrote: Sun Oct 27, 2024 12:18 pm "especially nowadays" means: it's much easier and less complicated to generate 3D plots in mathematical/statistical software than back in 1998 (if 3D was available), and maybe expected in a publication such as BOE in 2024, if correct.
That's not true at all. Gauss by the late 1980's could have easily done the graph in the BOE. (It's just a one-off 3D function on a very coarse grid.) Modern desktops can do on-the-fly rotations of objects with arbitrary light sources which would have been the province of supercomputers 30 years ago, but a single projection onto 2D of a 3D surface was readily doable with off-the-shelf computers and software.
ac_1 wrote: Sun Oct 27, 2024 12:18 pm However, as the authors say:
p.34 The fan chart itself is best understood by looking at Charts 5 (Cross-sectional probability distribution of RPIX inflation with 10% confidence bands) and 6 (fan-chart RPIX inflation projection in August 1997), and
I'm pretty sure you're completely misunderstanding what they are saying. That's explaining how they take the density/distribution and create the change points on the fan chart. They're not saying you need to go back to the densities (since there is one per time period) to help you read the fan chart.
ac_1 wrote: Sun Oct 27, 2024 12:18 pm p.35 the 'fan' does not cover 100% of the probability.
NOTHING covers 100% of the probability. (It's an unbounded distribution).

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Thu Oct 31, 2024 8:48 am
by TomDoan
ac_1 wrote: Sun Oct 27, 2024 12:16 pm So would it be best to do an interest-rate VECM for maturities at the Short-End only, and the Long-End only? I refer to: Anthony Hall, Heather Anderson and Clive Granger, A Cointegration Analysis of Treasury Bill Yields, The Review of Economics and Statistics, 1992, vol. 74, issue 1, 116-26.

In otherwords, should they be consecutive maturities for the yields chosen, along the yield-curve?
The relationship between bond yields and the term structure is non-linear. What the paper shows is that a linear VECM seems to be adequate for short maturities.

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Fri Nov 01, 2024 2:19 am
by ac_1
TomDoan wrote: Thu Oct 31, 2024 8:48 am
ac_1 wrote: Sun Oct 27, 2024 12:16 pm So would it be best to do an interest-rate VECM for maturities at the Short-End only, and the Long-End only? I refer to: Anthony Hall, Heather Anderson and Clive Granger, A Cointegration Analysis of Treasury Bill Yields, The Review of Economics and Statistics, 1992, vol. 74, issue 1, 116-26.

In otherwords, should they be consecutive maturities for the yields chosen, along the yield-curve?
The relationship between bond yields and the term structure is non-linear. What the paper shows is that a linear VECM seems to be adequate for short maturities.
Yes, the paper is very good, and I'd like to try modelling:
- short maturities only
- long maturities only
- and in the middle consecutive maturities along the yield-curve
but I need a better data source e.g. Refinitiv DataStream, Bloomberg. If there are comparable free/cheaper alternatives please let me know.

Thus, I have modelled maturities at a monthly frequency in a small linear VECM: maturities that are 'benchmark' and most actively traded, across the curve https://en.wikipedia.org/wiki/Yield_curve. Depending on lag length, my results show the number of cointegrating vectors being less than (n-1), hence rejecting the hypothesis, p.120 "yields are cointegrated with (n—1) cointegrating vectors corresponding to ANY set of n yields, and that the cointegrating vectors are the spread vectors." Unlike in the paper where they do find (n-1). For example, I have found 1 or 2 significant ECT's only ("correcting" the rates back to LR equilibrium), with the remaining non-significant ECT's being stochastic trends (i.e. economic factor(s) like monetary policy or inflation, influencing all the rates simultaneously). Although with acceptable, not stellar, OOS forecasts.

Also, the lambda(trace) statistic uses e-values which are canonical correlations estimating the rank. In addition do I have to test all the ECT's for (non-)stationarity?

I'd be very interested in
- a non-linear VECM
- Bayesian VECM
- other appropriate forecasting methods e.g. Diebold, F.X. and Li, C (2006) Forecasting the term structure of government bond yields, Journal of Econometrics, 130, pp.337-364.
And be grateful regarding references and RATS examples. My ultimate aim is always generating forecasts!

Re: Model of the US Economy: CointegratedVARModelHandbook

Posted: Wed Nov 06, 2024 4:10 am
by ac_1
Sorry, I think I have made an error calculating the prediction intervals for VAR and VECM.

I have them as they would be for the univariate case:

Code: Select all

forecast(model=varmodel,results=f_D,stderrs=s_D,from=iend+1,steps=nsteps)

set l95_D(1) iend+1 iend+nsteps = f_D(1)+%invnormal(.025)*s_D(1)
set u95_D(1) iend+1 iend+nsteps = f_D(1)+%invnormal(.975)*s_D(1)
set l80_D(1) iend+1 iend+nsteps = f_D(1)+%invnormal(.1)*s_D(1)
set u80_D(1) iend+1 iend+nsteps = f_D(1)+%invnormal(.9)*s_D(1)
What is the RATS code for the prediction intervals in the multivariate case, applicable to both dynamic multi-step and recursive static one-step ahead forecasts, assuming no transformations to all the series? Or are those correct, I'm thinking multivariate normal distribution?