creating error bands
creating error bands
I ran a simple recursive VAR with the following model and then tried to generate impulse responses. When I try to generate error bands, it leads to very weird bands. How can i correct this please?
system(model=varmodel)
variables y d p comm_ne nltl_1 lp sp
lags 1 to 2
end(system)
estimate(noprint)
@VARIRF(model=varmodel,steps=15,page=byshocks,noprint)
@montevar(draws=10000, model=varmodel)
The program file is attached with the data.
system(model=varmodel)
variables y d p comm_ne nltl_1 lp sp
lags 1 to 2
end(system)
estimate(noprint)
@VARIRF(model=varmodel,steps=15,page=byshocks,noprint)
@montevar(draws=10000, model=varmodel)
The program file is attached with the data.
- Attachments
-
- bubble_basic3.xls
- (56 KiB) Downloaded 690 times
-
- basic_var.RPF
- (755 Bytes) Downloaded 911 times
Re: creating error bands
You somehow deleted the CALENDAR instruction for the data: there should be a CALENDAR(Q) 1984:4 before the DATA instruction below. So you're only getting 26 data points which is giving you a (very) unstable model.
OPEN DATA "E:\phd\research\RATS\bubble_basic3.xls"
DATA(FORMAT=XLS,ORG=COLUMNS) 1984:04 2009:02 y d p lp sp hp m2 comm_e comm_ne int_rate lp_nd lp_nb nltl $
npa nltl_1 npa_1
Step 1. Always check your data.
OPEN DATA "E:\phd\research\RATS\bubble_basic3.xls"
DATA(FORMAT=XLS,ORG=COLUMNS) 1984:04 2009:02 y d p lp sp hp m2 comm_e comm_ne int_rate lp_nd lp_nb nltl $
npa nltl_1 npa_1
Step 1. Always check your data.
Re: creating error bands
Thank you but the attached program file contains calender instruction but it still gives very weird impulses for a very standard identification in the literature.
- Attachments
-
- basic_var.RPF
- (838 Bytes) Downloaded 901 times
Re: creating error bands
Chop a year off the data set and you'll get perfectly reasonable results. The model goes unstable trying to deal with the 2008 data for LP, SP and probably INT_RATE. (The point estimates are stable, but the MC draws generally aren't).
You also left the DET CONSTANT out of the VAR definition.
You also left the DET CONSTANT out of the VAR definition.
Re: creating error bands
Thank you, now i am getting more consistent results and good IRFs. One last question, how did you realise that the problem lies in data for LP, SP and int_rate and not other variables.
Re: creating error bands
Your original model had nltl_1 in place of the interest rate. Now, I don't know what that is, and it looks a lot like it may have seasonal effects, but with that in the model (and the data corrected), you get reasonably well-behaved IRF's. (Probably because of the seasonality, it can't really interact with the other variables). Thus I was puzzled by the "very weird" remarks until I ran your second spec. That obviously drew attention to the interest rate series to start, and then I checked the others. So it appears that the problem with the explosive roots is due to an interaction among the last three variables---most likely you need the interest rates to transmit the explosiveness to the rest of the variables.
Re: creating error bands
I seasonally adjusted the data (using esmooth) and then ran the same VAR regressions as before uptil 2007. When we look at the impulse responses, the response sometimes go out of the bounds(Graph 1 and 2). Is it natural or is there an issue? Further, puzzles seem to appear in standard specification (y d p comm_e int_rate sp) which were not seen when we used non de-seasonalised data (e.g response of prices to interest rate shocks and provisions shock)
Data is in sheet 3 of seas.xls.
y-log GDP
d-log dividends
p-log prices
comm_e-log commodity prices (energy)
comm_ne - log commodity prices (non-energy)
int_rate-interest rate
lp-provisions
sp-share price
spec: y d p comm_e int_rate sp
So this is a standard specification in literature.
Data is in sheet 3 of seas.xls.
y-log GDP
d-log dividends
p-log prices
comm_e-log commodity prices (energy)
comm_ne - log commodity prices (non-energy)
int_rate-interest rate
lp-provisions
sp-share price
spec: y d p comm_e int_rate sp
So this is a standard specification in literature.
- Attachments
-
- seas.xls
- data
- (148.5 KiB) Downloaded 730 times
-
- int_rate_resp_comne.RGF
- graph
- (42.02 KiB) Downloaded 931 times
-
- int_rate_resp_come.RGF
- graph
- (42.17 KiB) Downloaded 833 times
Re: creating error bands
You'll have to post the program that actually generates the responses that you find problematic.
Re: creating error bands
Attached is the program.
- Attachments
-
- basic_var.RPF
- (815 Bytes) Downloaded 927 times
Re: creating error bands
First off, you have nowhere near enough data to even think about 12 lags. (And even with a lot more data, 12 lags with quarterly data is generally too big for a VAR). And one lag probably isn't enough---you get better results with 2. This isn't US data is it? If it is, you have a problem with your stock price data---there wasn't a crash at the beginning of 1984. If it isn't US data, then expecting similar results as with US data is unrealistic.
Re: creating error bands
First of all, it is US data. The stock price data is from robert shillers website, he publishes dividends and stock price data together and it is widely used. My question is when I use non-de-seasonalised data, results are perfect, but when deseasonalise, we have problems as u can see.
Also 12 lags have been given as an option for VAR lag selection (quarterly data) and this is generally the procedure in most papers (bank of england also suggests this). And it generally recommends 1 to 2 lags (AIC).And no macroeconomic data can have more than 90-100 data points. so this is a general issue widely discussed in literature. You can see my data set begins at 1984 and we have to exclude data beyond 2007 because it gives weird results (as you suggested).
Also 12 lags have been given as an option for VAR lag selection (quarterly data) and this is generally the procedure in most papers (bank of england also suggests this). And it generally recommends 1 to 2 lags (AIC).And no macroeconomic data can have more than 90-100 data points. so this is a general issue widely discussed in literature. You can see my data set begins at 1984 and we have to exclude data beyond 2007 because it gives weird results (as you suggested).
Re: creating error bands
To clarify things, attached are two sheets, one original data and two deseasonalised data. There is no stock market crash in original data ( column 2) but de-seasonalised data shows a crash (column 1). De-seasonalisation has been carried out using esmooth in RATS. Is it reliable?
esmooth(seasonal=multiplicative,estimate,$
smoothed=sp_s,fitted=fit_sp) sp 1984:01 2015:04
esmooth(seasonal=multiplicative,estimate,$
smoothed=sp_s,fitted=fit_sp) sp 1984:01 2015:04
- Attachments
-
- sp_s2.XLSX
- (12.45 KiB) Downloaded 677 times
Re: creating error bands
First of all, seasonal adjusting stock prices is a bad idea (or interest rates or exchange rates). If a series like that had any significant seasonality, people could make a fortune trading on it. Second, despite the name, ESMOOTH is not a smoothing procedure---it's a filtering procedure so it is very imprecise at the start of a sample. Which data do you have that isn't seasonally adjusted at the source?
Re: creating error bands
many of my variables are not seasonally adjusted namely - d (dividends), comm_e(commodity price(energy)), comm_ne(commodity price (non-energy)), lp(provisions),sp(stock price), hp(house prices), nltl(net loan to total loans), npa.
Re: creating error bands
And how many of those would be expected to have any seasonality? Maybe the two commodity price series? Don't seasonal adjust series which don't have seasonaility. Do-it-yourself seasonal adjustment is a last resort.