Page 1 of 1

STAR models

Posted: Fri Apr 08, 2016 4:35 pm
by Hayet-
hi
I try to use the procedure of Terasvirta for the Smooth model, but the parameters of a nonlinear part, isn't estimate (phie2 =0.000, for all case) :cry:
and the code that i adopt is:

Code: Select all

CALENDAR(M) 1995:1
OPEN DATA "C:\Users\LAPTOP\Desktop\RATS\lnTCER.RAT"
DATA(FORMAT=RATS) 1995:01 2012:11 LNTCER_EGYPTE
stat LNTCER_EGYPTE
*
*
GRAPH(STYLE=LINE,HEADER="séries des taux de change éffectif réel_Egypte") 1
# LNTCER_EGYPTE
set y = LNTCER_EGYPTE
set x = y-y{1}
   @DFUNIT y
**optimal ordre of AR(p)
* The YuleLags procedure does a quick, efficient examination of a range
* of AR models for stationary data.
@yulelags(max=20) x
*
*
* The tests STAR
*
do d=1,9
   @StarTest(p=3,d=d) x
end do d
*
nlpar(exactlinesearch)
stats x
*
compute scalef=1.0/sqrt(%variance)
*compute scalef=1.8
nonlin(parmset=starparms) gamma c
frml flstar = %logistic(1.8*gamma*(y{7}-c),1.0)
compute c=%mean,gamma=2.0
equation standard x
# constant  x{1 to 4}
equation transit x
# constant x{1 to 4}
*
* Convert the linear equations into FRML's with phi1 and phi2 as the
*
frml(equation=standard,vector=phi1) phi1f
frml(equation=transit ,vector=phi2) phi2f
frml star x = f=flstar,phi1f+f*phi2f
*
nonlin(parmset=regparms) phi1 phi2
nonlin(parmset=starparms) gamma c
nlls(parmset=regparms,frml=star) x
*
* Based upon the initial results, the standard equation is trimmed to
* just y{1} and transit to y{2 3 4 10 11} (The article shows lag 9
* rather than 10, but this specification fits quite a bit better). This
* is now estimated with all the parameters.
*
equation standard x
# constant y{1} x{1 to 4}
equation transit x
# constant y{1} x{1 to 4}
frml(equation=standard,vector=phi1) phi1f
frml(equation=transit ,vector=phi2) phi2f
*
*
* the new estimation, then including the STAR parameters
compute c=%mean,gamma=2.0
nlls(parmset=regparms+starparms,frml=star) x

Re: STAR models

Posted: Fri Apr 08, 2016 5:01 pm
by TomDoan
Why is y{1} added to the model half-way through the analysis?

You're using a hard-coded value of 1.8 in the scale factor for the flstar formula rather than the scalef. 1.8 is for Terasvirta's data and may be completely wrong for yours.

Re: STAR models

Posted: Sat Apr 09, 2016 3:58 am
by Hayet-
thank's Tom

but the first time i have some results
and according to the model, we divide by the standard deviation if Gamma is high

Re: STAR models

Posted: Sat Apr 09, 2016 4:05 am
by Hayet-
i re-excute the programme when i devised by the standard variance, but the same results

Statistics on Series X
Monthly Data From 1995:02 To 2012:11
Observations 214
Sample Mean 0.000814 Variance 0.000433
Standard Error 0.020807 SE of Sample Mean 0.001422
t-Statistic (Mean=0) 0.572199 Signif Level (Mean=0) 0.567790
Skewness -2.565172 Signif Level (Sk=0) 0.000000
Kurtosis (excess) 18.956202 Signif Level (Ku=0) 0.000000
Jarque-Bera 3438.783838 Signif Level (JB=0) 0.000000

## NL6. NONLIN Parameter PHI1(1) Has Not Been Initialized. Trying 0
## NL6. NONLIN Parameter PHI1(2) Has Not Been Initialized. Trying 0
## NL6. NONLIN Parameter PHI1(3) Has Not Been Initialized. Trying 0
## NL6. NONLIN Parameter PHI1(4) Has Not Been Initialized. Trying 0
## NL6. NONLIN Parameter PHI1(5) Has Not Been Initialized. Trying 0
## NL6. NONLIN Parameter PHI2(1) Has Not Been Initialized. Trying 0
## NL6. NONLIN Parameter PHI2(2) Has Not Been Initialized. Trying 0
## NL6. NONLIN Parameter PHI2(3) Has Not Been Initialized. Trying 0
## NL6. NONLIN Parameter PHI2(4) Has Not Been Initialized. Trying 0
## NL6. NONLIN Parameter PHI2(5) Has Not Been Initialized. Trying 0

Nonlinear Least Squares - Estimation by Gauss-Newton
Convergence in 1 Iterations. Final criterion was 0.0000000 <= 0.0000100
Dependent Variable X
Monthly Data From 1995:02 To 2012:11
Usable Observations 208
Degrees of Freedom 198
Skipped/Missing (from 214) 6
Centered R^2 0.1401544
R-Bar^2 0.1010705
Uncentered R^2 0.1423529
Mean of Dependent Variable 0.0010599872
Std Error of Dependent Variable 0.0209862836
Standard Error of Estimate 0.0198974930
Sum of Squared Residuals 0.0783902250
Regression F(9,198) 3.5860
Significance Level of F 0.0003724
Log Likelihood 524.7546
Durbin-Watson Statistic 1.9670

Variable Coeff Std Error T-Stat Signif
************************************************************************************
1. PHI1(1) 0.000637915 0.001383206 0.46119 0.64517121
2. PHI1(2) 0.323371941 0.071420425 4.52772 0.00001028
3. PHI1(3) -0.099135436 0.073264202 -1.35312 0.17755943
4. PHI1(4) 0.224112910 0.073393135 3.05359 0.00257201
5. PHI1(5) -0.002970890 0.071518769 -0.04154 0.96690726
6. PHI2(1) 0.000000000 0.000000000 0.00000 0.00000000
7. PHI2(2) 0.000000000 0.000000000 0.00000 0.00000000
8. PHI2(3) 0.000000000 0.000000000 0.00000 0.00000000
9. PHI2(4) 0.000000000 0.000000000 0.00000 0.00000000
10. PHI2(5) 0.000000000 0.000000000 0.00000 0.00000000

Re: STAR models

Posted: Sat Apr 09, 2016 8:36 am
by TomDoan
You're using lagged Y as the threshold variable, but the statistics are on X. You need the statistics on Y to get the guess values and scale factor for the threshold. Since the mean of X has little similarity to the mean of Y, the threshold at the guess values has all the data points in one regime.

Re: STAR models

Posted: Sat Apr 09, 2016 8:50 am
by Hayet-
thank you, great...
i have an other question: for the Terasvirta procedure, its true to devise par the standard error the exponential function

flstar = %logistic(scalef*gamma*(x{7}-c),1.0) with the scalef = 1/standard error ??

Re: STAR models

Posted: Sat Apr 09, 2016 9:47 am
by TomDoan
That's not required. As described in the User's Guide, that's convenient because it takes the scale-dependence off the GAMMA parameter.

Re: STAR models

Posted: Sat Apr 09, 2016 10:38 am
by Hayet-
i would like to use the STGARCH model ; you known this model?

Re: STAR models

Posted: Sat Apr 09, 2016 1:39 pm
by TomDoan
There's a brief discussion at https://estima.com/forum/viewtopic.php?f=38&t=1237 which has a reference to an example from Tsay which does a ST-ARCH model. I'm aware of it, but have never seen an example of its use which seems to show that it's worth attention.