function [z_hat_1,beta_hats,et,MSE] = locally_constant_trigonometric_model(Z,s,m,omega,beta_0) % LOCALLY_CONSTANT_TRIGONOMETRIC_MODEL - Compute the recursive update of the locally linear model coefficients \hat{\beta}_n given the time series Z % % Input: % Z : the time series % s : total number of seasons = seasonal indicators. % m : the number of trigonometric functions to consider % omega : the relaxation coefficient % beta_0 : an initial guess at the local linear seasonal coefficients (will be estimated from the entire data set if not provided) % % Outputs: % z_hat_1 : the one-step-look ahead prediction % beta_hats : the estimated beta's at each timestep % et : the one-step-ahead residuals % MSE : the mean square error % % Our assumed model is a locally constant LINEAR model with "m" seasonal trigonometric functions: % % z_{n+j} = \beta_0 + \beta_1 j + \sum_{i=1}^{m} \left( \beta_{1i} \sin( f_i j ) + \beta_{2i} \cos(f_i j) \right) + \varepsilon_t \,. % % with f_i = \frac{ 2 \pi i }{ s } % % See the book by Abraham page 164 % % Written by: % -- % John L. Weatherwax 2009-06-01 % % email: wax@alum.mit.edu % % Please send comments and especially bug reports to the % above email address. % %----- n = length(Z); % 1) get an initial guess for \hat{\beta}, i.e. \hat{\beta}_0 by estimating it from all available data: % X = gen_global_linear_seasonal_trigonometric_X(s,m,n); if( nargin < 5 ) [beta_0,sC,s_hat,t_stats,SSE,SSR,SSTO,R2,MSR,MSE,F_ratio,MI] = wwx_regression( X, Z, 0 ); end % 2) get the F and L matrices (for a linear trend k=1): % F = gen_seasonal_trigonometric_F(s,m,omega); k = 1; L = gen_seasonal_trigonometric_L(k,s,m); f_0 = zeros(2+2*m,1); f_0(1) = 1; f_0(4:2:(2+2*m)) = 1; % in the trigonometric part ...every other element starting from the fourth is zero fInvOnF_0 = F \ f_0; f_1 = X(1,:); % 3) iterate over the data in the time series Z (do the first point by hand): % beta_hats(:,n) is \hat{\beta}_n for n=1,2,\cdots,n % The matlab array z_hat_1(n) is \hat{z}_{n-1}(1) the one-step-ahead prediction made after recieving measurement *n* % for n=1,2,\cdots,n. That is: % % z_hat_1(1) = the prediction made of z_1 before seeing any data % z_hat_1(2) = the prediction made of z_2 made after seeing z_1 % z_hat_1(3) = the prediction made of z_3 after seeing z_1 and z_2 % % Thus plotting z_hat_1 and z_n will indicate one how well the predictions are % beta_hats = zeros(2+2*m,n); z_hat_1 = zeros(n,1); z_hat_1(1) = f_1 * beta_0(:); beta_hats(:,1) = (L.') * beta_0(:) + fInvOnF_0 * ( Z(1) - z_hat_1(1) ); for ii=2:n, z_hat_1(ii) = f_1 * beta_hats(:,ii-1); innovation = fInvOnF_0 * ( Z(ii) - z_hat_1(ii) ); beta_hats(:,ii) = (L.') * beta_hats(:,ii-1) + innovation; end et = Z - z_hat_1; % the one-step-ahead error MSE = mean(et.^2); % the mean square error