Last two weeks have been a rollercoaster ride.
We have wrote the extensive modules for KalmanFilter(Statsample::TimeSeries::Arima::KalmanFilter
) and LogLikelihood(Statsample::TimeSeries::Arima::KF::LogLikelihood
).
I am way too grateful to Claudio for his uber-awesome support and guidance. While implementing log-likelihood, I understood why he asked me to go through GSL
minimization at the first place. Thanks! :)
KalmanFilter enables us to find the ARIMA(p, d, q) of series where,
- p = Order of Autoregressive part.
- d = Integerated part.
- q = Order of Moving Average part.
The filter finds the autoregressive and moving average coefficients for the given series and orders.
In the previous phase, we were working on the simulations of ARIMA model and manually provided the phi
and theta
coefficients to the simulator.
The KalmanFilter removes that dependency, by simplex algorithm minimizing approach on the log likelihood of the series, it successfully finds the ARIMA of a given series.
Use Case:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
|
The source code of KalmanFilter
can be found here.
Now, as name depicts, LogLikelihood
generates log-likelihood and few other attributes of the series.
With the LogLikelihood
class, we are generating many internal matrices(we even added few utility functions to make computation easier here). With this class, on given coefficients, order and series, we are able to calculate sigma, log-likelihood and AIC(Akaike Information Criterion) of the series.
LogLikelihood is the important class, since this is the function which is repeatedly minimized in KalmanFilter, which in turn generates the estimated parameters for ARIMA.
Use Case:
1 2 3 4 5 6 7 8 9 10 11 12 13 |
|
The source code of LogLikelihood
can be found here
The tests for both passes all the case.
In between, we have covered the detailed documentation of pretty much everything in statsample-timeseries.
With that, we have also bumped our version. Go, gem install statsample-timeseries
Cheers,
Ankur Goel