Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
8 views13 pages

Time Series Assignment 2

The document provides an overview of regression analysis, including its purpose, types (simple linear, multiple linear, polynomial, and exponential regression), and methods like least squares regression. It explains key concepts such as total variation (SST), explained variation (SSR), and unexplained variation (SSE), as well as steps for hypothesis testing and prediction in regression models. Additionally, it discusses variable selection methods and parameter estimation in time series regression models.

Uploaded by

saadmomin00313
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views13 pages

Time Series Assignment 2

The document provides an overview of regression analysis, including its purpose, types (simple linear, multiple linear, polynomial, and exponential regression), and methods like least squares regression. It explains key concepts such as total variation (SST), explained variation (SSR), and unexplained variation (SSE), as well as steps for hypothesis testing and prediction in regression models. Additionally, it discusses variable selection methods and parameter estimation in time series regression models.

Uploaded by

saadmomin00313
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

clcsute

Dete
Poge

Assiqnnent No.2
And
Ro Regression Analysis Tore casting

A hat is reatession nd wnte


te tyoes of hegressict

a) Regression
) Regressien Analysis is a supervised Jearnir
analysis wheie supervised iearning is the
analy zing ar predictBna the data based on th
previously anah available alata or past dato,
19 Yurpose af Rearession Analy sis The end
result of reqression analy sis study is often
qenerate a mode) that can be used to
forecast or predict future values af respons
Variable,given specigied values af the
predictor variable..
b) Ty pes of Reqression
i)simple Linear Reqression:
-Tn this simple linear reqression there is
only one dependent and one indeendent
Variable'
This simple linear reqressien analysis
is mostly used in weather Forecasting
financial analysis ,markeB anaysis
- The reqression model allouos For a linear
relation ship bet the forecost variable y and
a single predictor variable x

i) Multiple linear Regression


-tMultiple tinear reqression analysis qives
alssate
Data.
Paga.

+he relationship between the two or more indepen


Aent variable anda depend ent variable
Multiple lincár reqression con be teepresented
as the hyper plane inmu ltielmensional space.
TA also a linear type reqession analysis.
Nhen there are two or more predictor variables,
4he model is calledd multiple r

iii) Polynomial reqression:


Polynomial regression analysis is non linear
tegresSi6n analysis
Polynomìal' reqression analysis is Ahe eatension
of the simple linear regression analysis by
adding the extra indpendent varia bles
"obtained by raising the pouwer.
y=a0+a1t a 2 t - - - tan xn
iv) Exponential reqression:
- Exponen tial regression is anon linear type
oF rearession.s
fxponential teqression can be expressed

r Fy ponential rea ression con be us ed in


finance, biolo4 y, physics etc. fields
= ae^(bz)

a hat is least Squares Regression method and

1) Least squares is method to apply lineor


tegression,
TA helos us predictresults based on an
Date
Page

existinq Set of_ dataas wellas cear anomalies


in our data
ii) Anomalies are values Ahat are too qood,or bad,
to be thue or 4hat represent rae cases
iv) de have a collection of bbservations but we
do nat know the values of the Coeeficìent s
Foy Biyi,BThese need to be estimated
fcom the datAa.
ne least squares principle provides a way
of cho0sína the coe fficients e fectively
by the
sum of sq uaredetrors.
minimizíng th
ithat iswechoosey the volues ef Po Bt,
that minimize

t=|

vi) This is called least squares estiHatien


because it qives the least value for 4ne sum
of squared errotS
Nit) Einding the best estimates ofthe coefficients
is often called EiLEing' the'model to thedataa
Or sonme times Iearning" or ltraining" the modA

Explain $ST, SSR, SSE,R2

-SsT repres enis the Aotal variatien in the


dependent variable . T4s calculat ed as t
sum of thesa uored differences betu0een
Cach data point & the mean of the
dependent Nariable
elassmte
Data
Paga

-Tt Used to calculate he Sum of sq uares


TotalCs sT)
2 Ss R
-Tt used to caleulate Sum oF &quare s.
cegeSSienCssR).
SSRrepresentsthe variationin the
dependent Nafiable that your model explains
Ttls salculated as the Sum of square d
dieferences between the predicted values
Crom your nmodel and the mean of thc
dependend variable
SSE ResidualSum of Squares )
-SSE represents the uneAplained variation.
ofi eror in your model
-_TAls calculatedas the s um of squared
dieterenees betn Ahe actual data points
and predicted valucs fram your model.
R? CR-squared).
-R-squared is calculated as the ratio oF
SSR to SST Tnothtr ords it tells you
what proportion of total Nariation io the
dependent variable bE is explained by
our nodel.
The formula For R-Squared is
R=4-(sSE ssT)

ixplain steps to conduct typothe sis reqressian


Coefficiernt

A Steps to conduc attiipothesis Test ana


cegression cocFficient
Date,
Page

1. Nrite doun the null hupothesis that there 1s no


elationsheep between Ahe dependent' ariable y
and the independent wariable zi:
Ho Bi =0
2. rite doun:the alternative hypothesis thatis
a relationship between dependent ariable y and
the ndependent Nariable

Q. callect the sample inEatmatian Eor the test and


identify the siqniticance level .
4.The p-value is the Sum F the area in the trals ot
the t-distribution The t-score and deqrees af
freedom are
t= 'bi-B i d f n-k
Sbi

s. Compare the p-value to the signifieance level and


state the outcoime of the testi
TE p-valué , reject Ho in Favour of Ha
a.) The results of the sample data are signitieo
There is sueEicient evidence Ao conelude
that the null hypcthesìs Ho is an incorteck
belicf and that alternati ve hypothesis Ha
is mast ikely Correct
- IE p- value > xy do not teiect Ho,
) The results of the sample dota are not
signisieanthere is not sufficient evidenco
to conelude that the alternative hypothesis
Ha. mày be correct.
Write down a
he context concluding sentence specieic
of the to
- The
question.
requtred t-seore andp-value For the test
can be found on the
which be learned ho uirearession
to
summary table,
aenera te in Excel in
previots section

Explain steps For prediction


, Collect Data Relevant to t4he Problem
- Gather as much data as possible 4hatis
related
to the problem you'te trying to solve.
More data can
help captut e the variabili ky
in the problem domain, improving the model's
abilit
2. clean, Auqment yand Preprocess the data.
Before building a model, the data hteds to be
eleaned,augm ented , and preproces sed to
make it suitable For analysis
Properly cleaned data and Iprepared dato helps
the mo del (earn patterhs mare
re duces the risk of bias or 6verfitting.
3. Conduct an fplaratory Analysìs af Data.
- Perfor ma explorator y dota analysìs (EDA) Ao
understond the underlying pattérnsy relationships
and disttibutions oithin he data.
EDA helps identity eý Features,
features., detect
outliers, and understand correlatio ns 4hal
willinfor m medel building
Construet a model Bosed on the Data.
Date
Pace

-Use insightsqsined from the exploratory


analy sis to a uide the cons Aruction ot á
predictive model.
The modelis the core afprediction
process, trans Forming input data into predici,
basedan learnedpatteris
S Use the nodel to ans uer 4he initial question
and validate Results
App\y themodelto new data or test data
Ao make predictions and then validate
4he atcurac
Outline typesiof residual

) Residual
-A pesidua is a measure of hnoo Par away
point is vertically from Ahe reqres sion
1ine
Simpty It is the errot, between a predicted
valueand the cbservéd actúalvalue
Residual Cey= -
2) Typesof residual
a) standardized Residual
Dle Thesc are the tesiduals divided by oan
estima te of their standard deviátion. for a
linear tearession model his is otteo
4he sAandard ertor of residuals.

whereei-residual for obsesi.


estimated standard deviation
elassate
Data
Page

y sudenized Residuals
-TheSe similar to s tandardizedcesiduals
butare mor accumte for smal sample
size s
E=ei
Gus-hii
C Pearson Residuals
-commanly used in the Cantext ef qeneralized.
inear madels CGlMs) theseresidugs are the
- AicEerence betn the observed a fitted values

d)Deviance Residuals
Residuals
- Dsed in qenetaliz ed linear models (GLMS
hese are Signedsquate coot af contibution to
+he model's euiance for each observation

di-sigocy-(ytog j-(t- 9r))

7 Explainevariable selectionmetthods in teqression

a) Variable Selecti on ehads


- Variable acrccial aspect
Selec ion is
buildina reatssisnmodels as it helos in
identCying the nost relevant redictors.
generaizabilit
bTypes os varia bleselection
4 stepwise selection
Stepuise6election methods itera1vely add
Dele

anspecifiC Criterin
varables based
er remove
- h e r e are thrée mann types
a For ward selection
ion
b: Baekward Eliming Diceetion
stepwise tion (eoth
selee
2:) LasSo Regression
is a reqularization technigu
SSo reqression
that includes a penalty proporti oh
al to the
he CoefC
abslute valuc of
S Ridae Rcaression
regularization
Ridge rearessien is another
technigue that includes a pentalty prap
praportia
thecoePe
to Square oE
4) flastic Net
-Elastic Net combines the penalties of both
lasso & ridge reqress ion.
lteS)
s Principa} cbm ponent Analysis CpCA)
peA is dimensionatity reductión {echniq ue
that trans Fot ms the predictors into a set
ef euncorrelated components
6) Recursive feature Eli mination CRfE)
RFEis anterai ve pro cess that its a
model and removes the least siqnificant
features one by one

8. Summari ze Fstimainq the parameters in Tme


Senes Reqression Modelsi

a) Definition:
i} estimating parameters in i me series
egiession models involves delermining 4ht
clkcsate
Data
Page

coefficients that q uantifythe relationship be


the depend ent variable and one or more indepen
dent varia bles, while also according For temporol
dependencies.
b) Purpase
i The primary purpese of estimaBi nq paramet
in Eime series reqression models isto:
- Understand Relationships
forecasting
Control and ootimization
c) steps in Esti mating Parameters
i) Model selection
Choose an appro priate model based on data
characteristics:
ARIMA CAuto Regressive Integrated Moving
Averaqe )
SARIMA (Seaso nalARIMA)
i) Model Specieication
Define the structure of model yincludinq order of
autoregression ( p), aieferencinq (d÷, and
hmoving averaqe (q) component For ARIMA models.
iiY Parameter Estimation
Use historical data to estimate th parameBers.
aaThis typically involves
Maximum kikelihood Estimation (mLE)
.Least Squares
ivy Model eitting
EiA model to data to find Ahé patameter values 4hat
est expain the observed series. This involveS
optimizing the chooSen' esti tmation criteri on
iv) Model Validatien
Validate the modelusing diaqnosticchek;
*Residual Analy sis
Gaadnessof-Fit
9Summanze The maximum ikelihocd Approach in
Regression , Analysìs
a Definitian
i) The maximum Likeliho ad(mI) hpproach is a
methed af estithating the parameters of'n
sta tistical model
b) Pur pese
i The maximum ikelihaod approách is used Bar
Sevéna rea sons

b) Elexibili ty
C) Inference

.i AsympBoti e properkies
ML estimators are consistent, asympto ticalh
normal and e fficient
iy Adaphasilhty
TA can be adaptedto various types of data
and models including those cuith nen- normal
distributions..
iRi) Taferences
Facilitates hypathesis testing contideNe
interval eonstruction through he 1ikelihoad
ratia testi hlald test , and scare test.
slassste
Data

hichK comomandsusedin Rearession Analysis

do) Lineqr Reqre ssicn To perfarm linea regressicn


yau sie the Im() function,which fids olinea
adel.
Example t Load necessa ry ibrary
ibtary Ctidyverse).
Load etample dataset
data Comttors)
a)Multiple ioeat tegressian
similar to simple tinear reqressian but with
multiple predictors
Example i it amu ltiple bnear teqression_model
modelmulti-Im(mpgutthp+ 4sec,
datazmtcars)
tt summary o£ the model
Cmodelmul!f)
Summaty
s) logistic Regre ssion fot binat y aulcomes, use the glm)
function tuith family binomial.
Example
A simulale binaryoulcome Yariable
mtcans $am k-as.factor (mtcars $am)
# Summaty a£ the model
summaryogitmadel)
series data,you can.
A) Time Series Analysis fot tìme abicct
atime serics
1Ee the fs C) Function to creale
package For
and autoarima)Eram the fore cast
ARI MA modeling:
- Load the forecast packagt
1ibraryCfotecast)
Dale
Poge,

Credte a time series objecta


Es-data <- ts (mtcars $mpq , star-c(1q76, 1)
,Erequency - 12)
SPanel Dato models fof panel Data, you might
use the plm packageTt alloos fot fíxed
efferts o tandom eEects nod els
Example load the plan pockage
Alafibrary(plm)
tsimulate ponel data
edlata("Grun Feld'!, parkág c' "p'la").

You might also like