USF Libraries
USF Digital Collections

Forecasting models for economic and environmental applications

MISSING IMAGE

Material Information

Title:
Forecasting models for economic and environmental applications
Physical Description:
Book
Language:
English
Creator:
Shih, Shou Hsing
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Time Series
Global Warming
Stock
S&P Price Index
Temperature
Dissertations, Academic -- Mathematics -- Doctoral -- USF   ( lcsh )
Genre:
non-fiction   ( marcgt )

Notes

Summary:
ABSTRACT: The object of the present study is to introduce three analytical time series models for the purpose of developing more effective economic and environmental forecasting models, among others. Given a stochastic realization, stationary or nonstationary in nature, one can utilize exciting methodology to develop an autoregressive, moving average or a combination of both for short and long term forecasting. In the present study we analytically modify the stochastic realization utilizing (a) a k-th moving average, (b) a k-th weighted moving average and (c) a k-th exponential weighted moving average processes. Thus, we proceed in developing the appropriate forecasting models with the new (modified) time series using the more recent methodologies in the subject matter. Once the proposed statistical forecasting models have been developed, we proceed to modify the analytical process back into the original stochastic realization.The proposed methods have been successfully applied to real stock data from a Fortune 500 company. A similar forecasting model was developed and evaluated for the daily closing price of S&P Price Index of the New York Stock Exchange. The proposed forecasting model was developed along with the statistical model using classical and most recent methods. The effectiveness of the two models was compared using various statistical criteria. The proposed models gave better results. Atmospheric temperature and carbon dioxide, CO₂, are the two variables most attributable to GLOBAL WARMING. Using the proposed methods we have developed forecasting statistical models for the continental United States, for both the atmospheric temperature and carbon dioxide. We have developed forecasting models that performed much better than the models using the classical Box-Jenkins type of methodology.Finally, we developed an effective statistical model that relates CO₂ and temperature; that is, knowing the atmospheric temperature we can at the specific location estimate the carbon dioxide and vice versa.
Thesis:
Dissertation (Ph.D.)--University of South Florida, 2008.
Bibliography:
Includes bibliographical references.
System Details:
Mode of access: World Wide Web.
System Details:
System requirements: World Wide Web browser and PDF reader.
Statement of Responsibility:
by Shou Hsing Shih.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains 142 pages.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 001994141
oclc - 317402311
usfldc doi - E14-SFE0002425
usfldc handle - e14.2425
System ID:
SFS0026743:00001


This item is only available as the following downloads:


Full Text
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam 2200385Ka 4500
controlfield tag 001 001994141
005 20090330114350.0
007 cr mnu|||uuuuu
008 090330s2008 flu s 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0002425
035
(OCoLC)317402311
040
FHM
c FHM
049
FHMM
090
QA36 (Online)
1 100
Shih, Shou Hsing.
0 245
Forecasting models for economic and environmental applications
h [electronic resource] /
by Shou Hsing Shih.
260
[Tampa, Fla] :
b University of South Florida,
2008.
500
Title from PDF of title page.
Document formatted into pages; contains 142 pages.
502
Dissertation (Ph.D.)--University of South Florida, 2008.
504
Includes bibliographical references.
516
Text (Electronic dissertation) in PDF format.
520
ABSTRACT: The object of the present study is to introduce three analytical time series models for the purpose of developing more effective economic and environmental forecasting models, among others. Given a stochastic realization, stationary or nonstationary in nature, one can utilize exciting methodology to develop an autoregressive, moving average or a combination of both for short and long term forecasting. In the present study we analytically modify the stochastic realization utilizing (a) a k-th moving average, (b) a k-th weighted moving average and (c) a k-th exponential weighted moving average processes. Thus, we proceed in developing the appropriate forecasting models with the new (modified) time series using the more recent methodologies in the subject matter. Once the proposed statistical forecasting models have been developed, we proceed to modify the analytical process back into the original stochastic realization.The proposed methods have been successfully applied to real stock data from a Fortune 500 company. A similar forecasting model was developed and evaluated for the daily closing price of S&P Price Index of the New York Stock Exchange. The proposed forecasting model was developed along with the statistical model using classical and most recent methods. The effectiveness of the two models was compared using various statistical criteria. The proposed models gave better results. Atmospheric temperature and carbon dioxide, CO, are the two variables most attributable to GLOBAL WARMING. Using the proposed methods we have developed forecasting statistical models for the continental United States, for both the atmospheric temperature and carbon dioxide. We have developed forecasting models that performed much better than the models using the classical Box-Jenkins type of methodology.Finally, we developed an effective statistical model that relates CO and temperature; that is, knowing the atmospheric temperature we can at the specific location estimate the carbon dioxide and vice versa.
538
Mode of access: World Wide Web.
System requirements: World Wide Web browser and PDF reader.
590
Advisor: Chris P. Tsokos, Ph.D.
653
Time Series
Global Warming
Stock
S&P Price Index
Temperature
690
Dissertations, Academic
z USF
x Mathematics
Doctoral.
773
t USF Electronic Theses and Dissertations.
4 856
u http://digital.lib.usf.edu/?e14.2425



PAGE 1

Forecasting Models for Economic and Environmental Applications by Shou Hsing Shih A dissertation submitted in partial fulfillment of the requirements for the degree of Doctoral of Philosophy Department of Mathematics and Statistics College of Arts and Science University of South Florida Major Professor: Chris P. Tsokos, Ph.D. Gangaram Ladde, Ph.D. Kandethody Ramachandran, Ph.D. Wonkuk Kim, Ph.D. Marcus McWaters, Ph.D. Date of Approval: April 3, 2008 Keywords: Time Series, Global Warming, St ock, S&P Price Index, Temperature, Carbon Dioxide Copyright 2008, Shou Hsing Shih

PAGE 2

Acknowledgement I would like to express deep appreciation to my major professor Dr. Chris P. Tsokos for his patient advice, generous support, friendship, and the many countless hours of personal time he has spent assisting me throughout my graduate career. With his guidance, it has fostered an environment that has helped me complete this work diligently and with greater ease than I would have thought possible otherwise. I would like to thank Professor Ladde for his assistance and advisement on the theoretical aspects of this work. I would also like to express my appreciation to Professors Ramachadran, McWaters, and Kim, for their support and guidance to complete this work.

PAGE 3

i Table of Contents List of Tables iii List of Figures v Abstract viii Chapter 1 Review and Basic Concepts 1 1.0 Introduction 1 1.1 Literature Review 1 1.2 Fundamental Concepts 4 1.2.1Stochastic Process 5 1.2.2White Noise Process 7 1.2.3Estimations on Si ngle Realization 8 1.3 The Stationary AR, MA, and ARMA Models 9 1.3.1 The Autoregressive (AR) Models 10 1.3.2 The Moving Average (MA) Models 10 1.3.3 The General Mixed ARMA Models 11 1.4 Aims of Present Study 12 1.5 Conclusion 13 Chapter 2 Analytical Formulations of Modified Time Series 14 2.0 Introduction 14 2.1 The General ARIMA Model 14 2.2 The k-th Moving Average Time Series Model 18 2.3 The k-th Weighted Moving Average Time Series Model 20 2.4 The k-th Exponential Moving Average Time Series Model 22 2.5 The Multiplicative ARIMA Model 24 2.6 Conclusion 26 Chapter 3 Proposed Econometrics Forecasting Models 28 3.0 Introduction 28 3.1 Forecasting Models on Stock XYZ 28 3.1.1 Data Preparation 29 3.1.2 The General ARIMA Model 31 3.1.3 The k-th Moving Average Time Series Model 35 3.1.4 The k-th Weighted Moving Average Time Series Model 39 3.1.5 The k-th Exponential Moving Average Time Series Model 44 3.2 Forecasting Models on S&P Price Index 48 3.2.1 Data Preparation 49

PAGE 4

ii 3.2.2 The General ARIMA Model 49 3.2.3 The k-th Moving Average Time Series Model 53 3.2.4 The k-th Weighted Moving Average Time Series Model 57 3.2.5 The k-th Exponential Moving Average Time Series Model 62 3.3 Evaluations on Proposed Models VS. Classical Method 66 3.4 Conclusion 67 Chapter 4 Global Warming: Atmospheric Temperature Forecasting Model 68 4.0 Introduction 68 4.1 Analytical Procedure 70 4.2 Development of Forecasting Models 72 4.3 Comparison 73 4.4 Conclusion 79 Chapter 5 Global Warming: Carbon Dioxi de Proposed Forecasting Model 80 5.0 Introduction 80 5.1 Carbon Dioxide Emission Modeling 80 5.2 Atmospheric Carbon Dioxide Modeling 85 5.3 Conclusion 91 Chapter 6 Global Warming: Temperature & Carbon Dioxide Prediction Modeling 92 6.0 Introduction 92 6.1 Relationship between Carbon Dioxide & Temperature 92 6.2 Carbon Dioxide & Temperature Modeling-1 98 6.3 Carbon Dioxide & Temperature Modeling-2 100 6.4 Temperature Model 102 6.5 Conclusion 104 Chapter 7 Future Research 106 References 107 Appendices 114 Appendix A1: Daily Closing Price of Stock XYZ 115 Appendix A2: MA3 Series on Daily Closing Price of Stock XYZ 117 Appendix A3: WMA3 Series on Daily Closing Price of Stock XYZ 119 Appendix A4: EWMA3 Series on Daily Closing Price of Stock XYZ 121 Appendix B1: Daily Closing Price of S&P Price Index 123 Appendix B2: MA3 Series on Daily Cl osing Price of S&P Price Index 125 Appendix B3: WMA3 Series on Daily Closing Price of S&P Price Index 127 Appendix B4: EWMA3 Series on Daily Cl osing Price of S& P Price Index 129 Appendix C1: Monthly Temperature for 1895-2007 (Version 1 Dataset) 131 Appendix C2: Monthly Temperatur e for 1895-2007 (Version 2 Dataset) 135 Appendix D: Monthly CO 2 Emission 1981-2003 139 Appendix E: Monthly CO 2 in the Atmosphere 1965-2004 141

PAGE 5

iii List of Tables Table 3.1 Basic Evaluation Sta tistics for Classical ARIMA 33 Table 3.2 Actual and Predicted Price for Classical ARIMA 34 Table 3.3 Basic Evaluation Statistics for MA3 ARIMA 38 Table 3.4 Actual and Predicted Price for MA3 ARIMA 38 Table 3.5 Basic Evaluation Statistics for WMA3 ARIMA 42 Table 3.6 Actual and Predicted Price for WMA3 ARIMA 43 Table 3.7 Basic Evaluation St atistics for EWMA3 ARIMA 46 Table 3.8 Actual and Predicted Price for EWMA3 ARIMA 47 Table 3.9 Basic Evaluation Sta tistics for Classical ARIMA 51 Table 3.10 Actual and Predicted Price for Classical ARIMA 52 Table 3.11 Basic Evaluation Statistics for MA3 ARIMA 56 Table 3.12 Actual and Predicted Price for MA3 ARIMA 56 Table 3.13 Basic Evaluation Statistics for WMA3 ARIMA 60 Table 3.14 Actual and Predicted Price for WMA3 ARIMA 61 Table 3.15 Basic Evaluation Statistics for EWMA3 ARIMA 64 Table 3.16 Actual and Predicted Price for EWMA3 ARIMA 65 Table 3.17 Ranking Comparison on stock XYZ 66 Table 3.18 Ranking Comparison on S&P Price Index 67 Table 4.1 Basic Evaluation Sta tistics (Version 1 Dataset) 76

PAGE 6

iv Table 4.2 Basic Evaluation Sta tistics (Version 2 Dataset) 76 Table 4.3 Original VS. Forecast Values (Version 1 Dataset) 77 Table 4.4 Original VS. Forecast Values (Version 2 Dataset) 78 Table 5.1 Basic Evaluation on CO 2 Emissions Model 83 Table 5.2 Original VS. Forecasting Values on CO 2 Emissions Model 84 Table 5.3 Basic Evaluation on Atmospheric CO 2 Mode 89 Table 5.4 Original VS. Forecasting Values on Atmospheric CO 2 Model 90 Table 6.1 Comparison between Two Series 96 Table 6.2 Residual Analysis for Combine Model-1 99 Table 6.3 Comparison between Actual and Forecast for Model-1 100 Table 6.4 Residual Analysis for Combine Model-2 101 Table 6.5 Comparison between Actual and Forecast for Model-2 102 Table 6.6 Basic Evaluation Statistic s for the Temperature Mode-2 104

PAGE 7

v List of Figures Figure 3.1 Daily Closing Price for Stock XYZ 29 Figure 3.2 Comparisons on Classical ARIMA Model VS. Original Time Series 32 Figure 3.3 Time Series Plot of th e Residuals for Classical Model 33 Figure 3.4 Classical ARIMA Forecasti ng on the Last 25 Observations 35 Figure 3.5 MA3 Series VS. The Original Time Series 35 Figure 3.6 Comparisons on MA3 ARIMA M odel VS. Original Time Series 36 Figure 3.7 Time Series Plot of the Residuals for MA3 Model 37 Figure 3.8 MA3 ARIMA Forecasting on the Last 25 Observations 39 Figure 3.9 WMA3 Series VS. The Original Time Series 40 Figure 3.10 Comparisons on WMA3 ARIMA Model VS. Original Time Series 41 Figure 3.11 Time Series Plot of the Residuals for WMA3 Model 42 Figure 3.12 WMA3 ARIMA Forecasting on the Last 25 Observations 44 Figure 3.13 WMA3 Series VS. The Original Time Series 44 Figure 3.14 Comparisons on EWMA3 ARIMA M odel VS. Original Time Series 45 Figure 3.15 Time Series Plot of the Residuals for EWMA3 Model 46 Figure 3.16 EWMA3 ARIMA Forecasting on the Last 25 Observations 48 Figure 3.17 Daily Closing Price for S&P Price Index 48 Figure 3.18 Comparisons on Classical ARIM A Model VS. Original Time Serie 50 Figure 3.19 Time Series Plot of th e Residuals for Classical Model 51

PAGE 8

vi Figure 3.20 Classical ARIMA Forecasti ng on the Last 25 Observations 53 Figure 3.21 MA3 Series VS. The Original Time Series 53 Figure 3.22 Comparisons on MA3 ARIMA M odel VS. Original Time Series 54 Figure 3.23 Time Series Plot of the Residuals for MA3 Model 55 Figure 3.24 MA3 ARIMA Forecasting on the Last 25 Observations 57 Figure 3.25 WMA3 Series VS. The Original Time Series 58 Figure 3.26 Comparisons on WMA3 ARIMA Model VS. Original Time Series 59 Figure 3.27 Time Series Plot of the Residuals for WMA3 Model 60 Figure 3.28 WMA3 ARIMA Forecasting on the Last 25 Observations 62 Figure 3.29 WMA3 Series VS. The Original Time Series 62 Figure 3.30 Comparisons on EWMA3 ARIMA M odel VS. Original Time Series 63 Figure 3.31 Time Series Plot of the Residuals for EWMA3 Model 64 Figure 3.32 EWMA3 ARIMA Forecasting on the Last 25 Observations 66 Figure 4.1 Time Series Plot for Monthly Temperature for 1895-2007 (Version 1) 69 Figure 4.2 Time Series Plot for Monthly Temperature for 1895-2007 (Version 2) 70 Figure 4.3 Actual VS. Predicted Values for Version 1 Dataset 74 Figure 4.4 Actual VS. Predicted Values for Version 2 Dataset 74 Figure 4.5 Residual Plot for Monthly Temperature (Version 1 Dataset) 75 Figure 4.6 Residual Plot for Monthly Temperature (Version 2 Dataset) 75 Figure 4.7. Actual VS. Predicted Values for the Last 12 Observations (Version 1) 78 Figure 4.8 Actual VS. Predicted Values for the Last 12 Observations (Version 2) 79 Figure 5.1 Time Series Plot on CO 2 Emission 1981-2003 81 Figure 5.2 Actual VS. Predicted Values for CO 2 Emission 1981-2003 82

PAGE 9

vii Figure 5.3 Residuals Plot for CO 2 Emissions 83 Figure 5.4 Monthly CO 2 Emission VS. Forecast Values for the Last 12 Observations 85 Figure 5.5 Geographical Lo cation of Mauna Loa 86 Figure 5.6 Time Series Plot for Monthly CO 2 in the Atmosphere 1965-2004 87 Figure 5.7 Actual VS. Predicted Values for Atmospheric CO 2 1965-2004 88 Figure 5.8 Residuals Plot for Atmospheric CO 2 89 Figure 5.9 Actual VS. Predicted Values for CO 2 in the Atmosphere 90 Figure 6.1 Monthly Atmospheric Ca rbon Dioxide from 1965 to 2004 93 Figure 6.2 Monthly Temperature of the Con tinental United States from 1965 to 2004 93 Figure 6.3 First Order Differencing M onthly Atmospheric CO2 Series 95 Figure 6.4 The Temperature Series VS. The Differencing CO2 Series 95 Figure 6.5 Comparison between Both Series After Transformation 97 Figure 6.6 Cross Correlation at Different Lag 98

PAGE 10

viii Forecasting Models for Economic and Environmental Applications Shou Hsing Shih ABSTRACT The object of the present study is to intr oduce three analytical time series models for the purpose of developing more effec tive economic and environmental forecasting models, among others. Given a stochastic re alization, stationary or nonstationary in nature, one can utilize exciting methodology to develop an autoregressive, moving average or a combination of both for short and long term forecasting. In the present study we analytically modify the stochastic realization utilizing (a) a k -th moving average, (b) a k -th weighted moving average and (c) a k -th exponential weighted moving average processes. Thus, we proceed in developing the appropriate forecasting models with the new (modified) time series using the more recent methodologies in the subject matter. Once the proposed statistical forecasting mode ls have been developed, we proceed to modify the analytical process back into the original stochastic realization. The proposed methods have been successfu lly applied to real stock data from a Fortune 500 company. A similar forecasting model was developed and evaluated for the daily closing price of S&P Price Inde x of the New York Stock Exchange. The proposed forecasting model was develope d along with the statistical model using classical and most recent methods. The effec tiveness of the two models was compared

PAGE 11

ix using various statistical criteria. The proposed models gave better results. Atmospheric temperature and carbon dioxide, CO 2 are the two variables most attributable to GLOBAL WA RMING. Using the proposed methods we have developed forecasting statistical models for the contin ental United States, for both the atmospheric temperature and carbon dioxide. We have deve loped forecasting models that performed much better than the models using the cl assical Box-Jenkins type of methodology. Finally, we developed an effective statistical model that relates CO 2 and temperature; that is, knowing the atmospheri c temperature we can at the specific location estimate the carbon dioxide and vice versa.

PAGE 12

Chapter 1 Literature Review and Fundamental Concepts 1.0 Introduction Time series analysis is one of the major areas in statistics that can be applied to many realistic problems. In the present chapter, we begin with summarizing the development of time series modeling and introduce some methodologies that have been developed recently. We then introduce some fundamental concepts that are essential for dealing with time series m odels. Most real-world time series are nonstationary in nature. Thus, before we deal with those nonstationary phenomena in the latter chapter, we shall first define stoc hastic process and white noise process, then survey some of the most popular stationary time series forecasting methodologies, finally give a brief outline of the basic structure of each model. 1.1 Literature Review During the preparation of the present study we have surveyed literature in the area of time series forecasting. Although th e present study is concerned mainly with economic and environmental applications, our literature survey was extended to the forecasting in general. 1

PAGE 13

In 1960, Muth presented a method for forecasting by an exponentially weighted moving average. Muths study in cluded the simple, seasonal effects and seasonal effects with linear trend. Actual sa les data is used in the study to examine this proposed modeling scheme. D. W. Trigg initiated a study of automatically monitoring a forecasting process to assure that the forecast remains in cont rol. Trigg utilizes a first order exponential model with data that contained jumps. A study for short term sales forecasting was carried out by Harrison in 1965. The study examines several methods for s hort term sales predictions; the method of Box and Jenkins in 1970 and Brown in 1965. It is argued that multi-parameter procedures for short term forecasting of sa les do not give significantly better results. Harrison recommends the one parameter expone ntial procedure for sales data for nonseasonal forecasting. Some methods are r ecommended for short term sales process with seasonal effects. A two stage exponential model with exponential smoothing and multiple regression was introduced by Crane and Eeatly in 1967. They first applied the exponential process forecasts along with othe r independent variable s into developing a multiple regression model. This combined forecasting procedure was applied in modeling some economic data series related to bank de posits. This method produced good forecasting results of the economic series. In 1971, Tsokos studied some forecasting models for short term forecasting of economic data. He studied the autoregr essive, moving averag e and mixture of 2

PAGE 14

autoregressive moving average process as short term forecasting models for commodity contracts. Several models were developed for soybean, silver and New York Time Daily averages of fifty combin ed stocks for the first 100 business days. Their short term forecasts were evaluate d using the minimum residual variance criteria. The investigation of outliers in time series was the aim of Fox in 1972. Two types of outliers in the time series were considered. Gross error in the sample of observations and a single innova tion are extreme, but they were included in the study. The second type of outlier does not only effect the present observations but also subsequent observations. A likelihood ratio criterion was developed to study the problem of the two outliers. In 1994, Box and Jenkins introduced th e multiplicative ARIMA model on airline data. This type of model addresses the issue of seasonal variation that was not identifiable by the classical ARIMA model. Thus, once we identify the period of a time series, we can use the multiplicative AR IMA model to forecast this series much better than the classical met hod. We shall investigate this model further in Chapter 4 and 5. In 1982, Engle introduced the autoregressive conditional heteroskedasticity (ARCH) model that considers the variance of the current term to be a function of the variances of the previous tim e periods error terms. ARCH relates the error variance to the square of a previous periods error. In 1986, the generalized autoregressive conditional heteroskedasticity (GARCH) mode l was first proposed by Bollerslev, and 3

PAGE 15

it is employed commonly in modeling financia l time series that exhibit time varying volatility clustering. For further disc ussion on GARCH modeling, see (Enders 1995, Engle 2001, Gujarati 2003, and Nelson 1991). The state space representation of a system is related to the Kalman Filter and was originally developed by control engineers Kalman 1960, Kalman and Bucy 1961, and Kalman, Falb, and Arbib 1969. The system is also known as a state space model and is defined to be the minimum set of information from the present and past such that the future behavior of the system can be completely described by the knowledge of the present state and futu re input. For further discussion and development on the state space model, see (Chen 1999, Khalil, Nise 2004, Hinrichsen & Pritchard 2005, Sontag 1999, and Durbin & Koopman 2001) 1.2 Fundamental Concepts A time series can be thought of as co mprising of a sequence of measurements, almost certainly intercorrelated, represen ting some phenomena in different areas. Each of the measurements is associated with a moment of time, with some measurements incorporating other parameters A time series can be classified as continuous or discrete depe nding upon whether the sequence is continuous or discrete. In the present study, we shall be only concer ned with finite discrete time series which are measured at equal-distant time intervals or those continuous time series that have been digitized to a finite discrete time series. 4

PAGE 16

1.2.1 Stochastic Process A stochastic process is a family of tim e indexed random variables ),( tX where belongs to a sample space and t belongs to an index set. For a fixed t),( tX is a random variable. For a given ),( tX as a function of is called a sample function or realization. The population that consists of all possible realizations is called ensemble in stochastic processes a nd time series analysis. Thus, a time series is a realization or sample function from a certain stochastic process. t Consider a finite set of random variables from a stochastic process },...,,{21 nt ttXXX,...}2,1,0:),({ ttX The n-dimensional distribution function is defined by } ,...,:{),...,,(1 21 ,,1 21nt t n XXXxXxXPxxx Fn n ttt (1.2.1) where are any real numbers. A process is said to be first-order stationary if its one-dimensional distribu tion function is time invariant. i.e., if for any integers and nixi,...,2,1, )()(1 11 1xFxFkt tX Xkt ,1kt 1 ; second-order stationary if ),( ),(21, 21,21 21xxFxxFktkt ttXX XX for any integer ktktt 121,,, and ; and nthorder stationary if kt 2 ) ,...,,( ) ,...,,(21 ,,, 21,,,21 21n XXXn XXXxxx Fxxx Fk n tktkt n ttt (1.2.2) for any n-tuple and integers. A process is said to be strictly stationary if (1.1.2) is true for The terms strongly stationary and completely stationary are also used to denote a strictly stationa ry process. Suppose (1.1.2) is true for some it would also be true for ) ,...,,(21 ntttk,....2,1 nmn mn because the m th-order distribution function determines all distribution f unctions of lower order. Th erefore, a higher order of 5

PAGE 17

stationarity always implies a lower order stationarity. For a real-value process ,...} 2,1,0:{ tXt we defined the mean function of the process ) (t tXE (1.2.3) the variance function of the process (1.2.4) ])[(2 2 tt tXE the covariance function ))((),(2 2 1 121 ttttXXEtt (1.2.5) and the correlation function 22 21 212 1),( ),(tttt tt (1.2.6) Since the distribution function for a strictly stationary process is the same for all t the mean function t is a constant, provided )(tXE This implies and hence for all t is also a constant. In addition, since )(2 tXE 22t ),( ),(21, 21,21 21xxFxxFktkt ttXX XX for any integer and k, we have 21, tt ),(),(21 21ktkttt and ),(),(21 21ktkttt Letting and we get ktt 1 tt 2 kktttkttt ),(),(),(21 (1.2.7) and kktttkttt ),(),(),(21 (1.2.8) Thus, for a strictly stationary process w ith first two moments finite, the covariance 6

PAGE 18

and correlation between and depend only on the time difference k tX ktX A process is said to be nth-order weakly stationary if all of its joint moments up to the n th-order exist and are time invarian t, hence a second-order weakly stationary process will have constant mean and variance, and the covariance and correlation functions depend only on the ti me difference. Moreover, a strictly stationary process with the first two mome nts being finite is also a second-order weakly stationary or covariance stationary process. However, a strictly stationary process does not always have finite moments. When th is occurs, it may not be covariance stationary. Therefore, it is possibl e for a strictly stati onary process, with no joint moments existing, to not be we akly stationary of any order. 1.2.2 White Noise Process A process is said to be a white noise process if it is a sequence of uncorrelated random variables from a fi xed distribution with constant mean } {tW WtWE ) ( and constant variance and 2)(WtWVar 0 ),( ktt kWWCov for all It is obvious to see that a white noise process is stationary with the autocovariance function when 0 k } {tW 2 Wk 0 k 0 k when 0 k ; the autocorrelation function 1 k when 0 k 0 k when 0 k ; and partial autocorrelation function 1 kk when 0 k 0 kk when 0 k The basic phenomenon of the white noise is that both its Autocorrelation Function (ACF) and Partial Autocorrelation Function (P ACF) are equal to zero. In addition, a white noise is Gaussian if its joint distri bution is normal. This concept plays an 7

PAGE 19

important role in constructing time series models in our later chapters. 1.2.3 Estimations on a Single Realization A stationary time series is characterized by its mean variance autocorrelation 2k and partial autocorrelation kk We can calculate the exact values of these parameters if the ensemble of all possible realizations is available. However, for most time series, only a si ngle realization is available, thus making it impossible to calculate the ensemble average. In th e following discussion, we discuss some statistical properties on these estimators by using time averages, which are useful when we deal with only a single realization. The mean can be estimated by its sample mean n t tx n x11 (1.2.9) which is the time average of n observations. To see whether this is a valid or good estimator, we take the expectation of x, we get n t tn n xE n xE11 )( 1 )( (1.2.10) which indicates that x is an unbiased estimator for And since 1 )1( 0 1 )1( 2 0 11 )( 2 0 11 2)1( )( ),( 1 )(n nk k n nk k n t n s st n t n s stn k n kn n n xxCov n xVar (1.2.11) Choose )(stk we know that if k n nk nn k1 )1(1 lim 8

PAGE 20

is finite, then 0)( xVar as n and x is a consistent estimator for That is, n t t nx n11 lim (1.2.12) Similarly, the variance can be estimated by its sample variance by using the time average as follows 2 n t txx n1 2 2)( 1 (1.2.13) the sample autocovariance function is given by k kn t kt t kxxxx1))(( (1.2.14) the sample ACF k can be estimated by n t t kn t kt t k kxx xxxx1 2 1 0)( ))(( ,...2,1,0 k (1.2.15) and the sample PACF kk can be estimated by using a recursive method starting with was published by Durbin(1960) as follows: 111 j k j kj k j jkkj k kk 1 1 1 1 1,11 (1.2.16) 1.3 The Stationary AR, MA, and ARMA Models In time series analysis there are two very useful representations to express as a time series process, namely the autoregressive process and the moving average process. Before the discussion of these models it is important to define several useful notations for simplifying purposes. The backshift operator is defined as (1.3.1) jtt jxxB 9

PAGE 21

and let (1.3.2) )... 1()(2 21p p pB BBB and (1.3.3) )... 1()(2 21q q qBBBB By defining (1.3.2) and (1.3.3), we can redu ce models into comp act forms in latter discussion. 1.3.1 The Autoregressive (AR) Models The autoregressive process of order p, denoted as AR( p) was first introduced by Yule in 1926. It is useful in describing situations in which th e present value of a time series is explained by its past values plus a random shock. The pth-order autoregressive process is defined as follows: (1.3.4) tptp t ttx xxx ...2211 or (1.3.5) ttpxB)( where } {t is a zero mean white noise process, and In addition, the AR( p) process contains parameters, which are As ttxx 1 p 2 21,,...,,p p j j 1 the process is always invertible. To be stationary, the roots of 0)( Bp must lie outside of the unit circle. 1.3.2 The Moving Average (MA) Models The moving average process of order q, denoted as MA( q) was first presented 10

PAGE 22

by Slutzky in 1927. It is useful in desc ribing phenomena that produce an immediate effect that only lasts for short periods of time. The qth-order moving average process is defined as follows: (1.3.6) qtq t tttx ...2211 or (1.3.7) tqtBx)( The MA( q) process contains 1 q parameters, which are Moreover, a finite moving average process is always stationary as and the process is invertible if the roots 2 21,,...,,q 2 2 2 2 1... 1q 0)( Bq of lie outside the unit circle. 1.3.3 The General Mixed ARMA Models The mixture of autoregressive and moving average process, denoted as ARMA( p, q ), can be produced if we combine the autoregressive and moving average process. The process was put together by Wold in 1938, and it is defined as follows: (1.3.8) qtq t ttptp t ttx xxx ... ...2211 2211 or (1.3.9) tqtpBxB)()( where )... 1()(2 21p p pB BBB and )... 1()(2 21q q qB BBB The general mixed ARMA( p, q) process has 1 qp parameters, which are 11

PAGE 23

2 21 21,,...,,,,...,,q p In addition, the process is invertible if the roots of 0)( Bq lie outside the unit circle and the pr ocess is stationary if the roots of 0)( Bp lie outside of the unit circle and under the assumption that 0)( Bq and 0)( Bp share no common roots. 1.4 Aims of Present Study The aim of the present study is to propose several new time series methodologies that can be applied to ma ny economic and environmental problems. We begin with summarizing some fundamental concepts that are essential for dealing with time series modeling and introduce some preliminary processes that are necessary for model building in time series. After we define all the necessities for stationary time series we shall introduce the classical nonstationary time series m odeling. Box and Jenkins methodology is one the most famous time series approach that is widely being used by many researchers working in many different areas. The idea of our proposed mode ls is to filter a nonstationary time series and smooth the edges of that series so we can be better forecast the phenomena. The smoothing components that we implement into the time series shall be removed once we have obtai ned the model. Due to the complexity of the developmental processes, we summari ze the methodologies for both the classical approach and our proposed models by providing step by step procedures in chapter 2. In chapter 3, 4, 5 and 6, we introduce several economic and environmental applications by developing time series fo recasting models using both the classical 12

PAGE 24

approach and our proposed methodologies. We shall show the quality and usefulness of our proposed models on those applicati ons by comparing them with the classical methods. In each of the applications, we shall examine the proposed models by looking into some essential statistical properties numerically and identify the usefulness under circumstances. Finally, we shall evaluate the proposed models by ranking their efficiency under different circ umstances according to their performance in forecasting the future phenomena. 1.5 Conclusion In the present chapter, we began with surveying several major literatures in the area of time series forecasting and providi ng the references of the development of those methodologies in the subject area. We then defined some fundamental concepts such as stochastic process, white noise pr ocess and stationary ARMA process. Those processes are not only essential for our pr oposed model building procedures but also play a major role in time series forecasting. Finally, we briefly di scuss the approach of our proposed methodologies and show the usefulness and effectiveness of our proposed models with the applica tions in the present study. 13

PAGE 25

Chapter 2 Analytical Formulations of Modified Time Series 2.0 Introduction In the present chapter, we shall summarize the updated step-by-step process on developing a forecasting model from a non-stat ionary stochastic realization. In addition, we introduce three additional models, namely simple moving average, weighted moving average, and the exponential moving average. Th e basic structure of these models begins with the actual non-stationary realizations and we formulate a new time series from the original data based on the models we mentioned above. 2.1 The General ARIMA Model The time series processes we have discusse d so far are all stationary processes, but most of the time series in reality are nonstationary. Therefore, the roots of their AR polynomials do not lie outside the unit circle. Hence, we will not be able to use the general mixed ARMA(p,q) as we discussed in the prev ious chapter on a non-stationary time series. In order for us to resolve this issue, we must first introduce the difference filter as follows: (2.1.1) dB )1( where and d is the degree of differencing of the series. jtt jxxB 14

PAGE 26

Due to the reason that the loca l behavior of such non-stationa ry time series is independent of its level, we let )( B be the autoregressive operator which describes the behavior of this homogeneous non-stationary time series, we have t txBCxB )())(( (2.1.2) for any constant C This equation implies that )( B must be of the form (2.1.3) dBBB )1)(()( for some where 0 d )( B is a stationary autoregressive operator. Hence, we can reduce a non-stationary time series to a stati onary time series after taking a proper degree of differencing of the series. After we define the difference filter, we can now introduce the famous ARIMA( p,d,q) autoregressive integrated moving average model as follows: (2.1.4) tqt d pBxBB )()1)(( where d is the degree of differencing, )... 1()(2 21 p p pB BBB and )... 1()(2 21 q q qBBBB Consider the simplest case, ARIMA(0,1,1), we have t tBxB )1()1(1 or we can expand the model as 11 1 ttttxx which is a MA(1) model on the difference of the non-stationary time series. Consider another simplest case, ARIMA(1,1,0), we have 15

PAGE 27

ttxBB )1)(1(1 or we can expand the model as ttt txxx 211)1 ( which is a AR(1) model on the difference of the non-stationary time series. In time series analysis, sometimes it is very difficult to make a decision in selecting the best order of the ARIMA( p,d,q) model when we have several models that all adequately represent a given set of time series. Hence, Akaikes information criterion (AIC), [1], plays a major role when it come s to model selection. AI C was first introduced by Akaike in 1973, and it is defined as follows: AIC( M ) = -2 ln [maximum likelihood] + 2 M (2.1.5) where M is the number of parameters in the model and the uncondi tional log-likelihood function suggested by Box, Jenkins, a nd Reinsel in 1994, [4], is given by 2 2 22 ),,( 2ln 2 ),,,(ln S n L (2.1.6) where ),,( S is the unconditional sum of squares function given by n t tx E S2)],,,([),,( (2.1.7) where ),,,( x Et is the conditional expectation of t given x ,,, The quantities , and that maximize (2.1.6) are called unconditional maximum likelihood estimators. Since involves the data only through ),,,(ln2L ),,( S these unconditional maximum likelihood estimators are equivalent to the unconditional least squares es timators obtained by minimizing ),,( S In practice, the summation in (2.1.7) is approximated by a finite form 16

PAGE 28

n Mt tx E S2)],,,([),,( (2.1.8) where M is a sufficiently large integer su ch that the backcast increment ),,,(),,,(1x Ex Et t is less than any arbitrary predetermined small value for This expression implies that )1( Mt ),,,( x Et ; hence, ),,,( x Et is negligible for )1( Mt After obtaining the parameter estimates , and the estimate of can 2 2 then be calculated from n S ),,(2 (2.1.9) For an ARMA(p,q) model based on n observations, the log-likelihood function is ),,( 2 1 2ln 2 ln2 2 S n L (2.1.10) We proceed to maximize (2.1.10) with respect to the parameters ,,, and from (2.1.9), we have 2 )2ln1( 2 ln 2 ln2 nn L (2.1.11) Since the second term in expression (2.1.11) is a constant, we can reduce the AIC to the following expression AIC( M ) (2.1.12) Mn 2ln2 Thus, we generate an appropriate time series model and select the statistical process with the smallest AIC. The model that we have identified will possess the smallest average mean square error. In addition, we summari ze the development of the model as follows: 17

PAGE 29

Check for stationarity of the time series by determining the order of differencing d, where according to KPSS test [12], until we achieve stationarity ,...2,1,0 d Deciding the order m of the process, for our case, we let where 5 m mq p After (d, ) being selected, listing all possible set of ( p, q) for m mq p For each set of ( p, q), estimating the parameters of each model, that is, q p ,...,,,,...,,21 21 Compute the AIC for each model, and choose the one with smallest AIC According to the criterion that we me ntioned above, we can obtain the ARIMA( p,d,q) model that best fit a given time series, where the coefficients are q p ,...,,,,...,,21 21 2.2 The k th Moving Average Time Series Model In this section, we propose a model which is constructed based on the concept of the simple moving average. Before we actua lly discuss the proposed models, we shall first define the simple moving average. In ti me series analysis, the primary use for the kth simple moving average is for smoothing a time series. It is very useful in discovering short-term, long-term trends and seasona l components of a time series. The k-th simple moving average process of a time series is defined as follows: } {tx 1 0 11k j jkt tx k y (2.2.1) where nkkt ,...,1, It is obvious to see that as k increases, the number of observations of decreases, and the series gets closer and closer to the mean of the series as k increases. In } {ty } {ty } {tx 18

PAGE 30

addition, when the series reduces to a single obs ervation, and equals to nk } {ty because n j j tx n y11 (2.2.2) On the other hand, if we choose a fairly small k we can smooth the edges of the series without losing too much of the general information. We proceed to develop our proposed m odel by transforming the original time series into by applying (2.2.1). After establishi ng the new time series, usually very nonstationary, we begin the process of reducing it to stationary time series by selecting the appropriate diffe rencing order. We then proceed with the model building procedure of developing time series model and at each stage calculate the AIC. The best model will be the one with the smallest AIC. } {tx } {ty Using the model that we developed for and subject to the AIC criteria, we forecast values of and proceed to apply the back-shift operator to obtain estimates of the original phenomenon that is, } {ty }{ty } {tx (2.2.3) 1 21... kt ttttxxxykx In addition, we summarize the development of the model as follows: Transforming the original time series into by applying (2.2.1) } {tx } {ty Check for stationarity of the series by determining the order of differencing d, where according to KPSS test, unt il we achieve stationarity } {ty ,...2,1,0 d Deciding the order m of the process, for our case, we let where 5 m mq p After (d, ) being selected, listing all possible set of ( p, q) for m mq p 19

PAGE 31

For each set of (p, q), estimating the parameters of each model, that is, q p ,...,,,,...,,21 21 Compute the AIC for each model, and choose one with the smallest AIC Solve the estimates of the original time series by applying (2.2.3) The proposed model and the corresponding procedure discussed in this section shall be illustrated with real economic appli cation and the results will be compared with the classical time series model. 2.3 The k-th Weighted Moving Average Time Series Model Compare to the model that we proposed in the previous section, the model we proposed in this section is more useful to those time series which behave more aggressively compared with some time seri es which do not. Before we actually present the model, we shall first introduce the weighted moving average process. The k-th weighted moving average process is also a very good smoothing tool. However, the structure of it is slightly di fferent from the simple moving average process. It puts more weight on the most recent observation, and the weight consistently decreases up to the first observation. In addition, it captures the original time se ries better than the moving average process, which is suitable for those analysts who believe th e recent observations should weight more than the old ones. The k-th weighted moving average process of a time series is defined as follows: } {tx 1 0 1)1( 2/)1( 1k j jkt txj kk z (2.3.1) 20

PAGE 32

where Similar to the moving average process, as k increases, the number of observations of the series decreases, and as from (2.3.1) the series becomes nkkt ,...,1, } {tz nk } {tz n j j tjx nn z12/)1( 1 (2.3.2) If we choose a fairly small k we can smooth the edges of the series, and the series is closer to the series compared with the moving averag e method that we discussed in the last section. } {tz } {tx Our first proposed model begins with transforming the original time series into by applying (2.3.1). After establishing the new time series, we begin the process of reducing it to stationary time series by se lecting the appropriate differencing order. We then proceed with the model building procedure of developi ng time series model and at each stage calculate the AIC. The best mode l will be the one with the smallest AIC. } {tx } {tz Using the model that we developed for and subject to the AIC criteria, we forecast values of and proceed to apply the back-shift operator to obtain estimates of the original phenomenon that is, } {tz }{tz } {tx k xxkxkzkk xkt t t t t1 2 1...)2()1(]2/)1[( (2.3.3) In addition, we summarize the development of the model as follows: Transforming the original time series into by applying (2.3.1) } {tx } {tz Check for stationarity of the series by determining the order of differencing d, where according to KPSS test, unt il we achieve stationarity } {tz ,...2,1,0 d 21

PAGE 33

Deciding the order m of the process, for our case, we let where 5 m mq p After (d, ) being selected, listing all possible set of ( p, q) for m mq p For each set of (p, q), estimating the parameters of each model, that is, q p ,...,,,,...,,21 21 Compute the AIC for each model, and choose one with the smallest AIC Solve the estimates of the original time series by applying (2.3.3) To conclude the goodness of fit of the propos ed model, we will i llustrate in a later chapter with numerical examples. 2.4 The k-th Exponential Weighted Moving Av erage Time Series Models Similar to the last two sec tions, in order for us to pr esent our proposed models, we must first introduce the exponential weighted moving average method. The k -days exponential weighted moving average serves pr etty much the same purpose as those two average processes as we discussed in the last two sections. Instead of decreasing weight consistently as the weighted moving average method, it decreases weight exponentially. This fits perfect for those an alysts who care most on the re cent observations and not pay too much attention to the old ones. In a ddition, it is not difficult to see that the exponential weighted moving average catches the original series fast er than both average processes as we discussed earlier Hence, it is very useful in dealing with very aggressive time series. The k-th exponential weighted moving aver age process of a time series is defined as follows: } {tx 1 0 1 1 1 0)1( )1( 1k j jkt jk k j j tx v (2.4.1) 22

PAGE 34

where and the smoothing factor nkkt ,...,1, is defined as 1 2 k If we let we have nk 1 2 n Moreover, 1 0)1(k j j reaches its maximum when 3 k and it gets closer and closer to 1 as k increases. As k increases, the number of observations the series decreases, and it eventually reduces to a single observation when As the series becomes } {tv nk nk } {tv 1 0 1 1 1 0)1( )1( 1n j j jn n j j tx v (2.4.2) From (2.4.2), it is obvious to see that the exponential weighted moving average process weighs heavily on the most recent observati on and decreases the we ight exponentially as time decreases. If we choose a fairly small k we can smooth the edges of the series and the would be the closest to the series compared with both moving average processes that we discussed in the previous two sections. } {tv } {tx We proceed to develop our proposed m odel by transforming the original time series into by applying (2.4.1). After obtaining the new time series, we select the appropriate differencing or der to reduce the series into a stationary time series. We then proceed the model building procedur e of developing time series model and at each stage calculate the AIC. The best mode l will be the one with the smallest AIC. } {tx } {tv } {tv Using the model that we developed for and subject to the AIC criteria, we forecast values of and proceed to apply the back-shift operator to obtain estimates of the original phenomenon that is, } {tv }{tv } {tx 23

PAGE 35

1 1 2 2 1 1 0)1(...)1()1( )1( kt k t t k j j t tx x x v x (2.4.3) In addition, we summarize the development of the model as follows: Transforming the original time series into by applying (2.4.1) } {tx } {tv Check for stationarity of the series by determining the order of differencing d, where according to KPSS test, unt il we achieve stationarity } {tv ,...2,1,0 d Deciding the order m of the process, for our case, we let where 5 m mq p After (d, ) being selected, listing all possible set of ( p, q) for m mq p For each set of (p, q), estimating the parameters of each model, that is, q p ,...,,,,...,,21 21 Compute the AIC for each model, and choose one with the smallest AIC Solve the estimates of the original time series by applying (2.4.3) The proposed model and the corresponding procedure discussed in this section shall be illustrated with real economic appli cation and the results will be compared with the classical time series model. 2.5 The Multiplicative ARIMA Model In time series analysis, seasonal variations sometimes dominate the variations of the original nonstationary time series. It occurs very commonly on the environmental applications, as most of the environmental forecasting problems, ther e exists some type of periodic trend that is obvious for us to recognize. We can treat the seasonal time series as a nonstationary time series that varies along some kind of seasonal periodic trend. 24

PAGE 36

Hence, addressing the seasonal variations fo r the model becomes extremely useful when we deal with these types of problems. The multiplicative seasonal autoregressive integrated moving average, ARIMA model is defined by (2.5.1) t s Qqt Ds d p s PBBxBBBB )()()1()1)(()( where p is the order of the autoregressive process, d is the order of regular differencing, q is the order of the moving average process, P is the order of the seasonal autoregressive process, D is the order of the seasonal differencing, Q is the order of the seasonal moving average process, and the subindex s refers to the seasonal period. We shall denote the subject model by ARIMA sQDPqdp ),,(),,( and defined as follows: )(),(),(),(s q s P qpBBBB )... 1()(2 21 p p pB BBB )... 1()(2 21 q q qB BBB Ps P s s s PB BB B ... 1)(2 2 1 and Qs Q s s s QB BBB ... 1)(2 2 1 The order of the multiplicative ARIMA model de termines the structure of the model, and it is essential to have a good methodology in terms of developing the forecasting model. Below we summarize the model identifying procedure: Determine the seasonal period s 25

PAGE 37

Check for stationarity of the given time series by determining the order of differencing d, where } {tx ,...2,1,0 d according to KPSS test, until we achieve stationarity. Deciding the order m of the process, for our case, we let where 5 m mQPqp After being selected, listing all possible configurations of for ),( md ),,,( QPqp mQPqp For each set of estimates the parameters for each model, that is, ),,,( QPqp Q P q p ,...,,,,...,,,,...,,,,...,,21 21 21 21 Compute the AIC for each model, and choose the one with smallest AIC. After ( ) is selected, we determine th e seasonal differencing filter by selecting the smaller AIC between the model with QPqdp ,,,, 0 D and 1 D Our final model will have identified the order of ( ). QDPqdp ,,,,, To conclude the goodness of fit of the Multip licative ARIMA model, we will illustrate in latter chapter with environmental applications. 2.6 Conclusion In the present chapter, we began with developing a step-by-step forecasting procedure for the classical ARIMA model. We then introduced our three proposed models, namely the k-th Moving Average Time Series Model, the k-th Weighted Moving Average Time Series Model, and the k-th Exponential Weighted Moving Average Time Series Model, of which we believe perf orm better than the classical methodology. In 26

PAGE 38

27 addition, we have developed a step-by-step procedure for each of our proposed models. This step-by-step procedure of each model shall be very helpful when one needs to develop model following our methodologies. Finally, we introduce the multiplicative ARIMA model that is able to address the seasonal variation when we deal with environmental applications in later chapters and a step-by-step procedure for the subject model.

PAGE 39

28 Chapter 3 Econometrics Forecasting Models 3.0 Introduction The object of the present chapter is to illustrate the results of our proposed forecasting models for two nonstationary stocha stic realizations. The subject models are based on modifying both given time series into a new k-th moving average, k-th weighted moving average, and k-th exponential weighted moving average time series to begin the development of the model. The study is base d on the autoregressive integrated moving average process along with its analytical c onstrains. The analytical procedure of the proposed models is given in chapter two. The first series is a stoc k XYZ selected from the Fortune 500 companies and its daily closing price constitute the time series. The second series is the closing price of the S& P price index. Both the classical and proposed forecasting models were developed and a comp arison of the accuracy of their responses is given. 3.1 Forecasting Models on Stock XYZ In this section, we first begin with co llecting 500 observations for a stock from the fortune 500 companies and try to for ecast its daily closing value by using the traditional approach versus different forecasti ng models that we proposed in chapter two. We then examine each model by looking into th e properties of its plots and residuals and rank each model based on the results of our findings. Finally, we hide the last 25

PAGE 40

observations and forecast the ne xt 25 observations without us ing the future information. Thus, the coefficients of the models are updated every time when we get new information. We then examine the last 25 re siduals so we can de termine the goodness of fit of those models and draw c onclusions based on these results. 3.1.1 Data Preparation In order for us to properly illustrate different types of moving average ARIMA models, we need to first create another thr ee time series, namely 3 days moving average time series (MA3), 3 days weighted movi ng average time series (WMA3), and 3 days exponential weighted moving average time series (EWMA3) by using the methodology that we discussed in chapter two. Suppose we let (see Appendix A1) be the daily closing values of stock XYZ that we collect from the fortune 500 companies that we mentioned earlier. A plot of the actual information is given by Figure 3.1 } {tx TimePrice 0 100 200 300 400 500 2324252627282930 Figure 3.1 Daily Closing Price for Stock XYZ 29

PAGE 41

In order for us to apply the fitting procedure for our first proposed model, we must transform into a 3 days moving average series (see Appendix A2) by referring to (2.2.1). Hence, we have } {tx } {ty 21yy are not available, and 3321 3xxx y 3432 4xxx y 312 ttt txxx y (3.1.1) Similar to the last transformation, we need to create another series by referring to (2.3.1), which is transferring into a 3 days weighted moving average series (see Appendix A3). Hence, we have } {tx } {tz 21zz are not available, and 6 32321 3xxx z 6 32432 4xxx z 6 321 2t t t txxx z (3.1.2) The last transformation we need to make is turning the original time series into a 3 days exponential weighted moving average series (see Appendix A4) by referring to (2.4.1). that is } {tx }{tv 21vv are not available, and 75.1 5.25.321 3xxx v 30

PAGE 42

75.1 5,25.432 4xxx v 75.1 5.25.1 2tt t txxx v (3.1.3) After finishing all of the a bove transformations, we end up with 4 set of different time series observations, which are , and Now we are ready for the model fitting procedures in the next section. } {tx } {ty } {tz } {tv In the following sub-sections, we shall follo w those step by step procedures as we discussed in chapter two for both the classi cal time series approach and our proposed methods. 3.1.2 The General ARIMA Model Following the step-by-step procedure th at we introduced in Section 2.1, the classical forecasting model with the best AIC score is the ARIMA(1,1,2). That is, a combination of first order autoregressive (AR) and a second order moving average (MA) with a first difference filter Thus, we can write it as (3.1.4) t tBB xBB)0581.0531.11()1)(9631.1(2 After expanding the autoregressive operator and the difference filter, we have t tBB xBB)0581.0531.11()9631.9631.11(2 2 and the model can be written as 2 1 2 10581.0531.1 9631.9631.1 t t tt t txxx 31

PAGE 43

by letting 0 t we have the one day ahead forecasting time series of the closing price of stock XYZ as (3.1.5) 2 1 2 10581.0531.19631.9631.1 t t t t txxx Using the above equation, we graph the forecasting values obtained by using the classical approach on top of the orig inal time series, shown as Figure 3.2 TimePrice 0 100 200 300 400 500 2224262830 0 100 200 300 400 500 2224262830 Original Data Classical ARIMA Figure 3.2 Comparisons on Cl assical ARIMA Model VS. Original Time Series The basic statistics that reflect the ac curacy of model (3.1.5) are the mean r variance standard deviation and standard error 2 rS rS n Sr of the residuals. Figure 3.3 gives a time series plot of the residual and ta ble 3.1 provides the basic statistics. 32

PAGE 44

TimePrice 0 100 200 300 400 500 -3-2-10123 Figure 3.3 Time Series Plot of th e Residuals for Classical Model Table 3.1 Basic Evaluation Stat istics for Classical ARIMA r 2 rS rS n Sr 0.02209169 0.1445187 0.3801562 0.0170011 Furthermore, we restructure the model (3.1.5) with 475 n data points to forecast the last 25 observations only using the prev ious information. The purpose is to see how accurate our forecast prices are with respect to th e actual 25 values that have not been used. Table 3.2 gives the actual price, predicte d price, and residuals between the forecasts and the 25 hidden values. 33

PAGE 45

Table 3.2 Actual and Predicte d Price for Classical ARIMA N Actual Price Predicted Price Residuals 476 26.78 26.8473 -0.0673 477 26.75 26.7976 -0.0476 478 26.67 26.7673 -0.0972 479 26.8 26.6922 0.1078 480 26.73 26.8064 -0.0764 481 26.78 26.7490 0.0310 482 26.27 26.7911 -0.5211 483 26.12 26.3277 -0.2077 484 26.32 26.1631 0.1569 485 25.98 26.3364 -0.3564 486 25.86 26.0349 -0.1749 487 25.65 25.9068 -0.2568 488 25.67 25.6670 0.0031 489 26.02 25.7119 0.3081 490 26.01 26.0335 -0.0235 491 26.11 26.0427 0.0674 492 26.18 26.1343 0.0457 493 26.28 26.2032 0.0768 494 26.39 26.2986 0.0914 495 26.46 26.4043 0.0557 496 26.18 26.4743 -0.2943 497 26.32 26.2219 0.0981 498 26.16 26.3354 -0.1754 499 26.24 26.1953 0.0447 500 26.07 26.2602 -0.1902 The average of these residuals is 05608.0 r and the Figure 3.4 is a graph presentation of the forecasting result. 34

PAGE 46

TimePrice 0 5 10 15 20 25 24 25 26 27 28 0 5 10 15 20 25 24 25 26 27 28 Original Data Classical ARIMA Figure 3.4 Classical ARIMA Forecasti ng on the Last 25 Observations 3.1.3 The 3 Days Moving Average ARIMA Models Figure 3.5 shows the new time series along with the original time series that we shall use to develop the proposed forecasting model. } {ty } {tx TimePrice 0 100 200 300 400 500 2224262830 0 100 200 300 400 500 2224262830 Original Data MA3 Figure 3.5 MA3 Series VS. Th e Original Time Series 35

PAGE 47

Following the procedure we discussed in se ction 2.2, we found the best model that characterizes the behavior of to be ARIMA(2,1,3). That is, } {ty (3.1.6) t tBBB yBBB) 0056..00561()1)(.0605 .89611(32 2 Expanding the autoregressive operator and the first difference filter, we have t tBBB yBBB) 0056..00561()0605.8356.8961.11(32 3 2 and the model can be written as 32 1 3 2 10056.0056. 0605.8356.8961.1 tt t tt t t ty y yy The final analytical form of the proposed forecasting model can be written as (3.1.7) 32 1 3 2 10056.0056.0605.8356.8961.1 tt t t t t ty y y y Using the above equation, we graph the forecasting values obtained by using the MA3 ARIMA approach on top of the original time series, shown as Figure 3.6 TimePrice 0 100 200 300 400 500 2224262830 0 100 200 300 400 500 2224262830 Original Data MA3 ARIMA Figure 3.6 Comparisons on MA3 ARIMA Model VS. Original Time Series Note the very closeness of th e two plots that reflect the quality of the proposed model. 36

PAGE 48

Similar to the classical model approach th at we discussed earlier, we shall use the first 475 observations to forecast Then we use the observations to forecast and continue this process until we obtain forecasts all the observations, that is, From equation (2.2.3), we can see the relationship between the forecasting va lues of the original series and the forecasting values of 3 days moving average series ,that is, } ,...,,{475 21yyy 476y },...,,{476 21yyy 477y },...,,{500 477476 yyy } {tx } {ty (3.1.8) 213 ttttxxyx Hence, after we estimated we can use the above equation, (3.1.8), to solve the forecasting values for Figure 3.7 is the residu al plot generated by the MA3 ARIMA model, and followed by Table 3.3, that includes the basic evaluation statistics. },...,,{500 477476 yyy } {tx TimePrice 0 100 200 300 400 500 -3-2-10123 Figure 3.7 Time Series Plot of the Residuals for MA3 Model 37

PAGE 49

Table 3.3 Basic Evaluation Statistics for MA3 ARIMA r 2 rS rS n Sr 0.01016814 0.1437259 0.3791119 0.01698841 Both of the above displayed evaluations reflect on accuracy of the proposed model. The actual daily closing pr ices of stock XYZ from the 476 th day along with the forecasted prices and residua ls are provided in Table 3.4. Table 3.4 Actual and Predicted Price for MA3 ARIMA N Actual Price Predicted Price Residuals 476 26.78 26.8931 -0.1131 477 26.75 26.7715 -0.0215 478 26.67 26.7121 -0.0421 479 26.8 26.7239 0.0761 480 26.73 26.7854 -0.0554 481 26.78 26.6892 0.0908 482 26.27 26.8292 -0.5592 483 26.12 26.3027 -0.1827 484 26.32 26.0808 0.2392 485 25.98 26.3603 -0.3803 486 25.86 25.9868 -0.1268 487 25.65 25.8443 -0.1943 488 25.67 25.7115 -0.0414 489 26.02 25.6499 0.3701 490 26.01 25.9650 0.0450 491 26.11 26.0526 0.0574 492 26.18 26.0912 0.0888 493 26.28 26.1449 0.1351 494 26.39 26.3090 0.0810 495 26.46 26.3752 0.0848 496 26.18 26.4223 -0.2423 497 26.32 26.2461 0.0739 498 26.16 26.2964 -0.1364 499 26.24 26.1437 0.0963 500 26.07 26.2678 -0.1978 38

PAGE 50

The Results given above attest to the good forecasting estimat es for the hidden data. The average of these residuals is 0342.0 r and the Figure 3.8 is a graph presentation of the forecasting result. TimePrice 0 5 10 15 20 25 24 25 26 27 28 0 5 10 15 20 25 24 25 26 27 28 Original Data MA3 ARIMA Figure 3.8 MA3 ARIMA Forecasting on the Last 25 Observations 3.1.4 The 3 Days Weighted Moving Average ARIMA Models Figure 3.9 shows the new time series along with the original time series that we shall use to develop the proposed forecasting model. } {ty } {tx 39

PAGE 51

TimePrice 0 100 200 300 400 500 2224262830 0 100 200 300 400 500 2224262830 Original Data WMA3 Figure 3.9 WMA3 Series VS. The Original Time Series Following the procedure we have stated in section 2.3, we found the best model that characterizes the behavior of to be ARIMA(1,1,3). That is, } {tz (3.1.9) t tB BB zBB)2456.8348.5084.11()1)(9073.1(3 2 expand the autoregressive operator a nd the difference filter, we have t tBBB zBB)2456.8348.5084.11()9073.0927.1(3 2 2 and the model can be written as 3 2 1 2 12456.8348.5084.1 9073.0927. t t t tt t tzzz by letting 0 t we have the one day ahead forecasting series as (3.1.10) 3 2 1 2 12456.8348.5084.19073.0927. t t t t t tzzz Using the above equation, we graph the forecasting values obtained by using the WMA3 ARIMA approach on top of the original time series, shown as Figure 3.10 40

PAGE 52

TimePrice 0 100 200 300 400 500 2224262830 0 100 200 300 400 500 2224262830 Original Data WMA3 ARIMA Figure 3.10 Comparisons on WM A3 ARIMA Model VS. Original Time Series Similarly, we shall use the first 475 observations to forecast Then we use the observations to forecast and continue this process until we obtain forecasts all the observations, that is, From equation (2.3.3), we can see the relationship between the forecasting values of the original series and the forecasting values of 3 days moving average series ,that is, } ,...,,{475 21zzz 476z } ,...,,{476 21zzz 477z },...,,{500 477476 zzz }{tx } {tz 3 2621 ttt txxz x (3.1.11) Hence, after we estimated we can use the above equation, (3.1.11), to solve the forecasting values for Figure 3.11 is the residu al plot generated by our proposed model, and followed by Table 3.5, that includes the basic ev aluation statistics. },...,,{500 477476 zzz } {tx 41

PAGE 53

TimePrice 0 100 200 300 400 500 -3-2-10123 Figure 3.11 Time Series Plot of the Residuals for WMA3 Model Table 3.5 Basic Evaluation St atistics for WMA3 ARIMA r 2 rS rS n Sr 0.00866631 0.1578446 0.3972966 0.01776764 A detail ranking comparison between models will be illustrated in latter section, and Table 3.6 shows a head to head comparison be tween the actual and predicted price for WMA3 ARIMA model. 42

PAGE 54

Table 3.6 Actual and Predicted Price for WMA3 ARIMA N Actual Price Predicted Price Residuals 476 26.78 26.8435 -0.0635 477 26.75 26.77416 -0.0242 478 26.67 26.76265 -0.0926 479 26.8 26.6686 0.1314 480 26.73 26.79599 -0.0660 481 26.78 26.72857 0.0514 482 26.27 26.7817 -0.5117 483 26.12 26.30523 -0.1852 484 26.32 26.13652 0.1835 485 25.98 26.30601 -0.3260 486 25.86 25.98931 -0.1293 487 25.65 25.89196 -0.2420 488 25.67 25.65289 0.0171 489 26.02 25.67122 0.3488 490 26.01 25.9955 0.0145 491 26.11 26.00049 0.1095 492 26.18 26.11326 0.0667 493 26.28 26.1717 0.1083 494 26.39 26.26891 0.1211 495 26.46 26.38754 0.0725 496 26.18 26.44883 -0.2688 497 26.32 26.20532 0.1147 498 26.16 26.31132 -0.1513 499 26.24 26.16914 0.0709 500 26.07 26.23146 -0.1615 The average of these residuals is 0325.0 r and the Figure 3.12 is a graph presentation of the forecasting result. 43

PAGE 55

TimePrice 0 5 10 15 20 25 24 25 26 27 28 0 5 10 15 20 25 24 25 26 27 28 Original Data WMA3 ARIMA Figure 3.12 WMA3 ARIMA Forecasting on the Last 25 Observations 3.1.5 The 3 Days Exponential Weighted Moving Average ARIMA Models Figure 3.13 shows the new time series along with the original time series that we shall use to develop the proposed forecasting model. } {ty } {tx TimePrice 0 100 200 300 400 500 2224262830 0 100 200 300 400 500 2224262830 Original Data EWMA3 Figure 3.13 WMA3 Series VS. The Original Time Series 44

PAGE 56

Following the procedure we have stated in section 2.4, we found the best model that characterizes the behavior of to be ARIMA(3,1,2), that is, } {tv t tBBB vBBB)9071.0728.4362.1()1)(9045.4766.1(3 2 2 (3.1.12) expand the autoregressive operator a nd the difference filter, we have t tBBB vBBB)9071.0728.4362.1()9045.4279.4766.11(3 2 3 2 and rewrite the model as 3 2 1 3 2 19071.0728.4362. 9045.4279.4766.1 t t t tt t t tvvvv by letting 0 t we have the one day ahead forecasting series as (3.1.13) 3 2 1 3 2 19071.0728.4362.9045.4279.4766.1 t t t t t t tv vvv Using the above equation, we graph the forecasting values obtained by using the EWMA3 ARIMA approach on top of the orig inal time series, shown as Figure 3.14 TimePrice 0 100 200 300 400 500 2224262830 0 100 200 300 400 500 2224262830 Original Data EWMA3 ARIMA Figure 3.14 Comparisons on EWMA3 ARIMA Model VS. Original Time Series Similarly, we shall use the first 475 observations to forecast Then we use the observations to forecast and continue this process } ,...,,{475 21vvv 476v } ,...,,{476 21vvv 477v 45

PAGE 57

until we obtain forecasts all the observations, that is, From equation (2.4.3), the relationship between the forecas ting values of the original series and the forecasting values of 3 days moving average series can be derived as },...,,{500 477476 vvv } {tx } {tv (3.1.14) 2 125.5.75.1 t tt txxvx Hence, after we estimated we can use the above equation, (3.1.14), to solve the forecasting values for Figure 3.15 is the residu al plot generated by our proposed model, and followed by Table 3.7, that includes the basic ev aluation statistics. },...,,{500 477476 vvv } {tx TimePrice 0 100 200 300 400 500 -3-2-10123 Figure 3.15 Time Series Plot of the Residuals for EWMA3 Model Table 3.7 Basic Evaluation Statistics for EWMA3 ARIMA r 2 rS rS n Sr 0.008076663 0.1573456 0.3966682 0.01773954 46

PAGE 58

A detail ranking comparison between models will be illustrated in latter section, and Table 3.8 shows a head to head comparison be tween the actual and predicted price for WMA3 ARIMA model. Table 3.8 Actual and Predicted Price for EWMA3 ARIMA N Actual Price Predicted Price Residuals 476 26.78 26.86504 -0.0850 477 26.75 26.80201 -0.0520 478 26.67 26.79553 -0.1255 479 26.8 26.69121 0.1088 480 26.73 26.81608 -0.0861 481 26.78 26.75107 0.0289 482 26.27 26.81218 -0.5422 483 26.12 26.3397 -0.2197 484 26.32 26.16653 0.1535 485 25.98 26.30794 -0.3279 486 25.86 26.02086 -0.1609 487 25.65 25.94825 -0.2983 488 25.67 25.68467 -0.0147 489 26.02 25.71181 0.3082 490 26.01 26.03114 -0.0211 491 26.11 26.05992 0.0501 492 26.18 26.18416 -0.0042 493 26.28 26.22193 0.0581 494 26.39 26.31487 0.0751 495 26.46 26.43036 0.0296 496 26.18 26.48979 -0.3098 497 26.32 26.25195 0.0680 498 26.16 26.34178 -0.1818 499 26.24 26.19074 0.0493 500 26.07 26.2655 -0.1955 The average of these residuals is 0678.0 r and the Figure 3.16 is a graph presentation of the forecasting result. 47

PAGE 59

TimePrice 0 5 10 15 20 25 24 25 26 27 28 0 5 10 15 20 25 24 25 26 27 28 Original Data EWMA3 ARIMA Figure 3.16 EWMA3 ARIMA Forecasting on the Last 25 Observations 3.2 Forecasting Models on S&P Price Index In the following application, we shall us e the S&P Price Index, and consider its daily closing price for 500 days to constitute the time series A plot of the actual information is given by Figure 3.17. } {tx TimePrice 0 100 200 300 400 500 1150120012501300135014001450 Figure 3.17 Daily Closing Pr ice for S&P Price Index 48

PAGE 60

3.2.1 Data Preparation To proceed with the model building processe s, we shall first create another three time series, namely 3 days moving average tim e series (MA3), 3 days weighted moving average time series (WMA3), and 3 days exponential weighted moving average time series (EWMA3) by using the methodologies that we discussed in chapter two. Suppose we let (see Appendix B1) be the daily closing values of the S&P Price Index that we mentioned earlier. In orde r for us to proceed the fitting procedure for our first proposed model, we must transform into a 3 days moving average series (see Appendix B2), a 3 days weighted moving average series (see Appendix B3), and a 3 days exponential weighted moving average series (see Appendix B4) by referring to (3.1.1), (3.1.2) and (3.1.3) respectively. } {tx } {tx }{ty } {tz } {tv After finishing all of the a bove transformations, we end up with 4 sets of different time series observations, which are , and Hence, we are now ready to proceed with the model fitting procedures. } {tx } {ty } {tz } {tv In the following sub-sections, we shall follo w those step by step procedures as we discussed in chapter two for both the classi cal time series approach and our proposed methods. 3.2.2 The General ARIMA Model Following the step-by-step procedure th at we introduced in Section 2.1, the classical forecasting model with the best AIC score is the ARIMA(0,1,2). That is, a second order moving average (MA) with a first difference filter. Thus, we can write it as (3.2.1) t tBB xB)1104.0331.1()1(2 49

PAGE 61

After expanding the autoregressive operator and the difference filter, we have t ttBB xx)1104.0331.1(2 1 and the model can be written as 2 1 11104.0331. t t tttxx by letting 0 t we have the one day ahead forecasting time series of the closing price of stock XYZ as 2 1 11104.0331. t t ttxx (3.2.2) Using the above equation, we graph the forecasting values obtained by using the classical approach on top of the orig inal time series, shown as Figure 3.2 TimePrice 0 100 200 300 400 500 1150120012501300135014001450 0 100 200 300 400 500 1150120012501300135014001450 Original Data Classical ARIMA Figure 3.18 Comparisons on Classical ARIM A Model VS. Original Time Series The basic statistics that reflect the ac curacy of model (3.1.5) are the mean r variance standard deviation and standard error 2 rS rS n Sr of the residuals. Figure 3.19 gives a time series plot of the residual and ta ble 3.9 provides the basic statistics. 50

PAGE 62

TimePrice 0 100 200 300 400 500 -40-2002040 Figure 3.19 Time Series Plot of th e Residuals for Classical Model Table 3.9 Basic Evaluation Stat istics for Classical ARIMA r 2 rS rS n Sr 0.5621225 60.76364 7.795104 0.3493070 Furthermore, we restructure the model (3.1.5) with 475 n data points to forecast the last 25 observations only using the prev ious information. The purpose is to see how accurate our forecast prices are with respect to the actual 25 values that have not been used. Table 3.10 gives the actual price, pr edicted price and residuals between the forecasts and the 25 hidden values. 51

PAGE 63

Table 3.10 Actual and Predicte d Price for Classical ARIMA N Actual Price Predicted Price Residuals 476 1422.95 1430.7690 -7.8190 477 1427.99 1422.8510 5.1390 478 1440.13 1428.5800 11.5500 479 1423.9 1439.1420 -15.2420 480 1422.18 1423.3170 -1.1370 481 1420.62 1423.8770 -3.2570 482 1428.82 1420.8820 7.9380 483 1438.24 1428.8140 9.4260 484 1445.94 1436.9820 8.9580 485 1448.39 1444.5870 3.8030 486 1446.99 1447.3060 -0.3160 487 1448 1446.6020 1.3980 488 1450.02 1447.9800 2.0400 489 1448.31 1449.7950 -1.4850 490 1438.06 1448.1510 -10.0910 491 1433.37 1438.5980 -5.2280 492 1444.26 1434.6300 9.6300 493 1455.3 1444.4730 10.8270 494 1456.81 1453.8570 2.9530 495 1455.54 1455.5140 0.0260 496 1459.68 1455.2130 4.4670 497 1457.63 1459.5290 -1.8990 498 1456.38 1457.2000 -0.8200 499 1451.19 1456.6180 -5.4280 500 1449.37 1451.4620 -2.0920 The average of these residuals is 9336.0 r and the Figure 3.20 is a graph presentation of the forecasting result. 52

PAGE 64

TimePrice 0 5 10 15 20 25 14201430144014501460 0 5 10 15 20 25 14201430144014501460 Original Data Classical ARIMA Figure 3.20 Classical ARIMA Forecasti ng on the Last 25 Observations 3.2.3 The 3 Days Moving Average ARIMA Models Figure 3.21 shows the new time series along with the original time series that we shall use to develop the proposed forecasting model. } {ty } {tx TimePrice 0 100 200 300 400 500 1150120012501300135014001450 0 100 200 300 400 500 1150120012501300135014001450 Original Data MA3 Figure 3.21 MA3 Series VS. Th e Original Time Series 53

PAGE 65

Following the procedure we discussed in se ction 2.2, we found the best model that characterizes the behavior of to be ARIMA(0,1,4). That is, } {ty (3.1.3) t tBBBB yB)1132.1562.8441..96081()1(4 3 2 Expanding the autoregressive operator and the first difference filter, we have t t ttBBBB yy)1132.1562.8441..96081(4 3 2 1 and the model can be written as 4 3 2 1 11132.1562.8441.9608. t t t t tttyy The final analytical form of the proposed forecasting model can be written as 4 3 2 1 11132.1562.8441.9608. t t t t ttyy (3.1.4) Using the above equation, we graph the forecasting values obtained by using the MA3 ARIMA approach on top of the original time series, shown as Figure 3.22 TimePrice 0 100 200 300 400 500 1150120012501300135014001450 0 100 200 300 400 500 1150120012501300135014001450 Original Data MA3 ARIMA Figure 3.22 Comparisons on MA3 ARIMA Model VS. Original Time Series Note the very closeness of th e two plots that reflect the quality of the proposed model. 54

PAGE 66

Similar to the classical model approach th at we discussed earlier, we shall use the first 475 observations to forecast Then we use the observations to forecast and continue this process until we obtain forecasts all the observations, that is, From equation (2.2.3), we can see the relationship between the forecasting va lues of the original series and the forecasting values of 3 days moving average series ,that is, } ,...,,{475 21yyy 476y },...,,{476 21yyy 477y },...,,{500 477476 yyy } {tx } {ty (3.1.8) 213 ttttxxyx Hence, after we estimated we can use the above equation, (3.1.8), to solve the forecasting values for Figure 3.23 is the residu al plot generated by the MA3 ARIMA model, and followed by Table 3.11 that includes the basic evaluation statistics. },...,,{500 477476 yyy } {tx TimePrice 0 100 200 300 400 500 -40-2002040 Figure 3.23 Time Series Plot of the Residuals for MA3 Model 55

PAGE 67

Table 3.11 Basic Evaluation St atistics for MA3 ARIMA r 2 rS rS n Sr 0.0006352779 60.60183 7.784718 0.3488415 Both of the above displayed evaluations reflect on accuracy of the proposed model. The actual daily closing pr ices of stock XYZ from the 476 th day along with the forecasted prices and residua ls are provided in Table 3.12. Table 3.12 Actual and Predicted Price for MA3 ARIMA N Actual Price Predicted Price Residuals 476 1422.95 1430.3760 -7.4260 477 1427.99 1422.7950 5.1950 478 1440.13 1429.0290 11.1010 479 1423.9 1438.7110 -14.8110 480 1422.18 1423.2030 -1.0230 481 1420.62 1424.5170 -3.8970 482 1428.82 1420.4510 8.3690 483 1438.24 1428.5580 9.6820 484 1445.94 1437.5200 8.4200 485 1448.39 1444.1100 4.2800 486 1446.99 1447.1560 -0.1660 487 1448 1447.2460 0.7540 488 1450.02 1447.5280 2.4920 489 1448.31 1449.6110 -1.3010 490 1438.06 1448.7910 -10.7310 491 1433.37 1438.1860 -4.8160 492 1444.26 1434.5520 9.7080 493 1455.3 1444.9280 10.3720 494 1456.81 1453.3290 3.4810 495 1455.54 1455.4640 0.0760 496 1459.68 1455.7900 3.8900 497 1457.63 1459.0210 -1.3910 498 1456.38 1457.0970 -0.7170 499 1451.19 1457.2440 -6.0540 500 1449.37 1450.9800 -1.6100 56

PAGE 68

The Results given above attest to the good forecasting estimat es for the hidden data. The average of these residuals is 9551.0 r and the Figure 3.24 is a graph presentation of the forecasting result. TimePrice 0 5 10 15 20 25 14201430144014501460 0 5 10 15 20 25 14201430144014501460 Original Data MA3 ARIMA Figure 3.24 MA3 ARIMA Forecasting on the Last 25 Observations 3.2.4 The 3 Days Weighted Moving Average ARIMA Models Figure 3.25 shows the new time series along with the original time series that we shall use to develop the proposed forecasting model. } {ty } {tx 57

PAGE 69

TimePrice 0 100 200 300 400 500 1150120012501300135014001450 0 100 200 300 400 500 1150120012501300135014001450 Original Data WMA3 Figure 3.25 WMA3 Series VS. The Original Time Series Following the procedure we have stated in section 2.3, we found the best model that characterizes the behavior of to be ARIMA(0,1,2). That is, } {tz (3.1.9) t tBB zB)2249.6339.1()1(2 expand the autoregressive operator a nd the difference filter, we have t ttBB zz)2249.6339.1(2 1 and the model can be written as 2 1 12249.6339. t t tttzz by letting 0 t we have the one day ahead forecasting series as 2 1 12249.6339. t t ttzz (3.1.10) Using the above equation, we graph the forecasting values obtained by using the WMA3 ARIMA approach on top of the original time series, shown as Figure 3.26 58

PAGE 70

TimePrice 0 100 200 300 400 500 1150120012501300135014001450 0 100 200 300 400 500 1150120012501300135014001450 Original Data WMA3 ARIMA Figure 3.26 Comparisons on WM A3 ARIMA Model VS. Original Time Series Similarly, we shall use the first 475 observations to forecast Then we use the observations to forecast and continue this process until we obtain forecasts all the observations, that is, From equation (2.3.3), we can see the relationship between the forecasting values of the original series and the forecasting values of 3 days moving average series ,that is, } ,...,,{475 21zzz 476z } ,...,,{476 21zzz 477z },...,,{500 477476 zzz }{tx } {tz 3 2621 ttt txxz x (3.1.11) Hence, after we estimated we can use the above equation, (3.1.11), to solve the forecasting values for Figure 3.27 is the residu al plot generated by our proposed model, and followed by Table 3.13, that includes the basic evaluation statistics. },...,,{500 477476 zzz } {tx 59

PAGE 71

TimePrice 0 100 200 300 400 500 -40-2002040 Figure 3.27 Time Series Plot of the Residuals for WMA3 Model Table 3.13 Basic Evaluation Statistics for WMA3 ARIMA r 2 rS rS n Sr -0.005440737 60.87297 7.802113 0.3496211 A detail ranking comparison between models will be illustrated in a later section, and Table 3.14 shows a head to head comparison between the actual and predicted price for WMA3 ARIMA model. 60

PAGE 72

Table 3.14 Actual and Predicted Price for WMA3 ARIMA N Actual Price Predicted Price Residuals 476 1422.95 1430.643 -7.6930 477 1427.99 1422.655 5.3350 478 1440.13 1428.753 11.3770 479 1423.9 1438.699 -14.7990 480 1422.18 1423.973 -1.7930 481 1420.62 1424.451 -3.8310 482 1428.82 1419.478 9.3420 483 1438.24 1428.872 9.3680 484 1445.94 1437.154 8.7860 485 1448.39 1445.274 3.1160 486 1446.99 1448.124 -1.1340 487 1448 1447.094 0.9060 488 1450.02 1448.109 1.9110 489 1448.31 1449.747 -1.4370 490 1438.06 1448.302 -10.2420 491 1433.37 1438.673 -5.3030 492 1444.26 1434.242 10.0180 493 1455.3 1443.687 11.6130 494 1456.81 1453.938 2.8720 495 1455.54 1456.578 -1.0380 496 1459.68 1455.873 3.8070 497 1457.63 1459.531 -1.9010 498 1456.38 1457.274 -0.8940 499 1451.19 1456.895 -5.7050 500 1449.37 1451.229 -1.8590 The average of these residuals is 8329.0 r and the Figure 3.28 is a graph presentation of the forecasting result. 61

PAGE 73

TimePrice 0 5 10 15 20 25 14201430144014501460 0 5 10 15 20 25 14201430144014501460 Original Data WMA3 ARIMA Figure 3.28 WMA3 ARIMA Forecasting on the Last 25 Observations 3.2.5 The 3 Days Exponential Weighted Moving Average ARIMA Models Figure 3.29 shows the new time series along with the original time series that we shall use to develop the proposed forecasting model. } {ty } {tx TimePrice 0 100 200 300 400 500 1150120012501300135014001450 0 100 200 300 400 500 1150120012501300135014001450 Original Data EWMA3 Figure 3.29 EWMA3 Series VS. The Original Time Series 62

PAGE 74

Following the procedure we have stated in section 2.4, we found the best model that characterizes the behavior of to be ARIMA(3,1,2), that is, } {tv (3.1.12) t tBB vBBBB)1946.5144.1()1)(0671.1296.9617.1(3 2 3 2 expand the autoregressive operator a nd the difference filter, we have t tBB vBBBB)1946.5144.1()0671.1967.0913.19617.11(3 2 4 3 2 and rewrite the model as 2 1 4 3 2 11946.5144. 0671.1967.0913.19617.1 t t tt t t t tvv v vv by letting 0 t we have the one day ahead forecasting series as 2 1 4 3 2 11946.5144.0671.1967.0913.19617.1 t t t t t t tvv v vv (3.1.13) Using the above equation, we graph the forecasting values obtained by using the EWMA3 ARIMA approach on top of the orig inal time series, shown as Figure 3.30 TimePrice 0 100 200 300 400 500 1150120012501300135014001450 0 100 200 300 400 500 1150120012501300135014001450 Original Data EWMA3 ARIMA Figure 3.30 Comparisons on EWMA3 ARIMA Model VS. Original Time Series Similarly, we shall use the first 475 observations to forecast Then we use the observations to forecast and continue this process } ,...,,{475 21vvv 476v } ,...,,{476 21vvv 477v 63

PAGE 75

until we obtain forecasts all the observations, that is, From equation (2.4.3), the relationship between the forecas ting values of the original series and the forecasting values of 3 days moving average series can be derived as },...,,{500 477476 vvv } {tx } {tv (3.1.14) 2 125.5.75.1 t tt txxvx Hence, after we estimated we can use the above equation, (3.1.14), to solve the forecasting values for Figure 3.31 is the residu al plot generated by our proposed model, and followed by Table 3.15, that includes the basic evaluation statistics. },...,,{500 477476 vvv } {tx TimePrice 0 100 200 300 400 500 -40-2002040 Figure 3.31 Time Series Plot of the Residuals for EWMA3 Model Table 3.15 Basic Evaluation Statistics for EWMA3 ARIMA r 2 rS rS n Sr 0.01651362 60.42038 7.773055 0.3483189 64

PAGE 76

A detail ranking comparison between models will be illustrated in latter section, and Table 3.16 shows a head to head comparison between the actual and predicted price for WMA3 ARIMA model. Table 3.16 Actual and Predic ted Price for EWMA3 ARIMA N Actual Price Predicted Price Residuals 476 1422.95 1430.186 -7.2360 477 1427.99 1422.34 5.6500 478 1440.13 1428.754 11.3760 479 1423.9 1438.368 -14.4680 480 1422.18 1423.878 -1.6980 481 1420.62 1424.578 -3.9580 482 1428.82 1419.391 9.4290 483 1438.24 1428.912 9.3280 484 1445.94 1437.024 8.9160 485 1448.39 1445.242 3.1480 486 1446.99 1447.595 -0.6050 487 1448 1446.488 1.5120 488 1450.02 1447.386 2.6340 489 1448.31 1449.043 -0.7330 490 1438.06 1447.769 -9.7090 491 1433.37 1438.381 -5.0110 492 1444.26 1434.177 10.0830 493 1455.3 1443.617 11.6830 494 1456.81 1453.801 3.0090 495 1455.54 1456.459 -0.9190 496 1459.68 1455.642 4.0380 497 1457.63 1458.982 -1.3520 498 1456.38 1456.648 -0.2680 499 1451.19 1456.533 -5.3430 500 1449.37 1450.962 -1.5920 The average of these residuals is 1166.1 r and the Figure 3.32 is a graph presentation of the forecasting result. 65

PAGE 77

TimePrice 0 5 10 15 20 25 14201430144014501460 0 5 10 15 20 25 14201430144014501460 Original Data EWMA3 ARIMA Figure 3.32 EWMA3 ARIMA Forecasting on the Last 25 Observations 3.3 Evaluations on Proposed Models VS. Classical Method In the previous sections, we have discu ssed the classical ARIMA, MA3 ARIMA, WMA3 ARIMA, and EWMA3 ARIMA models on bot h stock XYZ and S&P Price Index. We shall now compare the performance of each model through examining their basic statistical properties. The Following Table 3.17 shows the comparison between models for stock XYZ. Table 3.17 Ranking Comparison on stock XYZ r 2S rr S n Sr Order Classical ARIMA 0.02209169 0.1445187 0.3801562 0.0170011 1,1,2 MA3 ARIMA 0.01016814 0.1437259 0.3791119 0.01698841 2,1,3 WMA3 ARIMA 0.00866631 0.1578446 0.3972966 0.01776764 1,1,3 EWMA3 ARIMA 0.008076663 0.1573456 0.3966682 0.01773954 3,1,2 66

PAGE 78

According to Table 3.17 the MA3 ARIMA m odel performs best on forecasting stock XYZ compare to other models. Table 3.18 Ranking Comparis on on S&P Price Index r 2S rr S n Sr RANK Classical ARIMA 0.5621225 60.76364 7.795104 0.3493070 0,1,2 MA3 ARIMA -0.0006352779 60.60183 7.784718 0.3488415 0,1,4 WMA3 ARIMA -0.005440737 60.87297 7.802113 0.3496211 0,1,2 EWMA3 ARIMA 0.01651362 60.42038 7.773055 0.3483189 3,1,2 According to Table 3.18 the EWMA3 ARIMA model performs best on forecasting S&P Price Index among all models. In addition, th e MA3 ARIMA model also performs better than the classical ARIMA m odel; it speaks out the quality of our proposed models. 3.4 Conclusion In the present chapter, we try to show the usefulness and effectiveness of our proposed models by applying them on two different economic time series, namely the daily closing price of Stock XYZ and the dail y closing price of S&P Price Index. In both cases, once we obtained our proposed models we compare them with the classical approach and rank their efficiency by examin ing some basic statistical criteria of the residuals. In addition, by hiding the last 25 obs ervations and trying to predict the future information, we are able to show that our proposed models perform well without knowing the future information. The encouraging results speak out the quality of the models. 67

PAGE 79

68 Chapter 4 Global Warming: Atmospheric Temperature Forecasting Model 4.0 Introduction Temperature plays a very important role in Global Warming and its relation with Carbon Dioxide. The aim of the present chapte r is to develop a statistical forecasting model for the temperature in the Continental United States. There are two methods being used in recording temperatures and we sha ll refer them as Version 1 (see Appendix C1) and Version 2 (see Appendix C2) data sets. Thus, an additional aim in the present study is to determine if the two methods of recording temperatures are indeed different. Version 1 data was collected by the United States C limate Division, USCD, and Version 2 data by the United States Historical Climatology Network, USHCN. The Version 1 dataset consists of monthl y mean temperature and precipitation for all 344 climate divisions in the contiguous U. S. from January 1895 to June 2007. The data is adjusted for time of observation bias. However, no other adjustments are made for inhomogeneities. These inhomogeneities includ e changes in instrumentation, observer, and observation practices, stat ion and instrumentation moves, and changes in station composition resulting from stations closing and opening over time within a division. For additional information concerning Version 1 of the data, see (E asterling & Peterson, 1995; Karl et al., 1986; Karl & Williams, 1987; Karl et al., 1988; Karl et al., 1990; Peterson & Easterling, 1994; Qu ayle et al., 1991). A graphical presenta tion of the Version 1 dataset is given by Figure 4.1.

PAGE 80

MonthTemperature 0 200 400 600 800 1000 1200 3040506070 Figure 4.1 Time Series Plot for Monthly Temperature for 1895-2007 (Version 1) The Version 2 dataset was first become av ailable in July 2007, and it consists of data from a network of 1219 stations in the contiguous United States that were defined by scientists at the Global Change Research Progr am of the U. S. Department of Energy at National Climate Data Center. A methodology wa s developed and applied to test known station changes for their impact on the hom ogeneity, and necessary adjustments were made if the changes caused a statistically significant response in the time series. They claim that the data set is a consistent ne twork through time, which minimizes any biasing due to network changes through time. For in formation on Version 2 of the time series, see (Alexandersson & Moberg, 1997; Baker, 1 975; Easterling et al., 1996; Easterling et al., 1999; Hughes et al., 1992; Karl et al., 1990; Karl et al., 1 988; Karl et al., 1986; Karl & Williams, 1987; Lund & Reeves, 2002; Menne & Williams, 2005; Quinlan et al., 1987; Vose et al., 2003; Wang, 2003). A graphical presentation of the Version 2 dataset is given by Figure 4.2. 69

PAGE 81

MonthTemperature 0 200 400 600 800 1000 1200 3040506070 Figure 4.2 Time Series Plot for Monthly Temperature for 1895-2007 (Version 2) 4.1 Analytical Procedure The multiplicative seasonal autoregressive integrated moving average, ARIMA model is defined by (4.1.1) t s Qqt Ds d p s PBBxBBBB )()()1()1)(()( where p is the order of the autoregressive process, d is the order of regular differencing, q is the order of the moving average process, P is the order of the seasonal autoregressive process, D is the order of the seasonal differencing, Q is the order of the seasonal moving average process, and the subindex s refers to the seasonal period. We shall denote the subject model by ARIMA sQDPqdp ),,(),,( and defined as follows: )(),(),(),(s q s P qpBBBB )... 1()(2 21 p p pB BBB )... 1()(2 21 q q qBBBB Ps P s s s PB BB B ... 1)(2 2 1 70

PAGE 82

and Qs Q s s s QB BBB ... 1)(2 2 1 The order of the multiplicative ARIMA model determines the structure of the model and it is essential to have a good methodology in term s of developing the forecasting model. In the present study, we star t with addressing the issue of the seasonal subindex s After we examine the original da ta, shown by Figure 4.1 and 4.2, we have reason to believe the monthly temperature of the Continental United States behaves as a periodic function with a cy cle of 12 months. Hence, we let the seasonal subindex 12 s In time series analysis, one cannot proceed with a model building procedure without confirming the stationarity of a given stocha stic realization, thus we test the overall stationarity of the series by using the method introduced by Kwiatkowski, D., Phillips, P. C. B., Schmidt, P., and Shin, Y in 1992, (Kwiatkowski et al., 1992). Once the order of the differencing is identified, it is common for one ARIMA model that we have several sets of that are all adequately representing a give n set of time series. Akaikes information criterion, AIC, (Akaike, 1974), was first introduced by Akaike in 1974 plays a major role in our model selecting process. We shall choose the set of that produces the smallest AIC. sQDPqdp ),,(),,( ),,,( QPqp ),,,( QPqp Another important aspect in our model sel ection process is to determine the seasonal differencing, D the goal is to select a smaller AIC without complicating the selected model. Hence, we only compute the AIC for both 0 D and 1 D based on our previous selection of the orders ( ), and choose the model with smaller AIC to be our final model. QPqdp ,,,, Below we summarize the model identifying procedure: 71

PAGE 83

Determine the seasonal period s Check for stationarity of the given time series by determining the order of differencing d, where } {tx ,...2,1,0 d according to KPSS test, until we achieve stationarity. Deciding the order m of the process, for our case, we let where 5 m mQPqp After being selected, listing all possible configurations of for ),( md ),,,( QPqp mQPqp For each set of estimates the parameters for each model, that is, ),,,( QPqp Q P q p ,...,,,,...,,,,...,,,,...,,21 21 21 21 Compute the AIC for each model, and choose the one with smallest AIC. After ( ) is selected, we determine th e seasonal differencing filter by selecting the smaller AIC between the model with QPqdp ,,,, 0 D and 1 D Our final model will have identified the order of ( ). QDPqdp ,,,,, 4.2. Development of Forecasting Models The historical temperature data for the c ontinental United States that we shall use are shown by Figure 4.1 and 4.2. A visual in spection does not show any obvious trends being present. Thus, we let the seasonal period 12 s Following the step-by-step procedure we described above, we found that the model best charac terizes the average monthly temperature of the Continental Unit ed States for both Version 1 and 2 is a ARIMA(2,1,1) (1,1,1)12 process, analytical given by 72

PAGE 84

(4.2.1) t tBBxBBBBB )1)(1()1)(1)( 1)(1(12 1 1 12 2 21 12 1 Expanding both sides of the above ARIMA, we have t tBBBxBB B BB B B B BB B ) 1(] ) ( )( )() ( ) 1()1( )()1(1[13 11 12 11 27 12 26 1211 25 11 24 1 15 122 14 111122 13 1111 12 1 3 2 2 21 1 Simplify it, we get 131112111 2712261211 25112411512214111122 131111 121 3222111) ( )( )() ( ) 1()1( )()1( t t tt t t t t t t t t t t t tx x x xx x x x xx x x Thus, the one-step ahead forecasting m odel for Version 1 data is given by (4.2.2) 13 121 1 27 26 25 24 15 14 13 12 3 2 19607.0 9742.09861.0 00017.0 0004.0 0895.0 0046.0 0369.0 0567.0 0891.1 9954.00371.0057.00941.1 t t t t t t t t t t t t t t tx x x x x x x x x x xx and the one-step ahead forecasting mode l for Version 2 data is given by (4.2.3) 13 12 1 27 26 25 241 15 14 13 12 3 2 19599.0 9741.09855.0 00014.0 0002.0 0916.0 0036.0 0395.0 0554.0 9009.0 9964.00396.00556.00952.1 t t t t t t t t t t t t t t tx x x x x x x x x x x x Note the closeness of the two forecasting models. 4.3 Evaluation of the Proposed Models We begin by forecasting for the last one hundred observations the monthly average temperature in the Continental United States for both Version 1 and 2, using the models given by expression 4.2.2 and 4.2.3. A gra phical presentation of the results is presented below by Figure 4.3 and 4.4. 73

PAGE 85

MonthTemperature 0 200 400 600 800 1000 1200 30405060708090 0 200 400 600 800 1000 1200 30405060708090 Original Data Predicted Value Figure 4.3 Actual VS. Predicted Values for Version 1 Dataset MonthTemperature 0 200 400 600 800 1000 1200 30405060708090 0 200 400 600 800 1000 1200 30405060708090 Original Data Predicted Value Figure 4.4 Actual VS. Predicted Values for Version 2 Dataset As can be observed that both models are si milar and the one-step ahead forecasting is quite good, except the temperature of January 2006 took an unexpected turn. We identify this inconsistency as a possible outlier. 74

PAGE 86

We proceed to calculate the residuals estimates, for both forecasting process given by (4.2.2) and (4.2.3). The results are grap hically presented below by Figure 4.5 and 4.6. tttxxr MonthTemperature 0 200 400 600 800 1000 1200 -15-10-5051015 Figure 4.5 Residual Plot for Monthly Temperature (Version 1 Dataset) MonthTemperature 0 200 400 600 800 1000 1200 -15-10-5051015 Figure 4.6 Residual Plot for Monthly Temperature (Version 2 Dataset) 75

PAGE 87

We observe that the residuals are quite small and isolati ng around the zero axis as expected. It indicates that both models are good models in predicting the Version 1 and Version 2 of the time series. Next, we evaluate the mean of the residuals, r the variance, the standard deviation, standard error, SE and the mean square error, MSE The results are presented below by Table 4.1 and 4.2, for Versio n 1 and Version 2 data, respectively. 2 rS rS Table 4.1 Basic Evaluation Stat istics (Version 1 Dataset) r 2S rrSE MSE S -0.008512476 4.331902 2.081322 0.05673052 4.328756 Table 4.2 Basic Evaluation Stat istics (Version 2 Dataset) r 2S rrSE MSE S -0.01310953 4.323726 2.079357 0.05667696 4.320685 We observe that all evaluation criteria s upport the quality of th e proposed forecasting model. We can also conclude the similarity of the two models. Thus, it raises the question is the effort to collect two data sets imple ment two different procedures by two agencies necessary? We have demonstrated that our proposed models are capable of representing the past monthly average temperature of the Contin ental United States, it is also essential to show that these models are also capable of forecasting the future values of the temperature. Therefore, we hi de the last 12 months of the temperature, restructure the 76

PAGE 88

models (4.2.2) and (4.2.3) and try to predict the following mo nths only using the previous information. For example, we used the first 1334 observations to forecast Then we use the observations to forecast and continue this process until we obtain the forecasting values of the last 12 observations, that is, Table 4.3, gives the actual, forecasting and residual data for the subject 12 months. } ,...,,{1334 21xxx 1335x } ,...,,{1335 21xxx 1336x },...,,{1346 1336 1335 xxx Table 4.3 Original VS. Forecast Values (Version 1 Dataset) Original Values Forecast Values Residuals March 2006 43.31 44.0291 -0.7191 April 2006 56.03 53.1361 2.89395 May 2006 63.06 62.5318 0.52821 June 2006 71.44 70.6153 0.82467 July 2006 77.1 75.5855 1.51453 August 2006 74.1 74.2054 -0.1054 September 2006 63.69 66.6904 -3.0004 October 2006 52.97 55.4991 -2.5291 November 2006 44.68 43.2673 1.41275 December 2006 36.64 34.6357 2.00433 January 2007 31.39 32.58 -1.19 February 2007 32.86 36.2024 -3.3424 Figure 4.7 below gives a graphical presentation of the information presented in Table 4.3 for Version 1 observed time series. 77

PAGE 89

MonthTemperature 024681 01 2 304050607080 024681 01 2 304050607080 Original Data Predicted Value Figure 4.7. Actual VS. Predicted Values fo r the Last 12 Observations (Version 1) Similarly, for Version 2 of the data se t, we have calculated the estimates presented by Table 4.4. Table 4.4 Original VS. Forecast Values (Version 2 Dataset) Original Values Forecast Values Residuals March 2006 43.45 44.1812 -0.7312 April 2006 56.12 53.2506 2.86942 May 2006 63.12 62.6351 0.48486 June 2006 71.55 70.7152 0.83478 July 2006 77.22 75.6947 1.52532 August 2006 74.19 74.3167 -0.1267 September 2006 63.86 66.8069 -2.9469 October 2006 53.13 55.6137 -2.4837 November 2006 44.58 43.3947 1.18529 December 2006 36.79 34.7224 2.06761 January 2007 31.46 32.6854 -1.2254 February 2007 32.86 36.3025 -3.4425 78

PAGE 90

A graphical presentation of the results given in Table 4.4 are given below by Figure 4.8. MonthTemperature 024681 01 2 304050607080 024681 01 2 304050607080 Original Data Predicted Value Figure 4.8 Actual VS. Predicted Values fo r the Last 12 Observations (Version 2) We remark the similarity of the results of both models and the good forecast values. 4.4 Conclusion In the present study, we have developed two seasonal autoregressive integrated moving average models to forecast the m onthly average temperature in degrees Fahrenheit in the Continental United Stat es using historical monthly data from 18952007. The two statistical models are based on two different methods of recording and weighting the subject temperature namely, USCD (Version 1) and USHCN (Version 2). Although the two different sets of data are some what similar, we believe from a statistical perspective that the one from USHCN is mo re appropriate to use. The two developed statistical models were evaluated using vari ous statistical criteria and it was shown that both forecasting processes produced good estimates of the subject matter. 79

PAGE 91

80 Chapter 5 Global Warming: Carbon Dioxide Proposed Forecasting Model 5.0 Introduction Global Warming is one of the most comp elling and difficult problems facing our society. It is well underst ood that carbon dioxide (CO 2 ), along with temperature are the primary causes of global warming. The present study is concerned with developing analytical statistical models to predict CO 2 Jim Verhulst, Perspective Editor, St. Petersburg Times, writes, Car bon dioxide is invisibleno co lor, no odor, no taste. It puts out fires, puts the fizz in seltzer and is to plants what oxygen is to us. It is hard to think of it as a poison. (Verhulst 2007). The Un ited States is emitting approximately 5.91221 billion metric tons of carbon dioxide in the atmosphere, which makes us one of the World leaders. In addition to CO 2 in the atmosphere, we have CO 2 emissions that are related to gas, liquid, and solid fuels along with gas flares and cement production. The aim of the present chapter is to deve lop two different statistical models for the carbon emissions and atmospheric carbon dioxi de in the United Stat es using historical data from the subject matter. 5.1 Carbon Dioxide Emission Modeling The CO 2 emissions data set that we used to develop the proposed model contains the monthly emissions data from 1981 to 2003 (see Appendix D). It was published by Carbon Dioxide Information Analysis Center (CDIAC), which is supported by the United

PAGE 92

States Department of Energy. The CDIAC is a well known organization, which responds to data and information requests from us ers worldwide inves tigating the greenhouse effect and global climate change. For detailed information, see (United States Environmental Protection Agency (EPA), 2004; Marland et al., 2003). A graphical presentation of the emissions data is given by Figure 5.1. MonthCO2 Emission 0 50 100 150 200 250 90100110120130140150 Figure 5.1 Time Series Plot on CO 2 Emission 1981-2003 In forecasting the CO 2 emission, we start with addre ssing the issue of the seasonal subindex s After we examine the original datase t, shown by Figure 5.1, we note that the monthly CO 2 emission behaves as a periodic func tion with a cycle of 12 months and contains a small upward trend. Thus, we let the seasonal subindex Follow by the step-by-step procedure as we described in section 4.2, we found the model that best characterizes the monthly emissions of the United States is an ARIMA(1,1,2) (1,1,1)12 process, analytically given by 12 S (5.1.1) t tBBBxBBBB )1)( 1()1)(1)(1)(1(12 1 2 21 12 1 12 1 81

PAGE 93

After expanding both sides of model (5.1) and estimate its coefficients. The final statistical model for CO 2 emission, with the appropriate estimate of the weights are given by ECO2 (5.1.2) 14 13 12 2 1 26 25 24 14 13 12 2 1 210517.0 8512772.0 8523.0 1234.09988.0 002549.0 007449.0 0049.0 5228495.0 527749.1 0049.15203.05203.1 t t t t t t t t t t t t tx x x x x x x x COE Once we obtained proper coefficients, we sha ll proceed to evaluate the proposed model and illustrate the quality of the model. The forecasting values that obtained from the proposed statistical model, (5.1.2), for CO 2 emissions in the United States is graphically presented Figure 4.2. MonthCO2 Emissions 0 50 100 150 200 250 90100110120130140150 0 50 100 150 200 250 90100110120130140150 Original Data Predicted Value Figure 5.2 Actual VS. Predicted Values for CO 2 Emission 1981-2003 As can be observed, the predicted va lues follow the actual values of CO 2 Emission closely. It indicates that the overa ll quality of the mode l is good. We shall 82

PAGE 94

proceed to calculate the residuals estimates, and the results are graphically presented below by Figure 5.3 tttxxr MonthCO2 Emissions 0 50 100 150 200 250 -20-1001020 Figure 5.3 Residuals Plot for CO 2 Emissions The residuals are quite small and isolati ng around the zero axis as expected. It indicates that the proposed model forecasts the CO 2 emissions closely in the United States. The mean of the residuals, r the variance, the standard deviation, standard error, SE and the mean square error, MSE, are presented below by Table 5.1. 2rS rS Table 5.1 Basic Evaluation on CO 2 Emissions Model r 2S rr S SE MSE 0.2339641 8.055668 2.838251 0.1708426 8.08122 83

PAGE 95

We observe that all evaluation criter ia support the quality of the proposed forecasting model for CO 2 emissions. We now proceed to further evaluate mode l (5.1.1) hiding the last 12 months of the CO 2 recordings and re-estimating the coefficients of the model (5.1.1). Having restructured the model (5.1.1) we proceed to estimate the hidden recordings. For example, we used the first 264 observations to forecast Then we use the observations to forecast and continue this process until we obtain the forecasting values of the la st 12 observations, that is, Table 5.2, gives the actual, forecasting and residual data for the subject 12 months. } ,...,,{264 21xxx 265x } ,...,,{265 21xxx 266x },...,,{276 266265 xxx Table 5.2 Original VS. Forecasting Values on CO 2 Emissions Model Original Values Forecast Values Residuals January 2003 147.6298 145.2361 2.3937 February 2003 134.1716 132.6554 1.5162 March 2003 133.6979 137.3912 -3.6933 April 2003 121.0047 124.5518 -3.5471 May 2003 120.4789 122.4091 -1.9302 June 2003 120.7394 123.101 -2.3616 July 2003 132.4187 129.3481 3.0706 August 2003 135.1314 132.787 2.3444 September 2003 121.7753 123.8295 -2.0542 October 2003 125.2487 125.9811 -0.7324 November 2003 126.2127 126.812 -0.5993 December 2003 143.1509 141.1834 1.9675 84

PAGE 96

Note the closeness between the origin al and forecast values. A graphical presentation of the results given in Table 4.2 is given below by Figure 5.4. MonthCO2 Emissions 024681 01 2 120125130135140145 024681 01 2 120125130135140145 Original Data Predicted Value Figure 5.4 Monthly CO 2 Emission VS. Forecast Values for the Last 12 Observations It can be seen that the predicted valu es produced by our proposed model follow the actual values of CO 2 Emissions closely. It not only s hows that our proposed model is capable of forecasting CO 2 Emissions without using any future information but also speaks out the usefulness of the model. 5.2 Atmospheric Carbon Dioxide Modeling The data set that we used to devel op our second proposed model consists of monthly CO 2 concentrations in the atmosphere from 1958 to 2004 (see Appendix E). The data was collected in Mauna Loa by Carbon Dioxide Research Group, Scripps Institution of Oceanography, University of California. A map of geographical location of Mauna Loa is provided by Figure 5.5. 85

PAGE 97

Figure 5.5 Geographical Lo cation of Mauna Loa At the earlier stage of our model building process, we spot several missing values in the early 1960s. To address this problem, we decided to use the data from 1965 to 2004, which is a period which contains no mi ssing values. For additional information concerning the data set on CO 2 concentrations in the atmo sphere, see (Bacastow, 1979; Bacastow & Keeling, 1981; Bacastow et al., 1980; Bacastow et al., 1985; Keeling, 1960; Keeling, 1984; Keeling, 1998; Keeling et al., 1976; Keeling et al., 1982; Keeling et al., 1989; Keeling et al., 1996; Keeling et al., 1995; Pales & Keeling, 1965; Keeling et al., 2002; Whorf & Keeling, 1998). A plot of the actual CO 2 concentration in the atmosphere is given by Figure 5.6. It provides a visual presentation of the time series plot of CO 2 concentrations in the atmosphere. 86

PAGE 98

MonthAtmospheric CO2 Concentration 0 100 200 300 400 320330340350360370380 Figure 5.6 Time Series Plot for Monthly CO 2 in the Atmosphere 1965-2004 In forecasting the atmospheric CO 2 we begin by addressing the issue of the seasonal subindex s After we examine the original data sets, shown by Figure 5.6, we note that the CO 2 in the atmosphere data has a more obvious upward trend compare to the CO 2 emission and the shape of its pattern is almost identical every year, as shown by Figure 5.6. Thus, we set the seasonal period 12 S We have identifie d that the model that best described the monthly CO 2 concentrations in the a tmosphere according to the procedure that we discussed in section 4.2 is an ARIMA(2,1,0) (2,1,1)12 process, analytically given by (5.2.1) t tB xBBBB BB )1()1)(1)( 1)( 1(12 1 12 2 2 1 24 2 12 1 After expanding both sides of model (5.2.1) and estimate its coefficients. The final statistical model for atmospheric CO 2 , with the appropriate estimate of the weights are given by ACO2 87

PAGE 99

(5.2.2) 12 39 38 37 36 27 26 25 24 15 14 13 12 3 2 1 28787.0 00085.0 0015116.0 005234.0 0076.0 00768.0 013585.0 047038.0 0683.0 12093.0 213997.0 74097.0 0759.11124.01989.06887.0 t t t t t t t t t t t t t t t tx x x x x x x x x x x x x x x COA We shall proceed to evaluate these models a nd illustrate the quality of both models in the next section. MonthAtmospheric CO2 0 100 200 300 400 320330340350360370380 0 100 200 300 400 320330340350360370380 Original Data Predicted Value Figure 5.7 Actual VS. Predicted Values for Atmospheric CO 2 1965-2004 Obviously, this graphical presentation attests to show the quality of the proposed model. A plot of the residuals is given by Figure 5.8 below. 88

PAGE 100

MonthAtmospheric CO2 Concentration 0 100 200 300 400 -4-2 0 2 4 Figure 5.8 Residuals Plot for Atmospheric CO 2 The residuals of our proposed model are very small and isolating around the zero axes. It illustrates the quality of the model. The following Table 5.3 gives a basic evaluation statistics of the proposed model. Table 5.3 Basic Evaluation on Atmospheric CO 2 Model r 2S rr S SE MSE 0.01140137 0.08460756 0.2908738 0.01327651 0.08456128 These results also confirm the effectiven ess of the proposed model for forecasting CO 2 in the atmosphere. We shall use the same technique as we used in the previous application to illustrate the quality of our pr oposed model in terms of foreca sting values in the future. Again, we hide the last 12 months of atmospheric CO 2 recordings and try to predict them only using the information from the past. Table 5.4 gives the numerical comparison between the original series and the forecasting. 89

PAGE 101

Table 5.4 Original VS. Forecasting Values on Atmospheric CO 2 Model Original Values Forecast Values Residuals January 2004 376.79 376.7963 -0.0063 February 2004 377.37 377.609 -0.239 March 2004 378.41 378.1837 0.2263 April 2004 380.52 379.6653 0.8547 May 2004 380.63 380.8268 -0.1968 June 2004 379.57 380.2339 -0.6639 July 2004 377.79 378.3489 -0.5589 August 2004 375.86 375.837 0.023 September 2004 374.06 374.1871 -0.1271 October 2004 374.24 374.1482 0.0918 November 2004 375.86 375.6897 0.1703 December 2004 377.48 377.2186 0.2614 The residuals that were calculated are s hown by Table 5.4 are all very small, and a graphical presentation of the resu lts is given below by Figure 5.9. MonthAtmospheric CO2 Concentration 024681 01 2 374376378380382 024681 01 2 374376378380382 Original Data Predicted Value Figure 5.9 Actual VS. Predicted Values for CO 2 in the Atmosphere 90

PAGE 102

91 Thus, we can conclude that the proposed mode l (5.2.1), forecasts very well on the future behavior of CO 2 in the atmosphere. 5.3 Conclusion We have developed two non-stationary time series statistical models with trend and seasonal effects to predic t future estimates of carbon di oxide emissions and that in the atmosphere. We use actual CO 2 recordings in both situati ons to develop the subject statistical models. The developed processes were evaluated to attest the degree of quality by using various statistical criteria. Finall y, we tested the acc uracy of the proposed models by predicting and analyzing the CO 2 emission and atmosphere for 12 months. The results are very encouraging.

PAGE 103

92 Chapter 6 Global Warming: Temperature & Carbon Dioxide Prediction Modeling 6.0 Introduction The object of the present chapter is to propose forecasting models for the monthly Carbon Dioxide in the atmosphere and monthl y temperature of the Continental United States. The approach of the subject model is to use regr ession analysis on the monthly temperature of Continental United States to explain the difference of the monthly Carbon Dioxide and vice versa. Therefore, the final fo rm of the subject models is a combination of regression model based on monthly temp erature predicting the difference of the monthly Carbon Dioxide adding a time series term of the previous month atmospheric Carbon Dioxide. 6.1 Relationship between Carbon Dioxide & Temperature Many studies have been done in the subj ect of Global Warming. In fact, it is common to use time series analysis to form a forecasting model when historical information is available. Shih & Tsokos introduces the time series approach on forecasting both temperature of the Continenta l United States and Carbon Dioxide in the atmosphere. In the present study, we take th e monthly temperature of the Continental United States from 1965 to 2004 along with the monthly atmospheric CO 2 from 1965 to 2004 and try to explore the relationship between those two. Figure 6.1 is the illustration

PAGE 104

of the time series plot on monthly temperat ure of the Continental United States from 1965 to 2004. Timets(CO2) 0 100 200 300 400 320330340350360370380 Figure 6.1 Monthly Atmospheric Ca rbon Dioxide from 1965 to 2004 Obviously, the monthly atmospheric CO 2 behave as a periodic function and has a stable upward trend present at all time period. Since there are total of 480 observations, we denote them as Figure 6.2 is the time series plot of the mean 480 21,...,, xxx temperature of the Continental Un ited States from 1965 to 2004. Timets(Temp) 0 100 200 300 400 3040506070 93 Figure 6.2 Monthly Temperature of the Con tinental United States from 1965 to 2004

PAGE 105

It is expected that the average monthl y temperature of the Continental United States behave as a periodic function with a clear seasonal variation that is obvious to identify. Similar to the CO 2 data, we denote the 480 temperature observations as The correlation coefficient is defined as 480 21,...,, yyy y xs yy s xx n r )()( 1 1 (6.1.1) By using (6.1.1), we calculated the correlation coefficient between the temperature and CO 2 and found 04704972.0 r which does not show much relationship between those two variables. In order for our proposed model to be accu rate, it is important to have high correlation coefficient to drive the regression pa rt of the subject mode l. Hence, a filtering process becomes necessary during our model building procedure. Consider the following difference filter 1)1( tttxxxB (6.1.2) It is obvious that the atmospheric CO 2 series contains an upward trend, but the temperature series doesnt. Hence, removi ng the upward trend from the atmospheric CO 2 series is the first step of our model building procedure. By applying (6.1.2), we can produce the differencing seri es of the atmospheric CO 2 denote as Where 479 21,...,, zzz t txBz )1( (6.1.3) Figure 6.3 gives the time series plot of th e first order differencing atmospheric CO 2 series. 94

PAGE 106

Timets(x) 0 100 200 300 400 -2 -1 0 1 2 Figure 6.3 First Order Differencing Monthly Atmospheric CO 2 Series It can be seen that the differencing filt er has remove the upward trend from the atmospheric CO 2 series, and turn it into a periodic function similar to the temperature series. After the examination of the temperat ure and the differencing series, we want to compare both series by graphing both series on a time series plot. Figure 6.4 shows the illustration of that subject matter. Time 0 100 200 300 400 02 04 06 0 Time 0 100 200 300 400 02 04 06 0 Temperature CO2 Figure 6.4 The Temperature Series VS. The Differencing CO 2 Series 95

PAGE 107

The purpose of our next several transformati ons is to bring the temperature series to the same level as the CO 2 series, so we can force a co rrelation between two series. We begin with computing the mean of both series using (6.1.4). n x xn i i1 (6.1.4) We found the mean of the temperature series is 53.03569, and the mean of the differencing CO 2 series is 0.1211691. Hence, the first transformation is to subtract 52.91452 from the temperature series. That is d dyutt (6.1.5) Therefore, after the transforma tion (6.1.5), both series have th e same mean as a periodic function. We then examine the minimum and maximum of both series. The following Table 6.1 gives a comparison on both series. Table 6.1 Comparison between Two Series Series Minimum Maximum Difference Temperature -30.5357 23.51431 54.05 CO 2 -2.53 2.11 4.64 Ratio 11.64871 After the examination of both series, it is important to determine the ratio between both series, because it plays a majo r role on our next transformation. 64871.11t tu v (6.1.6) After the 2 nd transformation being made, we examin e the time series plot on both series, shown as Figure 6.5. 96

PAGE 108

Time 0 100 200 300 400 -2-10123 Time 0 100 200 300 400 -2-10123 Temperature CO2 Figure 6.5 Comparison between Both Series After Transformations It is obvious that both series behave as periodic functi ons at the same level. By examine the cross correlation function of the tw o, we can find the lag that shall give us the highest correlation. Consider two series and where ix iy 1,...,2,1,0 ti The cross correlation r at delay d is defined as i i i i i di iyyxx yyxx r2 2)()( )])([( (6.1.7) We then plot the cross correlation at different lag betw een the difference atmospheric CO 2 and temperature on Figure 6.6. 97

PAGE 109

-20 -10 0 10 20 -0.5 0.0 0.5 LagACF x & y Figure 6.6 Cross Correlation at Different Lag After reviewing the cross correlation function shown as Figure 6, it is clear to see that there exists a negative correlation between the difference of atmospheric CO 2 and the temperature at lag 0, because r is the maximum at lag 0. Hence we can now calculate the correlation coefficient between these two series by using (6.1.1), and we have which indicate a negative linear relationship between two series. -0.8109993 r 6.2 Carbon Dioxide & Temperature Model-01 The aim of this section is to de velop a model for the atmospheric CO 2 knowing the monthly temperature. Since all transforma tions being made are invertible, we can go ahead build the regression model first, and then use backward filter to solve back the original series. Let the first 479 observa tions of the temperature series after transformations be The simple regression model formulated between series and is 479 21,...,, vvv tz tv 98

PAGE 110

(6.2.1) t tv z 7640.01319.0 The simple regression model given by e quation (6.2.1) indicate s the difference of the atmospheric CO 2 is explained by the transforming te mperature series. In addition, we can solve back the original atmospheric CO 2 and temperature series by using equation (6.1.3), (6.1.5), and (6.1.6). Therefore, the fina l analytical form of the proposed model is 1) 64871.11 91452.52 (7620.01319.0 t t tx y x And we can simplify the model as (6.2.2) 106541.05933.3 tt txy x Table 6.2 illustrates the resi dual analysis of (6.2.2) Table 6.2 Residual Analysis for Combine Model-1 r 2 rS rS n Sr 2.650722e-17 0.5242516 0.7240522 0.03280994 After we develop our proposed model, it is essential to evalua te the performance of the model. In this section, we shall illu strate the evaluation and usefulness of our proposed model by using one-step ahead forecasti ng technique. That is we hide the last 12 month of atmospheric CO 2 data, and try to forecast th em using only the previous information. In example, we forecast using only and ; forecast using only and ;; and forecast using only and In addition, we update the coefficient of the model once 469x 468 21,...,, xxx 469 21,...,, yyy 470x 469 21,...,, xxx 470 21,...,, yyy 480x 479 21,...,, xxx 480 21,...,, yyy 99

PAGE 111

100 we obtain new information. Table 2 gives th e monthly comparison between the actual and the forecasts that produces by our proposed model. Table 6.3 Comparison between Actu al and Forecast for Model-1 Month (2004) Temperature CO 2 (Actual) CO 2 (Forecast) Residuals January 30.34 376.79 376.9535 -0.1635 February 33.91 377.37 378.3881 -1.0181 March 47.94 378.41 378.7315 -0.3215 April 53.53 380.52 378.8619 1.6581 May 62.87 380.63 380.6131 0.0169 June 68.87 379.57 380.1183 -0.5483 July 73.59 377.79 378.6683 -0.8783 August 70.8 375.86 376.5785 -0.7185 September 66.26 374.06 374.8254 -0.7654 October 55.84 374.24 373.3174 0.9226 November 44.34 375.86 374.1783 1.6817 December 35.15 377.48 376.5517 0.9283 From Table 6.3, it can be seen that the forecasts fit closely to the actual CO2, it speaks out the quality of the model. 6.3 Carbon Dioxide & Temperature Model-02 The aim of this section is to develop a model for the monthly average temperature knowing the CO 2 in the atmosphere. We shall use those transformations that we discussed in section 6.1 to build the regression model first, and then use backward filter to solve back the original series. Let the fi rst 479 observations of th e temperature series

PAGE 112

after transformations be The simple regression model formulated between series and is 479 21,...,, vvv tv tz (6.3.1) t tz v 8578.01180.0 The simple regression model given by e quation (6.3.1) indicate s the transforming temperature series is explained by the difference of the atmospheric CO 2 In addition, we can solve back the original temperature and atmospheric CO 2 series by using equation (6.1.3), (6.1.5), and (6.1.6). Therefore, the fina l analytical form of the proposed model is )(8578.01180.0 64871.11 91452.521 tt txx y And we can simplify the model as (6.3.2) 199226.999226.92891.54 t t tx x y Table 6.2 illustrates the resi dual analysis of (6.2.2) Table 6.4 Residual Analysis for Combine Model-2 r 2 rS rS n Sr -1.646039e-06 80.08299 8.94891 0.4088861 After we develop our proposed model, it is essential to evalua te the performance of the model. In this section, we shall illu strate the evaluation and usefulness of our proposed model by using one-step ahead forecasti ng technique. That is we hide the last 12 month of the temperature data and try to forecast them using only the existing information. In example, we forecast using only and ; forecast using only and ;; and forecast using only 469y 469 21,...,, xxx 468 21,...,, yyy 470y 470 21,...,, xxx 469 21,...,, yyy 480y 101

PAGE 113

480 21,...,, xxx and In addition, we update the coefficient of the model once we obtain new information. Table 2 gives th e monthly comparison between the actual and the forecasts that produces by our proposed model. 479 21,...,, yyy Table 6.5 Comparison between Actu al and Forecast for Model-2 Month (2004) CO2 Temperature Temperature (Forecast) Residuals January 376.79 30.34 40.70947 -10.36947 February 377.37 33.91 43.27709 -9.36709 March 378.41 47.94 48.37591 -0.43591 April 380.52 53.53 43.7548 9.7752 May 380.63 62.87 33.13691 29.73309 June 379.57 68.87 53.18363 15.68637 July 377.79 73.59 64.91555 8.67445 August 375.86 70.8 72.14047 -1.34047 September 374.06 66.26 73.62139 -7.36139 October 374.24 55.84 72.27571 -16.43571 November 375.86 44.34 52.4835 -8.1435 December 377.48 35.15 38.11712 -2.96712 From Table 2, it can be seen that the forecasts fit closely to the actual CO2, it speaks out the quality of the model. 6.4 Temperature Model It is obvious that the average monthl y temperature of the Continental United States contains a seasonal pattern. We consider the temperature series and take the average of each month as by using the following transformation. 480 21,...,, xxx 12 21,...,, mmm 102

PAGE 114

40 ...469 131 1xxx m 40 ...470 142 2xxx m 40 ...480 2412 12xxx m (6.4.1) We then create a new series } {t simply by repeating the series shown as (6.4.2). 12 21,...,, mmm },...,,{},...,,{12 21 1221mmm } ,...,,{},...,,{12 21 24 1413mmm } ,...,,{},...,,{12 21 480 470469mmm (6.4.2) Let } {t be the difference between the temperature series and the new series } {tx }{t shown as (6.4.3). tttx (6.4.3) By using the methodology that we discusse d in section 2.1, we found the best ARIMA model on the series } {t is a ARIMA(2,1,2), that is (6.4.4) t tBB BBB ).9819 .01811()1)(.1938 .7755-1(2 2 Expanding the autoregressive operator and the first difference filter, we have t tBB BBB )9819..01811().1938 9693.1.77551(2 3 2 and the model can be written as 2 1 3 2 19819.0181. .1938 9693. 1.7755 t t tt t t t 103

PAGE 115

The final analytical form of the proposed forecasting model can be written as (6.4.5) 2 1 3 2 19819.0181. 0.1938 9693.7755.1 t t t t t t We can solve back the original temperatur e series by combining (6.4.2) and (6.4.3), we have (6.4.6) tt t t t t tx 2 1 3 2 19819.0181. 0.1938 9693.7755.1 After we obtained the predicted values for the temperature series, we proceed to evaluate the residuals of the model by calculating r , and 2 rS rS n Sr shown as Table 6.6. Table 6.6 Basic Evaluation Statisti cs for the Temperature Model r 2 rS rS n Sr 0.1290907 3.75889 1.938786 0.08849305 It can be seen that the evaluation of Ta ble 6.6 supports the quality of the proposed temperature model. 6.5 Conclusion In the present chapter, we look into the trend and seasonal patterns of the CO 2 and temperature time series and examine the rela tionship between the two series. We discover that there exists a strong linea r relationship between two seri es after we applied several transformations into both series. Thus, the first two proposed stat istical models that relates CO 2 to temperature are semi-regression based models; that is, knowing the atmospheric temperature we can at the speci fic location estimate the carbon dioxide and vice versa. Our last proposed model begins with removing the seasonal variation of the 104

PAGE 116

105 temperature series by subtrac ting the mean function from the series. We found the new series become much easier to predict than th e original series. Since all transformations that applied to the series are invertible, we can predict the origin al series by applying a back shift operator. The comparisons betw een the actual and forecasts of each model were provided and they attest th e quality of the proposed models.

PAGE 117

Chapter 7 Future Research As a result of the present study, we will continue the research on the subject area by studying the following problems. Investigate the selection of the best ARIMA model utilizing AIC versus BIC with respect to small, medium and large sample sizes. In the proposed forecasting models with k -th moving average, k -th weighted moving average, and k -th exponential weighted moving average processes, we want to be able to determine the optimal k that will produce the smallest residuals. We will also like to study the robustness and sens itivity of the selected models when k changes. Once a particular model has been identif ied with an actual sample size, we want to study the consistency of the or ders of the model when the sample size changes. In addition, we will study the Four ier transform of the developed models so that we can investigate the be havior of the variance as a function of time for a given set of data. This information will assist us in improving the selected model. Finally, we will restructure the proposed models so that would be instantly updated as new information becomes available. We will obtain confidence limits for short term and long term forecasting and compare the confidence range with other acceptable and useful models. 106

PAGE 118

107 Reference Akaike, H. (1974). A New Look at the Statistical Model Identification, IEEE Transactions on Automatic Control, AC-19, 716-723. Alexandersson, H. & A. Moberg, (1997). Homogenization of Swedish temperature data. Part I: Hom ogeneity test for linear trends. Int. J. Climatol., 17, 25-34. Bacastow, R.B. (1979). Dip in the atmospheric CO2 level during the mid-1960s. Journal of Geophysical Research 80:3109-14. Bacastow, R.B., & C.D. Keeling. (1 981). Atmospheric carbon dioxide concentration and the observed airborne fr action. In B. Bolin (ed.), Carbon Cycle Modelling, SCOPE 16. John Wile y and Sons, New York. Bacastow, R.B., J.A. Adams, Jr., C.D. Keeling, D.J. Moss, T.P. Whorf, & C.S. Wong. (1980). Atmospheric carbon dioxide, the S outhern Oscillation, and the weak 1975 El Nio. Science 210:66-68. Bacastow, R.B., C.D. Keeling, & T.P. Whorf. (1985). Seasonal amplitude increase in atmospheric CO2 concentration at Mauna Loa, Hawaii, 1959-1982. Journal of Geophysical Research 90(D6):10529-40. Baker, D. G., (1975). Effect of observa tion time on mean temperature estimation. J. Appl. Meteor., 14, 471-476. Banerjee, A., J. J. Dolado, J. W. Galb raith, & D. F. Hendry (1993), Cointegration, Error Correction, and the Econometric Analysis of Non-Stationary Data, Oxford University Press, Oxford. Bollerslev, T. (1986). Generalized autoregr essive conditional heteroskedasticity, Journal of Econometrics, 31, 307-327. Box, G. E. P. and G. M. Jenkins, (1970) Time Series Analysis Forecasting and Control. Holden-Day, San Franscisco. Box, G. E. P., G. M. Jenkins, & G. C. Reinsel, (1994). Time Series Analysis: Forecasting and Control, 3rd ed., Pren tice Hall, Englewood Cliffs, NJ., 89-99. Box, G. E. P., G. M. Jenkins, & G. C. Reinsel, (1994). Time Series Analysis: Forecasting and Control, 3rd ed., Pren tice Hall, Englewood Cliffs, NJ., 224-247.

PAGE 119

108 Brockwell, P. J., & R. A. Davis, (1996). Introduction to Time Series and Forecasting., Springer, New York., Sections 3.3 and 8.3. Brown, R. G. (1962). Smoothing, Forecasti ng and Prediction of Discrete Time Series, Prentice-Hall, New Jersey. Chen, C. T. (1999). Linear System Theory and Design, 3rd. ed., Oxford University Press. Crane, D. B. and J. R. Eratty, (1967). A two stage forecasting model: exponential smoothing and multiple regression, Mana gement Science, Vol. 13, No. 8. Dickey, D. A., & W. A. Fuller, (1979) Distribution and the Estimators for Autoregressive Time Series With a Unit R oot., Journal of the American Statistical Association, Vol. 74, No. 366, 427-431 Dickey, D. A., D. P. Hasza, & W. A. Fuller, (1984). Testing for Unit Roots in Seasonal Time Series., Journal of the Americ an Statistical Association, Vol. 79, No. 386, 355-367. Durbin, J., & S. J. Koopman, (2001). Ti me Series Analysis by State Space Methods., Oxford University Press. Easterling, D.R., & T.C. Peterson, (1995). A new method of detecting undocumented discontinuities in climatological time series, Int. J. of Climatol., 15, 369377. Easterling, D. R., T. R. Karl, E.H. Mas on, P. Y. Hughes, & D. P. Bowman. (1996). United States Historical C limatology Network (U.S. HCN) Monthly Temperature and Precipitation Data. ORNL/CDIAC-87, NDP-019/R3 Carbon Dioxide Information Analysis Center, Oak Ridge National Laborator y, U.S. Department of Energy, Oak Ridge, Tennessee. Easterling, D. R., T. R. Karl, J. H. Lawrimore, & S. A. Del Greco. (1999). United States Historical Climatology Network Daily Temperature, Precipita tion, and Snow Data for 1871-1997. ORNL/CDIAC-118, NDP-070 Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee. Enders, W. (1995). Applied Econometrics Time Series, John-Wiley & Sons, 139149. Engle, R. F. (1982). Autoregressive condi tional heteroskedasticity with estimates of variance of United Kingdom In flation, Econometrica, 50, 987-1007.

PAGE 120

109 Engle, R. F. (2001). "GARCH 101: The Use of ARCH/GARCH Models in Applied Econometrics", Journal of Economic Perspectives 15(4):157-168. EPA (U.S. Environmental Protection Ag ency) (2004). Inventory or U.S. Greenhouse Gas Emissions and Si nks: 1990-2002, EPA 430-R-04-003, U.S. Environmental Protection Agency, Washingt on, D.C., 308 pp. plus annexes (291 pp.). Available electronically from: http://yosemite.epa.gov/oar/globalwarming.nsf/content/ ResourceCenterPublicationsGHGEmi ssionsUSEmissionsInventory2003.html Fox, A. J. (1972). Outliers in time series, Journal of the Royal Stat. Society, Series B, Vol. 34, No. 3. Gardner, G., A. C. Harvey, & G. D. A. Phillips, (1980). Algorithm AS154. An algorithm for exact maximum likelihood estimat ion of autoregressive-moving average models by means of Kalman filtering., Applied Statistics, 29, 311-322. Gujarati, D. N. (2003). Basic Econometrics, 856-862. Harrison, P. J. (1965). Short-term sale s forecasting. Applied Statistics. 14, 102139. Harvey, A. C., (1993). Time Series Models, 2nd Edition, Harvester Wheatsheaf., sections 3.3 and 4.4. Hinrichsen, D. and Pritchard, A. J. (2005). Mathematical Systems Theory I, Modelling, State Space Analysis, St ability and Robustness. Springer. Hughes, P. Y., E. H. Mason, T. R. Karl & W. A. Brower. (1992). United States Historical Climatology Network Daily Temp erature and Precipitation Data. ORNL/ CDIAC-50, NDP-042 Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee. Jones, R. H., 1980). Maximum likelihood f itting of ARMA models to time series with missing observations., Technometrics, 20, 389-395. Karl, T. R., C. N. Williams, Jr., & F. T. Quinlan. (1990). United States Historical Climatology Network (HCN) Serial Temperat ure and Precipitation Data. ORNL/CDIAC30, NDP-019/R1 Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee. Karl, T.R., C.N. Williams, Jr., P.J. Y oung, & W.M. Wendl&, (1986). A model to estimate the time of observation bias associat ed with monthly mean maximum, minimum, and mean temperature for the United St ates, J. Climate Ap pl. Meteor., 25, 145-160. Karl, T.R., & C.W. Williams, Jr., (1987). An approach to adjus ting climatological time series for discontinuous inhomogeneitie s, J. Climate Appl. Meteor., 26, 1744-1763.

PAGE 121

110 Karl, T.R., H.F. Diaz, & G. Kukla, (1988). Urbanization: its de tection and effect in the United States climate record, J. Climate, 1, 1099-1123. Karl, T.R., C.N. Williams, Jr., F.T. Quinla n, & T.A. Boden, (1990). United States Historical Climatology Network (HCN) Seri al Temperature and Precipitation Data, Environmental Science Division, Publication No. 3404, Carbon Dioxide Information and Analysis Center, Oak Ridge National Laboratory, Oak Ridge, TN, 389 pp. Keeling, C.D. (1960). The concentra tion and isotopic abundance of carbon dioxide in the atmosphere. Tellus 12:200-203. Keeling, C.D. (1984). Atmospheric and oceanographic measurements needed for establishment of a data base for carbon dioxide from fossil fuels. In The Potential Effects of Carbon Dioxide-Induced Climatic Change s in Alaska. (Miscellaneous, etc.). The Proceedings of a Conference. Fairbanks, Al aska, April 7-8, 1982. School of Agriculture and Land Resources Management, Univ ersity of Alaska, Fairbanks. Keeling, C.D. (1998). Rewards and pena lties of monitoring the earth. Annual Review of Energy and the Environment 23:25 -82. Annual Reviews Inc., Palo Alto. Keeling, C.D., R.B. Bacastow, A.E. Bainbr idge, C.A. Ekdahl, Jr., P.R. Guenther, L.S. Waterman, & J.F.S. Chin. (1976). Atmospheric carbon dioxide va riations at Mauna Loa Observatory, Hawaii. Tellus 28(6):538-51. Keeling, C.D., R.B. Bacastow, & T.P. Whorf. (1982). Measurements of the concentration of carbon dioxide at Mauna Loa Observatory, Hawaii. In W.C. Clark (ed.), Carbon Dioxide Review: 1982. Oxford University Press, New York. Keeling, C.D., R.B. Bacastow, A.F. Carter S.C. Piper, T.P. Whorf, M. Heimann, W.G. Mook, & H. Roeloffzen. (1989). A three-dimensional model of atmospheric CO2 transport based on observed winds: 1. Analysis of observational data. In D.H. Peterson (ed.), Aspects of Climate Variability in the Pacific and the Western Americas. Geophysical Monograph 55:165-235. Keeling, C.D., J.F.S. Chin, & T.P. Whorf. (1996). Increased activity of northern vegetation inferred from atmospheric CO2 measurements. Nature 382: (6587) 146-49. MacMillan Magazines Ltd., London. Keeling, C.D., T.P. Whorf, M. Wahlen, & J. van der Plicht. (1995). Interannual extremes in the rate of rise of atmosp heric carbon dioxide since 1980. Nature 375:666670. Keeling, C.D., P.R. Guenther, G. Emanuele III, A. Bollenbacher, & D.J. Moss. (2002). Scripps Reference Gas Calibration System for Carbon Dioxide-in-Nitrogen and Carbon Dioxide-in-Air St andards: Revision of 1999 (with Addendum). SIO Reference Series No. 01-11.

PAGE 122

111 Khalil, H. K. Nonlinear Systems, 3rd. ed., Prentice Hall. Kwiatkowski, D., P. C. B. Phillips, P. Schmidt, & Y. Shin. (1992). Testing the Null Hypothesis of Stationarity against th e Alternative of a Un it Root., Journal of Econometrics, 54, 159-178. Lund, R., & J. Reeves, (2002). Detect ion of undocumented changepoints: a revision of the two-phase regression model. J. Climate, 15, 2547-2554. Marland, G., T.A. Boden, & R. J. Andres (2003). Global, Regional, and National CO2 Emissions. In Trends: A Compendium of Data on Global Change. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, TN, USA. Menne, M.J., & C.N. Williams, Jr., (2005). Detection of undocumented changepoints using multiple test statistics and composite reference series. J. Climate, 18, 4271-4286. Muth, J. (1960). Forecasting sales by exponentially weighted moving averages. Journal American Statistical Assoc., 55, 299. Nelson, D. B. (1991). Conditional heteroskedasticity in asset returns: A new approach, Econometrica 59: 347-370. Nise, N. S. (2004). Control Systems Engin eering, 4th ed., John Wiley & Sons, Inc. Pales, J.C., & C.D. Keeling. (1965). Th e concentration of atmospheric carbon dioxide in Hawaii. Journal of Geophysical Research 24:6053-76. Peterson, T.C., & D.R. Easterling, (1994). Creation of homogeneous composite climatological reference series Int. J. Climatol., 14, 671-680. Quayle, R.G., D.R. Easterling, T.R. Ka rl, & P.Y. Hughes, (1991). Effects of recent thermometer changes in the cooperative station network, Bull. Am. Meteorol. Soc., 72, 1718-1724. Quinlan, F. T., T. R. Karl, & C. N. Williams, Jr. (1987). United States Historical Climatology Network (HCN) Serial Te mperature and Precipitation Data. NDP-019 Carbon Dioxide Information Analysis Center. Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee. Rogers, A. J., (1986). Modified Lagra nge Multiplier Tests for Problems with One-Sided Alternatives, Journal of Econometrics, North-Holland., 31, 341-361 Said, S. E., & D. A. Dickey, (1984). T esting for Unit Roots in AutoregressiveMoving Average Models of Unknow n Order., Biometrika, 71, 599-607.

PAGE 123

112 Sakamoto, Y., M. Ishiguro, & G. Kitaga wa, (1986). Akaike Information Criterion Statistics., D. Reidel Publishing Company. Shih, S. H., & Tsokos, C. P., (2008). A Weighted Moving Average Process for Forecasting, To Appear, Journal of Modern Applied Statistical Methods. Shih, S. H., & Koutras, D., (2007). Analytical Model for Economic Forecasting, To Appear, Proceedings of The 5th Intern ational Conference on Dynamic Systems and Applications. Shih, S. H., & Tsokos, C. P., (2007). Ne w Nonstationary Time Series Models with Economic Applications, To Appear, Proceedings of The 5th International Conference on Dynamic Systems and Applications. Shih, S. H., & Tsokos, C. P., (2008). A Temperature Forecasting Model for the Continental United States, To Appear, The International Journal Neural, Parallel & Scientific Computations, Volume 16. Shih, S. H., & Tsokos, C. P., (2008). Prediction Models for Carbon Dioxide Emissions and the Atmosphere, To Appear, Th e International Journal Neural, Parallel & Scientific Computations, Volume 16. Shumway, R. H., & D. S. Stoffer, ( 2006). Time Series Analysis and Its Applications: with R Examples, 2nd ed., Springer, New York. Slutzky, E. (1927). The summation of ra ndom causes as the source of cyclical processes, Econometria, 5, 105-146 (1937). Transl ated from the earlier paper of the same title in Problems of Economic Conditions (Ed. Conjuncture Institute Moscow). Sontag, E. D. (1999). Mathematical Control Theory: Deterministic Finite Dimensional Systems. Second Edition. Springer. Trigg, D. W. and A. G. Leach, (1967) Exponential smoothing with an adaptive response rate. Operations Res earch Quarterly, Vol. 18, No. 1. Tsokos, C. P., (1973). Forecasting Models from Non-Stationary Time SeriesShort Term Predictability of Stocks., Mathemat ical Methods in Investment and Finance., North Holland Publishing Co., 520-63. Verhulst, J. (April 22, 2007). Feeling Th e Heat. St. Petersburg Times. St. Petersburg, Florida. Vose, R.S., C.N. Williams, T.C. Peters on, T.R. Karl, & D.R. Easterling, (2003). An evaluation of the time of observation bias adjustment in the US Historical Climatology Network. Geophysical research letters, 30 (20), 2046, clim3-1--3-4 doi:10.1029/2003GL018111.

PAGE 124

113 Wang, X.L., (2003). Comments on "Detection of undocumented changepoints: A revision of the two-phase model". J. Climate, 16, 3383-3385. Whorf, T.P., & C.D. Keeling. (1998). Rising carbon. New Scientist 157:(2124) 54-54. New Scientist Publ Expediting Inc., Elmont. Wei, W. W. S., (2006). Time Series An alysis: Univariate and Multivariate Methods, 2nd ed., Pearson Education, Inc. Woldm H. O. A. (1938). A study in the analysis of stationary time series, Almqvist and Wiksell, Uppsala.

PAGE 125

114 Appendices

PAGE 126

Appendix A1 Daily Closing Price of Stock XYZ [1] 24.96 24.53 24.14 24.63 24.07 24.19 23.61 23.58 23.70 23.54 23.07 23.44 23.27 23.56 22.95 22.80 23.39 24.67 24.86 25.93 26.04 26.32 26.25 26.57 25.29 [26] 25.49 25.25 25.04 24.67 24.21 24.35 25.72 25.73 25.09 25.17 25.58 24.57 24.59 24.20 24.76 24.60 24.92 25.50 25.21 25.76 25.72 25.49 25.22 25.74 25.80 [51] 25.57 25.72 26.10 25.86 26.37 25.99 25.74 26.36 26.20 25.99 25.62 25.79 25.57 24.76 24.63 24.03 24.16 24.22 24.79 24.41 24.40 24.61 24.64 24.88 24.87 [76] 24.09 23.67 23.75 24.68 24.88 24.99 24.65 25.39 25.96 26.07 26.07 26.33 25.78 25.70 25.26 25.75 25.63 25.64 26.15 26.88 26.50 27.42 27.70 27.47 26.91 [101] 27.31 27.40 27.27 27.52 26.69 26.89 26.04 26.38 26.45 26.00 26.89 26.61 26.47 26.23 26.41 26.17 26.18 25.66 25.65 25.71 25.58 25.61 25.73 25.60 25.63 [126] 25.54 25.70 26.62 26.45 26.24 26.22 26.50 26.57 26.42 26.51 26.52 27.26 28.30 28.43 28.38 28.84 28.37 27.55 27.84 28.34 28.36 28.90 28.50 29.50 29.96 [151] 29.07 29.60 28.46 28.24 28.19 28.83 27.80 28.52 28.50 29.08 29.19 29.14 28.82 28.94 28.91 28.78 28.68 29.07 29.23 28.93 29.35 29.35 28.89 28.91 26.61 [176] 26.91 27.20 26.74 26.12 26.14 26.68 26.07 26.10 26.23 26.10 26.00 25.80 25.98 25.69 25.50 25.15 25.15 25.35 25.10 25.11 25.73 25.40 25.45 25.71 25.84 [201] 25.66 25.67 26.20 25.98 26.24 26.38 26.59 26.61 26.65 26.74 27.06 27.04 27.40 27.36 27.18 27.15 27.04 27.21 27.46 27.52 27.37 27.45 28.14 28.24 28.21 [226] 28.16 27.66 27.57 27.43 27.70 27.54 27.81 28.10 28.30 28.01 28.48 28.80 28.25 27.71 27.91 27.65 27.40 27.29 27.01 26.96 27.08 26.90 27.02 27.15 26.95 [251] 26.59 26.99 26.77 26.46 26.57 26.61 26.88 26.70 26.50 26.53 26.70 26.39 26.37 26.37 26.35 25.83 25.72 25.37 25.09 25.38 25.16 25.18 25.13 24.89 24.63 [276] 24.50 24.15 24.41 25.19 25.03 25.31 25.20 24.93 25.08 25.85 25.95 25.80 25.59 25.48 25.61 25.45 25.51 25.22 25.16 25.53 25.33 25.45 25.95 27.54 27.24 [301] 27.22 26.56 26.48 26.13 26.35 26.33 26.30 26.12 25.78 25.93 25.94 25.94 26.10 25.86 25.54 25.83 25.62 25.73 25.89 25.76 26.10 26.14 26.19 26.23 26.11 [326] 26.13 25.89 25.95 26.43 26.60 26.47 26.77 26.90 27.41 27.32 27.77 28.35 28.35 28.29 28.30 28.39 28.57 28.28 28.50 28.56 28.63 28.57 28.02 28.10 27.64 [351] 27.86 27.89 27.60 28.13 27.87 27.48 27.95 28.32 28.86 29.00 28.03 28.66 28.44 28.58 28.48 28.49 28.52 28.07 28.06 27.53 27.14 27.18 27.72 27.41 26.88 115

PAGE 127

Appendix A1: (Continued) [376] 27.02 27.09 27.05 27.46 27.12 27.20 27.24 27.24 27.55 27.44 27.46 27.30 27.30 27.39 27.62 27.11 27.36 27.26 27.28 27.49 27.25 27.44 27.19 27.26 27.51 [401] 27.51 27.26 27.12 27.35 27.29 27.19 27.27 27.58 27.65 28.25 28.12 28.38 28.53 28.17 27.99 28.06 28.03 28.03 27.80 27.99 28.41 28.18 28.70 28.56 27.74 [426] 27.63 27.90 28.15 28.01 27.97 28.08 28.24 28.47 29.00 29.31 29.28 29.77 29.73 29.98 29.97 27.39 27.12 27.17 27.07 26.86 26.65 26.53 26.64 26.60 26.77 [451] 26.81 27.25 27.09 27.23 27.33 27.07 27.36 27.23 27.08 27.25 27.23 27.11 27.16 26.96 26.95 27.07 26.97 27.01 26.85 26.95 26.90 26.76 26.72 26.74 26.84 [476] 26.78 26.75 26.67 26.80 26.73 26.78 26.27 26.12 26.32 25.98 25.86 25.65 25.67 26.02 26.01 26.11 26.18 26.28 26.39 26.46 26.18 26.32 26.16 26.24 26.07 116

PAGE 128

Appendix A2: MA3 Series on Dail y Closing Price of Stock XYZ [1] na na 24.54 24.43 24.28 24.30 23.96 23.79 23.63 23.61 23.44 23.35 23.26 23.42 23.26 23.10 23.05 23.62 24.31 25.15 25.61 26.10 26.20 26.38 26.04 [26] 25.78 25.34 25.26 24.99 24.64 24.41 24.76 25.27 25.51 25.33 25.28 25.11 24.91 24.45 24.52 24.52 24.76 25.01 25.21 25.49 25.56 25.66 25.48 25.48 25.59 [51] 25.70 25.70 25.80 25.89 26.11 26.07 26.03 26.03 26.10 26.18 25.94 25.80 25.66 25.37 24.99 24.47 24.27 24.14 24.39 24.47 24.53 24.47 24.55 24.71 24.80 [76] 24.61 24.21 23.84 24.03 24.44 24.85 24.84 25.01 25.33 25.81 26.03 26.16 26.06 25.94 25.58 25.57 25.55 25.67 25.81 26.22 26.51 26.93 27.21 27.53 27.36 [101] 27.23 27.21 27.33 27.40 27.16 27.03 26.54 26.44 26.29 26.28 26.45 26.50 26.66 26.44 26.37 26.27 26.25 26.00 25.83 25.67 25.65 25.63 25.64 25.65 25.65 [126] 25.59 25.62 25.95 26.26 26.44 26.30 26.32 26.43 26.50 26.50 26.48 26.76 27.36 28.00 28.37 28.55 28.53 28.25 27.92 27.91 28.18 28.53 28.59 28.97 29.32 [151] 29.51 29.54 29.04 28.77 28.30 28.42 28.27 28.38 28.27 28.70 28.92 29.14 29.05 28.97 28.89 28.88 28.79 28.84 28.99 29.08 29.17 29.21 29.20 29.05 28.14 [176] 27.48 26.91 26.95 26.69 26.33 26.31 26.30 26.28 26.13 26.14 26.11 25.97 25.93 25.82 25.72 25.45 25.27 25.22 25.20 25.19 25.31 25.41 25.53 25.52 25.67 [201] 25.74 25.72 25.84 25.95 26.14 26.20 26.40 26.53 26.62 26.67 26.82 26.95 27.17 27.27 27.31 27.23 27.12 27.13 27.24 27.40 27.45 27.45 27.65 27.94 28.20 [226] 28.20 28.01 27.80 27.55 27.57 27.56 27.68 27.82 28.07 28.14 28.26 28.43 28.51 28.25 27.96 27.76 27.65 27.45 27.23 27.09 27.02 26.98 27.00 27.02 27.04 [251] 26.90 26.84 26.78 26.74 26.60 26.55 26.69 26.73 26.69 26.58 26.58 26.54 26.49 26.38 26.36 26.18 25.97 25.64 25.39 25.28 25.21 25.24 25.16 25.07 24.88 [276] 24.67 24.43 24.35 24.58 24.88 25.18 25.18 25.15 25.07 25.29 25.63 25.87 25.78 25.62 25.56 25.51 25.52 25.39 25.30 25.30 25.34 25.44 25.58 26.31 26.91 [301] 27.33 27.01 26.75 26.39 26.32 26.27 26.33 26.25 26.07 25.94 25.88 25.94 25.99 25.97 25.83 25.74 25.66 25.73 25.75 25.79 25.92 26.00 26.14 26.19 26.18 [326] 26.16 26.04 25.99 26.09 26.33 26.50 26.61 26.71 27.03 27.21 27.50 27.81 28.16 28.33 28.31 28.33 28.42 28.41 28.45 28.45 28.56 28.59 28.41 28.23 27.92 [351] 27.87 27.80 27.78 27.87 27.87 27.83 27.77 27.92 28.38 28.73 28.63 28.56 28.38 28.56 28.50 28.52 28.50 28.36 28.22 27.89 27.58 27.28 27.35 27.44 27.34 [376] 27.10 27.00 27.05 27.20 27.21 27.26 27.19 27.23 27.34 27.41 27.48 27.40 27.35 27.33 27.44 27.37 27.36 27.24 27.30 27.34 27.34 27.39 27.29 27.30 27.32 117

PAGE 129

Appendix A2: (Continued) [401] 27.43 27.43 27.30 27.24 27.25 27.28 27.25 27.35 27.50 27.83 28.01 28.25 28.34 28.36 28.23 28.07 28.03 28.04 27.95 27.94 28.07 28.19 28.43 28.48 28.33 [426] 27.98 27.76 27.89 28.02 28.04 28.02 28.10 28.26 28.57 28.93 29.20 29.45 29.59 29.83 29.89 29.11 28.16 27.23 27.12 27.03 26.86 26.68 26.61 26.59 26.67 [451] 26.73 26.94 27.05 27.19 27.22 27.21 27.25 27.22 27.22 27.19 27.19 27.20 27.17 27.08 27.02 26.99 27.00 27.02 26.94 26.94 26.90 26.87 26.79 26.74 26.77 [476] 26.79 26.79 26.73 26.74 26.73 26.77 26.59 26.39 26.24 26.14 26.05 25.83 25.73 25.78 25.90 26.05 26.10 26.19 26.28 26.38 26.34 26.32 26.22 26.24 26.16 118

PAGE 130

Appendix A3: WMA3 Series on Da ily Closing Price of Stock XYZ [1] na na 24.41 24.45 24.27 24.22 23.88 23.69 23.64 23.60 23.33 23.33 23.29 23.44 23.21 22.98 23.12 23.93 24.55 25.36 25.81 26.16 26.24 26.42 25.88 [26] 25.60 25.34 25.18 24.89 24.50 24.36 25.01 25.50 25.41 25.24 25.36 25.01 24.75 24.39 24.54 24.59 24.79 25.16 25.26 25.53 25.65 25.61 25.39 25.52 25.68 [51] 25.68 25.68 25.89 25.92 26.16 26.09 25.93 26.09 26.18 26.12 25.84 25.77 25.65 25.20 24.83 24.35 24.20 24.17 24.50 24.50 24.47 24.51 24.59 24.75 24.83 [76] 24.48 24.01 23.78 24.20 24.62 24.90 24.80 25.08 25.55 25.92 26.05 26.20 26.01 25.83 25.49 25.58 25.61 25.66 25.89 26.43 26.57 27.02 27.41 27.54 27.23 [101] 27.20 27.29 27.32 27.42 27.06 26.93 26.43 26.35 26.36 26.21 26.52 26.60 26.59 26.37 26.36 26.26 26.21 25.92 25.74 25.68 25.63 25.62 25.66 25.65 25.64 [126] 25.58 25.63 26.13 26.38 26.37 26.26 26.36 26.49 26.48 26.49 26.50 26.89 27.66 28.19 28.38 28.62 28.53 28.04 27.83 28.04 28.27 28.63 28.61 29.07 29.56 [151] 29.44 29.48 28.94 28.54 28.25 28.52 28.21 28.33 28.39 28.79 29.04 29.15 28.99 28.93 28.91 28.85 28.75 28.89 29.09 29.05 29.19 29.28 29.12 28.98 27.76 [176] 27.14 27.00 26.92 26.51 26.23 26.41 26.29 26.19 26.16 26.14 26.07 25.92 25.92 25.80 25.64 25.36 25.21 25.25 25.19 25.15 25.42 25.46 25.48 25.57 25.73 [201] 25.73 25.70 25.93 26.00 26.15 26.27 26.46 26.56 26.63 26.69 26.88 27.00 27.22 27.32 27.28 27.19 27.10 27.14 27.31 27.45 27.44 27.43 27.78 28.07 28.21 [226] 28.19 27.92 27.70 27.52 27.59 27.57 27.70 27.91 28.15 28.12 28.29 28.56 28.47 28.07 27.90 27.75 27.57 27.39 27.17 27.03 27.03 26.97 26.99 27.07 27.03 [251] 26.80 26.85 26.81 26.65 26.57 26.57 26.74 26.74 26.63 26.55 26.61 26.52 26.43 26.37 26.36 26.09 25.86 25.56 25.29 25.28 25.22 25.21 25.15 25.02 24.80 [276] 24.61 24.35 24.34 24.76 24.98 25.20 25.21 25.08 25.05 25.44 25.77 25.86 25.72 25.57 25.56 25.51 25.51 25.36 25.24 25.36 25.37 25.42 25.68 26.66 27.12 [301] 27.28 26.89 26.63 26.32 26.30 26.30 26.32 26.21 25.98 25.91 25.91 25.94 26.02 25.95 25.74 25.74 25.68 25.71 25.79 25.80 25.95 26.06 26.16 26.20 26.16 [326] 26.14 26.01 25.96 26.18 26.43 26.51 26.64 26.78 27.13 27.28 27.56 27.99 28.25 28.32 28.30 28.34 28.46 28.39 28.44 28.49 28.58 28.59 28.30 28.15 27.86 [351] 27.83 27.84 27.74 27.91 27.91 27.72 27.78 28.06 28.53 28.84 28.49 28.51 28.45 28.55 28.51 28.50 28.50 28.29 28.14 27.80 27.42 27.23 27.44 27.48 27.20 [376] 27.04 27.03 27.06 27.26 27.22 27.22 27.21 27.23 27.39 27.44 27.47 27.38 27.33 27.35 27.49 27.33 27.32 27.27 27.29 27.38 27.34 27.39 27.28 27.27 27.37 119

PAGE 131

Appendix A3: (Continued) [401] 27.47 27.39 27.23 27.26 27.28 27.25 27.25 27.41 27.56 27.94 28.09 28.27 28.41 28.33 28.14 28.05 28.03 28.04 27.91 27.93 28.17 28.22 28.48 28.54 28.17 [426] 27.82 27.78 27.98 28.04 28.01 28.03 28.14 28.33 28.70 29.07 29.24 29.53 29.67 29.86 29.93 28.68 27.69 27.19 27.11 26.98 26.79 26.62 26.61 26.60 26.69 [451] 26.76 27.02 27.10 27.19 27.26 27.18 27.26 27.25 27.18 27.19 27.21 27.17 27.16 27.05 26.99 27.01 27.00 27.01 26.92 26.93 26.91 26.84 26.76 26.74 26.79 [476] 26.79 26.77 26.72 26.75 26.74 26.77 26.52 26.28 26.25 26.12 25.98 25.77 25.70 25.84 25.96 26.06 26.13 26.22 26.32 26.41 26.31 26.30 26.22 26.23 26.14 120

PAGE 132

Appendix A4: EWMA3 Series on Daily Closing Price of Stock XYZ [1] na na 24.37 24.48 24.24 24.22 23.84 23.68 23.65 23.59 23.29 23.35 23.29 23.46 23.17 22.95 23.16 24.04 24.60 2 5.44 25.84 26.18 26.24 26.44 25.79 [26] 25.59 25.32 25.16 24.86 24.46 24.36 25.11 25.53 25.36 25.23 25.39 24.94 24.73 24.36 24.58 24.59 24.81 25.21 25.25 25.57 25.66 25.59 25.37 25.56 25.70 [51] 25.66 25.69 25.92 25.91 26.19 26.08 25.90 26.13 26.18 26.10 25.81 25.77 25.64 25.14 24.80 24.31 24.19 24.18 24.54 24.49 24.46 24.52 24.60 24.77 24.84 [76] 24.43 23.96 23.78 24.27 24.66 24.91 24.78 25.12 25.61 25.94 26.05 26.22 25.98 25.81 25.46 25.60 25.61 25.65 25.93 26.49 26.56 27.08 27.45 27.53 27.18 [101] 27.22 27.30 27.31 27.43 27.01 26.92 26.38 26.36 26.37 26.18 26.57 26.60 26.57 26.35 26.37 26.25 26.21 25.88 25.73 25.69 25.63 25.62 25.67 25.64 25.64 [126] 25.57 25.64 26.20 26.39 26.35 26.26 26.38 26.50 26.47 26.49 26.50 26.94 27.75 28.23 28.38 28.65 28.51 27.97 27.83 28.08 28.28 28.67 28.59 29.13 29.62 [151] 29.39 29.50 28.87 28.50 28.24 28.56 28.15 28.36 28.41 28.83 29.06 29.15 28.96 28.93 28.91 28.84 28.74 28.92 29.11 29.04 29.21 29.29 29.09 28.97 27.59 [176] 27.11 27.03 26.90 26.45 26.22 26.45 26.25 26.17 26.17 26.14 26.06 25.90 25.93 25.79 25.62 25.33 25.20 25.26 25.18 25.14 2 5.46 25.45 25.48 25.59 25.75 [201] 25.72 25.69 25.97 26.00 26.16 26.28 26.48 26.57 26.63 26.70 26.91 27.00 27.25 27.33 27.26 27.19 27.09 27.15 27.33 27.46 27.43 27.44 27.83 28.10 28.21 [226] 28.19 27.88 27.68 27.50 27.60 27.57 27.72 27.94 28.17 28.11 28.32 28.60 28.44 28.02 27.90 27.73 27.54 27.37 27.15 27.02 27.04 26.96 26.99 27.08 27.02 [251] 26.77 26.87 26.81 26.62 26.57 26.58 26.76 26.74 26.61 26.55 26.62 26.50 26.42 26.37 26.36 26.06 25.84 25.54 25.26 25.30 25.21 25.20 25.15 25.00 24.78 [276] 24.59 24.32 24.35 24.82 24.99 25.21 25.21 25.06 25.05 25.50 25.80 25.85 25.70 25.56 25.57 25.50 25.51 25.34 25.23 25.38 25.36 25.43 25.72 26.79 27.14 [301] 27.27 26.85 26.61 26.29 26.31 26.31 26.32 26.20 25.95 25.91 25.91 25.94 26.03 25.94 25.71 25.75 25.67 25.71 25.81 25.79 25.97 26.07 26.16 26.21 26.16 [326] 26.14 25.99 25.96 26.22 26.46 26.50 26.66 26.80 27.17 27.29 27.59 28.04 28.27 28.32 28.30 28.35 28.48 28.38 28.45 28.50 28.59 28.59 28.26 28.14 27.83 [351] 27.83 27.85 27.72 27.94 27.91 27.68 27.80 28.09 28.58 28.86 28.43 28.53 28.44 28.55 28.50 28.50 28.51 28.26 28.13 27.76 27.38 27.22 27.48 27.47 27.15 [376] 27.04 27.04 27.06 27.29 27.21 27.21 27.21 27.23 27.42 27.44 27.47 27.37 27.32 27.35 27.51 27.30 27.33 27.27 27.29 27.40 27.32 27.39 27.27 27.27 27.39 121

PAGE 133

Appendix A4: (Continued) [401] 27.47 27.37 27.22 27.27 27.28 27.24 27.25 27.44 27.58 27.98 28.09 28.29 28.43 28.30 28.12 28.06 28.03 28.03 27.90 27.94 28.20 28.22 28.51 28.55 28.11 [426] 27.79 27.80 28.00 28.03 28.01 28.04 28.16 28.35 28.74 29.10 29.25 29.56 29.68 29.88 29.94 28.50 27.60 27.19 27.11 26.96 26.77 26.61 26.61 26.60 26.70 [451] 26.77 27.06 27.10 27.19 27.27 27.17 27.27 27.24 27.16 27.20 27.21 27.16 27.16 27.04 26.98 27.02 27.00 27.01 26.91 26.93 26.91 26.83 26.76 26.74 26.79 [476] 26.79 26.77 26.71 26.76 26.74 26.77 26.48 26.26 26.26 26.10 25.96 25.76 25.69 25.87 25.96 26.07 26.14 26.23 26.33 26.41 26.29 26.30 26.21 26.23 26.13 122

PAGE 134

Appendix B1: Daily Closing Price of S&P Price Index [1] 1210.08 1210.47 1222.12 1225.31 1219.43 1207.01 1209.25 1200.08 1206.83 1197.75 1188.07 1190.21 1189.65 1183.78 1171.71 1172.53 1171.42 [18] 1174.28 1165.36 1181.41 1180.59 1172.92 1176.12 1181.39 1184.07 1191.14 1181.20 1181.21 1187.76 1173.79 1162.05 1142.62 1145.98 1152.78 [35] 1137.50 1159.95 1152.12 1162.10 1151.83 1156.38 1143.22 1156.85 1162.16 1161.17 1175.65 1172.63 1171.35 1178.84 1166.22 1171.11 1159.36 [52] 1154.05 1165.69 1173.80 1185.56 1191.08 1189.28 1193.86 1194.07 1190.01 1197.62 1198.78 1191.50 1202.22 1204.29 1196.02 1197.51 1197.26 [69] 1194.67 1200.93 1198.11 1200.82 1203.91 1206.58 1210.96 1216.96 1216.10 1213.61 1213.88 1200.73 1191.57 1190.69 1201.57 1199.85 1191.33 [86] 1194.44 1204.99 1194.94 1197.87 1211.86 1219.44 1222.21 1223.29 1226.50 1227.92 1221.13 1229.35 1235.20 1227.04 1233.68 1229.03 1231.16 [103] 1236.79 1243.72 1234.18 1235.35 1244.12 1245.04 1235.86 122 6.42 1223.13 1231.38 1229.13 1237.81 1230.39 1233.87 1219.34 1 220.24 1219.02 [120] 1219.71 1221.73 1217.59 1209.59 1212.37 1205.10 1212.28 120 8.41 1220.33 1221.59 1218.02 1233.39 1236.36 1231.67 1241.48 1 240.56 1231.20 [137] 1227.16 1227.73 1237.91 1231.02 1221.34 1210.20 1214.62 121 5.29 1215.63 1215.66 1216.89 1227.68 1228.81 1226.70 1214.47 1 196.39 1191.49 [154] 1195.90 1187.33 1184.87 1177.68 1176.84 1186.57 1190.10 1178.14 1195.76 1177.80 1179.59 1199.38 1196.54 1191.38 1178.90 1 198.41 1207.01 [171] 1202.76 1214.76 1219.94 1220.14 1222.81 1218.59 1220.65 123 0.96 1234.72 1233.76 1229.01 1231.21 1242.80 1248.27 1254.85 1 261.23 1265.61 [188] 1268.25 1257.46 1257.48 1249.48 1264.67 1265.08 1262.09 126 3.70 1257.37 1255.84 1259.37 1260.43 1267.43 1272.74 1270.94 1 267.32 1259.92 [205] 1259.62 1262.79 1268.12 1268.66 1256.54 1258.17 1254.42 124 8.29 1268.80 1273.46 1273.48 1285.45 1290.15 1289.69 1294.18 1 286.06 1287.61 [222] 1283.03 1277.93 1285.04 1261.49 1263.82 1266.86 1264.68 127 3.83 1283.72 1285.19 1280.08 1282.46 1270.84 1264.03 1265.02 1 254.78 1265.65 [239] 1263.78 1266.99 1262.86 1275.53 1280.00 1289.38 1287.24 128 3.03 1292.67 1287.79 1289.43 1294.12 1280.66 1291.24 1289.14 1 287.23 1278.26 [256] 1275.88 1278.47 1272.23 1281.42 1284.13 1297.48 1303.02 130 5.33 1307.25 1305.08 1297.23 1305.04 1301.67 1302.95 1301.61 1 293.23 1302.89 123

PAGE 135

Appendix B1: (Continued) [273] 1300.25 1294.87 1297.81 1305.93 1311.56 1309.04 1295.50 129 6.62 1286.57 1288.12 1289.12 1285.33 1307.28 1309.93 1311.46 1 311.28 1308.11 [290] 1301.74 1305.41 1309.72 1310.61 1305.19 1313.21 1308.12 131 2.25 1325.76 1324.66 1325.14 1322.85 1305.92 1291.24 1294.50 1 292.08 1270.32 [307] 1261.81 1267.03 1262.07 1256.58 1258.57 1272.88 1280.16 125 9.87 1270.09 1285.71 1288.22 1265.29 1263.85 1256.15 1257.93 1 252.30 1237.44 [324] 1223.69 1230.04 1256.16 1251.54 1240.13 1240.12 1252.20 124 5.60 1244.50 1250.56 1239.20 1246.00 1272.87 1270.20 1280.19 1 270.91 1274.08 [341] 1265.48 1267.34 1272.43 1258.60 1242.28 1236.20 1234.49 123 6.86 1259.81 1249.13 1240.29 1260.91 1268.88 1268.40 1263.20 1 278.55 1276.66 [358] 1270.92 1277.41 1280.27 1279.36 1275.77 1271.48 1265.95 127 1.81 1266.74 1268.21 1285.58 1295.43 1297.48 1302.30 1297.52 1 298.82 1292.99 [375] 1296.06 1295.09 1301.78 1304.28 1305.37 1303.82 1311.01 1 313.25 1300.26 1294.02 1298.92 1299.54 1313.00 1318.07 1316.28 1 319.66 1321.18 [392] 1317.64 1325.18 1318.03 1314.78 1326.37 1336.35 1336.59 133 8.88 1335.85 1331.32 1334.11 1350.20 1353.22 1349.59 1350.66 1 353.42 1349.95 [409] 1362.83 1365.62 1369.06 1364.05 1365.80 1366.96 1368.60 137 7.02 1377.38 1382.22 1389.08 1377.34 1377.93 1377.94 1367.81 1 367.34 1364.30 [426] 1379.78 1382.84 1385.72 1378.33 1380.90 1384.42 1393.22 139 6.57 1399.76 1401.20 1400.50 1402.81 1406.09 1400.95 1381.96 1 386.72 1399.48 [443] 1400.63 1396.71 1409.12 1414.76 1412.90 1407.29 1409.84 141 3.04 1411.56 1413.21 1425.49 1427.09 1422.48 1425.55 1423.53 1 418.30 1410.76 [460] 1416.90 1426.84 1424.73 1418.30 1416.60 1418.34 1409.71 141 2.84 1412.11 1414.85 1423.82 1430.73 1431.90 1430.62 1426.37 1 430.50 1422.95 [477] 1427.99 1440.13 1423.90 1422.18 1420.62 1428.82 1438.24 144 5.94 1448.39 1446.99 1448.00 1450.02 1448.31 1438.06 1433.37 1 444.26 1455.30 [494] 1456.81 1455.54 1459.68 1457.63 1456.38 1451.19 1449.37 124

PAGE 136

Appendix B2: MA3 Series on Daily Closing Price of S&P Price Index [1] NA NA 1214.223 1219.300 1222.287 1217.250 1211.897 1205.447 1205.387 1201.553 1197.550 1192.010 1189.310 11 87.880 1181.713 [16] 1176.007 1171.887 1172.743 1170.353 1173.683 1175.787 1178 .307 1176.543 1176.810 1180.527 1185.533 1185.470 1184.517 118 3.390 1180.920 [31] 1174.533 1159.487 1150.217 1147.127 1145.420 1150.077 1149 .857 1158.057 1155.350 1156.770 1150.477 1152.150 1154.077 116 0.060 1166.327 [46] 1169.817 1173.210 1174.273 1172.137 1172.057 1165.563 1161 .507 1159.700 1164.513 1175.017 1183.480 1188.640 1191.407 119 2.403 1192.647 [61] 1193.900 1195.470 1195.967 1197.500 1199.337 1200.843 11 99.273 1196.930 1196.480 1197.620 1197.903 1199.953 1200.947 120 3.770 1207.150 [76] 1211.500 1214.673 1215.557 1214.530 1209.407 1202.060 1194.330 1194.610 1197.370 1197.583 1195.207 1196.920 1198.123 119 9.267 1201.557 [91] 1209.723 1217.837 1221.647 1224.000 1225.903 1225.183 1226.133 1228.560 1230.530 1231.973 1229.917 1231.290 1232.327 1237.223 1238.230 [106] 1237.750 1237.883 1241.503 1241.673 1235.773 1228.470 122 6.977 1227.880 1232.773 1232.443 1234.023 1227.867 1224.483 1219.533 1219.657 [121] 1220.153 1219.677 1216.303 1213.183 1209.020 1209.917 120 8.597 1213.673 1216.777 1219.980 1224.333 1229.257 1233.807 1236.503 1237.903 [136] 1237.747 1232.973 1228.697 1230.933 1232.220 1230.090 122 0.853 1215.387 1213.370 1215.180 1215.527 1216.060 1220.077 1224.460 1227.730 [151] 1223.327 1212.520 1200.783 1194.593 1191.573 1189.367 1183.293 1179.797 1180.363 1184.503 1184.937 1188.000 1183.900 1184 .383 1185.590 [166] 1191.837 1195.767 1188.940 1189.563 1194.773 1202.727 1208 .177 1212.487 1218.280 1220.963 1220.513 1220.683 1223.400 1228 .777 1233.147 [181] 1232.497 1231.327 1234.340 1240.760 1248.640 1254.783 126 0.563 1265.030 1263.773 1261.063 1254.807 1257.210 1259.743 1263.947 1263.623 [196] 1261.053 1258.970 1257.527 1258.547 1262.410 1266.867 127 0.370 1270.333 1266.060 1262.287 1260.777 1263.510 1266.523 1264.440 1261.123 [211] 1256.377 1253.627 1257.170 1263.517 1271.913 1277.463 1283.027 1288.430 1291.340 1289.977 1289.283 1285.567 1282.857 1282.000 1274.820 [226] 1270.117 1264.057 1265.120 1268.457 1274.077 1280.913 1282.997 1282.577 1277.793 1272.443 1266.630 1261.277 1261.817 1261 .403 1265.473 125

PAGE 137

Appendix B2: (Continued) [241] 1264.543 1268.460 1272.797 1281.637 1285.540 1286.550 128 7.647 1287.830 1289.963 1290.447 1288.070 1288.673 1287.013 1289.203 1284.877 [256] 1280.457 1277.537 1275.527 1277.373 1279.260 1287.677 129 4.877 1301.943 1305.200 1305.887 1303.187 1302.450 1301.313 1303.220 1302.077 [271] 1299.263 1299.243 1298.790 1299.337 1297.643 1299.537 130 5.100 1308.843 1305.367 1300.387 1292.897 1290.437 1287.937 1287.523 1293.910 [286] 1300.847 1309.557 1310.890 1310.283 1307.043 1305.087 1305.623 1308.580 1308.507 1309.670 1308.840 1311.193 1315.377 1320 .890 1325.187 [301] 1324.217 1317.970 1306.670 1297.220 1292.607 1285.633 127 4.737 1266.387 1263.637 1261.893 1259.073 1262.677 1270.537 1270.970 1270.040 [316] 1271.890 1281.340 1279.740 1272.453 1261.763 1259.310 125 5.460 1249.223 1237.810 1230.390 1236.630 1245.913 1249.277 1243.930 1244.150 [331] 1245.973 1247.433 1246.887 1244.753 1245.253 1252.690 126 3.023 1274.420 1273.767 1275.060 1270.157 1268.967 1268.417 1266.123 1257.770 [346] 1245.693 1237.657 1235.850 1243.720 1248.600 1249.743 1250.110 1256.693 1266.063 1266.827 1270.050 1272.803 1275.377 1274 .997 1276.200 [361] 1279.013 1278.467 1275.537 1271.067 1269.747 1268.167 126 8.920 1273.510 1283.073 1292.830 1298.403 1299.100 1299.547 1296.443 1295.957 [376] 1294.713 1297.643 1300.383 1303.810 1304.490 1306.733 130 9.360 1308.173 1302.510 1297.733 1297.493 1303.820 1310.203 1315.783 1318.003 [391] 1319.040 1319.493 1321.333 1320.283 1319.330 1319.727 132 5.833 1333.103 1337.273 1337.107 1335.350 1333.760 1338.543 1345.843 1351.003 [406] 1351.157 1351.223 1351.343 1355.400 1359.467 1365.837 136 6.243 1366.303 1365.603 1367.120 1370.860 1374.333 1378.873 1382.893 1382.880 [421] 1381.450 1377.737 1374.560 1371.030 1366.483 1370.473 137 5.640 1382.780 1382.297 1381.650 1381.217 1386.180 1391.403 1396.517 1399.177 [436] 1400.487 1401.503 1403.133 1403.283 1396.333 1389.877 138 9.387 1395.610 1398.940 1402.153 1406.863 1412.260 1411.650 1410.010 1410.057 [451] 1411.480 1412.603 1416.753 1421.930 1425.020 1425.040 1423.853 1422.460 1417.530 1415.320 1418.167 1422.823 1423.290 1419.877 1417.747 [466] 1414.883 1413.630 1411.553 1413.267 1416.927 1423.133 1428.817 1431.083 1429.630 1429.163 1426.607 1427.147 1430.357 1430.673 1428.737 [481] 1422.233 1423.873 1429.227 1437.667 1444.190 1447.107 144 7.793 1448.337 1448.777 1445.463 1439.913 1438.563 1444.310 1452.123 1455.883 [496] 1457.343 1457.617 1457.897 1455.067 1452.313 126

PAGE 138

Appendix B3: WMA3 Series on Daily Closing Price of S&P Price Index [1] NA NA 1216.230 1221.773 1221.838 1214.200 1210.200 1204.292 1204.983 1201.165 1194.423 1190.753 1189.573 11 86.808 1178.723 [16] 1174.132 1171.838 1173.035 1169.343 1174.872 1178.325 1176 .892 1175.798 1178.222 1181.852 1187.158 1184.992 1182.862 118 4.483 1179.683 [31] 1170.248 1154.292 1147.538 1148.820 1144.007 1151.272 1152 .293 1158.415 1155.302 1155.817 1149.042 1152.228 1157.233 116 0.780 1168.575 [46] 1171.727 1172.493 1175.308 1171.282 1170.768 1164.420 1158 .663 1160.755 1167.805 1178.328 1186.360 1189.260 1191.870 119 3.202 1192.005 [61] 1194.492 1196.932 1194.947 1198.073 1201.468 1199.810 11 98.143 1197.137 1196.007 1198.232 1198.477 1199.935 1201.913 120 4.730 1208.325 [76] 1213.230 1215.530 1214.998 1214.160 1207.260 1198.342 1192.657 1196.277 1198.897 1195.877 1194.305 1199.197 1198.207 1198.080 1204.377 [91] 1213.318 1219.562 1222.288 1224.715 1226.675 1224.288 1226.372 1230.905 1230.145 1231.720 1230.248 1230.870 1233.620 1239.317 1237.795 [106] 1236.355 1239.540 1243.118 1240.297 1232.670 1226.348 1227.803 1228.880 1233.845 1232.653 1233.367 1226.025 1222.212 1219.480 1219.568 [121] 1220.605 1219.323 1214.280 1212.313 1208.272 1209.902 120 9.148 1215.015 1218.973 1219.595 1226.300 1232.313 1233.520 1237.357 1239.385 [136] 1236.033 1230.740 1228.118 1232.725 1232.768 1227.328 1217.383 1214.267 1214.218 1215.348 1215.588 1216.270 1222.080 1226.447 1227.567 [151] 1220.937 1207.468 1196.953 1194.512 1190.880 1187.528 1181.685 1178.458 1181.845 1186.713 1183.532 1188.943 1183.843 1181 .688 1189.187 [166] 1194.662 1194.433 1186.000 1190.735 1199.458 1203.452 1209 .468 1215.350 1219.177 1221.442 1220.255 1220.323 1225.462 1231 .122 1233.613 [181] 1231.545 1230.902 1236.638 1243.603 1250.648 1256.943 126 2.357 1266.200 1262.415 1259.268 1253.477 1258.408 1262.343 1263.517 1263.393 [196] 1260.267 1257.660 1257.860 1259.312 1263.753 1268.918 127 0.955 1269.430 1264.223 1261.003 1261.255 1264.927 1267.502 1262.510 1259.375 [211] 1256.023 1251.980 1259.567 1267.712 1272.693 1279.462 1285.805 1289.137 1292.012 1289.372 1288.188 1285.062 1281.243 1282.335 1272.080 [226] 1266.580 1264.952 1265.263 1269.618 1277.250 1282.807 128 2.390 1282.122 1276.253 1269.372 1265.660 1259.735 1261.922 1262.903 1265.697 127

PAGE 139

Appendix B3: (Continued) [241] 1264.390 1269.883 1275.653 1283.945 1286.747 1285.492 128 8.552 1288.623 1289.423 1291.502 1286.608 1288.193 1288.427 1288.535 1283.063 [256] 1278.565 1277.572 1274.918 1277.865 1281.243 1290.353 129 8.025 1303.252 1305.905 1305.845 1301.517 1302.443 1302.053 1302.872 1302.067 [271] 1297.643 1299.457 1299.960 1298.000 1297.237 1301.380 130 7.392 1309.362 1302.690 1298.317 1291.408 1289.020 1288.362 1287.058 1296.937 [286] 1304.947 1310.253 1311.115 1309.725 1305.453 1304.637 1306.953 1309.447 1307.752 1310.103 1309.328 1311.033 1318.317 1322 .958 1325.083 [301] 1323.915 1314.767 1301.402 1295.317 1292.747 1281.603 126 9.692 1265.838 1263.680 1260.152 1258.490 1265.393 1274.135 1268.802 1268.362 [316] 1276.197 1284.362 1276.337 1268.392 1260.240 1258.323 125 4.818 1245.808 1233.042 1229.157 1242.042 1249.497 1246.605 1242.027 1246.162 [331] 1246.887 1246.150 1247.713 1243.870 1244.493 1258.302 126 7.057 1275.640 1273.885 1274.042 1269.252 1267.843 1269.575 1264.667 1252.745 [346] 1241.960 1236.358 1235.960 1247.940 1250.645 1246.490 125 2.073 1261.458 1267.312 1265.880 1271.742 1275.047 1274.105 1275.122 1277.758 [361] 1279.338 1277.717 1274.223 1269.430 1269.802 1268.298 126 8.320 1276.650 1287.610 1294.813 1299.548 1299.107 1298.967 1295.688 1295.497 [376] 1295.063 1298.597 1301.915 1304.408 1304.413 1307.673 131 0.932 1306.382 1299.305 1297.510 1298.413 1306.167 1313.292 1316.330 1318.268 [391] 1319.857 1319.157 1322.000 1320.348 1317.597 1321.117 1329.428 1334.807 1337.695 1336.983 1334.090 1333.470 1341.690 1349 .028 1350.902 [406] 1350.730 1351.862 1351.225 1356.968 1362.078 1366.875 136 5.982 1365.760 1366.088 1367.587 1372.537 1375.797 1379.740 1384.843 1382.067 [421] 1379.592 1377.837 1372.873 1369.263 1365.898 1372.547 137 8.730 1383.770 1381.545 1380.847 1382.232 1388.233 1393.428 1397.607 1399.948 [436] 1400.610 1401.772 1404.065 1402.973 1392.312 1387.505 139 2.307 1397.928 1398.478 1403.568 1409.872 1412.890 1410.405 1409 .500 1411.015 [451] 1411.767 1412.632 1419.075 1424.243 1424.518 1424.783 1424.028 1421.252 1415.402 1415.087 1420.847 1424.128 1421.867 1418.522 1417.753 [466] 1413.735 1412.713 1411.953 1413.602 1418.878 1425.780 1430.163 1431.065 1428.708 1429.143 1426.037 1426.728 1433.220 1429.992 1425.745 [481] 1421.687 1424.980 1432.163 1440.520 1445.882 1447.282 144 7.728 1448.842 1448.828 1443.470 1437.423 1439.597 1447.965 1454.215 1455.923 [496] 1457.822 1457.965 1457.347 1453.993 1451.145 128

PAGE 140

Appendix B4: EWMA3 Series on Daily Closing Price of S&P Price Index [1] NA NA 1217.071 1222.279 1221.494 1213.173 1210.064 1203.690 1205.247 1200.677 1193.516 1190.676 1189.584 11 86.376 1177.721 [16] 1173.903 1171.779 1173.213 1168.774 1175.806 1178.649 1176 .324 1175.844 1178.674 1182.169 1187.727 1184.450 1182.626 118 4.951 1178.841 [31] 1169.077 1152.624 1147.316 1149.386 1143.077 1152.511 1152.269 1158.941 1154.806 1155.897 1148.210 1152.889 1157.937 116 0.836 1169.586 [46] 1171.856 1172.330 1175.813 1170.559 1170.817 1163.697 1158 .004 1161.460 1168.661 1179.361 1187.034 1189.263 1192.154 119 3.326 1191.720 [61] 1194.939 1197.196 1194.454 1198.666 1201.871 1199.269 11 98.053 1197.154 1195.816 1198.617 1198.424 1200.061 1202.199 120 4.994 1208.701 [76] 1213.763 1215.611 1214.800 1214.120 1206.327 1197.374 1192.376 1197.033 1199.033 1195.227 1194.324 1200.024 1197.740 1198.050 1205.446 [91] 1214.193 1219.940 1222.431 1224.970 1226.853 1223.837 1226.797 1231.519 1229.701 1232.000 1230.074 1230.911 1234.073 123 9.946 1237.279 [106] 1236.211 1240.194 1243.393 1239.663 1231.777 1225.889 1228.314 1228.916 1234.411 1232.330 1233.439 1225.070 1221.930 1219 .414 1219.589 [121] 1220.766 1219.076 1213.610 1212.321 1207.819 1210.241 120 9.043 1215.774 1219.347 1219.370 1227.313 1232.891 1233.256 1237.946 1239.553 [136] 1235.343 1230.229 1228.063 1233.466 1232.519 1226.473 121 6.357 1214.317 1214.371 1215.389 1215.599 1216.359 1222.880 1226.784 1227.443 [151] 1220.013 1205.886 1196.173 1194.710 1190.373 1187.149 1181.113 1178.227 1182.520 1187.197 1182.761 1189.917 1182.980 1181 .389 1190.643 [166] 1194.930 1193.997 1184.986 1191.831 1200.537 1203.353 1210.224 1216.006 1219.314 1221.637 1220.017 1220.370 1226.247 1231 .636 1233.634 [181] 1231.183 1230.946 1237.519 1244.270 1251.249 1257.556 126 2.821 1266.493 1261.707 1259.013 1252.906 1259.303 1262.734 1263.313 1263.437 [196] 1259.853 1257.400 1258.076 1259.471 1264.279 1269.464 127 0.953 1269.129 1263.609 1260.806 1261.474 1265.383 1267.667 1261.657 1259.203 [211] 1255.794 1251.453 1260.886 1268.533 1272.806 1280.317 1286.426 1289.216 1292.321 1288.899 1288.106 1284.771 1280.770 1282.721 1270.567 [226] 1266.186 1265.224 1265.180 1270.220 1278.174 1283.147 128 2.060 1282.170 1275.480 1268.609 1265.569 1259.027 1262.454 1263.029 1265.881 129

PAGE 141

Appendix B4: (Continued) [241] 1264.171 1270.690 1276.274 1284.721 1286.817 1285.140 128 9.140 1288.504 1289.424 1291.876 1285.759 1288.629 1288.529 1288.349 1282.377 [256] 1278.181 1277.700 1274.534 1278.373 1281.656 1291.371 129 8.739 1303.549 1306.097 1305.736 1300.904 1302.814 1301.999 1302.883 1302.001 [271] 1297.013 1299.947 1300.001 1297.553 1297.319 1302.030 130 7.987 1309.316 1301.663 1298.074 1290.717 1288.891 1288.470 1286.811 1298.414 [286] 1305.659 1310.426 1311.139 1309.494 1304.923 1304.747 1307.349 1309.613 1307.386 1310.547 1309.156 1311.207 1319.380 1323 .201 1325.091 [301] 1323.763 1313.503 1299.950 1295.200 1292.651 1279.991 126 8.566 1266.009 1263.450 1259.641 1258.501 1266.463 1274.996 1267.526 1268.609 [316] 1277.556 1284.913 1274.759 1267.743 1259.656 1258.267 125 4.459 1244.613 1231.706 1229.283 1244.059 1249.789 1245.680 1241.754 1247.024 [331] 1246.703 1245.914 1248.120 1243.203 1244.709 1260.383 126 7.506 1276.290 1273.460 1274.047 1268.713 1267.771 1269.983 1263.800 1251.250 [346] 1241.137 1236.091 1236.089 1249.636 1250.429 1245.604 125 3.336 1262.519 1267.467 1265.497 1272.714 1275.277 1273.650 1275.449 1278.117 [361] 1279.341 1277.439 1273.831 1268.933 1270.089 1268.076 126 8.304 1277.926 1288.727 1295.194 1299.941 1298.880 1298.946 1295.303 1295.577 [376] 1295.067 1299.051 1302.253 1304.546 1304.329 1308.150 1311.263 1305.507 1298.550 1297.711 1298.574 1307.143 1313.974 1316.323 1318.467 [391] 1320.046 1318.940 1322.454 1320.017 1317.194 1321.867 133 0.417 1335.061 1337.864 1336.821 1333.694 1333.561 1342.906 1349.627 1350.714 [406] 1350.720 1352.084 1351.043 1357.806 1362.584 1367.187 136 5.706 1365.766 1366.213 1367.731 1373.177 1376.023 1380.094 1385.449 1381.391 [421] 1379.354 1377.851 1372.150 1368.989 1365.670 1373.580 137 9.317 1384.049 1381.086 1380.854 1382.544 1388.946 1393.877 1397.914 1400.127 [436] 1400.594 1401.920 1404.354 1402.684 1390.833 1387.393 139 3.331 1398.314 1398.226 1404.361 1410.570 1412.891 1409.960 1409 .549 1411.304 [451] 1411.737 1412.714 1419.991 1424.650 1424.227 1424.893 1423.957 1420.830 1414.739 1415.346 1421.703 1424.214 1421.357 1418.247 1417.837 [466] 1413.160 1412.731 1411.976 1413.780 1419.584 1426.487 143 0.411 1431.001 1428.374 1429.337 1425.596 1426.909 1434.207 1429 .121 1425.236 [481] 1421.534 1425.529 1433.031 1441.294 1446.240 1447.240 144 7.767 1449.010 1448.754 1442.697 1436.844 1440.263 1449.013 1454.586 1455.869 [496] 1458.087 1457.917 1457.209 1453.593 1450.891 130

PAGE 142

Appendix C1: Monthly Temperatur e for 1895-2007 (Version 1 Dataset) [1] 27.88 27.63 41.14 53.89 61.00 68.68 72.08 72.38 66.75 51.23 40.29 33.09 32.60 36.07 38.97 53.51 63.38 70.30 74.24 73.06 63.60 52.55 39.79 [24] 36.31 29.11 34.56 40.20 52.06 61.69 68.73 74.00 71.82 67.21 56.05 41.91 31.65 31.62 36.15 42.32 51.56 60.36 69.79 73.87 73.39 66.07 52.32 [47] 39.09 29.54 30.83 26.43 38.74 51.67 60.71 69.15 73.40 72.49 65.27 55.13 45.89 32.65 35.02 31.63 42.09 52.40 62.27 70.60 73.60 73.63 65.91 [70] 57.34 42.51 35.28 32.90 30.77 41.67 50.19 61.54 69.08 76.84 73.73 63.82 56.16 42.52 32.33 30.83 32.47 42.85 51.81 62.73 68.62 72.66 72.00 [93] 63.22 55.08 44.70 31.73 31.85 29.47 43.34 51.05 60.22 66.43 72.44 71.56 63.24 54.59 40.85 31.28 28.20 32.60 42.60 50.19 60.73 67.84 71.86 [116] 71.20 65.46 54.64 43.91 32.86 27.82 27.60 46.24 51.37 59.86 68.76 72.44 72.91 66.48 52.28 43.08 31.90 33.89 34.42 37.00 53.55 60.23 68.00 [139] 72.82 72.56 66.78 53.05 41.45 35.91 31.85 36.24 46.72 48.15 56.53 66.41 73.09 71.40 64.62 54.42 41.69 35.57 33.25 33.93 44.80 53.46 59.35 [162] 67.55 73.67 71.21 65.99 52.31 43.24 34.15 32.88 35.90 41.28 49.70 58.64 69.51 73.18 73.53 64.80 53.40 45.73 26.42 30.41 30.21 50.39 54.45 [185] 59.78 68.85 74.59 71.63 66.16 56.25 41.93 32.73 33.77 34.59 45.10 50.87 61.87 71.37 73.18 71.47 66.58 53.31 38.02 33.03 25.59 32.00 36.75 [208] 51.52 61.22 67.35 72.87 70.77 63.23 53.88 43.09 33.95 31.32 30.45 39.74 52.33 60.87 68.94 73.66 74.07 64.24 52.33 46.02 34.22 35.56 30.66 [231] 41.91 51.92 61.94 70.08 74.69 72.32 65.11 55.95 44.42 27.72 29.43 37.02 38.10 55.53 58.89 66.78 71.58 70.71 65.00 56.15 44.07 33.94 29.02 [254] 33.90 43.31 51.18 59.99 67.02 74.93 72.48 64.17 52.87 41.10 29.76 29.23 30.93 39.46 49.54 55.88 67.68 75.20 71.58 64.78 51.11 44.47 30.35 [277] 24.76 34.80 46.74 49.88 61.53 71.75 72.97 73.16 62.39 56.87 41.25 35.18 32.65 33.12 41.82 52.12 60.68 70.00 74.89 72.78 66.14 53.30 40.07 [300] 29.49 29.95 34.45 41.25 47.59 59.97 68.03 73.31 71.27 65.85 55.26 40.43 33.78 35.10 37.82 47.53 51.74 60.71 71.00 75.23 72.60 67.02 55.95 [323] 43.35 35.80 27.65 32.79 41.39 51.15 61.83 70.91 73.54 73.38 67.79 55.65 42.65 34.24 35.42 30.41 39.31 50.77 60.07 68.73 74.37 71.70 65.56 [346] 51.91 43.54 36.86 27.06 36.03 38.16 51.10 58.55 69.22 72.45 72.71 63.26 55.66 43.27 28.21 29.84 39.20 45.17 55.50 60.66 70.53 74.73 72.05 131

PAGE 143

Appendix C1: (Continued) [369] 67.17 49.45 41.33 33.32 31.15 38.59 40.68 50.89 61.70 68.91 74.02 72.95 64.54 55.53 41.37 32.16 32.33 38.73 43.25 52.40 60.38 67.87 73.31 [392] 69.98 65.53 56.57 45.14 29.02 32.79 35.26 44.10 48.88 62.11 66.55 74.00 72.40 63.74 55.66 42.52 33.77 27.15 27.84 44.65 51.99 59.52 68.17 [415] 74.28 73.27 63.93 54.53 39.21 34.67 24.43 41.02 41.40 55.23 59.92 69.15 75.67 73.57 66.28 52.43 42.01 32.08 33.88 38.91 40.43 51.90 60.13 [438] 71.45 76.36 72.67 68.70 57.27 44.28 36.02 32.41 37.63 38.37 52.42 61.28 69.98 74.83 73.22 65.25 53.04 41.64 30.32 35.25 30.56 42.93 50.78 [461] 60.62 72.34 75.85 72.55 68.25 55.89 43.46 37.18 36.00 35.18 43.79 54.63 65.38 71.48 77.09 74.25 64.43 57.22 45.91 33.55 32.33 36.94 44.62 [484] 50.54 58.02 68.37 75.86 73.66 65.32 54.42 40.39 31.71 28.20 26.08 44.52 51.04 64.15 71.35 77.52 75.32 66.89 54.11 40.69 35.37 25.77 32.72 [507] 39.90 51.03 62.67 69.56 75.28 75.36 66.26 54.58 41.69 33.52 32.98 36.95 46.16 52.60 60.56 69.54 74.47 74.46 67.24 57.15 40.89 35.14 34.91 [530] 30.98 43.56 52.37 63.43 69.58 75.51 73.30 67.95 55.47 43.20 38.55 24.56 34.85 42.57 51.07 61.57 70.29 74.82 72.79 65.94 56.88 39.98 37.01 [553] 33.20 34.53 40.35 53.07 63.15 68.71 74.74 72.82 65.16 56.02 43.97 37.13 30.79 31.80 42.51 53.99 59.88 68.72 74.68 72.51 64.30 55.36 44.06 [576] 33.84 30.28 37.28 39.61 53.86 60.48 69.70 75.01 74.25 64.84 54.42 41.90 33.55 33.04 35.62 39.69 49.97 62.82 69.16 73.49 72.82 65.81 56.04 [599] 42.56 31.57 31.34 36.26 46.70 51.17 59.00 66.63 73.65 72.89 65.50 54.86 43.01 30.01 32.34 35.93 47.93 55.56 59.46 69.16 74.54 71.83 65.13 [622] 54.03 42.73 37.08 32.72 32.38 39.80 51.94 60.83 67.54 73.54 74.84 67.12 59.58 39.20 33.90 28.83 32.59 39.83 54.31 61.04 69.56 73.97 72.81 [645] 66.67 53.97 42.70 33.55 27.07 32.95 41.93 53.03 62.54 70.05 74.88 72.82 64.33 54.84 46.22 34.31 31.41 36.51 39.95 49.65 60.24 68.74 71.89 [668] 70.72 64.14 58.17 41.58 33.88 31.16 35.43 39.62 50.69 61.50 67.35 74.29 72.56 64.61 54.75 39.07 32.18 32.96 36.50 38.84 52.61 61.34 71.55 [691] 74.74 73.50 66.47 53.66 41.00 34.36 37.26 36.90 44.79 49.66 59.94 70.92 74.78 72.78 66.69 56.95 44.85 34.66 31.64 42.30 40.32 55.04 59.31 [714] 69.61 76.15 72.87 67.07 55.43 44.98 34.86 30.36 32.07 41.10 53.80 62.09 67.26 75.12 74.54 66.46 55.77 39.10 32.45 31.78 33.16 41.80 50.34 [737] 62.58 71.09 73.77 72.36 65.67 56.90 41.31 37.05 27.95 38.93 42.37 51.64 60.50 69.52 74.99 72.41 64.77 52.43 41.26 37.95 32.51 32.89 38.75 [760] 51.31 63.28 68.74 73.26 73.98 66.08 55.48 43.96 32.98 30.26 33.40 42.58 52.56 61.70 70.72 74.24 73.89 65.35 54.02 39.52 36.82 30.30 31.55 132

PAGE 144

Appendix C1: (Continued) [783] 36.84 53.54 60.10 69.84 74.52 72.86 66.98 55.49 43.62 31.24 30.51 38.38 44.56 49.57 59.68 70.26 73.89 73.37 64.06 54.57 41.45 31.86 27.75 [806] 35.86 38.95 52.77 63.13 68.70 72.75 72.78 64.76 56.83 44.03 34.17 25.14 35.03 44.84 52.88 61.83 69.49 74.34 72.69 66.94 60.03 44.83 28.50 [829] 32.34 32.76 39.89 52.26 62.29 68.78 75.35 71.37 64.31 54.43 43.19 32.76 32.43 33.35 36.49 52.85 61.68 67.65 73.33 71.74 62.38 55.61 45.40 [852] 36.51 27.11 32.17 44.00 50.93 61.02 68.76 75.78 71.50 65.15 53.53 43.96 33.41 33.81 33.81 44.66 52.03 58.50 68.38 73.22 71.92 64.68 54.60 [875] 42.30 32.93 29.55 33.81 44.66 51.63 59.05 69.30 73.60 71.36 64.50 55.31 41.93 30.74 29.86 33.67 37.13 53.62 62.27 67.65 74.81 73.87 66.16 [898] 51.74 42.44 34.17 27.83 36.23 39.59 50.52 61.97 69.59 74.69 74.07 65.21 52.99 42.11 34.35 30.07 34.33 40.76 51.33 59.27 70.34 73.16 72.78 [921] 65.00 55.82 42.41 34.76 30.56 34.37 45.07 51.82 61.44 69.11 72.91 72.46 65.10 52.98 39.74 30.46 30.27 34.52 45.44 49.86 60.16 69.80 74.03 [944] 73.29 64.99 56.74 43.58 34.36 31.61 35.63 45.44 52.60 61.31 69.29 74.87 70.99 62.90 54.55 43.11 34.33 32.95 33.52 39.76 47.88 61.34 68.34 [967] 74.30 72.26 63.51 55.29 43.34 34.77 30.49 40.35 43.39 52.93 59.97 68.64 73.67 71.46 64.85 50.70 39.16 31.52 23.66 36.86 44.32 55.12 62.64 [990] 71.17 75.21 72.51 66.51 54.57 43.54 33.81 26.36 28.69 42.22 52.62 60.64 69.81 74.59 72.51 66.52 55.05 42.13 30.46 22.50 28.74 43.16 51.17 [1013] 60.13 68.73 73.91 71.72 66.72 55.91 40.70 36.60 31.62 33.50 40.37 52.46 61.21 69.55 76.49 73.53 66.82 53.27 43.03 35.66 32.96 37.43 43.91 [1036] 56.19 59.92 70.67 74.66 72.78 65.50 52.95 45.41 34.14 26.60 33.19 43.01 49.71 61.80 67.14 73.96 72.80 64.65 53.89 41.86 36.41 33.59 37.68 [1059] 43.43 48.09 58.91 68.00 74.81 75.85 66.21 55.63 43.78 25.85 29.28 38.17 41.08 50.65 61.08 69.51 73.93 73.89 63.76 54.31 42.29 35.07 26.72 [1082] 31.56 44.53 55.38 63.03 68.95 74.81 71.71 63.38 54.94 40.13 29.05 35.11 36.59 47.17 53.91 62.08 71.34 74.33 72.39 64.44 54.62 42.01 34.93 [1105] 31.95 38.23 43.58 54.51 64.03 71.08 74.04 72.54 65.76 53.34 44.65 35.23 28.21 34.26 43.47 52.84 62.23 71.71 75.51 74.38 65.13 54.29 44.00 [1128] 33.94 34.40 29.68 43.49 53.76 61.42 68.99 74.91 72.02 64.59 54.92 43.45 27.99 37.15 37.46 45.63 53.47 59.88 70.97 74.05 72.97 67.90 54.59 [1151] 45.54 31.28 29.74 40.44 44.43 53.61 63.00 69.95 74.70 73.73 65.48 55.25 40.02 36.12 33.96 40.43 45.62 53.48 62.07 68.29 72.02 70.46 65.17 [1174] 55.00 40.75 32.22 30.75 31.69 42.52 50.95 61.95 68.19 73.05 72.71 63.88 53.60 39.69 34.77 29.91 32.55 45.42 53.73 61.73 71.77 74.32 72.90 133

PAGE 145

Appendix C1: (Continued) [1197] 66.06 54.86 43.37 36.73 33.59 38.06 44.36 50.52 59.69 68.54 74.38 74.99 65.27 55.05 42.50 34.01 30.26 36.25 39.85 51.70 61.94 70.60 74.30 [1220] 73.02 64.13 54.31 40.05 34.45 30.46 36.93 45.49 48.97 60.33 69.24 73.98 72.44 67.07 54.50 41.42 34.40 35.45 39.68 42.09 51.99 63.51 69.01 [1243] 76.18 74.62 69.59 55.66 45.45 36.03 34.20 40.30 43.64 52.55 61.06 69.18 75.13 73.73 64.80 54.94 48.35 36.48 34.04 40.36 46.84 53.35 64.05 [1266] 69.81 74.53 74.70 66.06 55.55 38.32 28.48 31.50 34.40 42.04 53.92 63.62 70.00 75.26 74.86 66.07 54.76 47.83 36.53 34.92 36.74 40.02 54.39 [1289] 60.19 71.63 76.55 73.49 67.44 52.23 42.14 35.43 33.00 33.48 44.02 52.85 61.90 68.84 76.13 75.38 65.41 56.87 42.76 35.68 30.34 33.91 47.94 [1312] 53.53 62.87 68.87 73.59 70.80 66.26 55.84 44.34 35.15 33.23 38.22 42.90 52.98 60.34 70.01 75.89 73.88 67.84 55.75 45.28 32.64 39.29 35.16 [1335] 43.31 56.03 63.06 71.44 77.10 74.10 63.69 52.97 44.68 36.64 31.39 32.86 134

PAGE 146

Appendix C2: Monthly Temperatur e for 1895-2007 (Version 2 Dataset) [1] 27.63 27.54 40.95 54.05 61.11 68.61 71.97 72.24 66.43 51.57 40.25 32.73 32.38 36.04 39.21 53.27 63.23 70.57 74.44 73.0 5 63.63 52.79 39.17 [24] 36.44 28.92 34.29 39.90 52.24 62.03 68.75 74.03 72.03 67.04 55.69 41.72 31.62 31.59 36.21 42.03 51.80 60.51 69.93 73.92 73.45 66.01 52.13 [47] 39.25 29.49 30.55 26.27 38.57 51.60 60.62 69.28 73.65 72.42 65.33 54.68 45.98 32.63 34.93 31.42 42.18 52.56 62.52 71.01 73.76 73.55 65.62 [70] 57.17 42.44 35.37 32.54 30.71 41.81 50.31 61.83 69.11 76.94 73.87 63.80 56.16 42.66 32.32 30.85 32.37 42.78 51.93 62.88 68.74 72.74 72.09 [93] 63.36 55.13 44.58 31.51 31.72 29.43 43.40 51.20 60.30 66.58 72.61 71.64 63.32 54.71 41.13 31.44 28.29 32.40 42.70 50.35 60.88 67.99 72.02 [116] 71.36 65.56 54.67 44.18 32.83 27.83 27.76 46.24 51.53 59.95 68.71 72.57 73.04 66.52 52.36 43.17 31.96 33.76 34.59 37.20 53.74 60.37 68.06 [139] 73.00 72.48 66.87 53.20 41.45 35.74 31.55 36.34 46.68 48.43 56.65 66.46 73.21 71.47 64.79 54.46 41.86 35.59 33.34 33.94 44.96 53.73 59.59 [162] 67.65 73.83 71.34 66.08 52.41 43.41 34.15 33.02 35.86 41.50 49.91 58.86 69.62 73.25 73.58 64.93 53.55 45.96 26.52 30.47 30.22 50.61 54.67 [185] 60.02 68.98 74.69 71.73 66.20 56.29 42.04 32.68 33.61 34.56 45.21 51.04 62.01 71.41 73.29 71.56 66.63 53.41 38.18 33.13 25.55 32.06 36.90 [208] 51.67 61.39 67.44 72.90 70.80 63.32 53.90 43.17 33.95 31.32 30.47 39.86 52.42 60.93 68.97 73.67 74.06 64.22 52.36 46.01 34.20 35.39 30.48 [231] 41.93 51.98 62.01 70.11 74.67 72.30 65.09 55.90 44.33 27.68 29.33 36.92 38.18 55.60 58.94 66.71 71.53 70.70 64.97 56.03 44.01 33.96 28.94 [254] 33.80 43.24 51.25 60.07 66.96 74.92 72.44 64.06 52.81 41.06 29.73 29.13 30.89 39.54 49.64 55.92 67.65 75.22 71.63 64.71 51.05 44.44 30.33 [277] 24.69 34.74 46.84 50.03 61.62 71.80 73.02 73.21 62.41 56.86 41.37 35.26 32.62 33.10 41.94 52.28 60.80 70.04 74.97 72.86 66.19 53.27 40.15 [300] 29.55 30.00 34.46 41.38 47.73 60.12 68.07 73.35 71.31 65.90 55.32 40.54 33.93 35.12 37.81 47.64 51.95 60.87 71.08 75.34 72.68 67.20 56.07 [323] 43.45 35.88 27.70 32.84 41.55 51.33 62.02 70.94 73.64 73.45 67.83 55.70 42.78 34.23 35.40 30.48 39.44 50.92 60.27 68.82 74.44 71.79 65.63 [346] 51.97 43.66 36.92 27.06 36.08 38.33 51.25 58.69 69.29 72.51 72.75 63.34 55.69 43.34 28.25 29.82 39.19 45.35 55.67 60.80 70.57 74.76 72.05 135

PAGE 147

Appendix C2: (Continued) [369] 67.19 49.46 41.36 33.33 31.20 38.59 40.82 50.99 61.84 68.92 74.03 72.98 64.56 55.55 41.40 32.14 32.33 38.71 43.36 52.53 60.47 67.92 73.33 [392] 70.01 65.54 56.58 45.15 29.03 32.77 35.29 44.25 49.02 62.24 66.60 74.03 72.46 63.81 55.73 42.57 33.82 27.17 27.88 44.80 52.15 59.61 68.20 [415] 74.28 73.26 63.99 54.58 39.31 34.69 24.42 41.02 41.52 55.33 60.01 69.14 75.68 73.56 66.31 52.42 42.07 32.09 33.90 38.94 40.55 52.00 60.23 [438] 71.44 76.38 72.65 68.74 57.31 44.36 36.03 32.44 37.58 38.48 52.51 61.34 70.00 74.82 73.25 65.29 53.06 41.67 30.33 35.20 30.56 43.02 50.87 [461] 60.70 72.35 75.80 72.60 68.27 55.91 43.47 37.11 35.97 35.15 43.86 54.71 65.47 71.45 77.08 74.25 64.40 57.22 45.92 33.48 32.27 36.91 44.71 [484] 50.65 58.10 68.36 75.85 73.67 65.34 54.41 40.40 31.66 28.14 26.06 44.58 51.12 64.25 71.34 77.49 75.28 66.88 54.10 40.72 35.33 25.70 32.64 [507] 39.96 51.08 62.74 69.57 75.27 75.33 66.26 54.53 41.73 33.49 32.97 36.91 46.22 52.67 60.62 69.51 74.43 74.44 67.25 57.17 40.88 35.13 34.87 [530] 30.91 43.62 52.45 63.50 69.56 75.50 73.31 67.96 55.45 43.27 38.56 24.55 34.80 42.61 51.13 61.62 70.28 74.78 72.75 65.93 56.84 39.97 36.95 [553] 33.14 34.47 40.38 53.10 63.17 68.62 74.69 72.77 65.16 55.99 43.92 37.05 30.71 31.70 42.55 54.04 59.91 68.62 74.62 72.47 64.27 55.33 44.03 [576] 33.73 30.20 37.18 39.63 53.87 60.48 69.58 74.94 74.18 64.85 54.42 41.84 33.48 32.97 35.49 39.68 49.99 62.82 69.08 73.44 72.79 65.82 56.03 [599] 42.58 31.55 31.28 36.17 46.71 51.17 59.01 66.54 73.59 72.83 65.48 54.83 42.99 29.95 32.28 35.80 47.96 55.59 59.49 69.04 74.46 71.79 65.05 [622] 53.97 42.67 36.99 32.63 32.29 39.83 51.94 60.85 67.43 73.44 74.77 67.09 59.55 39.16 33.80 28.77 32.47 39.87 54.30 61.05 69.49 73.92 72.73 [645] 66.64 53.97 42.68 33.45 26.96 32.84 41.98 53.08 62.59 70.02 74.87 72.81 64.32 54.81 46.23 34.20 31.30 36.45 40.02 49.71 60.29 68.70 71.89 [668] 70.68 64.13 58.15 41.59 33.83 31.11 35.40 39.67 50.72 61.53 67.32 74.27 72.50 64.60 54.74 39.06 32.12 32.85 36.46 38.89 52.65 61.38 71.50 [691] 74.70 73.45 66.45 53.61 40.97 34.30 37.20 36.84 44.85 49.69 59.97 70.87 74.74 72.76 66.67 56.94 44.83 34.58 31.56 42.26 40.38 55.08 59.33 [714] 69.57 76.11 72.83 67.02 55.39 44.96 34.81 30.28 32.00 41.13 53.83 62.12 67.26 75.09 74.53 66.44 55.74 39.09 32.38 31.75 33.08 41.81 50.37 [737] 62.56 71.07 73.73 72.35 65.63 56.89 41.28 36.96 27.90 38.86 42.41 51.68 60.53 69.48 74.98 72.39 64.77 52.42 41.27 37.88 32.47 32.86 38.81 [760] 51.33 63.29 68.72 73.29 73.98 66.07 55.47 43.94 32.91 30.23 33.36 42.61 52.59 61.70 70.71 74.25 73.88 65.30 53.98 39.50 36.77 30.24 31.51 136

PAGE 148

Appendix C2: (Continued) [783] 36.88 53.57 60.12 69.83 74.52 72.86 66.94 55.45 43.59 31.16 30.39 38.30 44.58 49.59 59.71 70.25 73.88 73.36 64.01 54.53 41.41 31.80 27.65 [806] 35.77 38.96 52.78 63.11 68.69 72.73 72.78 64.71 56.77 44.01 34.12 25.11 34.97 44.86 52.89 61.85 69.49 74.34 72.67 66.91 60.00 44.81 28.46 [829] 32.27 32.70 39.91 52.29 62.33 68.79 75.34 71.37 64.30 54.42 43.20 32.71 32.37 33.29 36.52 52.86 61.70 67.66 73.31 71.75 62.38 55.60 45.39 [852] 36.49 27.07 32.11 44.03 50.96 61.04 68.77 75.76 71.51 65.14 53.52 43.95 33.37 33.74 33.75 44.65 52.02 58.51 68.36 73.25 71.93 64.67 54.58 [875] 42.30 32.87 29.52 33.77 44.66 51.63 59.06 69.28 73.59 71.33 64.49 55.29 41.92 30.71 29.81 33.61 37.11 53.64 62.28 67.64 74.81 73.87 66.13 [898] 51.71 42.41 34.15 27.79 36.18 39.56 50.52 61.96 69.58 74.68 74.04 65.19 52.95 42.06 34.27 30.02 34.30 40.73 51.31 59.25 70.32 73.14 72.77 [921] 64.98 55.80 42.38 34.70 30.53 34.31 45.03 51.80 61.46 69.11 72.92 72.47 65.08 52.97 39.73 30.44 30.26 34.47 45.47 49.87 60.17 69.78 74.02 [944] 73.29 64.99 56.71 43.55 34.33 31.61 35.60 45.43 52.62 61.30 69.32 74.86 70.99 62.88 54.53 43.10 34.32 32.96 33.54 39.79 47.89 61.35 68.36 [967] 74.31 72.26 63.50 55.30 43.35 34.77 30.49 40.36 43.37 52.93 59.98 68.67 73.67 71.46 64.85 50.69 39.14 31.50 23.64 36.86 44.32 55.12 62.65 [990] 71.17 75.19 72.48 66.52 54.58 43.54 33.86 26.45 28.75 42.24 52.62 60.68 69.82 74.58 72.50 66.52 55.05 42.15 30.46 22.57 28.82 43.15 51.17 [1013] 60.11 68.74 73.93 71.72 66.72 55.91 40.70 36.62 31.67 33.58 40.38 52.49 61.24 69.58 76.51 73.54 66.83 53.26 43.03 35.71 33.01 37.49 43.93 [1036] 56.19 59.94 70.68 74.68 72.77 65.52 52.97 45.43 34.20 26.65 33.29 42.98 49.69 61.78 67.12 73.95 72.78 64.66 53.94 41.86 36.47 33.64 37.75 [1059] 43.45 48.08 58.90 67.99 74.79 75.84 66.22 55.63 43.78 25.95 29.35 38.24 41.08 50.66 61.03 69.46 73.91 73.89 63.75 54.33 42.31 35.16 26.80 [1082] 31.65 44.54 55.36 63.01 68.92 74.78 71.70 63.39 54.94 40.14 29.11 35.17 36.64 47.15 53.88 62.05 71.30 74.33 72.38 64.42 54.63 42.03 34.93 [1105] 31.99 38.26 43.56 54.47 63.97 71.09 74.07 72.55 65.80 53.43 44.69 35.24 28.26 34.27 43.43 52.80 62.19 71.73 75.57 74.42 65.21 54.34 44.05 [1128] 33.97 34.43 29.69 43.48 53.79 61.42 69.03 74.94 72.07 64.66 54.96 43.47 28.00 37.17 37.46 45.64 53.47 59.86 71.03 74.06 73.02 67.96 54.65 [1151] 45.59 31.31 29.74 40.41 44.41 53.59 62.95 69.98 74.69 73.73 65.51 55.24 40.05 36.15 34.00 40.44 45.60 53.45 62.05 68.29 72.03 70.47 65.22 [1174] 55.01 40.79 32.26 30.76 31.67 42.48 50.89 61.86 68.16 73.02 72.68 63.86 53.57 39.74 34.76 29.84 32.56 45.38 53.65 61.67 71.70 74.29 72.88 137

PAGE 149

Appendix C2: (Continued) [1197] 66.07 54.87 43.42 36.74 33.58 37.95 44.33 50.45 59.62 68.50 74.36 74.98 65.28 55.04 42.52 33.99 30.22 36.21 39.83 51.63 61.87 70.62 74.28 [1220] 73.03 64.13 54.28 40.11 34.49 30.48 36.91 45.48 48.96 60.28 69.27 74.02 72.48 67.13 54.55 41.46 34.51 35.53 39.75 42.13 52.03 63.51 69.03 [1243] 76.25 74.70 69.63 55.76 45.58 36.10 34.29 40.36 43.68 52.55 61.07 69.23 75.17 73.75 64.88 55.06 48.46 36.59 34.11 40.47 46.89 53.41 64.03 [1266] 69.83 74.62 74.75 66.18 55.66 38.49 28.59 31.63 34.49 42.15 53.98 63.63 70.05 75.32 74.93 66.17 54.89 47.98 36.67 35.03 36.95 40.20 54.33 [1289] 60.13 71.67 76.62 73.54 67.49 52.31 42.33 35.65 33.01 33.53 44.05 52.84 61.88 68.89 76.21 75.44 65.51 57.03 42.92 35.91 30.45 34.07 48.05 [1312] 53.57 62.88 68.93 73.69 70.95 66.40 56.02 44.54 35.42 33.45 38.45 43.10 53.18 60.47 70.13 76.02 73.98 67.98 55.93 45.46 32.91 39.53 35.35 [1335] 43.45 56.12 63.12 71.55 77.22 74.19 63.86 53.13 44.58 36.79 31.46 32.86 138

PAGE 150

Appendix D: Monthly CO 2 Emission 1981-2003 [1] 125.4146 105.9197 107.1396 94.9415 95.2932 96.8175 101.4611 99.1624 94.7263 100.2223 99.5731 115.2323 120.7643 10 4.9733 104.5716 [16] 96.7353 89.5450 88.4151 93.3538 93.7069 88.7784 91.4506 95.8438 103.4164 109.8079 95.9954 102.0473 92.6503 89 .7150 90.0224 [31] 96.1035 99.8732 92.1287 91.9029 95.5590 111. 6509 123.0677 105.0796 109.5487 99.4655 97.1149 96.6004 98.8165 102 .1989 92.3669 [46] 96.5541 99.2773 107.1774 118.5610 109.6339 104.9004 97.7826 95.9263 93.8377 99.2186 100.9150 91.7477 96.6222 97 .1792 116.6590 [61] 119.0228 106.0296 107.9398 96.3769 96.1893 96.1216 102.9323 99.8575 92.0380 95.7214 97.5536 112.3446 119.0369 107 .0488 107.1302 [76] 100.2641 98.5374 101.2423 107.7090 105.7530 97.7900 102.1821 101.9277 117.4361 125.3272 117.3372 116.2031 102.3907 100 .7750 104.1288 [91] 108.4163 112.9515 100.5518 105.7330 107.9122 122.1634 121.3611 116.2543 120.0471 105.3687 103.7629 105.2683 107.0433 109.7045 101.6754 [106] 106.8046 109.6547 132.4097 122.7457 109.3402 114.3499 106.5081 105.7040 106.4926 110.1887 114.3188 103.9005 108.1243 106.9206 118.8626 [121] 126.0021 106.8170 110.8317 101.5886 102.7917 103.2419 109.4145 110.2427 102.5933 107.0639 110.2982 124.6845 123.6433 112. 3913 115.6407 [136] 107.3133 104.4047 102.8995 111.4495 108.3450 103.9191 107.7787 109.6073 124.7419 122.3291 116.2767 123.0597 107.9980 103.0110 106.5694 [151] 115.2173 115.0036 107.1554 110.2313 114.9989 127.2304 136. 3117 123.5147 121.0627 110.0240 108.2796 113.2922 115.1996 117. 1827 107.7320 [166] 110.8785 111.2215 125.6154 128.5581 119.5553 120.9017 109.9400 110.1609 111.9193 117.5974 123.0613 110.9679 112.1697 118. 6156 132.1565 [181] 137.3295 127.4181 127.3773 114.5245 114.7983 114.7599 120.0355 122.1473 111.5481 119.6606 123.3725 132.9810 139.4169 122. 3559 124.9457 [196] 117.3911 116.4253 116.2448 126.3987 123.1898 116.62 63 121.5586 123.3120 137.4636 136.2017 121.1348 128.8153 118.6826 118. 2228 121.3047 [211] 129.6144 129.3553 120.1381 120.3204 119.3293 134.2790 139.6680 122.5791 131.1086 120.0453 118.1919 121.0179 130.0581 130.0953 119.2973 [226] 122.7435 120.5918 138.3181 140.7672 132.3686 129.4892 118.8712 123.8231 125.0370 128.5651 134.8708 123.4820 126.2876 128. 2345 149.5021 [241] 147.4769 128.7643 135.0432 121.4761 121.8473 121.3621 131. 0966 133.6699 118.2129 122.7140 119.8921 130.9794 138.8554 123.7673 132.4239 139

PAGE 151

Appendix D: (Continued) [256] 122.5456 122.7795 124.5207 133.8627 133.4808 123.0272 125.6926 128.2429 141.2141 147.6298 134.1716 133.6979 121.0047 120. 4789 120.7394 [271] 132.4187 135.1314 121.7753 125.2487 126.2127 143.1509 140

PAGE 152

Appendix E: Monthly CO 2 in the Atmosphere 1965-2004 [1] 319.44 320.44 320.89 322.13 322.16 321.87 321.21 318.87 317.81 317.30 318.87 319.42 320.62 321.59 322.39 323.70 324.07 323.75 322.40 320.37 318.64 [22] 318.10 319.79 321.03 322.33 322.50 323.04 324.42 325.00 324.09 322.55 320.92 319.26 319.39 320.72 321.96 322.57 323.15 323.89 325.02 325.57 325.36 [43] 324.14 322.11 320.33 320.25 321.32 322.90 324.00 324.42 325.64 326.66 327.38 326.70 325.89 323.67 322.38 321.78 322.85 3 24.12 325.06 325.98 326.93 [64] 328.13 328.07 327.66 326.35 324.69 323.10 323.07 324.01 325.13 326.17 326.68 327.18 327.78 328.92 328.57 327.37 325.43 323.36 323.56 324.80 326.01 [85] 326.77 327.63 327.75 329.72 330.07 329.09 328.05 326.32 324.84 325.20 326.50 327.55 328.54 329.56 330.30 331.50 332.48 332.07 330.87 329.31 327.51 [106] 327.18 328.16 328.64 329.35 330. 71 331.48 332.65 333.09 332.25 331.18 329.40 327.44 327.37 328.46 329.58 330.40 331.41 33 2.04 333.31 333.96 333.59 [127] 331.91 330.06 328.56 328.34 329. 49 330.76 331.74 332.56 333.50 334.58 334.87 334.34 333.05 330.94 329.30 328.94 330.31 33 1.68 332.92 333.42 334.70 [148] 336.07 336.74 336.27 334.93 332. 75 331.58 331.16 332.40 333.85 334.97 335.39 336.64 337.76 338.01 337.89 336.54 334.68 33 2.76 332.54 333.92 334.95 [169] 336.23 336.76 337.96 338.89 339. 47 339.29 337.73 336.09 333.91 333.86 335.29 336.73 338.01 338.36 340.08 340.77 341.46 34 1.17 339.56 337.60 335.88 [190] 336.01 337.10 338.21 339.23 340. 47 341.38 342.51 342.91 342.25 340.49 338.43 336.69 336.85 338.36 339.61 340.75 341.61 34 2.70 343.56 344.13 343.35 [211] 342.06 339.82 337.97 337.86 339. 26 340.49 341.37 342.52 343.10 344.94 345.75 345.32 343.99 342.39 339.86 339.99 341.16 34 2.99 343.70 344.51 345.28 [232] 347.08 347.43 346.79 345.40 343. 28 341.07 341.35 342.98 344.22 344.97 346.00 347.43 348.35 348.93 348.25 346.56 344.69 34 3.09 342.80 344.24 345.56 [253] 346.29 346.96 347.86 349.55 350. 21 349.54 347.94 345.91 344.86 344.17 345.66 346.90 348.02 348.47 349.42 350.99 351.84 35 1.25 349.52 348.10 346.44 [274] 346.36 347.81 348.96 350.43 351. 72 352.22 353.59 354.22 353.79 352.39 350.44 348.72 348.88 350.07 351.34 352.76 353.07 35 3.68 355.42 355.67 355.13 [295] 353.90 351.67 349.80 349.99 351. 30 352.53 353.66 354.70 355.39 356.20 357.16 356.22 354.82 352.91 350.96 351.18 352.83 35 4.21 354.72 355.75 357.16 [316] 358.60 359.34 358.24 356.17 354. 03 352.16 352.21 353.75 354.99 355.98 356.72 357.81 359.15 359.66 359.25 357.03 355.00 35 3.01 353.31 354.16 355.40 [337] 356.70 357.16 358.38 359.46 360. 28 359.60 357.57 355.52 353.70 353.98 355.33 356.80 358.36 358.91 359.97 361.26 361.68 36 0.95 359.55 357.49 355.84 141

PAGE 153

Appendix E: (Continued) [358] 355.99 357.58 359.04 359.96 361. 00 361.64 363.45 363.79 363.26 361.90 359.46 358.06 357.75 359.56 360.70 362.05 363.25 36 4.03 364.72 365.41 364.97 [379] 363.65 361.49 359.46 359.60 360. 76 362.33 363.18 364.00 364.57 366.35 366.79 365.62 364.47 362.51 360.19 360.77 362.43 36 4.28 365.32 366.15 367.31 [400] 368.61 369.29 368.87 367.64 365. 77 363.90 364.23 365.46 366.97 368.15 368.87 369.59 371.14 371.00 370.35 369.27 366.94 36 4.63 365.12 366.67 368.01 [421] 369.14 369.46 370.52 371.66 371. 82 371.70 370.12 368.12 366.62 366.73 368.29 369.53 370.28 371.50 372.12 372.87 374.02 37 3.30 371.62 369.55 367.96 [442] 368.09 369.68 371.24 372.43 373. 09 373.52 374.86 375.55 375.40 374.02 371.49 370.71 370.24 372.08 373.78 374.68 375.63 37 6.11 377.65 378.35 378.13 [463] 376.62 374.50 372.99 373.00 374. 35 375.70 376.79 377.37 378.41 380.52 380.63 379.57 377.79 375.86 374.06 374.24 375.86 37 7.48 142