Global Warming Prediction Project
Global Warming Prediction Project
About the Data
14.01.2010
The earth climate, seen from systems theory, is a complex system. Developing models for prediction for such complex systems by a certain theory is almost all very difficult, time-consuming and problematic, because the a priori knowledge about the system required to formulate the model is more or less incomplete. Therefore, based on the well accepted assumption that observational data of a system contains hidden information about the system‘s behavior modeling technologies based on knowledge extraction from noisy data has been a substantial alternative, and they are becoming increasingly important in many fields.
So data are a potential source of knowledge and the quality of the data (representativity, reliability, consistency) plays an important role. The basic raw data used in this project is taken from the site of the Climate Research Unit of University of East Anglia and is described this:
„Over land regions of the world over 3000 monthly station temperature time series are used. Coverage is denser over the more populated parts of the world, particularly, the United States, southern Canada, Europe and Japan. Coverage is sparsest over the interior of the South American and African continents and over the Antarctic. The number of available stations was small during the 1850s, but increases to over 3000 stations during the 1951-90 period. For marine regions sea surface temperature (SST) measurements taken on board merchant and some naval vessels are used. As the majority come from the voluntary observing fleet, coverage is reduced away from the main shipping lanes and is minimal over the Southern Oceans. Maps/tables giving the density of coverage through time are given for land regions by Jones and Moberg (2003) and for the oceans by Rayner et al. (2003). Both these sources also extensively discuss the issue of consistency and homogeneity of the measurements through time and the steps that have made to ensure all non-climatic inhomogeneities have been removed.“
(source)
In climate modeling, it is common practice to use temperature anomalies instead of absolute temperature values. As the reference for anomalies calculation the average temperatures from 1961 to 1990 is used. This has several reasons and advantages:
„Stations on land are at different elevations, and different countries estimate average monthly temperatures using different methods and formulae. To avoid biases that could result from these problems, monthly average temperatures are reduced to anomalies from the period with best coverage (1961-90). For stations to be used, an estimate of the base period average must be calculated. Because many stations do not have complete records for the 1961-90 period several methods have been developed to estimate 1961-90 averages from neighbouring records or using other sources of data. Over the oceans, where observations are generally made from mobile platforms, it is impossible to assemble long series of actual temperatures for fixed points. However it is possible to interpolate historical data to create spatially complete reference climatologies (averages for 1961-90) so that individual observations can be compared with a local normal for the given day of the year.“
(source)
The objective of this project is doing monthly modeling and prediction of global temperature anomalies through self-organizing knowledge extraction from public data. The project is impartial and has no hidden personal, financial, political or other interests. It is entirely independent, transparent, and open in results.