Abstract
Reconstructions of historical sea level in the Arctic Ocean are fraught with difficulties related to lack of data, uneven distribution of tide gauges and seasonal ice cover. Considering the period from 1950 to the present, we attempt to identify conspicuous tide gauges in an automated way, using the statistical leverage of each individual gauge. This may be of help in determining appropriate procedures for data preprocessing, of particular importance for the Arctic area as the GIA is hard to constrain and many gauges are located on rivers. We use a model based on empirical orthogonal functions from a calibration period, in this preliminary case Drakkar ocean model data, which are forced using historical tide gauge data from the PSMSL database. The resulting leverage for each tide gauge may indicate that it represents a distinct mode of variability, or that its time series is perturbed in a way inappropriate for the reconstruction so that it should be removed from the reconstruction model altogether. Therefore, the characteristics of the high-leverage gauges are examined in detail.
Original language | English |
---|---|
Publication date | 2014 |
Number of pages | 1 |
Publication status | Published - 2014 |
Event | European Geosciences Union General Assembly 2014 - Vienna, Austria Duration: 27 Apr 2014 → 2 May 2014 |
Conference
Conference | European Geosciences Union General Assembly 2014 |
---|---|
Country/Territory | Austria |
City | Vienna |
Period | 27/04/2014 → 02/05/2014 |