The aim of this thesis is the application of the GPS technology to the investigation of crustal deformations, in order to exploit data from a GPS network to improve the knowledge of seismic phaenomena. We attack this problem at different levels: from weekly GPS data processing to normal equations stacking, in order to compute the velocity field and the deformations field of the crust. As a EUREF LAC (Local Analysis Centre) we carry out the weekly processing following the EUREF guidelines: we process data from a cluster of 34 stations. Besides this, we process data from another (non EUREF) cluster of stations, deployed on the Italian and Austrian regions. The aim of this second processing activity is to densify the EUREF network using stations whose quality is comparable to the EUREF stations. For both of the networks raw data are made up of daily phase and code measuremente, at 30 seconds sampled, stored as RINEX observation files. We download these data every week, using PERL scripts to automate the procedure. The data processing is also done in an automatic way, trough Bernese 5 and the Bernese Processing Engine. To produce daily normal equations and coordinates estimates we use the processing strategy defined by RNX2SNX PCF. This Process Control File is made of a list of statements activating the Bernese programs; the main features of this procedure are 1) Rejecting of RINEX containing large gaps or big residuals, 2) Ambiguity resolution through quasi-ionosphere-free strategy, 3) alignement of the network to the ITRF2000 (ITRF2005, now) using a set of fiducial stations. Weekly coordinates are estimated by stacking the daily normal equations; daily tropospheric delays are estimated by fixing the coordinates of the site to their weekly mean values.The velocity field is computed by a multi years solution. At this level we must ensure that all the discontinuities and the outliers are eliminated from the time series of each station. This is done by editing a Bernese input file (.STA) which stores all the information needed to remove discontinuities and reject outliers. Once continous EUREF time series have been obtained, we densify the velocity field by using the information stored in Austrian and Italian regions normal equations files. In order to estimate a reliable velocity field we solve the problem of multi-year normal equations stacking. First of all, we stack the EUREF weekly normal equations; in order to solve this issue it is necessary to screen the time serie of each site in order to find all the offsets and the outliers. The information collected is stored in a input file (.STA file) for the Bernese 5 stacking programme ADDNEQ2. The EUREF normal equations (neq's) stacking is essential for the definition of the reference frame. The second issue is the densification of the EUREF velocity field. This step is essential because the EUREF network is a cartographic network, so its data are not suitable to be used in crustal deformation studies. But this network can be densified using the data belonging to others GPS local networks through the stacking of their normal equations. In this view, each weekly (local) normal equation must be stacked with the related EUREF weekly normal equation, hence the resulting neq's must be stacked in a multi-year solution. We use weekly neq's belonging to two networks, namely UPA and GP, to densify the velocity field in the Italian and Austrian areas. During the EUREF multi-year stacking the programme ADDNEQ2 showed poor speed performance and repeatedly crashed, so we decided to use a faster and more stable software. We moved to CATREF (Combination and Analysis of Terrestrial Reference Frames, see <cite>altamimiitrf2000</cite>, <cite>altamimiitrf2005</cite> and <cite>noquet</cite>). The stacking strategy we used to carry out the multi-year densified solution is made of 4 steps: 1.Removal of the a-priori constraints from EUREF, UPA and GP weekly neq's, and imposition of new constraints. 2.Preliminary multi-year solution for each network, in order to find offsets and outliers. 3.Stacking of each UPA and GP neq with the related EUREF neq. 4.Multi-year stacking of the resulting neq's. Step 1 is very important to define a reference frame which is common for each neq involved in a stacking procedure. It is well known the neq's deriving from GPS observables are not full rank: the rank deficit is 7 and is strictly related to the poor definition of the reference frame given by the GPS observables themselves, since they are not sensitive to changes due to translations, rotation and are also scale invariant. So the information used to define the reference frame must be introduced by means of pseudo-observation equations called constraints. Several kinds of constraints can be imposed: some of them acting on the coordinates and velocities of the sites, others acting on the translation, rotation and scale parameters. CATREF is based on the second kind of constraints (the so-called minimum constraints). Offsets detected in the time series are treated through a piecewise approach: two sets of coordinates and velocities are estimated for the same site, using the data before and after the offset epoch and constraining the velocities (before and after) to the same value. The weighted RMS (WRMS) of the weekly solution versus the stacked solution is used to check the correctness of the constraints and outliers removal. Following A. Kenyeres <footnote>http://www.epncb.oma.be/_organisation/projects/series_sp/cumulative_solution.php </footnote> we can say that WRMS of or greater than 1 cm reveals the presence of outliers, and that a very small WRMS value (under 1 mm for the vertical residuals) reveals a wrong constraint removal. In our case we have WRMS_{EUREF}?[2.0,3.5] mm, WRMS_{UPA}?[2.0,4.5] mm and WRMS_{GP}?[2.0,5.0] mm. The WRMS of the vertical residuals is greater than 2 mm for the three multi-year solutions. We expected that WRMS of the UPA and GP stacking were greater than EUREF one because the raw data coming from UPA and GP are non checked as strictly as EUREF data. Moreover the plots of the Helmert parameters related to the UPA and GP neq's stacking reveal the inadequacy of reference frame definition, especially for the UPA neq's. However the weekly stacking of the EUREF, UPA and GP neq's seems to compensate this fact. The combined reference frame is aligned to ITRF05 by minimum constraints imposed on the following subset of sites: <K1.1/>. <K1.1 ilk="TABLE" > BRUS A 13101M004 JOZE A 12204M001 POTS A 14106M003 TRAB A 20808M001 GLSV A 12356M001 MAS1 A 31303M002 ZIMM A 14001M004 TRO1 A 10302M006 GRAS A 10002M006 METS A 10503S011 RABT A 35001M002 VILL A 13406M001 HOFN A 10204M002 NOT1 A 12717M004 </K1.1> A further check of the correctess of the alignment of the combined frame to ITRF05 is performed by computing the residuals of the velocity on a subset of sites common to the combined frame and the ITRF05. The common sites are 29 and the averaged velocity residuals in East and North direction are: <K1.1/> <K1.1 ilk="TABLE" > East [mm/yr] North [mm/yr] -0.11 0.07 </K1.1> The total number of sites contained in the EUREF, UPA ang GP multi-year solution is 247. ETRF velocities are computed using the transformation parameters reported in <cite>boucheraltamimi</cite>. Later we did the processing of 9 annual and bi-annual measurement campaigns related to the CEGRN network (time span 1994-2007), using reprocessed satellite orbits and EOPs files (see <cite>steigenberger</cite>). The related neq's, stacked with the corresponding neq's computed by the others analysis centres participating in the CEGRN project, were stacked with the EUREF, UPA and GP neq's, resulting a multi-year solution wich contains 296 sites. From this velocity field we can infer that the central European area is rigid (it shows horizontal ETRF velocity values smaller than 1 mm/yr) and that the Mediterranean area is characterized by ETRF velocity values greater than 2 mm/yr. Moreover we verified that the effect of the annual term, for a time span greater than 2.5 years, is negligible (see <cite>blewittlavalle</cite>). The averaged effect is 0.01 mm/yr and 0.06 mm/yr for the East and North components of the horizontal velocity. The stacking strategy (with CATREF ) is fully explained in chapter 9. Offsets tables, Helmert parameters and WRMS plots are given. The velocity field was used to infer the strain rate. The horizontal strain rate tensor can be expressed as a linear combination of the partials of the velocity field, computed with respect to the East and North directions: ?_{ee} = ((?v^{e})/(?e)), ?_{en} = (1/2)(((?v?)/(?e))+((?v^{e})/(?n))), ?_{nn} = ((?v?)/(?n)). The horizontal velocity field can be approximated analitically by means of least squares collocation. The least squares collocation is based on a knowledege of the statistical properties of the field. The interpolation formula can be written as (see <cite>moritz</cite>): s=C_{st}(C_{tt}+C_{nn})?¹l where s is the values of the field in the interpolation grid, l is the vector of the observed (centered) values of the field, C_{nn} is the noise covariance matrix (multipled by a factor 10 to compensate for the effects of random and flicker noise) of the measurements, C_{tt} is the covariance matrix of the observed values of the field and C_{st} is the cross-covariance matrix between the interpolation points and the oservation points. If we suppose the velocity field is isotropic and homogeneus, the elements of the covariance and cross-covariance matrices can be represented by the Cauchy function: C_{ij}^{e_{n}n}=((w_{e_{n}n})/(1+(((d_{ij})/(d?)))²)) where d_{ij} is the spherical distance between i and j sites, w_{e_{n}n} are the variances of the East and North components of the velocity field and d? is the correlation length. The w_{e_{n}n} values were computed through the empirical variance formula, and the d? value was estimating by fitting the (normalized) Cauchy function to the empirical correlogram. We found d?=105 Km. This values is (1/3) smaller than the values reported by <cite>altamimilegrand</cite>, <cite>kahlestraub</cite>, <cite>kahlecocard1</cite> e <cite>kahlecocard2</cite>. The strain-rate tensor is built using the partials of the analitical approximation given by the least squares collocation formula, computed with respect to East and North direction. To check the correctness of the algorithm we compare our strain-rate values with those estimated by <cite>serpe1</cite>. South-Eastern Alps are characterized by a deformation rate of 10÷15 nstrain/yr, Central Italy shows an extension rate of 25 nstrain/yr. Sardinia and Corsica do not show significant deformation rates. The agreement between or results and <cite>serpe1</cite> is better where our velocity field is more dense. This agreement is important because our processing and stacking software, our GPS network configuration and our strain-rate computation algorithm is different from <cite>serpe1</cite>. Finally this agreeent suggests that the value of the correlation length we found is correct. Since we had not the strain-rate computation software, we coded the least squares collocation algorithm into a python module (see appendix A.8).
Campo di velocità europeo dedotto da misure GPS / Nardo, Andrea. - (2008 Jul 31).
Campo di velocità europeo dedotto da misure GPS
Nardo, Andrea
2008
Abstract
The aim of this thesis is the application of the GPS technology to the investigation of crustal deformations, in order to exploit data from a GPS network to improve the knowledge of seismic phaenomena. We attack this problem at different levels: from weekly GPS data processing to normal equations stacking, in order to compute the velocity field and the deformations field of the crust. As a EUREF LAC (Local Analysis Centre) we carry out the weekly processing following the EUREF guidelines: we process data from a cluster of 34 stations. Besides this, we process data from another (non EUREF) cluster of stations, deployed on the Italian and Austrian regions. The aim of this second processing activity is to densify the EUREF network using stations whose quality is comparable to the EUREF stations. For both of the networks raw data are made up of daily phase and code measuremente, at 30 seconds sampled, stored as RINEX observation files. We download these data every week, using PERL scripts to automate the procedure. The data processing is also done in an automatic way, trough Bernese 5 and the Bernese Processing Engine. To produce daily normal equations and coordinates estimates we use the processing strategy defined by RNX2SNX PCF. This Process Control File is made of a list of statements activating the Bernese programs; the main features of this procedure are 1) Rejecting of RINEX containing large gaps or big residuals, 2) Ambiguity resolution through quasi-ionosphere-free strategy, 3) alignement of the network to the ITRF2000 (ITRF2005, now) using a set of fiducial stations. Weekly coordinates are estimated by stacking the daily normal equations; daily tropospheric delays are estimated by fixing the coordinates of the site to their weekly mean values.The velocity field is computed by a multi years solution. At this level we must ensure that all the discontinuities and the outliers are eliminated from the time series of each station. This is done by editing a Bernese input file (.STA) which stores all the information needed to remove discontinuities and reject outliers. Once continous EUREF time series have been obtained, we densify the velocity field by using the information stored in Austrian and Italian regions normal equations files. In order to estimate a reliable velocity field we solve the problem of multi-year normal equations stacking. First of all, we stack the EUREF weekly normal equations; in order to solve this issue it is necessary to screen the time serie of each site in order to find all the offsets and the outliers. The information collected is stored in a input file (.STA file) for the Bernese 5 stacking programme ADDNEQ2. The EUREF normal equations (neq's) stacking is essential for the definition of the reference frame. The second issue is the densification of the EUREF velocity field. This step is essential because the EUREF network is a cartographic network, so its data are not suitable to be used in crustal deformation studies. But this network can be densified using the data belonging to others GPS local networks through the stacking of their normal equations. In this view, each weekly (local) normal equation must be stacked with the related EUREF weekly normal equation, hence the resulting neq's must be stacked in a multi-year solution. We use weekly neq's belonging to two networks, namely UPA and GP, to densify the velocity field in the Italian and Austrian areas. During the EUREF multi-year stacking the programme ADDNEQ2 showed poor speed performance and repeatedly crashed, so we decided to use a faster and more stable software. We moved to CATREF (Combination and Analysis of Terrestrial Reference Frames, see altamimiitrf2000, altamimiitrf2005 and noquet). The stacking strategy we used to carry out the multi-year densified solution is made of 4 steps: 1.Removal of the a-priori constraints from EUREF, UPA and GP weekly neq's, and imposition of new constraints. 2.Preliminary multi-year solution for each network, in order to find offsets and outliers. 3.Stacking of each UPA and GP neq with the related EUREF neq. 4.Multi-year stacking of the resulting neq's. Step 1 is very important to define a reference frame which is common for each neq involved in a stacking procedure. It is well known the neq's deriving from GPS observables are not full rank: the rank deficit is 7 and is strictly related to the poor definition of the reference frame given by the GPS observables themselves, since they are not sensitive to changes due to translations, rotation and are also scale invariant. So the information used to define the reference frame must be introduced by means of pseudo-observation equations called constraints. Several kinds of constraints can be imposed: some of them acting on the coordinates and velocities of the sites, others acting on the translation, rotation and scale parameters. CATREF is based on the second kind of constraints (the so-called minimum constraints). Offsets detected in the time series are treated through a piecewise approach: two sets of coordinates and velocities are estimated for the same site, using the data before and after the offset epoch and constraining the velocities (before and after) to the same value. The weighted RMS (WRMS) of the weekly solution versus the stacked solution is used to check the correctness of the constraints and outliers removal. Following A. KenyeresFile | Dimensione | Formato | |
---|---|---|---|
andrea_nardo_thesis.pdf
accesso aperto
Tipologia:
Tesi di dottorato
Licenza:
Non specificato
Dimensione
5.33 MB
Formato
Adobe PDF
|
5.33 MB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.