Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros

Bases de datos
Tipo del documento
Asunto de la revista
País de afiliación
Intervalo de año de publicación
1.
Opt Express ; 29(11): 17011-17022, 2021 May 24.
Artículo en Inglés | MEDLINE | ID: mdl-34154252

RESUMEN

Rigorous statistical testing of deformation using a terrestrial laser scanner (TLS) can avoid events such as structure collapses. Such a procedure necessitates an accurate description of the TLS measurements' noise, which should include the correlations between angles. Unfortunately, these correlations are often unaccounted for due to a lack of knowledge. This contribution addresses this challenge. We combine (i) a least-square approximation to extract the geometry of the TLS point cloud with the aim to analyze the residuals of the fitting and (ii) a specific filtering coupled with a maximum likelihood estimation to quantify the amount of flicker noise versus white noise. This allows us to set up fully populated variance covariance matrices of the TLS noise as a result.

2.
Sensors (Basel) ; 19(17)2019 Aug 21.
Artículo en Inglés | MEDLINE | ID: mdl-31438564

RESUMEN

B-spline surfaces possess attractive properties such as a high degree of continuity or the local support of their basis functions. One of the major applications of B-spline surfaces in engineering geodesy is the least-square (LS) fitting of surfaces from, e.g., 3D point clouds obtained from terrestrial laser scanners (TLS). Such mathematical approximations allow one to test rigorously with a given significance level the deformation magnitude between point clouds taken at different epochs. Indeed, statistical tests cannot be applied when point clouds are processed in commonly used software such as CloudCompare, which restrict the analysis of deformation to simple deformation maps based on distance computation. For a trustworthy test decision and a resulting risk management, the stochastic model of the underlying observations needs, however, to be optimally specified. Since B-spline surface approximations necessitate Cartesian coordinates of the TLS observations, the diagonal variance covariance matrix (VCM) of the raw TLS measurements has to be transformed by means of the error propagation law. Unfortunately, this procedure induces mathematical correlations, which can strongly affect the chosen test statistics to analyse deformation, if neglected. This may lead potentially to rejecting wrongly the null hypothesis of no-deformation, with risky and expensive consequences. In this contribution, we propose to investigate the impact of mathematical correlations on test statistics, using real TLS observations from a bridge under load. As besides TLS, a highly precise laser tracker (LT) was used, the significance of the difference of the test statistics when the stochastic model is misspecified can be assessed. However, the underlying test distribution is hardly tractable so that only an adapted bootstrapping allows the computation of trustworthy p-values. Consecutively, the extent to which heteroscedasticity and mathematical correlations can be neglected or simplified without impacting the test decision is shown in a rigorous way, paving the way for a simplification based on the intensity model.

3.
Sensors (Basel) ; 18(9)2018 Sep 05.
Artículo en Inglés | MEDLINE | ID: mdl-30189695

RESUMEN

For a trustworthy least-squares (LS) solution, a good description of the stochastic properties of the measurements is indispensable. For a terrestrial laser scanner (TLS), the range variance can be described by a power law function with respect to the intensity of the reflected signal. The power and scaling factors depend on the laser scanner under consideration, and could be accurately determined by means of calibrations in 1d mode or residual analysis of LS adjustment. However, such procedures complicate significantly the use of empirical intensity models (IM). The extent to which a point-wise weighting is suitable when the derived variance covariance matrix (VCM) is further used in a LS adjustment remains moreover questionable. Thanks to closed loop simulations, where both the true geometry and stochastic model are under control, we investigate how variations of the parameters of the IM affect the results of a LS adjustment. As a case study, we consider the determination of the Cartesian coordinates of the control points (CP) from a B-splines curve. We show that a constant variance can be assessed to all the points of an object having homogeneous properties, without affecting the a posteriori variance factor or the loss of efficiency of the LS solution. The results from a real case scenario highlight that the conclusions of the simulations stay valid even for more challenging geometries. A procedure to determine the range variance is proposed to simplify the computation of the VCM.

4.
J Geod ; 97(2): 14, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36760754

RESUMEN

The global navigation satellite system (GNSS) daily position time series are often described as the sum of stochastic processes and geophysical signals which allow to study global and local geodynamical effects such as plate tectonics, earthquakes, or ground water variations. In this work, we propose to extend the Generalized Method of Wavelet Moments (GMWM) to estimate the parameters of linear models with correlated residuals. This statistical inferential framework is applied to GNSS daily position time-series data to jointly estimate functional (geophysical) as well as stochastic noise models. Our method is called GMWMX, with X standing for eXogenous variables: it is semi-parametric, computationally efficient and scalable. Unlike standard methods such as the widely used maximum likelihood estimator (MLE), our methodology offers statistical guarantees, such as consistency and asymptotic normality, without relying on strong parametric assumptions. At the Gaussian model, our results (theoretical and obtained in simulations) show that the estimated parameters are similar to the ones obtained with the MLE. The computational performances of our approach have important practical implications. Indeed, the estimation of the parameters of large networks of thousands of GNSS stations (some of them being recorded over several decades) quickly becomes computationally prohibitive. Compared to standard likelihood-based methods, the GMWMX has a considerably reduced algorithmic complexity of order O { log ( n ) n } for a time series of length n. Thus, the GMWMX appears to provide a reduction in processing time of a factor of 10-1000 compared to likelihood-based methods depending on the considered stochastic model, the length of the time series and the amount of missing data. As a consequence, the proposed method allows the estimation of large-scale problems within minutes on a standard computer. We validate the performances of our method via Monte Carlo simulations by generating GNSS daily position time series with missing observations and we consider composite stochastic noise models including processes presenting long-range dependence such as power law or Matérn processes. The advantages of our method are also illustrated using real time series from GNSS stations located in the Eastern part of the USA.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA