Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 159
Filtrar
Más filtros

Banco de datos
Tipo del documento
Intervalo de año de publicación
1.
Stat Appl Genet Mol Biol ; 23(1)2024 Jan 01.
Artículo en Inglés | MEDLINE | ID: mdl-38736398

RESUMEN

Longitudinal time-to-event analysis is a statistical method to analyze data where covariates are measured repeatedly. In survival studies, the risk for an event is estimated using Cox-proportional hazard model or extended Cox-model for exogenous time-dependent covariates. However, these models are inappropriate for endogenous time-dependent covariates like longitudinally measured biomarkers, Carcinoembryonic Antigen (CEA). Joint models that can simultaneously model the longitudinal covariates and time-to-event data have been proposed as an alternative. The present study highlights the importance of choosing the baseline hazards to get more accurate risk estimation. The study used colon cancer patient data to illustrate and compare four different joint models which differs based on the choice of baseline hazards [piecewise-constant Gauss-Hermite (GH), piecewise-constant pseudo-adaptive GH, Weibull Accelerated Failure time model with GH & B-spline GH]. We conducted simulation study to assess the model consistency with varying sample size (N = 100, 250, 500) and censoring (20 %, 50 %, 70 %) proportions. In colon cancer patient data, based on Akaike information criteria (AIC) and Bayesian information criteria (BIC), piecewise-constant pseudo-adaptive GH was found to be the best fitted model. Despite differences in model fit, the hazards obtained from the four models were similar. The study identified composite stage as a prognostic factor for time-to-event and the longitudinal outcome, CEA as a dynamic predictor for overall survival in colon cancer patients. Based on the simulation study Piecewise-PH-aGH was found to be the best model with least AIC and BIC values, and highest coverage probability(CP). While the Bias, and RMSE for all the models showed a competitive performance. However, Piecewise-PH-aGH has shown least bias and RMSE in most of the combinations and has taken the shortest computation time, which shows its computational efficiency. This study is the first of its kind to discuss on the choice of baseline hazards.


Asunto(s)
Neoplasias del Colon , Modelos de Riesgos Proporcionales , Humanos , Estudios Longitudinales , Neoplasias del Colon/mortalidad , Neoplasias del Colon/genética , Análisis de Supervivencia , Simulación por Computador , Modelos Estadísticos , Teorema de Bayes , Antígeno Carcinoembrionario/sangre
2.
J Electrocardiol ; 86: 153783, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-39213712

RESUMEN

Analyzing Electrocardiogram (ECG) signals is imperative for diagnosing cardiovascular diseases. However, evaluating ECG analysis techniques faces challenges due to noise and artifacts in actual signals. Machine learning for automatic diagnosis encounters data acquisition hurdles due to medical data privacy constraints. Addressing these issues, ECG modeling assumes a crucial role in biomedical and parametric spline-based methods have garnered significant attention for their ability to accurately represent the complex temporal dynamics of ECG signals. This study conducts a comparative analysis of two parametric spline-based methods-B-spline and Hermite cubic spline-for ECG modeling, aiming to identify the most effective approach for accurate and reliable ECG representation. The Hermite cubic spline serves as one of the most effective interpolation methods, while B-spline is an approximation method. The comparative analysis includes both qualitative and quantitative evaluations. Qualitative assessment involves visually inspecting the generated spline-based models, comparing their resemblance to the original ECG signals, and employing power spectrum analysis. Quantitative analysis incorporates metrics such as root mean square error (RMSE), Percentage Root Mean Square Difference (PRD) and cross correlation, offering a more objective measure of the model's performance. Preliminary results indicate promising capabilities for both spline-based methods in representing ECG signals. However, the analysis unveils specific strengths and weaknesses for each method. The B-spline method offers greater flexibility and smoothness, while the cubic spline method demonstrates superior waveform capturing abilities with the preservation of control points, a critical aspect in the medical field. Presented research provides valuable insights for researchers and practitioners in selecting the most appropriate method for their specific ECG modeling requirements. Adjustments to control points and parameterization enable the generation of diverse ECG waveforms, enhancing the versatility of this modeling technique. This approach has the potential to extend its utility to other medical signals, presenting a promising avenue for advancing biomedical research.


Asunto(s)
Electrocardiografía , Procesamiento de Señales Asistido por Computador , Electrocardiografía/métodos , Humanos , Algoritmos , Aprendizaje Automático , Reproducibilidad de los Resultados
3.
J Anim Breed Genet ; 141(4): 365-378, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-38217261

RESUMEN

The current study sought to genetically assess the lactation curve of Alpine × Beetal crossbred goats through the application of random regression models (RRM). The objective was to estimate genetic parameters of the first lactation test-day milk yield (TDMY) for devising a practical breeding strategy within the nucleus breeding programme. In order to model variations in lactation curves, 25,998 TDMY records were used in this study. For the purpose of estimating genetic parameters, orthogonal Legendre polynomials (LEG) and B-splines (BS) were examined in order to generate suitable and parsimonious models. A single-trait RRM technique was used for the analysis. The average first lactation TDMY was 1.22 ± 0.03 kg and peak yield (1.35 ± 0.02 kg) was achieved around the 7th test day (TD). The present investigation has demonstrated the superiority of the B-spline model for the genetic evaluation of Alpine × Beetal dairy goats. The optimal random regression model was identified as a quadratic B-spline function, characterized by six knots to represent the central trend. This model effectively captured the patterns of additive genetic influences, animal-specific permanent environmental effects (c2) and 22 distinct classes of (heterogeneous) residual variance. Additive variances and heritability (h2) estimates were lower in the early lactation, however, moderate across most parts of the lactation studied, ranging from 0.09 ± 0.04 to 0.33 ± 0.06. The moderate heritability estimates indicate the potential for selection using favourable combinations of test days throughout the lactation period. It was also observed that a high proportion of total variance was attributed to the animal's permanent environment. Positive genetic correlations were observed for adjacent TDMY values, while the correlations became less pronounced for more distant TDMY values. Considering better fitting of the lactation curve, the use of B-spline functions for genetic evaluation of Alpine × Beetal goats using RRM is recommended.


Asunto(s)
Cabras , Lactancia , Animales , Cabras/genética , Cabras/fisiología , Lactancia/genética , Femenino , Cruzamiento , Análisis de Regresión , Modelos Genéticos , Leche/metabolismo , Industria Lechera
4.
Sensors (Basel) ; 24(12)2024 Jun 18.
Artículo en Inglés | MEDLINE | ID: mdl-38931720

RESUMEN

This paper addresses the critical need for advanced real-time vehicle detection methodologies in Vehicle Intelligence Systems (VIS), especially in the context of using Unmanned Aerial Vehicles (UAVs) for data acquisition in severe weather conditions, such as heavy snowfall typical of the Nordic region. Traditional vehicle detection techniques, which often rely on custom-engineered features and deterministic algorithms, fall short in adapting to diverse environmental challenges, leading to a demand for more precise and sophisticated methods. The limitations of current architectures, particularly when deployed in real-time on edge devices with restricted computational capabilities, are highlighted as significant hurdles in the development of efficient vehicle detection systems. To bridge this gap, our research focuses on the formulation of an innovative approach that combines the fractional B-spline wavelet transform with a tailored U-Net architecture, operational on a Raspberry Pi 4. This method aims to enhance vehicle detection and localization by leveraging the unique attributes of the NVD dataset, which comprises drone-captured imagery under the harsh winter conditions of northern Sweden. The dataset, featuring 8450 annotated frames with 26,313 vehicles, serves as the foundation for evaluating the proposed technique. The comparative analysis of the proposed method against state-of-the-art detectors, such as YOLO and Faster RCNN, in both accuracy and efficiency on constrained devices, emphasizes the capability of our method to balance the trade-off between speed and accuracy, thereby broadening its utility across various domains.

5.
Sensors (Basel) ; 24(12)2024 Jun 18.
Artículo en Inglés | MEDLINE | ID: mdl-38931735

RESUMEN

Autonomous exploration in unknown environments is a fundamental problem for the practical application of unmanned ground vehicles (UGVs). However, existing exploration methods face difficulties when directly applied to UGVs due to limited sensory coverage, conservative exploration strategies, inappropriate decision frequencies, and the non-holonomic constraints of wheeled vehicles. In this paper, we present IB-PRM, a hierarchical planning method that combines Incremental B-splines with a probabilistic roadmap, which can support rapid exploration by a UGV in complex unknown environments. We define a new frontier structure that includes both information-gain guidance and a B-spline curve segment with different arrival orientations to satisfy the non-holonomic constraint characteristics of UGVs. We construct and maintain local and global graphs to generate and store filtered frontiers. By jointly solving the Traveling Salesman Problem (TSP) using these frontiers, we obtain the optimal global path traversing feasible frontiers. Finally, we optimize the global path based on the Time Elastic Band (TEB) algorithm to obtain a smooth, continuous, and feasible local trajectory. We conducted comparative experiments with existing advanced exploration methods in simulation environments of different scenarios, and the experimental results demonstrate that our method can effectively improve the efficiency of UGV exploration.

6.
Sensors (Basel) ; 24(16)2024 Aug 17.
Artículo en Inglés | MEDLINE | ID: mdl-39205026

RESUMEN

This paper proposes a method for solving the path planning problem for a collaborative robot. The time-optimal, smooth, collision-free B-spline path is obtained by the application of a nature-inspired optimization algorithm. The proposed approach can be especially useful when moving items that are delicate or contain a liquid in an open container using a robotic arm. The goal of the optimization is to obtain the shortest execution time of the production cycle, taking into account the velocity, velocity and jerk limits, and the derivative continuity of the final trajectory. For this purpose, the velocity profiling algorithm for B-spline paths is proposed. The methodology has been applied to the production cycle optimization of the pick-and-place process using a collaborative robot. In comparison with point-to-point movement and the solution provided by the RRT* algorithm with the same velocity profiling to ensure the same motion limitations, the proposed path planning algorithm decreased the entire production cycle time by 11.28% and 57.5%, respectively. The obtained results have been examined in a simulation with the entire production cycle visualization. Moreover, the smoothness of the movement of the robotic arm has been validated experimentally using a robotic arm.

7.
Lifetime Data Anal ; 2024 Apr 16.
Artículo en Inglés | MEDLINE | ID: mdl-38625444

RESUMEN

In studies with time-to-event outcomes, multiple, inter-correlated, and time-varying covariates are commonly observed. It is of great interest to model their joint effects by allowing a flexible functional form and to delineate their relative contributions to survival risk. A class of semiparametric transformation (ST) models offers flexible specifications of the intensity function and can be a general framework to accommodate nonlinear covariate effects. In this paper, we propose a partial-linear single-index (PLSI) transformation model that reduces the dimensionality of multiple covariates into a single index and provides interpretable estimates of the covariate effects. We develop an iterative algorithm using the regression spline technique to model the nonparametric single-index function for possibly nonlinear joint effects, followed by nonparametric maximum likelihood estimation. We also propose a nonparametric testing procedure to formally examine the linearity of covariate effects. We conduct Monte Carlo simulation studies to compare the PLSI transformation model with the standard ST model and apply it to NYU Langone Health de-identified electronic health record data on COVID-19 hospitalized patients' mortality and a Veteran's Administration lung cancer trial.

8.
Entropy (Basel) ; 26(6)2024 Jun 07.
Artículo en Inglés | MEDLINE | ID: mdl-38920507

RESUMEN

Many semiparametric spatial autoregressive (SSAR) models have been used to analyze spatial data in a variety of applications; however, it is a common phenomenon that heteroscedasticity often occurs in spatial data analysis. Therefore, when considering SSAR models in this paper, it is allowed that the variance parameters of the models can depend on the explanatory variable, and these are called heterogeneous semiparametric spatial autoregressive models. In order to estimate the model parameters, a Bayesian estimation method is proposed for heterogeneous SSAR models based on B-spline approximations of the nonparametric function. Then, we develop an efficient Markov chain Monte Carlo sampling algorithm on the basis of the Gibbs sampler and Metropolis-Hastings algorithm that can be used to generate posterior samples from posterior distributions and perform posterior inference. Finally, some simulation studies and real data analysis of Boston housing data have demonstrated the excellent performance of the proposed Bayesian method.

9.
Entropy (Basel) ; 26(7)2024 Jul 04.
Artículo en Inglés | MEDLINE | ID: mdl-39056939

RESUMEN

This paper proposed a two-dimensional steady-state field prediction approach that combines B-spline functions and a fully connected neural network. In this approach, field data, which are determined by corresponding control vectors, are fitted by a selected B-spline function set, yielding the corresponding best-fitting weight vectors, and then a fully connected neural network is trained using those weight vectors and control vectors. The trained neural network first predicts a weight vector using a given control vector, and then the corresponding field can be restored via the selected B-spline set. This method was applied to learn and predict two-dimensional steady advection-diffusion physical fields with absorption and source terms, and its accuracy and performance were tested and verified by a series of numerical experiments with different B-spline sets, boundary conditions, field gradients, and field states. The proposed method was finally compared with a generative adversarial network (GAN) and a physics-informed neural network (PINN). The results indicated that the B-spline neural network could predict the tested physical fields well; the overall error can be reduced by expanding the selected B-spline set. Compared with GAN and PINN, the proposed method also presented the advantages of a high prediction accuracy, less demand for training data, and high training efficiency.

10.
Entropy (Basel) ; 26(6)2024 Jun 15.
Artículo en Inglés | MEDLINE | ID: mdl-38920526

RESUMEN

When using traditional Euler deconvolution optimization strategies, it is difficult to distinguish between anomalies and their corresponding Euler tails (those solutions are often distributed outside the anomaly source, forming "tail"-shaped spurious solutions, i.e., misplaced Euler solutions, which must be removed or marked) with only the structural index. The nonparametric estimation method based on the normalized B-spline probability density (BSS) is used to separate the Euler solution clusters and mark different anomaly sources according to the similarity and density characteristics of the Euler solutions. For display purposes, the BSS needs to map the samples onto the estimation grid at the points where density will be estimated in order to obtain the probability density distribution. However, if the size of the samples or the estimation grid is too large, this process can lead to high levels of memory consumption and excessive computation times. To address this issue, a fast linear binning approximation algorithm is introduced in the BSS to speed up the computation process and save time. Subsequently, the sample data are quickly projected onto the estimation grid to facilitate the discrete convolution between the grid and the density function using a fast Fourier transform. A method involving multivariate B-spline probability density estimation based on the FFT (BSSFFT), in conjunction with fast linear binning appropriation, is proposed in this paper. The results of two random normal distributions show the correctness of the BSS and BSSFFT algorithms, which is verified via a comparison with the true probability density function (pdf) and Gaussian kernel smoothing estimation algorithms. Then, the Euler solutions of the two synthetic models are analyzed using the BSS and BSSFFT algorithms. The results are consistent with their theoretical values, which verify their correctness regarding Euler solutions. Finally, the BSSFFT is applied to Bishop 5X data, and the numerical results show that the comprehensive analysis of the 3D probability density distributions using the BSSFFT algorithm, derived from the Euler solution subset of x0,y0,z0, can effectively separate and locate adjacent anomaly sources, demonstrating strong adaptability.

11.
Biometrics ; 79(3): 2232-2245, 2023 09.
Artículo en Inglés | MEDLINE | ID: mdl-36065564

RESUMEN

Functional data analysis has emerged as a powerful tool in response to the ever-increasing resources and efforts devoted to collecting information about response curves or anything that varies over a continuum. However, limited progress has been made with regard to linking the covariance structures of response curves to external covariates, as most functional models assume a common covariance structure. We propose a new functional regression model with covariate-dependent mean and covariance structures. Particularly, by allowing variances of random scores to be covariate-dependent, we identify eigenfunctions for each individual from the set of eigenfunctions that govern the variation patterns across all individuals, resulting in high interpretability and prediction power. We further propose a new penalized quasi-likelihood procedure that combines regularization and B-spline smoothing for model selection and estimation and establish the convergence rate and asymptotic normality of the proposed estimators. The utility of the developed method is demonstrated via simulations, as well as an analysis of the Avon Longitudinal Study of Parents and Children concerning parental effects on the growth curves of their offspring, which yields biologically interesting results.


Asunto(s)
Estudios Longitudinales , Niño , Humanos , Funciones de Verosimilitud
12.
Artículo en Inglés | MEDLINE | ID: mdl-39044771

RESUMEN

In this paper we propose a new semiparametric function-on-function quantile regression model with time-dynamic single-index interactions. Our model is very flexible in taking into account of the nonlinear time-dynamic interaction effects of the multivariate longitudinal/functional covariates on the longitudinal response, that most existing quantile regression models for longitudinal data are special cases of our proposed model. We propose to approximate the bivariate nonparametric coefficient functions by tensor product B-splines, and employ a check loss minimization approach to estimate the bivariate coefficient functions and the index parameter vector. Under some mild conditions, we establish the asymptotic normality of the estimated single-index coefficients using projection orthogonalization technique, and obtain the convergence rates of the estimated bivariate coefficient functions. Furthermore, we propose a score test to examine whether there exist interaction effects between the covariates. The finite sample performance of the proposed method is illustrated by Monte Carlo simulations and an empirical data analysis.

13.
Sensors (Basel) ; 23(19)2023 Sep 28.
Artículo en Inglés | MEDLINE | ID: mdl-37836981

RESUMEN

To meet the real-time path planning requirements of intelligent vehicles in dynamic traffic scenarios, a path planning and evaluation method is proposed in this paper. Firstly, based on the B-spline algorithm and four-stage lane-changing theory, an obstacle avoidance path planning algorithm framework is constructed. Then, to obtain the optimal real-time path, a comprehensive real-time path evaluation mechanism that includes path safety, smoothness, and comfort is established. Finally, to verify the proposed approach, co-simulation and real vehicle testing are conducted. In the dynamic obstacle avoidance scenario simulation, the lateral acceleration, yaw angle, yaw rate, and roll angle fluctuation ranges of the ego-vehicle are ±2.39 m/s2, ±13.31°, ±13.26°/s, and ±0.938°, respectively. The results show that the proposed algorithm can generate real-time, available obstacle avoidance paths. And the proposed evaluation mechanism can find the optimal path for the current scenario.

14.
Sensors (Basel) ; 23(7)2023 Mar 28.
Artículo en Inglés | MEDLINE | ID: mdl-37050593

RESUMEN

To deal with the problem of optimal path planning in 2D space, this paper introduces a new toolbox named "Navigation with Polytopes" and explains the algorithms behind it. The toolbox allows one to create a polytopic map from a standard grid map, search for an optimal corridor, and plan a safe B-spline reference path used for mobile robot navigation. Specifically, the B-spline path is converted into its equivalent Bézier representation via a novel calculation method in order to reduce the conservativeness of the constrained path planning problem. The conversion can handle the differences between the curve intervals and allows for efficient computation. Furthermore, two different constraint formulations used for enforcing a B-spline path to stay within the sequence of connected polytopes are proposed, one with a guaranteed solution. The toolbox was extensively validated through simulations and experiments.

15.
Sensors (Basel) ; 23(6)2023 Mar 17.
Artículo en Inglés | MEDLINE | ID: mdl-36991936

RESUMEN

High precision geometric measurement of free-form surfaces has become the key to high-performance manufacturing in the manufacturing industry. By designing a reasonable sampling plan, the economic measurement of free-form surfaces can be realized. This paper proposes an adaptive hybrid sampling method for free-form surfaces based on geodesic distance. The free-form surfaces are divided into segments, and the sum of the geodesic distance of each surface segment is taken as the global fluctuation index of free-form surfaces. The number and location of the sampling points for each free-form surface segment are reasonably distributed. Compared with the common methods, this method can significantly reduce the reconstruction error under the same sampling points. This method overcomes the shortcomings of the current commonly used method of taking curvature as the local fluctuation index of free-form surfaces, and provides a new perspective for the adaptive sampling of free-form surfaces.

16.
Sensors (Basel) ; 23(5)2023 Feb 22.
Artículo en Inglés | MEDLINE | ID: mdl-36904621

RESUMEN

Text regions in natural scenes have complex and variable shapes. Directly using contour coordinates to describe text regions will make the modeling inadequate and lead to low accuracy of text detection. To address the problem of irregular text regions in natural scenes, we propose an arbitrary-shaped text detection model based on Deformable DETR called BSNet. The model differs from the traditional method of directly predicting contour points by using B-Spline curve to make the text contour more accurate and reduces the number of predicted parameters simultaneously. The proposed model eliminates manually designed components and dramatically simplifies the design. The proposed model achieves F-measure of 86.8% and 87.6% on CTW1500 and Total-Text, demonstrating the model's effectiveness.

17.
Sensors (Basel) ; 23(21)2023 Oct 27.
Artículo en Inglés | MEDLINE | ID: mdl-37960455

RESUMEN

"Three straight and two flat" is the inevitable demand when realizing the intelligent mining of a fully mechanized mining face. To address the crucial technical issue of lacking accurate perception of the shape of the scraper conveyor during intelligent coal mining, a three-dimensional curvature sensor involving fiber Bragg grating (FBG) is used as a perceptive tool to conduct curve reconstruction research based on different local motion frames and to reconstruct the shape of the scraper conveyor. Firstly, the formation process of the 'S'-shaped bending section of the scraper conveyor during the pushing process is determined. Based on the FBG sensing principle, a mathematical model between the variation in the central wavelength and the strain and curvature is established, and the cubic B-spline interpolation method is employed to continuously process the obtained discrete curvature. Secondly, based on differential geometry, a spatial curve reconstruction algorithm based on the Frenet moving frame is derived, and the shape curve prediction interpolation model is built based on a gated recurrent unit (GRU) model, which reduces the impact of the decrease in curve reconstruction accuracy caused by damage to some grating measuring points. Finally, an experimental platform was designed and built, and sensors with curvature radii of 6 m, 7 m, and 8 m were tested. The experimental results showed that the reconstructed curve was essentially consistent with the actual shape, and the absolute error at the end was about 2 mm. The feasibility of this reconstruction algorithm in engineering has been proven, and this is of great significance in achieving shape curve perception and straightness control for scraper conveyors.

18.
J Xray Sci Technol ; 31(3): 555-572, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36911966

RESUMEN

BACKGROUND: In medical applications, computed tomography (CT) is widely used to evaluate various sample characteristics. However, image quality of CT reconstruction can be degraded due to artifacts. OBJECTIVE: To propose and test a truncated total variation (truncation TV) model to solve the problem of large penalties for the total variation (TV) model. METHODS: In this study, a truncated TV image denoising model in the fractional B-spline wavelet domain is developed to obtain the best solution. The method is validated by the analysis of CT reconstructed images of actual biological Pigeons samples. For this purpose, several indices including the peak signal-to-noise ratio (PSNR), structural similarity index (SSIM) and mean square error (MSE) are used to evaluate the quality of images. RESULTS: Comparing to the conventional truncated TV model that yields 22.55, 0.688 and 361.17 in PSNR, SSIM and MSE, respectively, using the proposed fractional B-spline-truncated TV model, the computed values of these evaluation indices change to 24.24, 0.898 and 244.98, respectively, indicating substantial reduction of image noise with higher PSNR and SSIM, and lower MSE. CONCLUSIONS: Study results demonstrate that compared with many classic image denoising methods, the new denoising algorithm proposed in this study can more effectively suppresses the reconstructed CT image artifacts while maintaining the detailed image structure.


Asunto(s)
Algoritmos , Análisis de Ondículas , Microtomografía por Rayos X , Relación Señal-Ruido , Artefactos , Procesamiento de Imagen Asistido por Computador/métodos
19.
Entropy (Basel) ; 25(2)2023 Jan 17.
Artículo en Inglés | MEDLINE | ID: mdl-36832552

RESUMEN

Output probability density function (PDF) tracking control of stochastic systems has always been a challenging problem in both theoretical development and engineering practice. Focused on this challenge, this work proposes a novel stochastic control framework so that the output PDF can track a given time-varying PDF. Firstly, the output PDF is characterised by the weight dynamics following the B-spline model approximation. As a result, the PDF tracking problem is transferred to a state tracking problem for weight dynamics. In addition, the model error of the weight dynamics is described by the multiplicative noises to more effectively establish its stochastic dynamics. Moreover, to better reflect the practical applications in the real world, the given tracking target is set to be time-varying rather than static. Thus, an extended fully probabilistic design (FPD) is developed based on the conventional FPD to handle multiplicative noises and to track the time-varying references in a superior way. Finally, the proposed control framework is verified by a numerical example, and a comparison simulation with the linear-quadratic regulator (LQR) method is also included to illustrate the superiority of our proposed framework.

20.
Extremes (Boston) ; 26(1): 101-138, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36751468

RESUMEN

A bivariate extreme-value copula is characterized by its Pickands dependence function, i.e., a convex function defined on the unit interval satisfying boundary conditions. This paper investigates the large-sample behavior of a nonparametric estimator of this function due to Cormier et al. (Extremes 17:633-659, 2014). These authors showed how to construct this estimator through constrained quadratic median B-spline smoothing of pairs of pseudo-observations derived from a random sample. Their estimator is shown here to exist whatever the order m ≥ 3 of the B-spline basis, and its consistency is established under minimal conditions. The large-sample distribution of this estimator is also determined under the additional assumption that the underlying Pickands dependence function is a B-spline of given order with a known set of knots.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA