Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 588
Filter
1.
Stat Methods Med Res ; : 9622802241268466, 2024 Oct 07.
Article in English | MEDLINE | ID: mdl-39373068

ABSTRACT

In this article, we present a joint modeling approach for zero-inflated longitudinal count measurements and time-to-event outcomes. For the longitudinal sub-model, a mixed effects Hurdle model is utilized, incorporating various distributional assumptions such as zero-inflated Poisson, zero-inflated negative binomial, or zero-inflated generalized Poisson. For the time-to-event sub-model, a Cox proportional hazard model is applied. For the functional form linking the longitudinal outcome history to the hazard of the event, a linear combination is used. This combination is derived from the current values of the linear predictors of Hurdle mixed effects. Some other forms are also considered, including a linear combination of the current slopes of the linear predictors of Hurdle mixed effects as well as the shared random effects. A Markov chain Monte Carlo method is implemented for Bayesian parameter estimation. Dynamic prediction using joint modeling is highly valuable in personalized medicine, as discussed here for joint modeling of zero-inflated longitudinal count measurements and time-to-event outcomes. We assess and demonstrate the effectiveness of the proposed joint models through extensive simulation studies, with a specific emphasis on parameter estimation and dynamic predictions for both over-dispersed and under-dispersed data. We finally apply the joint model to longitudinal microbiome pregnancy and HIV data sets.

2.
Adv Mater ; : e2409389, 2024 Oct 02.
Article in English | MEDLINE | ID: mdl-39358940

ABSTRACT

Water-repellent superhydrophobic surfaces are ubiquitous in nature. The fundamental understanding of bio/bio-inspired structures facilitates practical applications surmounting metastable superhydrophobicity. Typically, the hierarchical structure and/or reentrant morphology have been employed hitherto to suppress the Cassie-Baxter to Wenzel transition (CWT). Herein, a new design concept is reported, an effect of concave structure, which is vital for the stable superhydrophobic surface. The thermodynamic and kinetic stabilities of the concave pillars are evaluated by continuous exposure to various hydrostatic pressures and sudden impacts of water droplets with various Weber numbers (We), comparing them to the standard superhydrophobic normal pillars. Specifically, the concave pillar exhibits reinforced impact resistance preventing CWT below a critical We of ≈27.6, which is ≈1.6 times higher than that of the normal pillar (≈17.0). Subsequently, the stability of underwater air film (plastron) is investigated at various hydrostatic pressures. The results show that convex air caps formed at the concave cavities generate downward Laplace pressure opposing the exerted hydrostatic pressure between the pillars, thus impeding the hydrostatic pressure-dependent underwater air diffusion. Hence, the effects of trapped air caps contributing to the stable Cassie-Baxter state can offer a pioneering strategy for the exploration and utilization of superhydrophobic surfaces.

3.
Brief Bioinform ; 25(6)2024 Sep 23.
Article in English | MEDLINE | ID: mdl-39356327

ABSTRACT

Single-cell cross-modal joint clustering has been extensively utilized to investigate the tumor microenvironment. Although numerous approaches have been suggested, accurate clustering remains the main challenge. First, the gene expression matrix frequently contains numerous missing values due to measurement limitations. The majority of existing clustering methods treat it as a typical multi-modal dataset without further processing. Few methods conduct recovery before clustering and do not sufficiently engage with the underlying research, leading to suboptimal outcomes. Additionally, the existing cross-modal information fusion strategy does not ensure consistency of representations across different modes, potentially leading to the integration of conflicting information, which could degrade performance. To address these challenges, we propose the 'Recover then Aggregate' strategy and introduce the Unified Cross-Modal Deep Clustering model. Specifically, we have developed a data augmentation technique based on neighborhood similarity, iteratively imposing rank constraints on the Laplacian matrix, thus updating the similarity matrix and recovering dropout events. Concurrently, we integrate cross-modal features and employ contrastive learning to align modality-specific representations with consistent ones, enhancing the effective integration of diverse modal information. Comprehensive experiments on five real-world multi-modal datasets have demonstrated this method's superior effectiveness in single-cell clustering tasks.


Subject(s)
Single-Cell Analysis , Cluster Analysis , Single-Cell Analysis/methods , Humans , Algorithms , Tumor Microenvironment , Computational Biology/methods
4.
Heliyon ; 10(17): e36513, 2024 Sep 15.
Article in English | MEDLINE | ID: mdl-39286179

ABSTRACT

This paper provides a comprehensive analysis, using nonlocal stress-driven integral theory, of the static behavior of a nanoscale beam of bidirectionally graded materials. After a brief explanation of the mathematical formulation of BDFGMs, the work done and strain energy expressions derived from the displacement field are discussed. Variational formulations and Hamilton's principle are used to develop the equilibrium equation. An analytical development of the nonlocal kernel for stress-driven integral theory and formulated governing equation which was nondimensionalized later. Explicit equations for displacement and moment are obtained by solving this equation using the Laplace transformation. Three different boundary conditions are examined, and differences in the maximum displacement with respect to the nonlocal parameter and the two material FGM parameters are displayed both visually and in table form. The results exhibit excellent agreement and provide a standard for further research when they are closely compared to the existing numerical data. This work contributes to the knowledge of BDFGMs under nonlocal effects generated by stress-driven integral theory and offers solutions that have been confirmed for further investigation.

5.
Stat Med ; 2024 Sep 15.
Article in English | MEDLINE | ID: mdl-39278641

ABSTRACT

Trivariate joint modeling for longitudinal count data, recurrent events, and a terminal event for family data has increased interest in medical studies. For example, families with Lynch syndrome (LS) are at high risk of developing colorectal cancer (CRC), where the number of polyps and the frequency of colonoscopy screening visits are highly associated with the risk of CRC among individuals and families. To assess how screening visits influence polyp detection, which in turn influences time to CRC, we propose a clustered trivariate joint model. The proposed model facilitates longitudinal count data that are zero-inflated and over-dispersed and invokes individual-specific and family-specific random effects to account for dependence among individuals and families. We formulate our proposed model as a latent Gaussian model to use the Bayesian estimation approach with the integrated nested Laplace approximation algorithm and evaluate its performance using simulation studies. Our trivariate joint model is applied to a series of 18 families from Newfoundland, with the occurrence of CRC taken as the terminal event, the colonoscopy screening visits as recurrent events, and the number of polyps detected at each visit as zero-inflated count data with overdispersion. We showed that our trivariate model fits better than alternative bivariate models and that the cluster effects should not be ignored when analyzing family data. Finally, the proposed model enables us to quantify heterogeneity across families and individuals in polyp detection and CRC risk, thus helping to identify individuals and families who would benefit from more intensive screening visits.

6.
JBMR Plus ; 8(10): ziae116, 2024 Oct.
Article in English | MEDLINE | ID: mdl-39315381

ABSTRACT

High-resolution peripheral quantitative computed tomography (HR-pQCT) has emerged as a powerful imaging technique for characterizing bone microarchitecture in the human peripheral skeleton. The second-generation HR-pQCT scanner provides improved spatial resolution and a shorter scan time. However, the transition from the first-generation (XCTI) to second-generation HR-pQCT scanners (XCTII) poses challenges for longitudinal studies, multi-center trials, and comparison to historical data. Cross-calibration, an established approach for determining relationships between measurements obtained from different devices, can bridge this gap and enable the utilization and comparison of legacy data. The goal of this study was to establish cross-calibration equations to estimate XCTII measurements from XCTI data, using both the standard and Laplace-Hamming (LH) binarization approaches. Thirty-six volunteers (26-85 yr) were recruited and their radii and tibiae were scanned on both XCTI and XCTII scanners. XCTI images were analyzed using the manufacturer's standard protocol. XCTII images were analyzed twice: using the manufacturer's standard protocol and the LH segmentation approach previously developed and validated by our team. Linear regression analysis was used to establish cross-calibration equations. Results demonstrated strong correlations between XCTI and XCTII density and geometry outcomes. For most microstructural outcomes, although there were considerable differences in absolute values, correlations between measurements obtained from different scanners were strong, allowing for accurate cross-calibration estimations. For some microstructural outcomes with a higher sensitivity to spatial resolution (eg, trabecular thickness, cortical pore diameter), XCTII standard protocol resulted in poor correlations between the scanners, while our LH approach improved these correlations and decreased the difference in absolute values and the proportional bias for other measurements. For these reasons and due to the improved accuracy of our LH approach compared with the standard approach, as established in our previous study, we propose that investigators should use the LH approach for analyzing XCTII scans, particularly when comparing to XCTI data.

7.
Proc Natl Acad Sci U S A ; 121(38): e2404169121, 2024 Sep 17.
Article in English | MEDLINE | ID: mdl-39254998

ABSTRACT

In interval reproduction tasks, animals must remember the event starting the interval and anticipate the time of the planned response to terminate the interval. The interval reproduction task thus allows for studying both memory for the past and anticipation of the future. We analyzed previously published recordings from the rodent medial prefrontal cortex [J. Henke et al., eLife10, e71612 (2021)] during an interval reproduction task and identified two cell groups by modeling their temporal receptive fields using hierarchical Bayesian models. The firing in the "past cells" group peaked at the start of the interval and relaxed exponentially back to baseline. The firing in the "future cells" group increased exponentially and peaked right before the planned action at the end of the interval. Contrary to the previous assumption that timing information in the brain has one or two time scales for a given interval, we found strong evidence for a continuous distribution of the exponential rate constants for both past and future cell populations. The real Laplace transformation of time predicts exponential firing with a continuous distribution of rate constants across the population. Therefore, the firing pattern of the past cells can be identified with the Laplace transform of time since the past event while the firing pattern of the future cells can be identified with the Laplace transform of time until the planned future event.


Subject(s)
Neurons , Prefrontal Cortex , Prefrontal Cortex/physiology , Prefrontal Cortex/cytology , Animals , Rats , Neurons/physiology , Bayes Theorem , Male , Models, Neurological , Memory/physiology , Time Perception/physiology , Action Potentials/physiology
8.
Sci Total Environ ; 953: 176138, 2024 Nov 25.
Article in English | MEDLINE | ID: mdl-39260476

ABSTRACT

In an era marked by unprecedented anthropogenic change, marine systems are increasingly subjected to interconnected and dynamic external stressors, which profoundly reshape the behavior and resilience of marine ecological components. Nevertheless, despite widespread recognition of the significance of stressor interactions, there persist notable knowledge deficits in quantifying their interactions and the specific biological consequences that result. To bridge this crucial gap, this research detected and examined the causal relationships between five key exogenous stressors in a complex estuarine ecosystem. Furthermore, a Bayesian Hierarchical Spatio-temporal modeling framework was proposed to quantitatively evaluate the distinct, interactive, and globally sensitive effects of multiple stressors on the population dynamics of a crucial fish species: Harpadon nehereus. The results showed that interactions were detected between fisheries pressure (FP), the Pacific Decadal Oscillation index (PDO), runoff volume (RV), and sediment load (SL), with five of these interactions producing significant synergistic effects on H. nehereus biomass. The SL*PDO and RV*PDO interactions had positive synergistic effects, albeit through differing processes. The former interaction amplified the individual effects of each stressor, while the latter reversed the direction of the original impact. Indeed overall, the synergistic effect of multiple stressors was not favorable, with FP in particular posing the greatest threat to H. nehereus population. This threat was more pronounced at high SL or negative PDO phases. Therefore, local management efforts aimed at addressing multiple stressors and protecting resources should consider the findings. Additionally, although the velocity of climate change (VoCC) failed to produce significant interactions, changes in this stressor had the most sensitive impacts on the response of H. nehereus population. This research strives to enhance the dimensionality, generalizability, and flexibility of the quantification framework for marine multi-stressor interactions, aiming to foster broader research collaboration and jointly tackle the intricate pressures facing marine ecosystems.


Subject(s)
Estuaries , Animals , Environmental Monitoring , Ecosystem , Population Dynamics , Fisheries , Bayes Theorem , Stress, Physiological
9.
Ultrason Imaging ; : 1617346241271240, 2024 Sep 10.
Article in English | MEDLINE | ID: mdl-39257166

ABSTRACT

In this research work, Semantic-Preserved Generative Adversarial Network optimized by Piranha Foraging Optimization for Thyroid Nodule Classification in Ultrasound Images (SPGAN-PFO-TNC-UI) is proposed. Initially, ultrasound images are gathered from the DDTI dataset. Then the input image is sent to the pre-processing step. During pre-processing stage, the Multi-Window Savitzky-Golay Filter (MWSGF) is employed to reduce the noise and improve the quality of the ultrasound (US) images. The pre-processed output is supplied to the Generalized Intuitionistic Fuzzy C-Means Clustering (GIFCMC). Here, the ultrasound image's Region of Interest (ROI) is segmented. The segmentation output is supplied to the Fully Numerical Laplace Transform (FNLT) to extract the features, such as geometric features like solidity, orientation, roundness, main axis length, minor axis length, bounding box, convex area, and morphological features, like area, perimeter, aspect ratio, and AP ratio. The Semantic-Preserved Generative Adversarial Network (SPGAN) separates the image as benign or malignant nodules. Generally, SPGAN does not express any optimization adaptation methodologies for determining the best parameters to ensure the accurate classification of thyroid nodules. Therefore, the Piranha Foraging Optimization (PFO) algorithm is proposed to improve the SPGAN classifier and accurately identify the thyroid nodules. The metrics, like F-score, accuracy, error rate, precision, sensitivity, specificity, ROC, computing time is examined. The proposed SPGAN-PFO-TNC-UI method attains 30.54%, 21.30%, 27.40%, and 18.92% higher precision and 26.97%, 20.41%, 15.09%, and 18.27% lower error rate compared with existing techniques, like Thyroid detection and classification using DNN with Hybrid Meta-Heuristic and LSTM (TD-DL-HMH-LSTM), Quantum-Inspired convolutional neural networks for optimized thyroid nodule categorization (QCNN-OTNC), Thyroid nodules classification under Follow the Regularized Leader Optimization based Deep Neural Networks (CTN-FRL-DNN), Automatic classification of ultrasound thyroids images using vision transformers and generative adversarial networks (ACUTI-VT-GAN) respectively.

10.
Biometrics ; 80(3)2024 Jul 01.
Article in English | MEDLINE | ID: mdl-39282733

ABSTRACT

Benchmark dose analysis aims to estimate the level of exposure to a toxin associated with a clinically significant adverse outcome and quantifies uncertainty using the lower limit of a confidence interval for this level. We develop a novel framework for benchmark dose analysis based on monotone additive dose-response models. We first introduce a flexible approach for fitting monotone additive models via penalized B-splines and Laplace-approximate marginal likelihood. A reflective Newton method is then developed that employs de Boor's algorithm for computing splines and their derivatives for efficient estimation of the benchmark dose. Finally, we develop a novel approach for calculating benchmark dose lower limits based on an approximate pivot for the nonlinear equation solved by the estimated benchmark dose. The favorable properties of this approach compared to the Delta method and a parameteric bootstrap are discussed. We apply the new methods to make inferences about the level of prenatal alcohol exposure associated with clinically significant cognitive defects in children using data from six NIH-funded longitudinal cohort studies. Software to reproduce the results in this paper is available online and makes use of the novel semibmd  R package, which implements the methods in this paper.


Subject(s)
Dose-Response Relationship, Drug , Models, Statistical , Humans , Benchmarking , Female , Algorithms , Pregnancy , Prenatal Exposure Delayed Effects/chemically induced , Computer Simulation , Child , Data Interpretation, Statistical , Likelihood Functions
11.
Adv Mater ; : e2403316, 2024 Sep 17.
Article in English | MEDLINE | ID: mdl-39286894

ABSTRACT

Quick-drying fabrics, renowned for their rapid sweat evaporation, have witnessed various applications in strenuous exercise. Profiled fiber textiles exhibit enhanced quick-drying performance, which is attributed to the excellent wicking effect within fibrous bundles, facilitating the rapid transport of sweat. However, the evaporation process is not solely influenced by macroscopic liquid transport but also by microscopic liquid spreading on the fibers where periodic liquid knots induced by spontaneous fluidic instability significantly reduce the evaporation area. Here, a cross-shaped profiled fiber with high off-circularity, featured as multiple concavities along the fibrous longitude-axis, which enables the formation of a homogeneous thin liquid film on a single fiber without any periodic liquid knots, is developed. The high off-circularity cross-sections help overcoming Plateau-Rayleigh instability by tuning the Laplace pressure difference, further facilitated by capillary flow along the concave surface. The homogeneous thin liquid film on a single fiber is responsible for maximizing the evaporation area, resulting in excellent overall evaporation capacity. Consequently, fabrics made from such fibers exhibit rapid evaporation behavior, with evaporation rates ≈50% higher than those of cylindrical fabrics. It is envisioned that profiled fibers may provide inspiration for the manipulating homogeneous liquid films for applications in fluid coatings and functional textiles.

12.
J Cardiovasc Dev Dis ; 11(8)2024 Aug 15.
Article in English | MEDLINE | ID: mdl-39195157

ABSTRACT

The clinical significance of measuring vessel wall thickness is widely acknowledged. Recent advancements have enabled high-resolution 3D scans of arteries and precise segmentation of their lumens and outer walls; however, most existing methods for assessing vessel wall thickness are 2D. Despite being valuable, reproducibility and accuracy of 2D techniques depend on the extracted 2D slices. Additionally, these methods fail to fully account for variations in wall thickness in all dimensions. Furthermore, most existing approaches are difficult to be extended into 3D and their measurements lack spatial localization and are primarily confined to lumen boundaries. We advocate for a shift in perspective towards recognizing vessel wall thickness measurement as inherently a 3D challenge and propose adapting the Laplacian method as an outstanding alternative. The Laplacian method is implemented using convolutions, ensuring its efficient and rapid execution on deep learning platforms. Experiments using digital phantoms and vessel wall imaging data are conducted to showcase the accuracy, reproducibility, and localization capabilities of the proposed approach. The proposed method produce consistent outcomes that remain independent of centerlines and 2D slices. Notably, this approach is applicable in both 2D and 3D scenarios. It allows for voxel-wise quantification of wall thickness, enabling precise identification of wall volumes exhibiting abnormal wall thickness. Our research highlights the urgency of transitioning to 3D methodologies for vessel wall thickness measurement. Such a transition not only acknowledges the intricate spatial variations of vessel walls, but also opens doors to more accurate, localized, and insightful diagnostic insights.

13.
Ann N Y Acad Sci ; 1538(1): 98-106, 2024 Aug.
Article in English | MEDLINE | ID: mdl-39091080

ABSTRACT

Scientific progress within the last few decades has revealed the functional morphology of an insect's sticky footpads-a compliant pad that secretes thin liquid films. However, the physico-chemical mechanisms underlying their adhesion remain elusive. Here, we explore these underlying mechanisms by simultaneously measuring adhesive force and contact geometry of the adhesive footpads of live, tethered Indian stick insects, Carausius morosus, spanning more than two orders of magnitude in body mass. We find that the adhesive force we measure is similar to the previous measurements that use a centrifuge. Our measurements afford us the opportunity to directly probe the adhesive stress in vivo and use existing theory on capillary adhesion to predict the surface tension of the secreted liquid and compare it to previous assumptions. From our predictions, we find that the surface tension required to generate the adhesive stresses we observed ranges between 0.68 and 12 mN m - 1 ${\rm m}^{-1}$ . The low surface tension of the liquid would enhance the wetting of the stick insect's footpads and promote their ability to conform to various substrates. Our insights may inform the biomimetic design of capillary-based, reversible adhesives and motivate future studies on the physico-chemical properties of the secreted liquid.


Subject(s)
Insecta , Surface Tension , Animals , Insecta/physiology , Adhesiveness , Capillaries/physiology , Biomechanical Phenomena
14.
Heliyon ; 10(14): e34061, 2024 Jul 30.
Article in English | MEDLINE | ID: mdl-39108875

ABSTRACT

This work presents an accurate and efficient method, for solving a two dimensional time-fractional Oldroyd-B fluid model. The proposed method couples the Laplace transform (LT) with a radial basis functions based local meshless method (LRBFM). The suggested numerical scheme first uses the LT which transform the given equation to an elliptic equation in LT space, and then it utilizes the LRBFM to solve transformed equation in LT space, and then the solution is converted back into the time domain via the improved Talbot's scheme. The local meshless methods are widely recognized for scattered data interpolation and for solving PDEs in complex shaped domains. The adaptability, simplicity, and ease of use are features that have led to the popularity of local meshless methods. The local meshless methods are easy and straightforward, they only requires to solve linear system of equations. The main objective of using the LT is to avoid the computation of costly convolution integral in time-fractional derivative and the effect of time stepping on accuracy and stability of numerical solution. The stability and the convergence of the proposed numerical scheme are discussed. Further, the Ulam-Hyers (UH) stability of the proposed model is discussed. The accuracy and efficiency of the suggested numerical approach have been demonstrated using numerical experiments on five different domains with regular nodes distribution.

15.
Am J Epidemiol ; 2024 Jul 10.
Article in English | MEDLINE | ID: mdl-38988237

ABSTRACT

The incubation period is of paramount importance in infectious disease epidemiology as it informs about the transmission potential of a pathogenic organism and helps to plan public health strategies to keep an epidemic outbreak under control. Estimation of the incubation period distribution from reported exposure times and symptom onset times is challenging as the underlying data is coarse. We develop a new Bayesian methodology using Laplacian-P-splines that provides a semi-parametric estimation of the incubation density based on a Langevinized Gibbs sampler. A finite mixture density smoother informs a set of parametric distributions via moment matching and an information criterion arbitrates between competing candidates. Algorithms underlying our method find a natural nest within the EpiLPS package, which has been extended to cover estimation of incubation times. Various simulation scenarios accounting for different levels of data coarseness are considered with encouraging results. Applications to real data on COVID-19, MERS and Mpox reveal results that are in alignment with what has been obtained in recent studies. The proposed flexible approach is an interesting alternative to classic Bayesian parametric methods for estimation of the incubation distribution.

16.
J Indian Soc Probab Stat ; 25: 17-45, 2024 Jun.
Article in English | MEDLINE | ID: mdl-39070705

ABSTRACT

Studies/trials assessing status and progression of periodontal disease (PD) usually focus on quantifying the relationship between the clustered (tooth within subjects) bivariate endpoints, such as probed pocket depth (PPD), and clinical attachment level (CAL) with the covariates. Although assumptions of multivariate normality can be invoked for the random terms (random effects and errors) under a linear mixed model (LMM) framework, violations of those assumptions may lead to imprecise inference. Furthermore, the response-covariate relationship may not be linear, as assumed under a LMM fit, and the regression estimates obtained therein do not provide an overall summary of the risk of PD, as obtained from the covariates. Motivated by a PD study on Gullah-speaking African-American Type-2 diabetics, we cast the asymmetric clustered bivariate (PPD and CAL) responses into a non-linear mixed model framework, where both random terms follow the multivariate asymmetric Laplace distribution (ALD). In order to provide a one-number risk summary, the possible non-linearity in the relationship is modeled via a single-index model, powered by polynomial spline approximations for index functions, and the normal mixture expression for ALD. To proceed with a maximum-likelihood inferential setup, we devise an elegant EM-type algorithm. Moreover, the large sample theoretical properties are established under some mild conditions. Simulation studies using synthetic data generated under a variety of scenarios were used to study the finite-sample properties of our estimators, and demonstrate that our proposed model and estimation algorithm can efficiently handle asymmetric, heavy-tailed data, with outliers. Finally, we illustrate our proposed methodology via application to the motivating PD study.

17.
Sci Total Environ ; 949: 174989, 2024 Nov 01.
Article in English | MEDLINE | ID: mdl-39053553

ABSTRACT

Queensland is the main coal mining state in Australia where populations in coal mining areas have been historically exposed to coal mining emissions. Although a higher risk of chronic circulatory and respiratory diseases has been associated with coal mining globally, few studies have investigated these associations in the Queensland general population. This study estimates the association of coal production with hospitalisations for chronic circulatory and respiratory diseases in Queensland considering spatial and temporal variations during 1997-2014. An ecological analysis used a Bayesian hierarchical spatiotemporal model to estimate the association of coal production with standardised rates of each, chronic circulatory and respiratory diseases, adjusting for sociodemographic factors and considering the spatial structure of Queensland's statistical areas (SA2) in the 18-year period. Two specifications; with and without a space-time interaction effect were compared using the integrated nested Laplace approximation -INLA approach. The posterior mean of the best fit model was used to map the spatial, temporal and spatiotemporal trends of risk. The analysis considered 2,831,121 hospitalisation records. Coal mining was associated with a 4 % (2.4-5.5) higher risk of hospitalisation for chronic respiratory diseases in the model with a space-time interaction effect which had the best fit. An emerging higher risk of either chronic circulatory and respiratory diseases was identified in eastern areas and some coal-mining areas in central and southeast Queensland. There were important disparities in the spatiotemporal trend of risk between coal -and non-coal mining areas for each, chronic circulatory and respiratory diseases. Coal mining is associated with an increased risk of chronic respiratory diseases in the Queensland general population. Bayesian spatiotemporal analyses are robust methods to identify environmental determinants of morbidity in exposed populations. This methodology helps identifying at-risk populations which can be useful to support decision-making in health. Future research is required to investigate the causality links between coal mining and these diseases.


Subject(s)
Bayes Theorem , Cardiovascular Diseases , Coal Mining , Hospitalization , Respiratory Tract Diseases , Queensland/epidemiology , Hospitalization/statistics & numerical data , Respiratory Tract Diseases/epidemiology , Humans , Cardiovascular Diseases/epidemiology , Environmental Exposure/statistics & numerical data , Chronic Disease/epidemiology , Respiration Disorders/epidemiology
18.
Sensors (Basel) ; 24(12)2024 Jun 11.
Article in English | MEDLINE | ID: mdl-38931573

ABSTRACT

The visual measurement of deep holes in composite material workpieces constitutes a critical step in the robotic assembly of aerospace components. The positioning accuracy of assembly holes significantly impacts the assembly quality of components. However, the complex texture of the composite material surface and mutual interference between the imaging of the inlet and outlet edges of deep holes significantly challenge hole detection. A visual measurement method for deep holes in composite materials based on the radial penalty Laplacian operator is proposed to address the issues by suppressing visual noise and enhancing the features of hole edges. Coupled with a novel inflection-point-removal algorithm, this approach enables the accurate detection of holes with a diameter of 10 mm and a depth of 50 mm in composite material components, achieving a measurement precision of 0.03 mm.

19.
Viruses ; 16(6)2024 Jun 03.
Article in English | MEDLINE | ID: mdl-38932198

ABSTRACT

Our study examines how dengue fever incidence is associated with spatial (demographic and socioeconomic) alongside temporal (environmental) factors at multiple scales in the city of Ibagué, located in the Andean region of Colombia. We used the dengue incidence in Ibagué from 2013 to 2018 to examine the associations with climate, socioeconomic, and demographic factors from the national census and satellite imagery at four levels of local spatial aggregation. We used geographically weighted regression (GWR) to identify the relevant socioeconomic and demographic predictors, and we then integrated them with environmental variables into hierarchical models using integrated nested Laplace approximation (INLA) to analyze the spatio-temporal interactions. Our findings show a significant effect of spatial variables across the different levels of aggregation, including human population density, gas and sewage connection, percentage of woman and children, and percentage of population with a higher education degree. Lagged temporal variables displayed consistent patterns across all levels of spatial aggregation, with higher temperatures and lower precipitation at short lags showing an increase in the relative risk (RR). A comparative evaluation of the models at different levels of aggregation revealed that, while higher aggregation levels often yield a better overall model fit, finer levels offer more detailed insights into the localized impacts of socioeconomic and demographic variables on dengue incidence. Our results underscore the importance of considering macro and micro-level factors in epidemiological modeling, and they highlight the potential for targeted public health interventions based on localized risk factor analyses. Notably, the intermediate levels emerged as the most informative, thereby balancing spatial heterogeneity and case distribution density, as well as providing a robust framework for understanding the spatial determinants of dengue.


Subject(s)
Dengue , Spatio-Temporal Analysis , Colombia/epidemiology , Dengue/epidemiology , Humans , Incidence , Socioeconomic Factors , Climate , Female , Male
20.
Commun Stat Theory Methods ; 53(13): 4819-4840, 2024.
Article in English | MEDLINE | ID: mdl-38895616

ABSTRACT

Two new nonconvex penalty functions - Laplace and arctan - were recently introduced in the literature to obtain sparse models for high-dimensional statistical problems. In this paper, we study the theoretical properties of Laplace and arctan penalized ordinary least squares linear regression models. We first illustrate the near-unbiasedness of the nonzero regression weights obtained by the new penalty functions, in the orthonormal design case. In the general design case, we present theoretical results in two asymptotic settings: (a) the number of features, p fixed, but the sample size, n → ∞ , and (b) both n and p tend to infinity. The theoretical results shed light onto the differences between the solutions based on the new penalty functions and those based on existing convex and nonconvex Bridge penalty functions. Our theory also shows that both Laplace and arctan penalties satisfy the oracle property. Finally, we also present results from a brief simulations study illustrating the performance of Laplace and arctan penalties based on the gradient descent optimization algorithm.

SELECTION OF CITATIONS
SEARCH DETAIL