Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 809
Filtrar
1.
Biostatistics ; 2024 Aug 05.
Artículo en Inglés | MEDLINE | ID: mdl-39103178

RESUMEN

The under-5 mortality rate (U5MR), a critical health indicator, is typically estimated from household surveys in lower and middle income countries. Spatio-temporal disaggregation of household survey data can lead to highly variable estimates of U5MR, necessitating the usage of smoothing models which borrow information across space and time. The assumptions of common smoothing models may be unrealistic when certain time periods or regions are expected to have shocks in mortality relative to their neighbors, which can lead to oversmoothing of U5MR estimates. In this paper, we develop a spatial and temporal smoothing approach based on Gaussian Markov random field models which incorporate knowledge of these expected shocks in mortality. We demonstrate the potential for these models to improve upon alternatives not incorporating knowledge of expected shocks in a simulation study. We apply these models to estimate U5MR in Rwanda at the national level from 1985 to 2019, a time period which includes the Rwandan civil war and genocide.

2.
ISA Trans ; 2024 Aug 05.
Artículo en Inglés | MEDLINE | ID: mdl-39142930

RESUMEN

A supervised probabilistic dynamic-controlled latent-variable (SPDCLV) model is proposed for online prediction, as well as real-time optimisation of process quality indicators. Compared to existing probabilistic latent-variable models, the key advantage of the proposed method lies in explicitly modelling the dynamic causality from the manipulated inputs to the quality pattern. This is achieved using a well-designed, dynamic-controlled Bayesian network. Furthermore, the algorithms for expectation-maximisation, forward filtering, and backward smoothing are designed for learning the SPDCLV model. For engineering applications, a framework for pattern-based quality prediction and optimisation is proposed, under which the pattern-filtering and pattern-based soft sensor are explored for online quality prediction. Furthermore, quality optimisation can be realised by directly controlling the pattern to the desired condition. Finally, case studies on both an industrial primary milling circuit and a numerical example illustrate the benefits of the SPDCLV method in that it can fully model the process dynamics, effectively predict and optimise the quality indicators, and monitor the process.

3.
Neural Netw ; 179: 106599, 2024 Aug 06.
Artículo en Inglés | MEDLINE | ID: mdl-39142176

RESUMEN

Dealing with high-dimensional problems has always been a key and challenging issue in the field of fuzzy systems. Traditional Takagi-Sugeno-Kang (TSK) fuzzy systems face the challenges of the curse of dimensionality and computational complexity when applied to high-dimensional data. To overcome these challenges, this paper proposes a novel approach for optimizing TSK fuzzy systems by integrating the spectral Dai-Yuan conjugate gradient (SDYCG) algorithm and the smoothing group L0 regularization technique. This method aims to address the challenges faced by TSK fuzzy systems in handling high-dimensional problems. The smoothing group L0 regularization technique is employed to introduce sparsity, select relevant features, and improve the generalization ability of the model. The SDYCG algorithm effectively accelerates convergence and enhances the learning performance of the network. Furthermore, we prove the weak convergence and strong convergence of the new algorithm under the strong Wolfe criterion, which means that the gradient norm of the error function with respect to the weight vector converges to zero, and the weight sequence approaches a fixed point.

4.
Neural Netw ; 179: 106625, 2024 Aug 12.
Artículo en Inglés | MEDLINE | ID: mdl-39168072

RESUMEN

In this paper, a smoothing approximation-based adaptive neurodynamic approach is proposed for a nonsmooth resource allocation problem (NRAP) with multiple constraints. The smoothing approximation method is combined with multi-agent systems to avoid the introduction of set-valued subgradient terms, thereby facilitating the practical implementation of the neurodynamic approach. In addition, using the adaptive penalty technique, private inequality constraints are processed, which eliminates the need for additional quantitative estimation of penalty parameters and significantly reduces the computational cost. Moreover, to reduce the impact of smoothing approximation on the convergence of the neurodynamic approach, time-varying control parameters are introduced. Due to the parallel computing characteristics of multi-agent systems, the neurodynamic approach proposed in this paper is completely distributed. Theoretical proof shows that the state solution of the neurodynamic approach converges to the optimal solution of NRAP. Finally, two application examples are used to validate the feasibility of the neurodynamic approach.

5.
Hum Brain Mapp ; 45(12): e26813, 2024 Aug 15.
Artículo en Inglés | MEDLINE | ID: mdl-39185695

RESUMEN

Advances in neuroimaging acquisition protocols and denoising techniques, along with increasing magnetic field strengths, have dramatically improved the temporal signal-to-noise ratio (tSNR) in functional magnetic resonance imaging (fMRI). This permits spatial resolution with submillimeter voxel sizes and ultrahigh temporal resolution and opens a route toward performing precision fMRI in the brains of individuals. Yet ultrahigh spatial and temporal resolution comes at a cost: it reduces tSNR and, therefore, the sensitivity to the blood oxygen level-dependent (BOLD) effect and other functional contrasts across the brain. Here we investigate the potential of various smoothing filters to improve BOLD sensitivity while preserving the spatial accuracy of activated clusters in single-subject analysis. We introduce adaptive-weight smoothing with optimized metrics (AWSOM), which addresses this challenge extremely well. AWSOM employs a local inference approach that is as sensitive as cluster-corrected inference of data smoothed with large Gaussian kernels, but it preserves spatial details across multiple tSNR levels. This is essential for examining whole-brain fMRI data because tSNR varies across the entire brain, depending on the distance of a brain region from the receiver coil, the type of setup, acquisition protocol, preprocessing, and resolution. We found that cluster correction in single subjects results in inflated family-wise error and false positive rates. AWSOM effectively suppresses false positives while remaining sensitive even to small clusters of activated voxels. Furthermore, it preserves signal integrity, that is, the relative activation strength of significant voxels, making it a valuable asset for a wide range of fMRI applications. Here we demonstrate these features and make AWSOM freely available to the research community for download.


Asunto(s)
Mapeo Encefálico , Encéfalo , Procesamiento de Imagen Asistido por Computador , Imagen por Resonancia Magnética , Humanos , Imagen por Resonancia Magnética/métodos , Encéfalo/diagnóstico por imagen , Encéfalo/fisiología , Mapeo Encefálico/métodos , Procesamiento de Imagen Asistido por Computador/métodos , Relación Señal-Ruido , Oxígeno/sangre , Análisis por Conglomerados , Adulto
6.
ACS Appl Mater Interfaces ; 16(29): 38744-38756, 2024 Jul 24.
Artículo en Inglés | MEDLINE | ID: mdl-38981068

RESUMEN

Glass ceramic (GC) is the most promising material for objective lenses for extreme ultraviolet lithography that must meet the subnanometer precision, which is characterized by low values of high spatial frequency surface roughness (HSFR). However, the HSFR of GC is typically degraded during ion beam figuring (IBF). Herein, a developed method for constructing molecular dynamics (MD) models of GC was presented, and the formation mechanisms of surface morphologies were investigated. The results indicated that the generation of the dot-like microstructure was the result of the difference in the erosion rate caused by the difference in the intrinsic properties between ceramic phases (CPs) and glass phases (GPs). Further, the difference in the microstructure of the IBF surface under different beam angles was mainly caused by the difference in the two types of sputtering. Quantum mechanical calculations showed that the presence of interstitial atoms would result in electron rearrangement and that the electron localization can lead to a reduction in CP stability. To obtain a homogeneous surface, the effects of beam parameters on the heterogeneous surface were systematically investigated based on the proposed MD model. Then, a novel ion beam modification (IBM) method was proposed and demonstrated by TEM and GIXRD. The range of ion beam smoothing parameters that could effectively converge the HSFR of the modified surface was determined through numerous experiments. Using the optimized beam parameters, an ultrathin homogeneous modified surface within 3 nm was obtained. The HSFR of GC smoothed by ion beam modification-assisted smoothing (IBMS) dropped from 0.348 to 0.090 nm, a 74% reduction. These research results offer a deeper understanding of the morphology formation mechanisms of the GC surfaces involved in ion beam processing and may point to a new approach for achieving ultrasmooth heterostructure surfaces down to the subnanometer scale.

7.
Sci Total Environ ; 948: 174843, 2024 Oct 20.
Artículo en Inglés | MEDLINE | ID: mdl-39019285

RESUMEN

Freshwater ecosystems offer a variety of ecosystem services, and water quality is essential information for understanding their environment, biodiversity, and functioning. Interpolation by smoothing methods is a widely used approach to obtain temporal and/or spatial patterns of water quality from sampled data. However, when these methods are applied to freshwater systems, ignoring terrestrial areas that act as physical barriers may affect the structure of spatial autocorrelation and introduce bias into the estimates. In this study, we applied stochastic partial differential equation (SPDE) smoothing methods with barriers to spatial interpolation and spatiotemporal interpolation on water quality indices (chemical oxygen demand, phosphate phosphorus, and nitrite nitrogen) in a freshwater system in Japan. Then, we compared the estimation bias and accuracy with those of conventional non-barrier models. The results showed that the estimation bias of spatial interpolations of snapshot data was improved by considering physical barriers (5.8 % for (chemical oxygen demand, 22.5 % for phosphate phosphorus, and 21.6 % for nitrite nitrogen). The prediction accuracy was comparable to that of the non-barrier model. These were consistent with the expectation that accounting for physical barriers would capture realistic spatial correlations and reduce estimation bias, but would increase the variance of the estimates due to the limited information that can be gained from the neighbourhood. On the other hand, for spatiotemporal smoothing, the barrier model was comparable to the non-barrier model in terms of both estimation bias and prediction accuracy. This may be due to the availability of information in the time direction for interpolation. These results demonstrate the advantage of considering barriers when the available data are limited, such as snapshot data. SPDE smoothing methods can be widely applied to interpolation of various environmental and biological indices in river systems and are expected to be powerful tools for studying freshwater systems spatially and temporally.

8.
BMC Public Health ; 24(1): 1893, 2024 Jul 15.
Artículo en Inglés | MEDLINE | ID: mdl-39010038

RESUMEN

BACKGROUND: Fatal opioid-involved overdose rates increased precipitously from 5.0 per 100,000 population to 33.5 in Massachusetts between 1999 and 2022. METHODS: We used spatial rate smoothing techniques to identify persistent opioid overdose-involved fatality clusters at the ZIP Code Tabulation Area (ZCTA) level. Rate smoothing techniques were employed to identify locations of high fatal opioid overdose rates where population counts were low. In Massachusetts, this included areas with both sparse data and low population density. We used Local Indicators of Spatial Association (LISA) cluster analyses with the raw incidence rates, and the Empirical Bayes smoothed rates to identify clusters from 2011 to 2021. We also estimated Empirical Bayes LISA cluster estimates to identify clusters during the same period. We constructed measures of the socio-built environment and potentially inappropriate prescribing using principal components analysis. The resulting measures were used as covariates in Conditional Autoregressive Bayesian models that acknowledge spatial autocorrelation to predict both, if a ZCTA was part of an opioid-involved cluster for fatal overdose rates, as well as the number of times that it was part of a cluster of high incidence rates. RESULTS: LISA clusters for smoothed data were able to identify whether a ZCTA was part of a opioid involved fatality incidence cluster earlier in the study period, when compared to LISA clusters based on raw rates. PCA helped in identifying unique socio-environmental factors, such as minoritized populations and poverty, potentially inappropriate prescribing, access to amenities, and rurality by combining socioeconomic, built environment and prescription variables that were highly correlated with each other. In all models except for those that used raw rates to estimate whether a ZCTA was part of a high fatality cluster, opioid overdose fatality clusters in Massachusetts had high percentages of Black and Hispanic residents, and households experiencing poverty. The models that were fitted on Empirical Bayes LISA identified this phenomenon earlier in the study period than the raw rate LISA. However, all the models identified minoritized populations and poverty as significant factors in predicting the persistence of a ZCTA being part of a high opioid overdose cluster during this time period. CONCLUSION: Conducting spatially robust analyses may help inform policies to identify community-level risks for opioid-involved overdose deaths sooner than depending on raw incidence rates alone. The results can help inform policy makers and planners about locations of persistent risk.


Asunto(s)
Teorema de Bayes , Sobredosis de Opiáceos , Factores Socioeconómicos , Análisis Espacial , Humanos , Massachusetts/epidemiología , Factores de Riesgo , Sobredosis de Opiáceos/mortalidad , Sobredosis de Opiáceos/epidemiología , Análisis por Conglomerados , Accesibilidad a los Servicios de Salud/estadística & datos numéricos , Analgésicos Opioides/envenenamiento , Femenino , Adulto , Masculino , Sobredosis de Droga/mortalidad , Sobredosis de Droga/epidemiología
9.
Front Public Health ; 12: 1359167, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-39022425

RESUMEN

Nowadays, epidemiological modeling is applied to a wide range of diseases, communicable and non-communicable, namely AIDS, Ebola, influenza, Dengue, Malaria, Zika. More recently, in the context of the last pandemic declared by the World Health Organization (WHO), several studies applied these models to SARS-CoV-2. Despite the increasing number of researches using spatial analysis, some constraints persist that prevent more complex modeling such as capturing local epidemiological dynamics or capturing the real patterns and dynamics. For example, the unavailability of: (i) epidemiological information such as the frequency with which it is made available; (ii) sociodemographic and environmental factors (e.g., population density and population mobility) at a finer scale which influence the evolution patterns of infectious diseases; or (iii) the number of cases information that is also very dependent on the degree of testing performed, often with severe territorial disparities and influenced by context factors. Moreover, the delay in case reporting and the lack of quality control in epidemiological information is responsible for biases in the data that lead to many results obtained being subject to the ecological fallacy, making it difficult to identify causal relationships. Other important methodological limitations are the control of spatiotemporal dependence, management of non-linearity, ergodicy, among others, which can impute inconsistencies to the results. In addition to these issues, social contact, is still difficult to quantify in order to be incorporated into modeling processes. This study aims to explore a modeling framework that can overcome some of these modeling methodological limitations to allow more accurate modeling of epidemiological diseases. Based on Geographic Information Systems (GIS) and spatial analysis, our model is developed to identify group of municipalities where population density (vulnerability) has a stronger relationship with incidence (hazard) and commuting movements (exposure). Specifically, our framework shows how to operate a model over data with no clear trend or seasonal pattern which is suitable for a short-term predicting (i.e., forecasting) of cases based on few determinants. Our tested models provide a good alternative for when explanatory data is few and the time component is not available, once they have shown a good fit and good short-term forecast ability.


Asunto(s)
COVID-19 , SARS-CoV-2 , Análisis Espacio-Temporal , Humanos , COVID-19/epidemiología , Modelos Epidemiológicos , Pandemias
10.
Sensors (Basel) ; 24(13)2024 Jun 23.
Artículo en Inglés | MEDLINE | ID: mdl-39000855

RESUMEN

The traditional methods for 3D reconstruction mainly involve using image processing techniques or deep learning segmentation models for rib extraction. After post-processing, voxel-based rib reconstruction is achieved. However, these methods suffer from limited reconstruction accuracy and low computational efficiency. To overcome these limitations, this paper proposes a 3D rib reconstruction method based on point cloud adaptive smoothing and denoising. We converted voxel data from CT images to multi-attribute point cloud data. Then, we applied point cloud adaptive smoothing and denoising methods to eliminate noise and non-rib points in the point cloud. Additionally, efficient 3D reconstruction and post-processing techniques were employed to achieve high-accuracy and comprehensive 3D rib reconstruction results. Experimental calculations demonstrated that compared to voxel-based 3D rib reconstruction methods, the 3D rib models generated by the proposed method achieved a 40% improvement in reconstruction accuracy and were twice as efficient as the former.

11.
Sensors (Basel) ; 24(13)2024 Jul 05.
Artículo en Inglés | MEDLINE | ID: mdl-39001161

RESUMEN

This study aimed to measure the differences in commonly used summary acceleration metrics during elite Australian football games under three different data processing protocols (raw, custom-processed, manufacturer-processed). Estimates of distance, speed and acceleration were collected with a 10-Hz GNSS tracking technology device from fourteen matches of 38 elite Australian football players from one team. Raw and manufacturer-processed data were exported from respective proprietary software and two common summary acceleration metrics (number of efforts and distance within medium/high-intensity zone) were calculated for the three processing methods. To estimate the effect of the three different data processing methods on the summary metrics, linear mixed models were used. The main findings demonstrated that there were substantial differences between the three processing methods; the manufacturer-processed acceleration data had the lowest reported distance (up to 184 times lower) and efforts (up to 89 times lower), followed by the custom-processed distance (up to 3.3 times lower) and efforts (up to 4.3 times lower), where raw data had the highest reported distance and efforts. The results indicated that different processing methods changed the metric output and in turn alters the quantification of the demands of a sport (volume, intensity and frequency of the metrics). Coaches, practitioners and researchers need to understand that various processing methods alter the summary metrics of acceleration data. By being informed about how these metrics are affected by processing methods, they can better interpret the data available and effectively tailor their training programs to match the demands of competition.

12.
PeerJ ; 12: e17408, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38948203

RESUMEN

Background: Over the last few decades, diabetes-related mortality risks (DRMR) have increased in Florida. Although there is evidence of geographic disparities in pre-diabetes and diabetes prevalence, little is known about disparities of DRMR in Florida. Understanding these disparities is important for guiding control programs and allocating health resources to communities most at need. Therefore, the objective of this study was to investigate geographic disparities and temporal changes of DRMR in Florida. Methods: Retrospective mortality data for deaths that occurred from 2010 to 2019 were obtained from the Florida Department of Health. Tenth International Classification of Disease codes E10-E14 were used to identify diabetes-related deaths. County-level mortality risks were computed and presented as number of deaths per 100,000 persons. Spatial Empirical Bayesian (SEB) smoothing was performed to adjust for spatial autocorrelation and the small number problem. High-risk spatial clusters of DRMR were identified using Tango's flexible spatial scan statistics. Geographic distribution and high-risk mortality clusters were displayed using ArcGIS, whereas seasonal patterns were visually represented in Excel. Results: A total of 54,684 deaths were reported during the study period. There was an increasing temporal trend as well as seasonal patterns in diabetes mortality risks with high risks occurring during the winter. The highest mortality risk (8.1 per 100,000 persons) was recorded during the winter of 2018, while the lowest (6.1 per 100,000 persons) was in the fall of 2010. County-level SEB smoothed mortality risks varied by geographic location, ranging from 12.6 to 81.1 deaths per 100,000 persons. Counties in the northern and central parts of the state tended to have high mortality risks, whereas southern counties consistently showed low mortality risks. Similar to the geographic distribution of DRMR, significant high-risk spatial clusters were also identified in the central and northern parts of Florida. Conclusion: Geographic disparities of DRMR exist in Florida, with high-risk spatial clusters being observed in rural central and northern areas of the state. There is also evidence of both increasing temporal trends and Winter peaks of DRMR. These findings are helpful for guiding allocation of resources to control the disease, reduce disparities, and improve population health.


Asunto(s)
Diabetes Mellitus , Humanos , Florida/epidemiología , Estudios Retrospectivos , Diabetes Mellitus/mortalidad , Diabetes Mellitus/epidemiología , Femenino , Masculino , Teorema de Bayes , Disparidades en el Estado de Salud , Persona de Mediana Edad , Factores de Riesgo , Estaciones del Año , Anciano , Adulto
13.
J Appl Stat ; 51(7): 1287-1317, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38835826

RESUMEN

The area of functional principal component analysis (FPCA) has seen relatively few contributions from the Bayesian inference. A Bayesian method in FPCA is developed under the cases of continuous and binary observations for sparse and irregularly spaced data. In the proposed Markov chain Monte Carlo (MCMC) method, Gibbs sampler approach is adopted to update the different variables based on their conditional posterior distributions. In FPCA, a set of eigenfunctions is suggested under Stiefel manifold, and samples are drawn from a Langevin-Bingham matrix variate distribution. Penalized splines are used to model mean trajectory and eigenfunction trajectories in generalized functional mixed models; and the proposed model is casted into a mixed-effects model framework for Bayesian inference. To determine the number of principal components, reversible jump Markov chain Monte Carlo (RJ-MCMC) algorithm is implemented. Four different simulation settings are conducted to demonstrate competitive performance against non-Bayesian approaches in FPCA. Finally, the proposed method is illustrated to the analysis of body mass index (BMI) data by gender and ethnicity.

14.
PeerJ Comput Sci ; 10: e2011, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38855226

RESUMEN

Mixup is an effective data augmentation method that generates new augmented samples by aggregating linear combinations of different original samples. However, if there are noises or aberrant features in the original samples, mixup may propagate them to the augmented samples, leading to over-sensitivity of the model to these outliers. To solve this problem, this paper proposes a new mixup method called AMPLIFY. This method uses the attention mechanism of Transformer itself to reduce the influence of noises and aberrant values in the original samples on the prediction results, without increasing additional trainable parameters, and the computational cost is very low, thereby avoiding the problem of high resource consumption in common mixup methods such as Sentence Mixup. The experimental results show that, under a smaller computational resource cost, AMPLIFY outperforms other mixup methods in text classification tasks on seven benchmark datasets, providing new ideas and new ways to further improve the performance of pre-trained models based on the attention mechanism, such as BERT, ALBERT, RoBERTa, and GPT. Our code can be obtained at https://github.com/kiwi-lilo/AMPLIFY.

15.
Materials (Basel) ; 17(12)2024 Jun 18.
Artículo en Inglés | MEDLINE | ID: mdl-38930346

RESUMEN

Pinch milling is a new technique for slender and long blade machining, which can simultaneously improve the machining quality and efficiency. However, two-cutter orientation planning is a major challenge due to the irregular blade surfaces and the structural constraints of nine-axis machine tools. In this paper, a method of twin-tool smoothing orientation determination is proposed for a thin-walled blade with pinch milling. Considering the processing status of the two cutters and workpiece, the feasible domain of the twin-tool axis vector and its characterization method are defined. At the same time, an evaluation algorithm of global and local optimization is proposed, and a smoothing algorithm is explored within the feasible domain along the two tool paths. Finally, a set of smoothly aligned tool orientations are generated, and the overall smoothness is nearly globally optimized. A preliminary simulation verification of the proposed algorithm is conducted on a turbine blade model and the planning tool orientation is found to be stable, smooth, and well formed, which avoids collision interference and ultimately improves the machining accuracy of the blade with difficult-to-machine materials.

16.
Stat Med ; 43(19): 3578-3594, 2024 Aug 30.
Artículo en Inglés | MEDLINE | ID: mdl-38881189

RESUMEN

In health and clinical research, medical indices (eg, BMI) are commonly used for monitoring and/or predicting health outcomes of interest. While single-index modeling can be used to construct such indices, methods to use single-index models for analyzing longitudinal data with multiple correlated binary responses are underdeveloped, although there are abundant applications with such data (eg, prediction of multiple medical conditions based on longitudinally observed disease risk factors). This article aims to fill the gap by proposing a generalized single-index model that can incorporate multiple single indices and mixed effects for describing observed longitudinal data of multiple binary responses. Compared to the existing methods focusing on constructing marginal models for each response, the proposed method can make use of the correlation information in the observed data about different responses when estimating different single indices for predicting response variables. Estimation of the proposed model is achieved by using a local linear kernel smoothing procedure, together with methods designed specifically for estimating single-index models and traditional methods for estimating generalized linear mixed models. Numerical studies show that the proposed method is effective in various cases considered. It is also demonstrated using a dataset from the English Longitudinal Study of Aging project.


Asunto(s)
Modelos Estadísticos , Estudios Longitudinales , Humanos , Modelos Lineales , Simulación por Computador , Interpretación Estadística de Datos
17.
Brief Bioinform ; 25(4)2024 May 23.
Artículo en Inglés | MEDLINE | ID: mdl-38877886

RESUMEN

Single-cell sequencing has revolutionized our ability to dissect the heterogeneity within tumor populations. In this study, we present LoRA-TV (Low Rank Approximation with Total Variation), a novel method for clustering tumor cells based on the read depth profiles derived from single-cell sequencing data. Traditional analysis pipelines process read depth profiles of each cell individually. By aggregating shared genomic signatures distributed among individual cells using low-rank optimization and robust smoothing, the proposed method enhances clustering performance. Results from analyses of both simulated and real data demonstrate its effectiveness compared with state-of-the-art alternatives, as supported by improvements in the adjusted Rand index and computational efficiency.


Asunto(s)
Neoplasias , Análisis de la Célula Individual , Análisis de la Célula Individual/métodos , Humanos , Neoplasias/genética , Neoplasias/patología , Análisis por Conglomerados , Algoritmos , Biología Computacional/métodos , Secuenciación de Nucleótidos de Alto Rendimiento/métodos , Genómica/métodos
18.
Sci Total Environ ; 943: 173748, 2024 Sep 15.
Artículo en Inglés | MEDLINE | ID: mdl-38857793

RESUMEN

In many coastal cities around the world, continuing water degradation threatens the living environment of humans and aquatic organisms. To assess and control the water pollution situation, this study estimated the Biochemical Oxygen Demand (BOD) concentration of Hong Kong's marine waters using remote sensing and an improved machine learning (ML) method. The scheme was derived from four ML algorithms (RBF, SVR, RF, XGB) and calibrated using a large amount (N > 1000) of in-situ BOD5 data. Based on labeled datasets with different preprocessing, i.e., the original BOD5, the log10(BOD5), and label distribution smoothing (LDS), three types of models were trained and evaluated. The results highlight the superior potential of the LDS-based model to improve BOD5 estimate by dealing with imbalanced training dataset. Additionally, XGB and RF outperformed RBF and SVR when the model was developed using log10(BOD5) or LDS(BOD5). Over two decades, the BOD5 concentration of Hong Kong marine waters in the autumn (Sep. to Nov.) shows a downward trend, with significant decreases in Deep Bay, Western Buffer, Victoria Harbour, Eastern Buffer, Junk Bay, Port Shelter, and the Tolo Harbour and Channel. Principal component analysis revealed that nutrient levels emerged as the predominant factor in Victoria Harbour and the interior of Deep Bay, while chlorophyll-related and physical parameters were dominant in Southern, Mirs Bay, Northwestern, and the outlet of Deep Bay. LDS provides a new perspective to improve ML-based water quality estimation by alleviating the imbalance in the labeled dataset. Overall, the remotely sensed BOD5 can offer insight into the spatial-temporal distribution of organic matter in Hong Kong coastal waters and valuable guidance for the pollution control.


Asunto(s)
Monitoreo del Ambiente , Aprendizaje Automático , Agua de Mar , Hong Kong , Monitoreo del Ambiente/métodos , Agua de Mar/química , Tecnología de Sensores Remotos , Análisis de la Demanda Biológica de Oxígeno , Contaminación del Agua/estadística & datos numéricos , Contaminación del Agua/análisis , Contaminantes Químicos del Agua/análisis
19.
Sensors (Basel) ; 24(11)2024 May 23.
Artículo en Inglés | MEDLINE | ID: mdl-38894126

RESUMEN

Prefabricated construction has pioneered a new model in the construction industry, where prefabricated component modules are produced in factories and assembled on-site by construction workers, resulting in a highly efficient and convenient production process. Within the construction industry value chain, the smoothing and roughening of precast concrete components are critical processes. Currently, these tasks are predominantly performed manually, often failing to achieve the desired level of precision. This paper designs and develops a robotic system for smoothing and roughening precast concrete surfaces, along with a multi-degree-of-freedom integrated intelligent end-effector for smoothing and roughening. Point-to-point path planning methods are employed to achieve comprehensive path planning for both smoothing and roughening, enhancing the diversity of textural patterns using B-spline curves. In the presence of embedded obstacles, a biologically inspired neural network method is introduced for precise smoothing operation planning, and the A* algorithm is incorporated to enable the robot's escape from dead zones. Experimental validation further confirms the feasibility of the entire system and the accuracy of the machining path planning methods. The experimental results demonstrate that the proposed system meets the precision requirements for smoothing and offers diversity in roughening, affirming its practicality in the precast concrete process and expanding the automation level and application scenarios of robots in the field of prefabricated construction.

20.
Sensors (Basel) ; 24(12)2024 Jun 18.
Artículo en Inglés | MEDLINE | ID: mdl-38931718

RESUMEN

In dynamic environments, real-time trajectory planners are required to generate smooth trajectories. However, trajectory planners based on real-time sampling often produce jerky trajectories that necessitate post-processing steps for smoothing. Existing local smoothing methods may result in trajectories that collide with obstacles due to the lack of a direct connection between the smoothing process and trajectory optimization. To address this limitation, this paper proposes a novel trajectory-smoothing method that considers obstacle constraints in real time. By introducing virtual attractive forces from original trajectory points and virtual repulsive forces from obstacles, the resultant force guides the generation of smooth trajectories. This approach enables parallel execution with the trajectory-planning process and requires low computational overhead. Experimental validation in different scenarios demonstrates that the proposed method not only achieves real-time trajectory smoothing but also effectively avoids obstacles.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA