Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 5.182
Filtrar
1.
J R Soc Interface ; 21(214): 20230732, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38774958

RESUMO

The concept of an autocatalytic network of reactions that can form and persist, starting from just an available food source, has been formalized by the notion of a reflexively autocatalytic and food-generated (RAF) set. The theory and algorithmic results concerning RAFs have been applied to a range of settings, from metabolic questions arising at the origin of life, to ecological networks, and cognitive models in cultural evolution. In this article, we present new structural and algorithmic results concerning RAF sets, by studying more complex modes of catalysis that allow certain reactions to require multiple catalysts (or to not require catalysis at all), and discuss the differing ways catalysis has been viewed in the literature. We also focus on the structure and analysis of minimal RAFs and derive structural results and polynomial-time algorithms. We then apply these new methods to a large metabolic network to gain insights into possible biochemical scenarios near the origin of life.


Assuntos
Algoritmos , Catálise , Modelos Biológicos , Bioquímica , Origem da Vida
2.
Materials (Basel) ; 17(9)2024 May 06.
Artigo em Inglês | MEDLINE | ID: mdl-38730966

RESUMO

In this article, the practical issues connected with guided wave measurement are studied: (1) the influence of gluing of PZT plate actuators (NAC2013) on generated elastic wave propagation, (2) the repeatability of PZT transducers attachment, and (3) the assessment of the possibility of comparing the results of Laser Doppler Vibrometry (LDV) measurement performed on different 2D samples. The consideration of these questions is crucial in the context of the assessment of the possibility of the application of the guided wave phenomenon to structural health-monitoring systems, e.g., in civil engineering. In the examination, laboratory tests on the web of steel I-section specimens were conducted. The size and shape of the specimens were developed in such a way that they were similar to the elements typically used in civil engineering structures. It was proved that the highest amplitude of the generated wave was obtained when the exciters were glued using wax. The repeatability and durability of this connection type were the weakest. Due to this reason, it was not suitable for practical use outside the laboratory. The permanent glue application gave a stable connection between the exciter and the specimen, but the generated signal had the lowest amplitude. In the paper, the new procedure dedicated to objective analysis and comparison of the elastic waves propagating on the surface of different specimens was proposed. In this procedure, the genetic algorithms help with the determination of a new coordinate system, in which the assessment of the quality of wave propagation in different directions is possible.

3.
J Clin Med ; 13(9)2024 Apr 29.
Artigo em Inglês | MEDLINE | ID: mdl-38731156

RESUMO

Background: The drug reaction with eosinophilia and systemic symptoms (DRESS) syndrome represents a severe form of drug hypersensitivity reaction characterized by significant morbidity, mortality, and long-term sequelae, coupled with limited therapeutic avenues. Accurate identification of the causative drug(s) is paramount for acute management, exploration of safe therapeutic alternatives, and prevention of future occurrences. However, the absence of a standardized diagnostic test and a specific causality algorithm tailored to DRESS poses a significant challenge in its clinical management. Methods: We conducted a retrospective case-control study involving 37 DRESS patients to validate a novel causality algorithm, the ALDRESS, designed explicitly for this syndrome, comparing it against the current standard algorithm, SEFV. Results: The ALDRESS algorithm showcased superior performance, exhibiting an 85.7% sensitivity and 93% specificity with comparable negative predictive values (80.6% vs. 97%). Notably, the ALDRESS algorithm yielded a substantially higher positive predictive value (75%) compared to SEFV (51.40%), achieving an overall accuracy rate of 92%. Conclusions: Our findings underscore the efficacy of the ALDRESS algorithm in accurately attributing causality to drugs implicated in DRESS syndrome. However, further validation studies involving larger, diverse cohorts are warranted to consolidate its clinical utility and broaden its applicability. This study lays the groundwork for a refined causality assessment tool, promising advancements in the diagnosis and management of DRESS syndrome.

4.
Sci Rep ; 14(1): 10809, 2024 May 11.
Artigo em Inglês | MEDLINE | ID: mdl-38734734

RESUMO

Due to the current environmental situation and human health, a green manufacturing system is very essential in the manufacturing world. Several researchers have developed various types of green manufacturing models by considering green products, green investments, carbon emission taxes, etc. Motivated by this topic, a green production model is formulated by considering selling price, time, warranty period and green level dependent demand with a carbon emission tax policy. Also, the production rate of the system is an unknown function of time. Per unit production cost of the products is taken as increasing function of production rate and green level of the products. In our proposed model, carbon emission rate is taken as linear function of time. Then, an optimization problem of the production model is constructed. To validate of our proposed model, a numerical example is considered and solved it by AHA. Further, other five metaheuristics algorithms (AEFA, FA, GWOA, WOA and EOA) are taken to compare the results obtained from AHA. Also, concavity of the average profit function and convergence graph of different metaheuristics algorithms are presented. Finally, a sensitivity analysis is carried out to investigate the impact of different system parameters on our optimal policy and reach a fruitful conclusion from this study.

5.
Cureus ; 16(5): e60134, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38736767

RESUMO

BACKGROUND: Large gatherings often involve extended and intimate contact among individuals, creating environments conducive to the spread of infectious diseases. Despite this, there is limited research utilizing outbreak detection algorithms to analyze real syndrome data from such events. This study sought to address this gap by examining the implementation and efficacy of outbreak detection algorithms for syndromic surveillance during mass gatherings in Iraq. METHODS: For the study, 10 data collectors conducted field data collection over 10 days from August 25, 2023, to September 3, 2023. Data were gathered from 10 healthcare clinics situated along Ya Hussein Road, a major route from Najaf to Karbala in Iraq. Various outbreak detection algorithms, such as moving average, cumulative sum, and exponentially weighted moving average, were applied to analyze the reported syndromes. RESULTS: During the 10 days from August 25, 2023, to September 3, 2023, 12202 pilgrims visited 10 health clinics along a route in Iraq. Most pilgrims were between 20 and 59 years old (77.4%, n=9444), with more than half being foreigners (58.1%, n=7092). Among the pilgrims, 40.5% (n=4938) exhibited syndromes, with influenza-like illness (ILI) being the most common (48.8%, n=2411). Other prevalent syndromes included food poisoning (21.2%, n=1048), heatstroke (17.7%, n=875), febrile rash (9.0%, n=446), and gastroenteritis (3.2%, n=158). The cumulative sum (CUSUM) algorithm was more effective than exponentially weighted moving average (EWMA) and moving average (MA) algorithms for detecting small shifts. CONCLUSION: Effective public health surveillance systems are crucial during mass gatherings to swiftly identify and address emerging health risks. Utilizing advanced algorithms and real-time data analysis can empower authorities to improve their readiness and response capacity, thereby ensuring the protection of public health during these gatherings.

6.
Food Chem X ; 22: 101412, 2024 Jun 30.
Artigo em Inglês | MEDLINE | ID: mdl-38707779

RESUMO

Identifying the geographic origin of a wine is of great importance, as origin fakery is commonplace in the wine industry. This study analyzed the mineral elements, volatile components, and metabolites in wine using inductively coupled plasma-mass spectrometry, headspace solid phase microextraction gas chromatography-mass spectrometry, and ultra-high-performance liquid chromatography-quadrupole-exactive orbitrap mass spectrometry. The most critical variables (5 mineral elements, 13 volatile components, and 51 metabolites) for wine origin classification were selected via principal component analysis and orthogonal partial least squares discriminant analysis. Subsequently, three algorithms-K-nearest neighbors, support vector machine, and random forest -were used to model single and fused datasets for origin identification. These results indicated that fused datasets, based on feature variables (mineral elements, volatile components, and metabolites), achieved the best performance, with predictive rates of 100% for all three algorithms. This study demonstrates the effectiveness of a multi-source data fusion strategy for authenticity identification of Chinese wine.

7.
J Pak Med Assoc ; 74(4 (Supple-4)): S5-S9, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38712403

RESUMO

OBJECTIVE: To segment dental implants on PA radiographs using a Deep Learning (DL) algorithm. To compare the performance of the algorithm relative to ground truth determined by the human annotator. Methodology: Three hundred PA radiographs were retrieved from the radiographic database and consequently annotated to label implants as well as teeth on the LabelMe annotation software. The dataset was augmented to increase the number of images in the training data and a total of 1294 images were used to train, validate and test the DL algorithm. An untrained U-net was downloaded and trained on the annotated dataset to allow detection of implants using polygons on PA radiographs. RESULTS: A total of one hundred and thirty unseen images were run through the trained U-net to determine its ability to segment implants on PA radiographs. The performance metrics are as follows: accuracy of 93.8%, precision of 90%, recall of 83%, F-1 score of 86%, Intersection over Union of 86.4% and loss = 21%. CONCLUSIONS: The trained DL algorithm segmented implants on PA radiographs with high performance similar to that of the humans who labelled the images forming the ground truth.


Assuntos
Aprendizado Profundo , Implantes Dentários , Humanos , Algoritmos , Inteligência Artificial , Radiografia Dentária/métodos
9.
Cureus ; 16(4): e57728, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38711724

RESUMO

Clinical Decision Support Systems (CDSS) are essential tools in contemporary healthcare, enhancing clinicians' decisions and patient outcomes. The integration of artificial intelligence (AI) is now revolutionizing CDSS even further. This review delves into AI technologies transforming CDSS, their applications in healthcare decision-making, associated challenges, and the potential trajectory toward fully realizing AI-CDSS's potential. The review begins by laying the groundwork with a definition of CDSS and its function within the healthcare field. It then highlights the increasingly significant role that AI is playing in enhancing CDSS effectiveness and efficiency, underlining its evolving prominence in shaping healthcare practices. It examines the integration of AI technologies into CDSS, including machine learning algorithms like neural networks and decision trees, natural language processing, and deep learning. It also addresses the challenges associated with AI integration, such as interpretability and bias. We then shift to AI applications within CDSS, with real-life examples of AI-driven diagnostics, personalized treatment recommendations, risk prediction, early intervention, and AI-assisted clinical documentation. The review emphasizes user-centered design in AI-CDSS integration, addressing usability, trust, workflow, and ethical and legal considerations. It acknowledges prevailing obstacles and suggests strategies for successful AI-CDSS adoption, highlighting the need for workflow alignment and interdisciplinary collaboration. The review concludes by summarizing key findings, underscoring AI's transformative potential in CDSS, and advocating for continued research and innovation. It emphasizes the need for collaborative efforts to realize a future where AI-powered CDSS optimizes healthcare delivery and improves patient outcomes.

10.
Front Physiol ; 15: 1329313, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38711954

RESUMO

Introduction: The availability of proactive techniques for health monitoring is essential to reducing fetal mortality and avoiding complications in fetal wellbeing. In harsh circumstances such as pandemics, earthquakes, and low-resource settings, the incompetence of many healthcare systems worldwide in providing essential services, especially for pregnant women, is critical. Being able to continuously monitor the fetus in hospitals and homes in a direct and fast manner is very important in such conditions. Methods: Monitoring the health of the baby can potentially be accomplished through the computation of vital bio-signal measures using a clear fetal electrocardiogram (ECG) signal. The aim of this study is to develop a framework to detect and identify the R-peaks of the fetal ECG directly from a 12 channel abdominal composite signal. Thus, signals were recorded noninvasively from 70 pregnant (healthy and with health conditions) women with no records of fetal abnormalities. The proposed model employs a recurrent neural network architecture to robustly detect the fetal ECG R-peaks. Results: To test the proposed framework, we performed both subject-dependent (5-fold cross-validation) and independent (leave-one-subject-out) tests. The proposed framework achieved average accuracy values of 94.2% and 88.8%, respectively. More specifically, the leave-one-subject-out test accuracy was 86.7% during the challenging period of vernix caseosa layer formation. Furthermore, we computed the fetal heart rate from the detected R-peaks, and the demonstrated results highlight the robustness of the proposed framework. Discussion: This work has the potential to cater to the critical industry of maternal and fetal healthcare as well as advance related applications.

11.
Sci Rep ; 14(1): 10515, 2024 May 07.
Artigo em Inglês | MEDLINE | ID: mdl-38714848

RESUMO

Reliable and comprehensive predictive tools for the frictional pressure drop (FPD) are of particular importance for systems involving two-phase flow condensation. However, the available models are only applicable to specific operating conditions and channel sizes. Thus, this study aims at developing universal models to estimate the FPD during condensation inside smooth mini/micro and conventional (macro) channels. An extensive databank, comprising 8037 experimental samples and 23 working fluids from 50 reliable sources, was prepared to achieve this target. A comprehensive investigation on the literature models reflected the fact that all of them are associated with high deviations, and their average absolute relative errors (AAREs) exceed 26%. Hence, after identifying the most effective input variables through the Spearman's correlation analysis, three soft-computing paradigms, i.e., multilayer perceptron (MLP), gaussian process regression (GPR) and radial basis function (RBF) were employed to establish intelligent and dimensionless predictive tools for the FPD based on the separated model suggested by Lockhart and Martinelli. Among them, the most accurate results were presented by the GPR approach with AARE and R 2 values of 4.10%, 99.23% respectively, in the testing step. The truthfulness and applicability of the models were explored through an array of statistical and visual analyses, and the results affirmed the obvious superiority of the newly proposed approaches over the literature correlations. Furthermore, the novel predictive tools excellently described the physical variations of the condensation FPD versus the operating parameters. Ultimately, the order of importance of factors in controlling the condensation FPD was clarified by a sensitivity analysis.

12.
Interdiscip Sci ; 2024 May 11.
Artigo em Inglês | MEDLINE | ID: mdl-38733473

RESUMO

Cancer remains a severe illness, and current research indicates that tumor homing peptides (THPs) play an important part in cancer therapy. The identification of THPs can provide crucial insights for drug-discovery and pharmaceutical industries as they allow for tailored medication delivery towards cancer cells. These peptides have a high affinity enabling particular receptors present upon tumor surfaces, allowing for the creation of precision medications that reduce off-target consequences and enhance cancer patient treatment results. Wet-lab techniques are considered essential tools for studying THPs; however, they're labor-extensive and time-consuming, therefore making prediction of THPs a challenging task for the researchers. Computational-techniques, on the other hand, are considered significant tools in identifying THPs according to the sequence data. Despite many strategies have been presented to predict new THP, there is still a need to develop a robust method with higher rates of success. In this paper, we developed a novel framework, THP-DF, for accurately identifying THPs on a large-scale. Firstly, the peptide sequences are encoded through various sequential features. Secondly, each feature is passed to BiLSTM and attention layers to extract simplified deep features. Finally, an ensemble-framework is formed via integrating sequential- and deep features which are fed to a support vector machine which with 10-fold cross-validation to carry to validate the efficiency. The experimental results showed that THP-DF worked better on both [Formula: see text] and [Formula: see text] datasets by achieving accuracy of > 95% which are higher than existing predictors both datasets. This indicates that the proposed predictor could be a beneficial tool to precisely and rapidly identify THPs and will contribute to the cutting-edge cancer treatment strategies and pharmaceuticals.

13.
Artigo em Inglês | MEDLINE | ID: mdl-38703167

RESUMO

BACKGROUND: Assessment of origin of ventricular tachycardias (VTs) arising from epicardial vs endocardial sites are largely challenged by the available criteria and etiology of cardiomyopathy. Current electrocardiographic (ECG) criteria based on 12-lead ECG have varying sensitivity and specificity based on site of origin and etiology of cardiomyopathy. OBJECTIVES: This study sought to test the hypothesis that epicardial VT has a slower initial rate of depolarization than endocardial VT. METHODS: We developed a method that takes advantage of the fact that electrical conduction is faster through the cardiac conduction system than the myocardium, and that the conduction system is primarily an endocardial structure. The technique calculated the rate of change in the initial VT depolarization from a signal-averaged 12-lead ECG. We hypothesized that the rate of change of depolarization in endocardial VT would be faster than epicardial. We assessed by applying this technique among 26 patients with VT in nonischemic cardiomyopathy patients. RESULTS: When comparing patients with VTs ablated using epicardial and endocardial approaches, the rate of change of depolarization was found to be significantly slower in epicardial (mean ± SD 6.3 ± 3.1 mV/s vs 11.4 ± 3.7 mV/s; P < 0.05). Statistical significance was found when averaging all 12 ECG leads and the limb leads, but not the precordial leads. Follow up analysis by calculation of a receiver-operating characteristic curve demonstrated that this analysis provides a strong prediction if a VT is epicardial in origin (AUC range 0.72-0.88). Slower rate of change of depolarization had high sensitivity and specificity for prediction of epicardial VT. CONCLUSIONS: This study demonstrates that depolarization rate analysis is a potential technique to predict if a VT is epicardial in nature.

14.
Mark Theory ; 24(2): 211-232, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38774190

RESUMO

While critical marketing studies have discussed algorithm-driven marketing's role in governmentality, subjectivity formation and capitalist accumulation, its role in shaping class inequalities is less studied. Drawing on the performativity of marketing, 'classification situations' and critical algorithm studies, this paper uses the case of credit marketing to propose a twofold framework to analyse how algorithmic marketing shapes the cultural and economic inequalities of class. First, algorithms used for categorizing consumers and matching them with marketing messages and products provide access (1) to different symbolic resources and (2) to credit products with different financial consequences to different consumers depending on their categorization, which contribute to the creation of cultural and economic inequalities, respectively. Second, algorithms of financial advice devices overtake parts of consumer choice. Insofar as different financial preferences and rationalities are scripted into the devices for different client groups, these technologies constitute an additional process that affects social divisions.

15.
MethodsX ; 12: 102747, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38774685

RESUMO

The Internet of Things (IoT) has radically reformed various sectors and industries, enabling unprecedented levels of connectivity and automation. However, the surge in the number of IoT devices has also widened the attack surface, rendering IoT networks potentially susceptible to a plethora of security risks. Addressing the critical challenge of enhancing security in IoT networks is of utmost importance. Moreover, there is a considerable lack of datasets designed exclusively for IoT applications. To bridge this gap, a customized dataset that accurately mimics real-world IoT scenarios impacted by four different types of attacks-blackhole, sinkhole, flooding, and version number attacks was generated using the Contiki-OS Cooja Simulator in this study. The resulting dataset is then consequently employed to evaluate the efficacy of several metaheuristic algorithms, in conjunction with Convolutional Neural Network (CNN) for IoT networks. •The proposed study's goal is to identify optimal hyperparameters for CNNs, ensuring their peak performance in intrusion detection tasks.•This study not only intensifies our comprehension of IoT network security but also provides practical guidance for implementation of the robust security measures in real-world IoT applications.

16.
JMIR Form Res ; 8: e51013, 2024 May 22.
Artigo em Inglês | MEDLINE | ID: mdl-38776539

RESUMO

BACKGROUND: Patient adherence to medications can be assessed using interactive digital health technologies such as electronic monitors (EMs). Changes in treatment regimens and deviations from EM use over time must be characterized to establish the actual level of medication adherence. OBJECTIVE: We developed the computer script CleanADHdata.R to clean raw EM adherence data, and this tutorial is a guide for users. METHODS: In addition to raw EM data, we collected adherence start and stop monitoring dates and identified the prescribed regimens, the expected number of EM openings per day based on the prescribed regimen, EM use deviations, and patients' demographic data. The script formats the data longitudinally and calculates each day's medication implementation. RESULTS: We provided a simulated data set for 10 patients, for which 15 EMs were used over a median period of 187 (IQR 135-342) days. The median patient implementation before and after EM raw data cleaning was 83.3% (IQR 71.5%-93.9%) and 97.3% (IQR 95.8%-97.6%), respectively (Δ+14%). This difference is substantial enough to consider EM data cleaning to be capable of avoiding data misinterpretation and providing a cleaned data set for the adherence analysis in terms of implementation and persistence. CONCLUSIONS: The CleanADHdata.R script is a semiautomated procedure that increases standardization and reproducibility. This script has broader applicability within the realm of digital health, as it can be used to clean adherence data collected with diverse digital technologies.

17.
J Comput Biol ; 2024 May 23.
Artigo em Inglês | MEDLINE | ID: mdl-38781420

RESUMO

The thresholding problem is studied in the context of graph theoretical analysis of gene co-expression data. A number of thresholding methodologies are described, implemented, and tested over a large collection of graphs derived from real high-throughput biological data. Comparative results are presented and discussed.

18.
Emerg Infect Dis ; 30(6): 1096-1103, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38781684

RESUMO

Viral respiratory illness surveillance has traditionally focused on single pathogens (e.g., influenza) and required fever to identify influenza-like illness (ILI). We developed an automated system applying both laboratory test and syndrome criteria to electronic health records from 3 practice groups in Massachusetts, USA, to monitor trends in respiratory viral-like illness (RAVIOLI) across multiple pathogens. We identified RAVIOLI syndrome using diagnosis codes associated with respiratory viral testing or positive respiratory viral assays or fever. After retrospectively applying RAVIOLI criteria to electronic health records, we observed annual winter peaks during 2015-2019, predominantly caused by influenza, followed by cyclic peaks corresponding to SARS-CoV-2 surges during 2020-2024, spikes in RSV in mid-2021 and late 2022, and recrudescent influenza in late 2022 and 2023. RAVIOLI rates were higher and fluctuations more pronounced compared with traditional ILI surveillance. RAVIOLI broadens the scope, granularity, sensitivity, and specificity of respiratory viral illness surveillance compared with traditional ILI surveillance.


Assuntos
Algoritmos , Registros Eletrônicos de Saúde , Infecções Respiratórias , Humanos , Infecções Respiratórias/virologia , Infecções Respiratórias/epidemiologia , Infecções Respiratórias/diagnóstico , Estudos Retrospectivos , Influenza Humana/epidemiologia , Influenza Humana/diagnóstico , Influenza Humana/virologia , COVID-19/epidemiologia , COVID-19/diagnóstico , Vigilância da População/métodos , Massachusetts/epidemiologia , Adulto , Pessoa de Meia-Idade , SARS-CoV-2 , Masculino , Adolescente , Criança , Idoso , Feminino , Estações do Ano , Viroses/epidemiologia , Viroses/diagnóstico , Viroses/virologia , Pré-Escolar , Adulto Jovem
19.
Food Chem ; 453: 139633, 2024 May 11.
Artigo em Inglês | MEDLINE | ID: mdl-38781896

RESUMO

Smilax glabra Roxb. (SGR) is known for its high nutritional and therapeutic value. However, the frequent appearance of counterfeit products causes confusion and inconsistent quality among SGR varieties. Herein, this study collected the proportion of SGR adulteration and used high-performance liquid chromatography (HPLC) to measure the astilbin content of SGR. Then Fourier-transform near-infrared (FT-NIR) technology, combined with multivariate intelligent algorithms, was used to establish partial least squares regression quantitative models for detecting SGR adulteration and measuring astilbin content, respectively. The method conducted a quantitative analysis of dual indicators through single-spectrum data acquisition (QADS) to comprehensively evaluate the authenticity and superiority of SGR. The coefficients of determination (R2) for both the calibration and prediction sets exceeded 0.96, which successfully leverages FT-NIR combined with multivariate intelligent algorithms to considerably enhance the accuracy and reliability of quantitative models. Overall, this research holds substantial value in the comprehensive quality evaluation in functional health foods.

20.
Heliyon ; 10(10): e31152, 2024 May 30.
Artigo em Inglês | MEDLINE | ID: mdl-38784542

RESUMO

Image segmentation is a computer vision technique that involves dividing an image into distinct and meaningful regions or segments. The objective was to partition the image into areas that share similar visual characteristics. Noise and undesirable artifacts introduce inconsistencies and irregularities in image data. These inconsistencies severely affect the ability of most segmentation algorithms to distinguish between true image features, leading to less reliable and lower-quality results. Cellular Automata (CA) is a computational concept that consists of a grid of cells, each of which can be in a finite number of states. These cells evolve over discrete time steps based on a set of predefined rules that dictate how a cell's state changes according to its own state and the states of its neighboring cells. In this paper, a new segmentation approach based on the CA model was introduced. The proposed approach consisted of three phases. In the initial two phases of the process, the primary objective was to eliminate noise and undesirable artifacts that can interfere with the identification of regions exhibiting similar visual characteristics. To achieve this, a set of rules is designed to modify the state value of each cell or pixel based on the states of its neighboring elements. In the third phase, each element is assigned a state that is chosen from a set of predefined states. These states directly represent the final segmentation values for the corresponding elements. The proposed method was evaluated using different images, considering important quality indices. The experimental results indicated that the proposed approach produces better-segmented images in terms of quality and robustness.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...