RESUMO
Sheep were among the first animals domesticated by humans, and to this day, small ruminants are primarily raised for their meat, milk, and wool. This study evaluated the goodness of fit for growth curve models using observed age and weight data from crossbred lambs of various breeds based on the mean values between paired breeds. We employed a hybrid metaheuristic algorithm, combining a simulated annealing (SA) algorithm and a genetic algorithm (GA) called SAGAC, to determine the optimal parameter values for growth models, ensuring the best alignment between simulated and observed curves. The goodness of fit and model accuracy was assessed using the coefficient of determination (R2), Mean Absolute Error (MAE), Root Mean Square Error (RMSE), and Mean Absolute Percentage Error (MAPE). Errors were measured by comparing the criteria differences between simulated and observed data. Thirty crossbreed combinations were simulated, considering the average weight. Analysis of the observed and simulated growth curves indicated that specific crossbreeding scenarios produced promising results. This simulation approach is believed to assist geneticists in predicting potential crossbreeding outcomes, thereby saving time and financial resources in field research.
Assuntos
Algoritmos , Animais , Carneiro Doméstico/crescimento & desenvolvimento , Carneiro Doméstico/fisiologia , Cruzamento , Peso Corporal , Modelos Biológicos , Masculino , Criação de Animais Domésticos/métodos , Feminino , Ovinos/crescimento & desenvolvimento , Simulação por ComputadorRESUMO
The aim of this study was to improve the diagnostic ability of fall risk classifiers using a Bayesian approach and the Simulated Annealing (SA) algorithm. A total of 47 features from 181 records (40 Center of Pressure (CoP) indices and 7 patient descriptive variables) were analyzed. The wrapper method of feature selection using the SA algorithm was applied to optimize the cost function based on the difference of the mean minus the standard deviation of the Area Under the Curve (AUC) of the fall risk classifiers across multiple dimensions. A stratified 60-20-20% hold-out method was used for train, test, and validation sets, respectively. The results showed that although the highest performance was observed with 31 features (0.815 ± 0.110), lower variability and higher explainability were achieved with only 15 features (0.780 ± 0.055). These findings suggest that the SA algorithm is a valuable tool for feature selection for acceptable fall risk diagnosis. This method offers an alternative or complementary resource in situations where clinical tools are difficult to apply.
RESUMO
INTRODUCTION AND OBJECTIVES: TIPS placement is an effective, possibly life-saving, treatment for complications of portal hypertension. The pressure shift induced by the stent can lead to cardiac decompensation (CD). We investigated the incidence of CD, possible variables associated with CD and the validity of the Toulouse algorithm for risk prediction of CD post-TIPS. PATIENTS AND METHODS: A total of 106 patients receiving TIPS for variceal bleeding (VB, 41.5%) or refractory ascites (RA, 58.5%) with available echocardiography and NT-proBNP results were included and retrospectively reviewed. Development of CD between time of TIPS placement and occurrence of liver transplantation, death or loss-to-follow-up was recorded. Competing risk regression analysis was performed to assess which baseline variables predicted occurrence of CD post-TIPS. RESULTS: A total of 12 patients (11.3%) developed CD after a median of 11.5 days (IQR 4 to 56.5) post-TIPS. Multivariate regression showed age (HR 1.06, p = 0.019), albumin (HR 1.10, p = 0.009) and NT-proBNP (HR 1.00, p = 0.023) at baseline predicted CD in the RA group. No clear predictors were found in those receiving TIPS for VB. Correspondingly, the Toulouse algorithm successfully identified patients at risk for CD, however only in the RA population (zero risk 0% vs. low risk 12.5% vs. high risk 35.3% with CD; p = 0.003). CONCLUSIONS: CD is not an infrequent complication post-TIPS occurring in 1/10 patients. The Toulouse algorithm can identify patients at risk of CD, though only in patients receiving TIPS for RA. Allocation to the high-risk category warrants close monitoring but should not preclude TIPS placement.
RESUMO
Envenomation by reptile venom, particularly from lizards, poses significant health risks and can lead to physiological and cardiovascular changes. The venom of Heloderma horridum horridum, endemic to Colima, Mexico, was tested on Wistar rats. Electrocardiographic (ECG) data were collected pre-treatment and at 5-min intervals for 1 h post-envenomation. A specially designed computational linear regression algorithm (LRA) was used for the segmentation analysis of the ECG data to improve the detection of fiducial points (P, Q, R, S, and T) in ECG waves. Additionally, heart tissue was analyzed for macroscopic and microscopic changes. The results revealed significant electrocardiographic alterations, including pacemaker migration, junctional extrasystoles, and intraventricular conduction aberrations. By applying a linear regression algorithm, the study compensated for noise and anomalies in the isoelectric line in an ECG signal, improving the detection of P and T waves and the QRS complex with an efficiency of 97.5%. Cardiac enzyme evaluation indicated no statistically significant differences between the control and experimental groups. Macroscopic and microscopic examination revealed no apparent signs of damage or inflammatory responses in heart tissues. This study enhances our understanding of the cardiovascular impact of Heloderma venom, suggesting a greater influence on changes in conduction and arrhythmias than on direct cardiac damage to the myocardium.
Assuntos
Algoritmos , Eletrocardiografia , Ratos Wistar , Animais , Ratos , Modelos Lineares , Coração/efeitos dos fármacos , Lagartos , Masculino , Peçonhas/toxicidade , México , Animais PeçonhentosRESUMO
We introduce a new modelling for long-term survival models, assuming that the number of competing causes follows a mixture of Poisson and the Birnbaum-Saunders distribution. In this context, we present some statistical properties of our model and demonstrate that the promotion time model emerges as a limiting case. We delve into detailed discussions of specific models within this class. Notably, we examine the expected number of competing causes, which depends on covariates. This allows for direct modeling of the cure rate as a function of covariates. We present an Expectation-Maximization (EM) algorithm for parameter estimation, to discuss the estimation via maximum likelihood (ML) and provide insights into parameter inference for this model. Additionally, we outline sufficient conditions for ensuring the consistency and asymptotic normal distribution of ML estimators. To evaluate the performance of our estimation method, we conduct a Monte Carlo simulation to provide asymptotic properties and a power study of LR test by contrasting our methodology against the promotion time model. To demonstrate the practical applicability of our model, we apply it to a real medical dataset from a population-based study of incidence of breast cancer in São Paulo, Brazil. Our results illustrate that the proposed model can outperform traditional approaches in terms of model fitting, highlighting its potential utility in real-world scenarios.
Assuntos
Biometria , Neoplasias da Mama , Modelos Estatísticos , Neoplasias da Mama/epidemiologia , Neoplasias da Mama/terapia , Humanos , Biometria/métodos , Feminino , Método de Monte Carlo , Funções Verossimilhança , Análise de Sobrevida , AlgoritmosRESUMO
Quantum computing is tipped to lead the future of global technological progress. However, the obstacles related to quantum software development are an actual challenge to overcome. In this scenario, this work presents an implementation of the quantum search algorithm in Atos Quantum Assembly Language (AQASM) using the quantum software stack my Quantum Learning Machine (myQLM) and the programming development platform Quantum Learning Machine (QLM). We present the creation of a virtual quantum processor whose configurable architecture allows the analysis of induced quantum noise effects on the quantum algorithms. The codes are available throughout the manuscript so that readers can replicate them and apply the methods discussed in this article to solve their own quantum computing projects. The presented results are consistent with theoretical predictions and demonstrate that AQASM and QLM are powerful tools for building, implementing, and simulating quantum hardware.
RESUMO
Therapeutic cancer vaccines have been considered in recent decades as important immunotherapeutic strategies capable of leading to tumor regression. In the development of these vaccines, the identification of neoepitopes plays a critical role, and different computational methods have been proposed and employed to direct and accelerate this process. In this context, this review identified and systematically analyzed the most recent studies published in the literature on the computational prediction of epitopes for the development of therapeutic vaccines, outlining critical steps, along with the associated program's strengths and limitations. A scoping review was conducted following the PRISMA extension (PRISMA-ScR). Searches were performed in databases (Scopus, PubMed, Web of Science, Science Direct) using the keywords: neoepitope, epitope, vaccine, prediction, algorithm, cancer, and tumor. Forty-nine articles published from 2012 to 2024 were synthesized and analyzed. Most of the identified studies focus on the prediction of epitopes with an affinity for MHC I molecules in solid tumors, such as lung carcinoma. Predicting epitopes with class II MHC affinity has been relatively underexplored. Besides neoepitope prediction from high-throughput sequencing data, additional steps were identified, such as the prioritization of neoepitopes and validation. Mutect2 is the most used tool for variant calling, while NetMHCpan is favored for neoepitope prediction. Artificial/convolutional neural networks are the preferred methods for neoepitope prediction. For prioritizing immunogenic epitopes, the random forest algorithm is the most used for classification. The performance values related to the computational models for the prediction and prioritization of neoepitopes are high; however, a large part of the studies still use microbiome databases for training. The in vitro/in vivo validations of the predicted neoepitopes were verified in 55% of the analyzed studies. Clinical trials that led to successful tumor remission were identified, highlighting that this immunotherapeutic approach can benefit these patients. Integrating high-throughput sequencing, sophisticated bioinformatics tools, and rigorous validation methods through in vitro/in vivo assays as well as clinical trials, the tumor neoepitope-based vaccine approach holds promise for developing personalized therapeutic vaccines that target specific tumor cancers.
RESUMO
Candida auris and Candida haemulonii are two emerging opportunistic pathogens that have caused an increase in clinical cases in the recent years worldwide. The differentiation of some Candida species is highly laborious, difficult, costly, and time-consuming depending on the similarity between the species. Thus, this study aimed to develop a new, faster, and less expensive methodology for differentiating between C. auris and C. haemulonii based on near-infrared (NIR) spectroscopy and multivariate analysis. C. auris CBS10913 and C. haemulonii CH02 were separated in 15 plates per species, and three isolated colonies of each plate were selected for Fourier transform near-infrared (FT-NIR) analysis, totaling 90 spectra. Subsequently, principal component analysis (PCA) and variable selection algorithms, including the successive projections algorithm (SPA) and genetic algorithm (GA) coupled with linear discriminant analysis (LDA), were employed to discern distinctive patterns among the samples. The use of PCA, SPA, and GA algorithms associated with LDA achieved 100% sensitivity and specificity for the discriminations. The SPA-LDA and GA-LDA algorithms were essential in selecting the variables (infrared wavelengths) of most importance for the models, which could be attributed to binding of cell wall structures such as polysaccharides, peptides, proteins, or molecules resulting from yeasts' metabolism. These results show the high potential of combined FT-NIR and multivariate analysis techniques for the classification of Candida-like fungi, which can contribute to faster and more effective diagnosis and treatment of patients affected by these microorganisms.
RESUMO
The first step in comprehending the properties of Au10 clusters is understanding the lowest energy structure at low and high temperatures. Functional materials operate at finite temperatures; however, energy computations employing density functional theory (DFT) methodology are typically carried out at zero temperature, leaving many properties unexplored. This study explored the potential and free energy surface of the neutral Au10 nanocluster at a finite temperature, employing a genetic algorithm coupled with DFT and nanothermodynamics. Furthermore, we computed the thermal population and infrared Boltzmann spectrum at a finite temperature and compared it with the validated experimental data. Moreover, we performed the chemical bonding analysis using the quantum theory of atoms in molecules (QTAIM) approach and the adaptive natural density partitioning method (AdNDP) to shed light on the bonding of Au atoms in the low-energy structures. In the calculations, we take into consideration the relativistic effects through the zero-order regular approximation (ZORA), the dispersion through Grimme's dispersion with Becke-Johnson damping (D3BJ), and we employed nanothermodynamics to consider temperature contributions. Small Au clusters prefer the planar shape, and the transition from 2D to 3D could take place at atomic clusters consisting of ten atoms, which could be affected by temperature, relativistic effects, and dispersion. We analyzed the energetic ordering of structures calculated using DFT with ZORA and single-point energy calculation employing the DLPNO-CCSD(T) methodology. Our findings indicate that the planar lowest energy structure computed with DFT is not the lowest energy structure computed at the DLPN0-CCSD(T) level of theory. The computed thermal population indicates that the 2D elongated hexagon configuration strongly dominates at a temperature range of 50-800 K. Based on the thermal population, at a temperature of 100 K, the computed IR Boltzmann spectrum agrees with the experimental IR spectrum. The chemical bonding analysis on the lowest energy structure indicates that the cluster bond is due only to the electrons of the 6 s orbital, and the Au d orbitals do not participate in the bonding of this system.
RESUMO
Diverse computational approaches have been widely used to assist in designing antimicrobial peptides with enhanced activities. This tactic has also been used to address the need for new treatment alternatives to combat resistant bacterial infections. Herein, we have designed eight variants from a natural peptide, pro-adrenomedullin N-terminal 20 peptide (PAMP), using an in silico pattern insertion approach, the Joker algorithm. All the variants show an α-helical conformation, but with differences in the helix percentages according to circular dichroism (CD) results. We found that the C-terminal portion of PAMP may be relevant for its antimicrobial activities, as revealed by the molecular dynamics, CD, and antibacterial results. The analogs showed variable antibacterial potential, but most were not cytotoxic. Nevertheless, PAMP2 exhibited the most potent activities against human and animal-isolated bacteria, showing cytotoxicity only at a substantially higher concentration than its minimal inhibitory concentration (MIC). Our results suggest that the enhanced activity in the profile of PAMP2 may be related to their particular physicochemical properties, along with the adoption of an amphipathic α-helical arrangement with the conserved C-terminus portion. Finally, the peptides designed in this study can constitute scaffolds for the design of improved sequences.
Assuntos
Adrenomedulina , Dicroísmo Circular , Testes de Sensibilidade Microbiana , Simulação de Dinâmica Molecular , Humanos , Adrenomedulina/química , Adrenomedulina/farmacologia , Sequência de Aminoácidos , Antibacterianos/farmacologia , Antibacterianos/química , Antibacterianos/síntese química , Animais , Simulação por Computador , Precursores de Proteínas/química , Precursores de Proteínas/farmacologia , Precursores de Proteínas/metabolismo , Peptídeos Antimicrobianos/química , Peptídeos Antimicrobianos/farmacologia , Estrutura Secundária de ProteínaRESUMO
Introduction The limited access to temporal fine structure (TFS) cues is a reason for reduced speech-in-noise recognition in cochlear implant (CI) users. The CI signal processing schemes like electroacoustic stimulation (EAS) and fine structure processing (FSP) encode TFS in the low frequency whereas theoretical strategies such as frequency amplitude modulation encoder (FAME) encode TFS in all the bands. Objective The present study compared the effect of simulated CI signal processing schemes that either encode no TFS, TFS information in all bands, or TFS only in low-frequency bands on concurrent vowel identification (CVI) and Zebra speech perception (ZSP). Methods Temporal fine structure information was systematically manipulated using a 30-band sine-wave (SV) vocoder. The TFS was either absent (SV) or presented in all the bands as frequency modulations simulating the FAME algorithm or only in bands below 525 Hz to simulate EAS. Concurrent vowel identification and ZSP were measured under each condition in 15 adults with normal hearing. Results The CVI scores did not differ between the 3 schemes (F (2, 28) = 0.62, p = 0.55, η 2 p = 0.04). The effect of encoding TFS was observed for ZSP (F (2, 28) = 5.73, p = 0.008, η 2 p = 0.29). Perception of Zebra speech was significantly better with EAS and FAME than with SV. There was no significant difference in ZSP scores obtained with EAS and FAME ( p = 1.00) Conclusion For ZSP, the TFS cues from FAME and EAS resulted in equivalent improvements in performance compared to the SV scheme. The presence or absence of TFS did not affect the CVI scores.
RESUMO
In the complex and dynamic landscape of cyber threats, organizations require sophisticated strategies for managing Cybersecurity Operations Centers and deploying Security Information and Event Management systems. Our study enhances these strategies by integrating the precision of well-known biomimetic optimization algorithms-namely Particle Swarm Optimization, the Bat Algorithm, the Gray Wolf Optimizer, and the Orca Predator Algorithm-with the adaptability of Deep Q-Learning, a reinforcement learning technique that leverages deep neural networks to teach algorithms optimal actions through trial and error in complex environments. This hybrid methodology targets the efficient allocation and deployment of network intrusion detection sensors while balancing cost-effectiveness with essential network security imperatives. Comprehensive computational tests show that versions enhanced with Deep Q-Learning significantly outperform their native counterparts, especially in complex infrastructures. These results highlight the efficacy of integrating metaheuristics with reinforcement learning to tackle complex optimization challenges, underscoring Deep Q-Learning's potential to boost cybersecurity measures in rapidly evolving threat environments.
RESUMO
This article addresses the diagnostic challenges of palmoplantar dermatoses (PPD) within the scope of Primary Health Care (PHC). These common skin conditions, encountered in daily practice, exhibit a diverse range of symptoms and morphologies, complicating their diagnosis. They are etiologically classified into infectious inflammatory, non-infectious inflammatory, and hereditary keratodermas. While various dermatoses may affect the palms and soles, few are specific to this area. Notable examples include palmoplantar pustulosis, dyshidrosis, erythema pernio, and Bazex syndrome. Given the high prevalence of dermatological consultations in PHC, this article underscores the significance of PHC professionals' knowledge regarding these conditions. It proposes a diagnostic algorithm to facilitate their management and timely referral.
RESUMO
BACKGROUND: Digital technologies have positively impacted the availability and usability of clinical algorithms through the advancement in mobile health. Therefore, this study aimed to determine if a web-based algorithm designed to support the decision-making process of cancer care providers (CCPs) differentially impacted their self-reported self-efficacy and practices for providing smoking prevention and cessation services in Peru and Colombia. METHODS: A simple decision-making tree algorithm was built in REDCap using information from an extensive review of the currently available smoking prevention and cessation resources. We employed a pre-post study design with a mixed-methods approach among 53 CCPs in Peru and Colombia for pilot-testing the web-based algorithm during a 3-month period. Wilcoxon signed-rank test was used to compare the CCPs' self-efficacy and practices before and after using the web-based algorithm. The usability of the web-based algorithm was quantitatively measured with the system usability scale (SUS), as well as qualitatively through the analysis of four focus groups conducted among the participating CCPs. RESULTS: The pre-post assessments indicated that the CCPs significantly improved their self-efficacy and practices toward smoking prevention and cessation services after using the web-based algorithm. The overall average SUS score obtained among study participants was 82.9 (± 9.33) [Peru 81.5; Colombia 84.1]. After completing the qualitative analysis of the focus groups transcripts, four themes emerged: limited resources currently available for smoking prevention and cessation in oncology settings, merits of the web-based algorithm, challenges with the web-based algorithm, and suggestions for improving this web-based decision-making tool. CONCLUSION: The web-based algorithm showed high usability and was well-received by the CCPs in Colombia and Peru, promoting a preliminary improvement in their smoking prevention and cessation self-efficacy and practices.
Assuntos
Algoritmos , Autoeficácia , Abandono do Hábito de Fumar , Humanos , Abandono do Hábito de Fumar/métodos , Colômbia , Masculino , Feminino , Peru , Adulto , Pessoa de Meia-Idade , Prevenção do Hábito de Fumar/métodos , Internet , Pessoal de Saúde , Neoplasias/prevenção & controleRESUMO
Although healthcare and medical technology have advanced significantly over the past few decades, heart disease continues to be a major cause of mortality globally. Electrocardiography (ECG) is one of the most widely used tools for the detection of heart diseases. This study presents a mathematical model based on transfer functions that allows for the exploration and optimization of heart dynamics in Laplace space using a genetic algorithm (GA). The transfer function parameters were fine-tuned using the GA, with clinical ECG records serving as reference signals. The proposed model, which is based on polynomials and delays, approximates a real ECG with a root-mean-square error of 4.7% and an R2 value of 0.72. The model achieves the periodic nature of an ECG signal by using a single periodic impulse input. Its simplicity makes it possible to adjust waveform parameters with a predetermined understanding of their effects, which can be used to generate both arrhythmic patterns and healthy signals. This is a notable advantage over other models that are burdened by a large number of differential equations and many parameters.
RESUMO
This paper studies a variant of the Pollution Traveling Salesman Problem (PTSP) focused on fuel consumption and pollution emissions (PTSPC). The PTSPC generalizes the well-known Traveling Salesman Problem (TSP), classified as NP-Hard. In the PTSPC, a vehicle must deliver a load to each customer through a Hamiltonian cycle, minimizing an objective function that considers the speed of each edge, the mass of the truck, the mass of the load pending delivery, and the distance traveled. We have proposed a three-phase algorithm for the PTSPC. The first phase solves the Traveling Salesman Problem (TSP) exactly with a time limit and heuristically using a Nearest Neighborhood Search approach. This phase considers the constraints associated with the PTSPC by using commercial software. In the second phase, both the obtained solutions and their inverse sequences from the initial phase undergo enhancement utilizing metaheuristic algorithms tailored for the PTSPC. These algorithms include Variable Neighborhood Search (VNS), Tabu Search (TS), and Simulated Annealing (SA). Subsequently, for the third phase, the best solution identified in the second phase-determined by having the minimum value by the PTSPC objective function-is subjected to resolution by a mathematical model designed for the PTSPC, considering the heuristic emphasis of commercial software. The efficiency of the former algorithm has been validated through experimentation involving the adaptation of instances from the Pollution Routing Problem (PRP) to the PTSPC. This approach demonstrates the capacity to yield high-quality solutions within acceptable computing times.
RESUMO
BACKGROUND: The current version of the Fetal Medicine Foundation competing risks model for preeclampsia prediction has not been previously validated in Brazil. OBJECTIVE: This study aimed (1) to validate the Fetal Medicine Foundation combined algorithm for the prediction of preterm preeclampsia in the Brazilian population and (2) to describe the accuracy and calibration of the Fetal Medicine Foundation algorithm when considering the prophylactic use of aspirin by clinical criteria. STUDY DESIGN: This was a cohort study, including consecutive singleton pregnancies undergoing preeclampsia screening at 11 to 14 weeks of gestation, examining maternal characteristics, medical history, and biophysical markers between October 2010 and December 2018 in a university hospital in Brazil. Risks were calculated using the 2018 version of the algorithm available on the Fetal Medicine Foundation website, and cases were classified as low or high risk using a cutoff of 1/100 to evaluate predictive performance. Expected and observed cases with preeclampsia according to the Fetal Medicine Foundation-estimated risk range (≥1 in 10; 1 in 11 to 1 in 50; 1 in 51 to 1 in 100; 1 in 101 to 1 in 150; and <1 in 150) were compared. After identifying high-risk pregnant women who used aspirin, the treatment effect of 62% reduction in preterm preeclampsia identified in the Combined Multimarker Screening and Randomized Patient Treatment with Aspirin for Evidence-Based Preeclampsia Prevention trial was used to evaluate the predictive performance adjusted for the effect of aspirin. The number of potentially unpreventable cases in the group without aspirin use was estimated. RESULTS: Among 2749 pregnancies, preterm preeclampsia occurred in 84 (3.1%). With a risk cutoff of 1/100, the screen-positive rate was 25.8%. The detection rate was 71.4%, with a false positive rate of 24.4%. The area under the curve was 0.818 (95% confidence interval, 0.773-0.863). In the risk range ≥1/10, there is an agreement between the number of expected cases and the number of observed cases, and in the other ranges, the predicted risk was lower than the observed rates. Accounting for the effect of aspirin resulted in an increase in detection rate and positive predictive values and a slight decrease in the false positive rate. With 27 cases of preterm preeclampsia in the high-risk group without aspirin use, we estimated that 16 of these cases of preterm preeclampsia would have been avoided if this group had received prophylaxis. CONCLUSION: In a high-prevalence setting, the Fetal Medicine Foundation algorithm can identify women who are more likely to develop preterm preeclampsia. Not accounting for the effect of aspirin underestimates the screening performance.
RESUMO
Saimiri cassiquiarensis cassiquiarensis (Cebidae) is a primate subspecies with a wide distribution in the Amazonian region of Brazil, Colombia, and Venezuela. However, the boundaries of its geographic range remain poorly defined. This study presents new occurrence localities for this subspecies and updates its distribution using a compiled data set of 140 occurrence records based on literature, specimens vouchered in scientific collections, and new field data to produce model-based range maps. After cleaning our data set, we updated the subspecies' extent of occurrence, which was used in model calibration. We then modeled the subspecies' range using a maximum entropy algorithm (MaxEnt). The final model was adjusted using a fixed threshold, and we revised this polygon based on known geographic barriers and parapatric congeneric ranges. Our findings indicate that this subspecies is strongly associated with lowland areas, with consistently high daily temperatures. We propose modifications to all range boundaries and estimate that 3% of the area of occupancy (AOO, as defined by IUCN) has already been lost due to deforestation, resulting in a current range of 224,469 km2. We also found that 54% of their AOO is currently covered by protected areas (PAs). Based on these results, we consider that this subspecies is currently properly classified as Least Concern, because it occupies an extensive range, which is relatively well covered by PAs, and is currently experiencing low rates of deforestation.
Saimiri cassiquiarensis cassiquiarensis (Cebidae) é uma subespécie de primata com ampla distribuição na região amazônica do Brasil, Colômbia e Venezuela. No entanto, os limites de sua distribuição geográfica permanecem mal definidos. Este estudo apresenta novas localidades de ocorrência para essa subespécie e atualiza sua distribuição usando 140 registros de ocorrência compilados com base na literatura, espécimes depositados em coleções científicas e novos registros de campo para produzir mapas de distribuição baseados em modelos. Após a limpeza do nosso banco de dados, atualizamos a extensão de ocorrência da subespécie, que foi usada na calibração do modelo. Em seguida, modelamos a área de distribuição da subespécie usando um algoritmo de entropia máxima (MaxEnt). O modelo final foi ajustado usando um limiar fixo e revisamos esse polígono com base em barreiras geográficas conhecidas e na distribuição de congêneres parapátricas. Nosso modelo sugere que a espécie é fortemente associada a áreas planas, com temperaturas diárias consistentemente altas. Propomos modificações em todos os limites da área de distribuição e estimamos que 3% da área de ocupação (AOO, conforme definida pela IUCN) da subespécie já foi perdida devido ao desmatamento, resultando em uma área de distribuição atual de 224,469 km2. Também estimamos que 54% de sua AOO encontrase atualmente coberta por áreas protegidas. Com base nesses resultados, consideramos que a subespécie está apropriadamente classificada como Pouco Preocupante, pois ocupa uma área extensa, que é relativamente bem coberta por áreas protegidas e atualmente apresenta baixas taxas de desmatamento.
Assuntos
Distribuição Animal , Saimiri , Animais , Saimiri/fisiologia , Venezuela , Brasil , Colômbia , Conservação dos Recursos Naturais , EcossistemaRESUMO
BACKGROUND AND OBJECTIVE: The Modified Rankin Scale (mRS) is a widely adopted scale for assessing stroke recovery. Despite limitations, the mRS has been adopted as primary outcome in most recent clinical acute stroke trials. Designed to be used by multidisciplinary clinical staff, the congruency of this scale is not consistent, which may lead to mistakes in clinical or research application. We aimed to develop and validate an interactive and automated digital tool for assessing the mRS-the iRankin. METHODS: A panel of five board-certified and mRS-trained vascular neurologists developed an automated flowchart based on current mRS literature. Two international experts were consulted on content and provided feedback on the prototype platform. The platform contained five vignettes and five real video cases, representing mRS grades 0-5. For validation, we invited neurological staff from six comprehensive stroke centers to complete an online assessment. Participants were randomized into two equal groups usual practice versus iRankin. The participants were randomly allocated in pairs for the congruency analysis. Weighted kappa (kw) and proportions were used to describe agreement. RESULTS: A total of 59 professionals completed the assessment. The kw was dramatically improved among nurses, 0.76 (95% confidence interval (CI) = 0.55-0.97) × 0.30 (0.07-0.67), and among vascular neurologists, 0.87 (0.72-1) × 0.82 (0.66-0.98). In the accuracy analysis, after the standard mRS values for the vignettes and videos were determined by a panel of experts, and considering each correct answer as equivalent to 1 point on a scale of 0-15, it revealed a higher mean of 10.6 (±2.2) in the iRankin group and 8.2 (±2.3) points in the control group (p = 0.02). In an adjusted analysis, the iRankin adoption was independently associated with the score of congruencies between reported and standard scores (beta coefficient = 2.22, 95% CI = 0.64-3.81, p = 0.007). CONCLUSION: The iRankin adoption led to a substantial or near-perfect agreement in all analyzed professional categories. More trials are needed to generalize our findings. Our user-friendly and free platform is available at https://www.irankinscale.com/.