ABSTRACT
BACKGROUND: The polymerase chain reaction of upper respiratory tract swab samples was established as the gold standard procedure for diagnosing SARS-CoV-2 during the COVID pandemic. However, saliva collection has attracted attention as an alternative diagnostic collection method. The goal of this study was to compare the use of saliva and nasopharyngeal swab (NPS) samples for the detection of SARS-CoV-2. METHODS: Ninety-nine paired samples were evaluated for the detection of SARS-CoV-2 by saliva and swab for a qualitative diagnosis and quantitative comparison of viral particles. Furthermore, the detection limits for each sample collection technique were determined. The cycle threshold (CT) values of the saliva samples, the vaccination status, and the financial costs associated with each collection technique were compared. RESULTS: The results showed qualitative equivalence in diagnosis (96.96%) comparing saliva and swab collection, although there was low quantitative agreement. Furthermore, the detection limit test demonstrated equivalence for both collection methods. We did not observe a statistically significant association between CT values and vaccination status, indicating that the vaccine had no influence on viral load at diagnosis. Finally, we observed that the use of saliva incurs lower financial costs and requires less use of plastic materials, making it more sustainable. CONCLUSIONS: These findings support the adoption of saliva collection as a feasible and sustainable alternative to the diagnosis of COVID-19.
ABSTRACT
In recent years, the scientific community has worked intensively in the search and development of new drugs to suppress viral infections, such as COVID-19. In fact, a number of active compounds have been tested; however, the absence of significant structure-activity relationships hinders the production of optimized drugs. In this study, molecular modeling techniques were employed to investigate the electronic, structural and chemical reactivity properties of a set α-ketoamides whose antiviral activities have been reported in the literature, aiming to propose new promising derivatives. The local reactivity of the compounds was evaluated via condensed-to-atoms Fukui indexes and molecular electrostatic potential. Multivariate data analysis and random forests machine learning techniques were employed to correlate the antiviral properties and electronic and structural descriptors and identify relevant variables. A series of new derivatives were then proposed and evaluated via density functional theory-based calculations, and docking/molecular dynamics with the target protein of the virus. The results suggest that active derivatives present reduced reactivity towards electrophilic agents on the central core of the molecules and high reactivity on R1 ligands. Derivatives with higher predicted antiviral activities were proposed based on simple electronic descriptors, and their efficacies are reinforced by docking and molecular dynamics simulations.Communicated by Ramaswamy H. Sarma.
ABSTRACT
The rising popularity of herbal medicine as a weight loss remedy, fueled by misleading propaganda, raises concerns about the manufacturing processes and potential inclusion of controlled substances such as fluoxetine (FLU). The objective of this work is to develop and evaluate the performance of an electrochemical device by modifying a glassy carbon electrode (GC) with a nanocomposite based on reduced graphene oxide (rGO) and copper nanoparticles (CuNPs) for detecting FLU in manipulated herbal medicines. Scanning electron microscopy (FEG-SEM) and cyclic voltammetry (CV) were applied for morphological and electrochemical characterization and analysis of the composite's electrochemical behavior. Under optimized conditions, the proposed sensor successfully detected FLU within the range of 0.6 to 1.6 µmol L-1, showing a limit of detection (LOD) of 0.14 µmol L-1. To determine the presence of FLU in herbal samples, known amounts of the analytical standard were added to the sample, and the analyses were performed using the standard addition method, yielding recoveries between -2.13 and 2.0%.
Subject(s)
Anti-Obesity Agents , Graphite , Humans , Fluoxetine , Weight Loss , Plant ExtractsABSTRACT
Ethanol (EtOH) alters many cellular processes in yeast. An integrated view of different EtOH-tolerant phenotypes and their long noncoding RNAs (lncRNAs) is not yet available. Here, large-scale data integration showed the core EtOH-responsive pathways, lncRNAs, and triggers of higher (HT) and lower (LT) EtOH-tolerant phenotypes. LncRNAs act in a strain-specific manner in the EtOH stress response. Network and omics analyses revealed that cells prepare for stress relief by favoring activation of life-essential systems. Therefore, longevity, peroxisomal, energy, lipid, and RNA/protein metabolisms are the core processes that drive EtOH tolerance. By integrating omics, network analysis, and several other experiments, we showed how the HT and LT phenotypes may arise: (1) the divergence occurs after cell signaling reaches the longevity and peroxisomal pathways, with CTA1 and ROS playing key roles; (2) signals reaching essential ribosomal and RNA pathways via SUI2 enhance the divergence; (3) specific lipid metabolism pathways also act on phenotype-specific profiles; (4) HTs take greater advantage of degradation and membraneless structures to cope with EtOH stress; and (5) our EtOH stress-buffering model suggests that diauxic shift drives EtOH buffering through an energy burst, mainly in HTs. Finally, critical genes, pathways, and the first models including lncRNAs to describe nuances of EtOH tolerance are reported here.
Subject(s)
RNA, Long Noncoding , Saccharomyces cerevisiae , Saccharomyces cerevisiae/genetics , Saccharomyces cerevisiae/metabolism , RNA, Long Noncoding/genetics , Ethanol/pharmacology , Ethanol/metabolismABSTRACT
INTRODUCTION: To understand the current practices in stroke evaluation, the main clinical decision support system and artificial intelligence (AI) technologies need to be understood to assist the therapist in obtaining better insights about impairments and level of activity and participation in persons with stroke during rehabilitation. METHODS: This scoping review maps the use of AI for the functional evaluation of persons with stroke; the context involves any setting of rehabilitation. Data were extracted from CENTRAL, MEDLINE, EMBASE, LILACS, CINAHL, PEDRO Web of Science, IEEE Xplore, AAAI Publications, ACM Digital Library, MathSciNet, and arXiv up to January 2021. The data obtained from the literature review were summarized in a single dataset in which each reference paper was considered as an instance, and the study characteristics were considered as attributes. The attributes used for the multiple correspondence analysis were publication year, study type, sample size, age, stroke phase, stroke type, functional status, AI type, and AI function. RESULTS: Forty-four studies were included. The analysis showed that spasticity analysis based on ML techniques was used for the cases of stroke with moderate functional status. The techniques of deep learning and pressure sensors were used for gait analysis. Machine learning techniques and algorithms were used for upper limb and reaching analyses. The inertial measurement unit technique was applied in studies where the functional status was between mild and severe. The fuzzy logic technique was used for activity classifiers. CONCLUSION: The prevailing research themes demonstrated the growing utility of AI algorithms for stroke evaluation.
Subject(s)
Artificial Intelligence , Stroke , Algorithms , Humans , International Classification of Functioning, Disability and Health , Muscle SpasticityABSTRACT
Hepatitis C virus has infected over 71 million people worldwide, and it is the main cause of cirrhosis in the western world. Currently, the treatment involves direct-acting antiviral agents (DAAs) and its main goal is to achieve sustained virologic response (SVR). The aim of this study was to evaluate the impact of SVR using DAAs in the improvement of liver fibrosis using scores evaluation by indirect method, liver function, and inflammation indirect biomarkers. Patients with cirrhosis with SVR after treatment (n = 104) were evaluated using liver function scores, indirect fibrosis methods, alpha-fetoprotein, and ferritin at t-base and t-SVR. Statistically significant positive results in all parameters were observed: 54 patients were classified as 5 in the CP score in t-base, and 77 in t-SVR; a significant decrease was observed in MELD score, alpha-fetoprotein, ferritin, APRI, FIB-4 and liver stiffness in liver elastography. We did not observe difference in the liver function scores between regressors and non-regressors of liver stiffness, as well as in indirect inflammation biomarkers. The measurements of fibrosis using the indirect methods have significantly decreased in patients with cirrhosis treated who achieved SVR associated with decreased indirect inflammation biomarkers and improved liver function scores.
Subject(s)
Hepatitis C, Chronic , Antiviral Agents/therapeutic use , Biomarkers , Ferritins , Fibrosis , Hepacivirus , Hepatitis C, Chronic/complications , Hepatitis C, Chronic/drug therapy , Humans , Inflammation/complications , Liver Cirrhosis/drug therapy , Sustained Virologic Response , alpha-FetoproteinsABSTRACT
Gene regulatory networks (GRNs) play key roles in development, phenotype plasticity, and evolution. Although graph theory has been used to explore GRNs, associations amongst topological features, transcription factors (TFs), and systems essentiality are poorly understood. Here we sought the relationship amongst the main GRN topological features that influence the control of essential and specific subsystems. We found that the Knn, page rank, and degree are the most relevant GRN features: the ones are conserved along the evolution and are also relevant in pluripotent cells. Interestingly, life-essential subsystems are governed mainly by TFs with intermediary Knn and high page rank or degree, whereas specialized subsystems are mainly regulated by TFs with low Knn. Hence, we suggest that the high probability of TFs be toured by a random signal, and the high probability of the signal propagation to target genes ensures the life-essential subsystems' robustness. Gene/genome duplication is the main evolutionary process to rise Knn as the most relevant feature. Herein, we shed light on unexplored topological GRN features to assess how they are related to subsystems and how the duplications shaped the regulatory systems along the evolution. The classification model generated can be found here: https://github.com/ivanrwolf/NoC/ .
ABSTRACT
The literature has shown that there is no consensus regarding the best resin composite photoactivation protocol. This study evaluated the efficiency of the conventional, soft-start, pulse-delay and exponential protocols for photoactivation of resin composites in reducing the shrinkage stress and temperature variation during the photopolymerisation. The photoactivation processes were performed using a photocuring unit and a smartphone app developed to control the irradiance according each photoactivation protocol. These photoactivation methods were evaluated applying photoactivation energies recommended by the resins manufactures. Three brands of resin composites were analysed: Z-250, Charisma and Ultrafill. The cure effectiveness was evaluated through depth of cure experiments. All results were statistically evaluated using one-way and multi-factor analysis of variance (ANOVA). The use of exponential and pulse-delay methods resulted in a significant reduction of the shrinkage stress for all evaluated resins; however, the pulse-delay method required too long a photoactivation time. The increases on the temperature were lower when the exponential photoactivation was applied; however, the temperature variation for all photoactivation protocols was not enough to cause damage in the restoration area. The evaluation of the depth of cure showed that all photoactivation protocols resulted in cured resins with equivalent hardness, indicating that the choice of an alternative photoactivation protocol did not harm the polymerisation. In this way, the results showed the exponential protocol as the best photoactivation technique for practical applications.
ABSTRACT
Cirrhotic patients with chronic hepatitis C should be monitored for the evaluation of liver function and screening of hepatocellular carcinoma even after sustained virological response (SVR). The stage of inflammatory resolution and regression of fibrosis is likely to happen, once treatment and viral clearance are achieved. However, liver examinations by elastography show that 30-40% of patients do not exhibit a reduction of liver stiffness. This work was a cohort study in cirrhotic patients whose purpose was to identify immunological factors involved in the regression of liver stiffness in chronic hepatitis C and characterize possible serum biomarkers with prognostic value. The sample universe consisted of 31 cirrhotic patients who underwent leukocyte immunophenotyping, quantification of cytokines/chemokines and metalloproteinase inhibitors in the pretreatment (M1) and in the evaluation of SVR (M2). After exclusion criteria application, 16 patients included were once more evaluated in M3 (like M1) and classified into regressors (R) or non-regressors (NR), decrease or not ≥ 25% stiffness, respectively. The results from ROC curve, machine learning (ML) and linear discriminant analysis showed that TCD4 + lymphocytes (absolute) are the most important biomarkers for the prediction of the regression (AUC = 0.90). NR patients presented levels less than R of liver stiffness since baseline, whereas NK cells were increased in NR. Therefore, it was concluded that there is a difference in the profile of circulating immune cells in R and NR, thus allowing the development of a predictive model of regression of liver stiffness after SVR. These findings should be validated in greater numbers of patients.
Subject(s)
Hepatitis C, Chronic , Liver Neoplasms , Antiviral Agents/therapeutic use , Cohort Studies , Hepatitis C, Chronic/complications , Hepatitis C, Chronic/drug therapy , Hepatitis C, Chronic/pathology , Humans , Inflammation/pathology , Liver/pathology , Liver Cirrhosis/pathology , Liver Neoplasms/pathologyABSTRACT
The lack of consensus concerning the biological meaning of entropy and complexity of genomes and the different ways to assess these data hamper conclusions concerning what are the causes of genomic entropy variation among species. This study aims to evaluate the entropy and complexity of genomic sequences of several species without using homologies to assess relationships among these variables and non-molecular data (e.g., the number of individuals) to seek a trigger of interspecific genomic entropy variation. The results indicate a relationship among genomic entropy, genome size, genomic complexity, and the number of individuals: species with a small number of individuals harbors large genome, and hence, low entropy but a higher complexity. We defined that the complexity of a genome relies on the entropy of each DNA segment within genome. Then, the entropy and complexity of a genome reflects its organization solely. Exons of vertebrates harbor smaller entropies than non-exon regions (likely by the repeats that accumulated from duplications), whereas other taxonomic groups do not present this pattern. Our findings suggest that small initial population might have defined current genomic entropy and complexity: actual genomes are less complex than ancestral ones. Besides, our data disagree with the relationship between phenotype and genomic entropies previously established. Finally, by establishing the relationship between genomic entropy/complexity with the number of individuals and genome size, under an evolutive perspective, ideas concerning the genomic variability may emerge.
Subject(s)
Genetic Variation , Sequence Analysis, DNA/methods , Vertebrates/growth & development , Animals , Entropy , Evolution, Molecular , Genome , Humans , Models, GeneticABSTRACT
The skin prick test is used to diagnose patients' sensitization to antigens through a mediated IgE response. It is a practical and quick exam, but its diagnosis depends on instruments for measuring the allergic response and observer's interpretation. The conventional method for inferring about the allergic reaction is performed from the dimensions of the wheals, which are measured using a ruler or a caliper. To make this diagnosis less dependent on human interpretation, the present study proposes two alternative methods to infer about the allergic reaction: computational determination of the wheal area and a study of the temperature variation of the patient's skin in the puncture region. For this purpose, prick test using histamine was performed on 20 patients randomly selected. The areas were determined by the conventional method using the dimensions of the wheals measured with a digital caliper 30â¯min after the puncture. The wheal areas were also determined by a Python algorithm using photographs of the puncture region obtained with a smartphone. A variable named circularity deviation was also determined for each analyzed wheal. The temperature variation was monitored using an infrared temperature sensor, which collected temperature data for 30â¯min. All results were statistically compared or correlated. The results showed that the computational method to infer the wheal areas did not differ significantly from the areas determined by the conventional method (p-valueâ¯=â¯0.07585). Temperature monitoring revealed that there was a consistent temperature increase in the first minutes after the puncture, followed by stabilization, so that the data could be adjusted by a logistic equation (R2â¯=â¯0.96). This adjustment showed that the optimal time to measure the temperature is 800â¯s after the puncture, when the temperature stabilization occurs. The results have also shown that this temperature stabilization has a significant positive correlation with wheal area (p-valueâ¯=â¯0.0015). Thus, we concluded that the proposed computational method is more accurate to infer the wheal area when compared to the traditional method, and that the temperature may be used as an alternative parameter to infer about the allergic reaction.
Subject(s)
Hypersensitivity/diagnosis , Image Interpretation, Computer-Assisted , Immunoglobulin E/immunology , Intradermal Tests , Photography , Skin Temperature , Skin/immunology , Thermography , Humans , Hypersensitivity/immunology , Hypersensitivity/pathology , Hypersensitivity/physiopathology , Image Interpretation, Computer-Assisted/instrumentation , Intradermal Tests/instrumentation , Mobile Applications , Photography/instrumentation , Predictive Value of Tests , Reproducibility of Results , Skin/pathology , Skin/physiopathology , Smartphone , Thermography/instrumentation , Time FactorsABSTRACT
The HIV subtype B is the most frequent in Brazil. The HIV subtype B' codes the amino acids glicine-tryptophan-glicine (GWG) instead of glicine-proline-glicine on the tip of gp120 V3 loop. This variant was associated to a slower HIV progression in mono-infected patients; however, there is no information in coinfected patients. This study evaluated the infection progression of HIV variant B' on the hepatitis C virus presence. RNA isolated from plasma of the 601 infected patients were used to human immunodeficiency virus (HIV) subtyping and to classify the virus according their syncytium-inducing ability. The HIV infection progression was evaluated by clinical and laboratorial data. The results showed a significant association between HIV B' variant and CD4 count and time of AIDS in HIV mono-infected patients. Notwithstanding the fact that we did not find a direct association between GWG variant and AIDS and in HIV coinfected patients no mitigating effect due to GWG presence was found. We did observe that the association between GWG variant and CD4 counts is lost in coinfected patients. This is first work showing influence of the HIV GWG variant in coinfected patients. Nevertheless, the presence of the GWG variant can indicate a better prognostic in the mono-infected patients.
Subject(s)
HIV Envelope Protein gp120/genetics , HIV Infections , HIV-1/genetics , Hepatitis C , Adult , Brazil/epidemiology , CD4 Lymphocyte Count/methods , Coinfection/epidemiology , Coinfection/virology , Disease Progression , Female , HIV Infections/epidemiology , HIV Infections/immunology , HIV Infections/virology , Hepacivirus/isolation & purification , Hepatitis C/epidemiology , Hepatitis C/virology , Humans , Male , Prognosis , RNA, Viral/analysisABSTRACT
INTRODUCTION: High levels of shrinkage stress caused by volumetric variations during the activation process are one of the main problems in the practical application of composite resins. OBJECTIVE: The aim of this study is to reduce the shrinkage stress and minimize the effects caused by composite resin volumetric variation due to the photopolymerization. In this way, this work proposes a systematic study to determine the optimal dimming function to be applied to light curing processes. MATERIAL AND METHODS: The study was performed by applying mathematical techniques to the optimization of nonlinear objective functions. The effectiveness of the dimming function was evaluated by monitoring the polymerization shrinkage stress during the curing process of five brands/models of composites. This monitoring was performed on a universal testing machine using two steel bases coupled in the arms of the machine where the resin was inserted and polymerized. The quality of the composites cured by the proposed method was analyzed and compared with the conventional photoactivation method by experiments to determine their degree of conversion (DC). Absorbance measurements were performed using Fourier-transform infrared spectroscopy (FT-IR). A T-test was performed on DC results to compare the photoactivation techniques. We also used scanning electron microscopy (SEM) to analyze in-vitro the adhesion interface of the resin in human teeth. RESULTS: Our results showed that the use of the optimal dimming function, named as exponential, resulted in the significant reduction of the shrinkage stress (~36.88% ±6.56 when compared with the conventional method) without affecting the DC (t=0.86, p-value=0.44). The SEM analyses show that the proposed process can minimize or even eliminate adhesion failures between the tooth and the resin in dental restorations. CONCLUSION: The results from this study can promote the improvement of the composite resin light curing process by the minimization of polymerization shrinkage effects, given an operational standardization of the photoactivation process.
Subject(s)
Composite Resins/chemistry , Composite Resins/radiation effects , Light-Curing of Dental Adhesives/methods , Polymerization/radiation effects , Adhesiveness , Dental Stress Analysis , Materials Testing , Microscopy, Electron, Scanning , Phase Transition/radiation effects , Reference Values , Spectroscopy, Fourier Transform Infrared , Stress, Mechanical , Time FactorsABSTRACT
Abstract High levels of shrinkage stress caused by volumetric variations during the activation process are one of the main problems in the practical application of composite resins. Objective The aim of this study is to reduce the shrinkage stress and minimize the effects caused by composite resin volumetric variation due to the photopolymerization. In this way, this work proposes a systematic study to determine the optimal dimming function to be applied to light curing processes. Material and Methods The study was performed by applying mathematical techniques to the optimization of nonlinear objective functions. The effectiveness of the dimming function was evaluated by monitoring the polymerization shrinkage stress during the curing process of five brands/models of composites. This monitoring was performed on a universal testing machine using two steel bases coupled in the arms of the machine where the resin was inserted and polymerized. The quality of the composites cured by the proposed method was analyzed and compared with the conventional photoactivation method by experiments to determine their degree of conversion (DC). Absorbance measurements were performed using Fourier-transform infrared spectroscopy (FT-IR). A T-test was performed on DC results to compare the photoactivation techniques. We also used scanning electron microscopy (SEM) to analyze in-vitro the adhesion interface of the resin in human teeth. Results Our results showed that the use of the optimal dimming function, named as exponential, resulted in the significant reduction of the shrinkage stress (~36.88% ±6.56 when compared with the conventional method) without affecting the DC (t=0.86, p-value=0.44). The SEM analyses show that the proposed process can minimize or even eliminate adhesion failures between the tooth and the resin in dental restorations. Conclusion The results from this study can promote the improvement of the composite resin light curing process by the minimization of polymerization shrinkage effects, given an operational standardization of the photoactivation process.