Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 118
Filter
Add more filters

Publication year range
1.
Nature ; 630(8016): 493-500, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38718835

ABSTRACT

The introduction of AlphaFold 21 has spurred a revolution in modelling the structure of proteins and their interactions, enabling a huge range of applications in protein modelling and design2-6. Here we describe our AlphaFold 3 model with a substantially updated diffusion-based architecture that is capable of predicting the joint structure of complexes including proteins, nucleic acids, small molecules, ions and modified residues. The new AlphaFold model demonstrates substantially improved accuracy over many previous specialized tools: far greater accuracy for protein-ligand interactions compared with state-of-the-art docking tools, much higher accuracy for protein-nucleic acid interactions compared with nucleic-acid-specific predictors and substantially higher antibody-antigen prediction accuracy compared with AlphaFold-Multimer v.2.37,8. Together, these results show that high-accuracy modelling across biomolecular space is possible within a single unified deep-learning framework.


Subject(s)
Deep Learning , Ligands , Models, Molecular , Proteins , Software , Humans , Antibodies/chemistry , Antibodies/metabolism , Antigens/metabolism , Antigens/chemistry , Deep Learning/standards , Ions/chemistry , Ions/metabolism , Molecular Docking Simulation , Nucleic Acids/chemistry , Nucleic Acids/metabolism , Protein Binding , Protein Conformation , Proteins/chemistry , Proteins/metabolism , Reproducibility of Results , Software/standards
2.
Nucleic Acids Res ; 52(6): 2821-2835, 2024 Apr 12.
Article in English | MEDLINE | ID: mdl-38348970

ABSTRACT

A key attribute of some long noncoding RNAs (lncRNAs) is their ability to regulate expression of neighbouring genes in cis. However, such 'cis-lncRNAs' are presently defined using ad hoc criteria that, we show, are prone to false-positive predictions. The resulting lack of cis-lncRNA catalogues hinders our understanding of their extent, characteristics and mechanisms. Here, we introduce TransCistor, a framework for defining and identifying cis-lncRNAs based on enrichment of targets amongst proximal genes. TransCistor's simple and conservative statistical models are compatible with functionally defined target gene maps generated by existing and future technologies. Using transcriptome-wide perturbation experiments for 268 human and 134 mouse lncRNAs, we provide the first large-scale survey of cis-lncRNAs. Known cis-lncRNAs are correctly identified, including XIST, LINC00240 and UMLILO, and predictions are consistent across analysis methods, perturbation types and independent experiments. We detect cis-activity in a minority of lncRNAs, primarily involving activators over repressors. Cis-lncRNAs are detected by both RNA interference and antisense oligonucleotide perturbations. Mechanistically, cis-lncRNA transcripts are observed to physically associate with their target genes and are weakly enriched with enhancer elements. In summary, TransCistor establishes a quantitative foundation for cis-lncRNAs, opening a path to elucidating their molecular mechanisms and biological significance.


Subject(s)
Computational Biology , Genetic Techniques , RNA, Long Noncoding , Animals , Humans , Mice , RNA, Long Noncoding/genetics , RNA, Long Noncoding/isolation & purification , Transcription Factors/genetics , Transcriptome , Software/standards , Computational Biology/methods
3.
Nucleic Acids Res ; 52(6): 2836-2847, 2024 Apr 12.
Article in English | MEDLINE | ID: mdl-38412249

ABSTRACT

The field of synthetic nucleic acids with novel backbone structures [xenobiotic nucleic acids (XNAs)] has flourished due to the increased importance of XNA antisense oligonucleotides and aptamers in medicine, as well as the development of XNA processing enzymes and new XNA genetic materials. Molecular modeling on XNA structures can accelerate rational design in the field of XNAs as it contributes in understanding and predicting how changes in the sugar-phosphate backbone impact on the complementation properties of the nucleic acids. To support the development of novel XNA polymers, we present a first-in-class open-source program (Ducque) to build duplexes of nucleic acid analogs with customizable chemistry. A detailed procedure is described to extend the Ducque library with new user-defined XNA fragments using quantum mechanics (QM) and to generate QM-based force field parameters for molecular dynamics simulations within standard packages such as AMBER. The tool was used within a molecular modeling workflow to accurately reproduce a selection of experimental structures for nucleic acid duplexes with ribose-based as well as non-ribose-based nucleosides. Additionally, it was challenged to build duplexes of morpholino nucleic acids bound to complementary RNA sequences.


Subject(s)
Molecular Dynamics Simulation , Morpholinos , Nucleic Acids , RNA , Software , Morpholinos/chemistry , Nucleic Acid Conformation , Nucleic Acids/chemistry , Oligonucleotides/chemistry , RNA/chemistry , Software/standards
4.
Nucleic Acids Res ; 52(6): e31, 2024 Apr 12.
Article in English | MEDLINE | ID: mdl-38364867

ABSTRACT

Proteins are crucial in regulating every aspect of RNA life, yet understanding their interactions with coding and noncoding RNAs remains limited. Experimental studies are typically restricted to a small number of cell lines and a limited set of RNA-binding proteins (RBPs). Although computational methods based on physico-chemical principles can predict protein-RNA interactions accurately, they often lack the ability to consider cell-type-specific gene expression and the broader context of gene regulatory networks (GRNs). Here, we assess the performance of several GRN inference algorithms in predicting protein-RNA interactions from single-cell transcriptomic data, and propose a pipeline, called scRAPID (single-cell transcriptomic-based RnA Protein Interaction Detection), that integrates these methods with the catRAPID algorithm, which can identify direct physical interactions between RBPs and RNA molecules. Our approach demonstrates that RBP-RNA interactions can be predicted from single-cell transcriptomic data, with performances comparable or superior to those achieved for the well-established task of inferring transcription factor-target interactions. The incorporation of catRAPID significantly enhances the accuracy of identifying interactions, particularly with long noncoding RNAs, and enables the identification of hub RBPs and RNAs. Additionally, we show that interactions between RBPs can be detected based on their inferred RNA targets. The software is freely available at https://github.com/tartaglialabIIT/scRAPID.


Subject(s)
RNA-Binding Proteins , RNA , Single-Cell Gene Expression Analysis , Software , Algorithms , RNA/genetics , RNA/metabolism , RNA-Binding Proteins/metabolism , Software/standards , Gene Regulatory Networks , Humans , Cell Line
5.
Plant Physiol ; 195(1): 378-394, 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38298139

ABSTRACT

Automated guard cell detection and measurement are vital for understanding plant physiological performance and ecological functioning in global water and carbon cycles. Most current methods for measuring guard cells and stomata are laborious, time-consuming, prone to bias, and limited in scale. We developed StoManager1, a high-throughput tool utilizing geometrical, mathematical algorithms, and convolutional neural networks to automatically detect, count, and measure over 30 guard cell and stomatal metrics, including guard cell and stomatal area, length, width, stomatal aperture area/guard cell area, orientation, stomatal evenness, divergence, and aggregation index. Combined with leaf functional traits, some of these StoManager1-measured guard cell and stomatal metrics explained 90% and 82% of tree biomass and intrinsic water use efficiency (iWUE) variances in hardwoods, making them substantial factors in leaf physiology and tree growth. StoManager1 demonstrated exceptional precision and recall (mAP@0.5 over 0.96), effectively capturing diverse stomatal properties across over 100 species. StoManager1 facilitates the automation of measuring leaf stomatal and guard cells, enabling broader exploration of stomatal control in plant growth and adaptation to environmental stress and climate change. This has implications for global gross primary productivity (GPP) modeling and estimation, as integrating stomatal metrics can enhance predictions of plant growth and resource usage worldwide. Easily accessible open-source code and standalone Windows executable applications are available on a GitHub repository (https://github.com/JiaxinWang123/StoManager1) and Zenodo (https://doi.org/10.5281/zenodo.7686022).


Subject(s)
Botany , Cell Biology , Plant Cells , Plant Stomata , Software , Plant Stomata/cytology , Plant Stomata/growth & development , Plant Cells/physiology , Botany/instrumentation , Botany/methods , Cell Biology/instrumentation , Image Processing, Computer-Assisted/standards , Algorithms , Plant Leaves/cytology , Neural Networks, Computer , High-Throughput Screening Assays/instrumentation , High-Throughput Screening Assays/methods , High-Throughput Screening Assays/standards , Software/standards
6.
Value Health Reg Issues ; 42: 100980, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38677062

ABSTRACT

OBJECTIVES: The study aimed to evaluate the cost-effectiveness of the Pare de Fumar Conosco software compared with the standard of care adopted in Brazil for the treatment of smoking cessation. METHODS: In the cohort of smokers with multiple chronic conditions, we developed an decision tree model for the benefit measures of smoking cessation. We adopted the perspectives of the Brazilian Unified Health System and the service provider. Resources and costs were measured by primary and secondary sources and effectiveness by a randomized clinical trial. The incremental cost-effectiveness ratio (ICER) was calculated, followed by deterministic and probabilistic sensitivity analyses and deterministic and probabilistic sensitivity analyses. No willingness to pay threshold was adopted. RESULTS: The software had a lower cost and greater effectiveness than its comparator. The ICER was dominant in all of the benefits examined (-R$2 585 178.29 to -R$325 001.20). The cost of the standard of care followed by that of the electronic tool affected the ICER of the benefit measures. In all probabilistic analyses, the software was superior to the standard of care (53.6%-82.5%). CONCLUSION: The Pare de Fumar Conosco software is a technology that results in cost savings in treating smoking cessation.


Subject(s)
Smoking Cessation , Standard of Care , Adult , Female , Humans , Male , Middle Aged , Brazil , Cost-Effectiveness Analysis , Decision Making , Decision Trees , Smoking Cessation/methods , Smoking Cessation/economics , Software/standards , Standard of Care/economics
7.
Rev Bras Enferm ; 77(3): e20230435, 2024.
Article in English, Portuguese | MEDLINE | ID: mdl-39082546

ABSTRACT

OBJECTIVES: to evaluate software technical quality for collecting data from patients under palliative care. METHODS: this is methodological technology evaluation research, according to the technical standard International Organization for Standardization/International Electrotechnical Commission 25040-2011, developed from August 2021 to August 2023. Eight nurses and eight information technology professionals participated as judges, who evaluated six quality characteristics and 23 subcharacteristics. Items that reached a percentage of agreement greater than 70% were considered suitable. RESULTS: the characteristics evaluated by nurses/information technology professionals received the following percentages of agreement, respectively: functional suitability (94%-84%); reliability (100-70%); usability (89.9-66.8%); performance efficiency (95.8%-86.1%); compatibility (95.8-79.6%); and safety (96%-83.4%). CONCLUSIONS: the software was considered suitable in quality evaluation to offer support to nurses in collecting patient data under palliative care, with the potential to operationalize the first Nursing Process stage.


Subject(s)
Palliative Care , Software , Humans , Palliative Care/standards , Palliative Care/methods , Software/standards , Data Collection/methods , Data Collection/standards , Reproducibility of Results
8.
Neuroinformatics ; 22(3): 297-315, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38861098

ABSTRACT

Pooling data across diverse sources acquired by multisite consortia requires compliance with a predefined reference protocol i.e., ensuring different sites and scanners for a given project have used identical or compatible MR physics parameter values. Traditionally, this has been an arduous and manual process due to difficulties in working with the complicated DICOM standard and lack of resources allocated towards protocol compliance. Moreover, issues of protocol compliance is often overlooked for lack of realization that parameter values are routinely improvised/modified locally at various sites. The inconsistencies in acquisition protocols can reduce SNR, statistical power, and in the worst case, may invalidate the results altogether. An open-source tool, mrQA was developed to automatically assess protocol compliance on standard dataset formats such as DICOM and BIDS, and to study the patterns of non-compliance in over 20 open neuroimaging datasets, including the large ABCD study. The results demonstrate that the lack of compliance is rather pervasive. The frequent sources of non-compliance include but are not limited to deviations in Repetition Time, Echo Time, Flip Angle, and Phase Encoding Direction. It was also observed that GE and Philips scanners exhibited higher rates of non-compliance relative to the Siemens scanners in the ABCD dataset. Continuous monitoring for protocol compliance is strongly recommended before any pre/post-processing, ideally right after the acquisition, to avoid the silent propagation of severe/subtle issues. Although, this study focuses on neuroimaging datasets, the proposed tool mrQA can work with any DICOM-based datasets.


Subject(s)
Magnetic Resonance Imaging , Humans , Magnetic Resonance Imaging/methods , Magnetic Resonance Imaging/standards , Software/standards , Guideline Adherence/statistics & numerical data , Guideline Adherence/standards , Image Processing, Computer-Assisted/methods , Image Processing, Computer-Assisted/standards , Brain/diagnostic imaging
9.
Pain ; 165(8): 1793-1805, 2024 Aug 01.
Article in English | MEDLINE | ID: mdl-39024163

ABSTRACT

ABSTRACT: Facial grimacing is used to quantify spontaneous pain in mice and other mammals, but scoring relies on humans with different levels of proficiency. Here, we developed a cloud-based software platform called PainFace ( http://painface.net ) that uses machine learning to detect 4 facial action units of the mouse grimace scale (orbitals, nose, ears, whiskers) and score facial grimaces of black-coated C57BL/6 male and female mice on a 0 to 8 scale. Platform accuracy was validated in 2 different laboratories, with 3 conditions that evoke grimacing-laparotomy surgery, bilateral hindpaw injection of carrageenan, and intraplantar injection of formalin. PainFace can generate up to 1 grimace score per second from a standard 30 frames/s video, making it possible to quantify facial grimacing over time, and operates at a speed that scales with computing power. By analyzing the frequency distribution of grimace scores, we found that mice spent 7x more time in a "high grimace" state following laparotomy surgery relative to sham surgery controls. Our study shows that PainFace reproducibly quantifies facial grimaces indicative of nonevoked spontaneous pain and enables laboratories to standardize and scale-up facial grimace analyses.


Subject(s)
Facial Expression , Mice, Inbred C57BL , Pain Measurement , Software , Animals , Mice , Female , Software/standards , Pain Measurement/methods , Pain Measurement/standards , Male , Pain/diagnosis
10.
Neuroinformatics ; 22(3): 269-283, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38763990

ABSTRACT

Magnetic resonance imaging of the brain is a useful tool in both the clinic and research settings, aiding in the diagnosis and treatments of neurological disease and expanding our knowledge of the brain. However, there are many challenges inherent in managing and analyzing MRI data, due in large part to the heterogeneity of data acquisition. To address this, we have developed MRIO, the Magnetic Resonance Imaging Acquisition and Analysis Ontology. MRIO provides well-reasoned classes and logical axioms for the acquisition of several MRI acquisition types and well-known, peer-reviewed analysis software, facilitating the use of MRI data. These classes provide a common language for the neuroimaging research process and help standardize the organization and analysis of MRI data for reproducible datasets. We also provide queries for automated assignment of analyses for given MRI types. MRIO aids researchers in managing neuroimaging studies by helping organize and annotate MRI data and integrating with existing standards such as Digital Imaging and Communications in Medicine and the Brain Imaging Data Structure, enhancing reproducibility and interoperability. MRIO was constructed according to Open Biomedical Ontologies Foundry principles and has contributed several classes to the Ontology for Biomedical Investigations to help bridge neuroimaging data to other domains. MRIO addresses the need for a "common language" for MRI that can help manage the neuroimaging research, by enabling researchers to identify appropriate analyses for sets of scans and facilitating data organization and reporting.


Subject(s)
Biological Ontologies , Magnetic Resonance Imaging , Humans , Magnetic Resonance Imaging/methods , Magnetic Resonance Imaging/standards , Brain/diagnostic imaging , Software/standards , Image Processing, Computer-Assisted/methods , Image Processing, Computer-Assisted/standards , Neuroimaging/methods , Neuroimaging/standards , Databases, Factual/standards
12.
J. health inform ; 13(1): 31-37, jan.-mar. 2021. ilus, tab
Article in Portuguese | LILACS | ID: biblio-1363869

ABSTRACT

A IEC 62304 fornece requisitos para os fabricantes de sistemas de saúde demonstrarem sua capacidade de fornecer softwares desenvolvidos com processos, atividades e tarefas, associadas aos riscos de segurança, que devem ser demonstrados para atendimento de fins regulatórios em diversos países. Este trabalho apresenta um mapeamento sistemático da literatura envolvendo os trabalhos que reportam utilizações, vantagens e dificuldades no uso da IEC 62304 em seus quase 15 anos de existência.


IEC 62304 provides requirements for manufacturers of healthcare systems to demonstrate their ability to provide software developed with processes, activities, and tasks, associated with safety risks, which must be demonstrated to meet regulatory purposes in several countries. This work presents a systematic literature mapping involving works that report uses, advantages and difficulties in the use of IEC 62304 in its almost 15 years of existence.


IEC 62304 proporciona requisitos para que los fabricantes de sistemas de atención médica demuestren su capacidad para proporcionar software desarrollado con procesos, actividades y tareas, asociadas con riesgos de seguridad, que deben demostrarse para cumplir con los propósitos regulatorios en varios países. Este trabajo presenta un mapeo sistemático de literatura que involucran trabajos que reportan usos, ventajas y dificultades en el uso de IEC 62304 en sus casi 15 años de existencia.


Subject(s)
Humans , Software/standards , Computer Security/standards , Health Information Systems
13.
Rev. cuba. inform. méd ; 12(1)ene.-jun. 2020. graf
Article in Spanish | CUMED, LILACS | ID: biblio-1126559

ABSTRACT

En Cuba la mayoría de los especialistas de seguridad informática tienen un escaso conocimiento sobre las herramientas indispensables del hacking ético. Para realizar la presente investigación se hicieron búsquedas bibliográficas nacionales e internacionales. De la gran cantidad de herramientas disponibles en distribuciones especializadas para la seguridad informática (Parrot Security, Black Arch y Kali Linux), se seleccionaron aquellas que se ajustaban más a las características de las redes cubanas. Este trabajo tuvo como objetivo describir algunas de las herramientas seleccionadas para el escaneo y explotación de vulnerabilidades, y conceptos fundamentales sobre el tema. Consideramos que este estudio puede ser provechoso para los especialistas cubanos en seguridad informática pues les servirá no solo para conocer sobre el hacking ético, sino también cuáles son las herramientas que se pueden emplear, teniendo en cuenta las características de nuestras redes nacionales(AU)


In Cuba, most computer security specialists have little knowledge about the essential tools of ethical hacking. To carry out the present investigation, national and international bibliographic searches were made. From the large number of tools available in specialized distributions for computer security (Parrot Security, Black Arch and Kali Linux), those that best fit the characteristics of Cuban networks were selected. Our research aimed to describe some of the selected tools for scanning and exploiting vulnerabilities, and fundamental concepts on the subject. We believe that this study will be very useful for Cuban specialists in computer security because it will serve them not only to learn about ethical hacking; but also, what are the tools that can be used taking into account the characteristics of our national networks(AU)


Subject(s)
Humans , Software/standards , Computer Security/standards , Cuba
14.
Rev. cuba. inform. méd ; 12(1)ene.-jun. 2020. tab, graf
Article in Spanish | CUMED, LILACS | ID: biblio-1126556

ABSTRACT

Una de las etapas de un proceso de despliegue de un sistema informático, es la capacitación de los usuarios finales. En algunas ocasiones se subestima la profundidad del impacto de los cambios técnicos en la organización y en los empleados al implantar un sistema informático y no se contrata el servicio de entrenamiento. El presente artículo describe una estrategia de entrenamiento y acompañamiento a usuarios en el proceso de implantación del Sistema de Información Hospitalaria XAVIA HIS. Los principales resultados se relacionan con la definición de acciones, métodos y técnicas que permiten planificar y ejecutar los servicios de entrenamiento y acompañamiento con una mayor eficiencia de las actividades ejecutadas, así como los programas bases de entrenamiento para especialistas informáticos, así como técnicos y profesionales de la salud(AU)


End user training is one of the computer system deployment process stages. The technical changes depth impact on the organization and on employees when implementing a computer system sometimes is underestimated and the training service is not hired. This article describes a training and accompaniment strategy for users in the Hospital Information System XAVIA HIS implementation process. Main results are related to the actions, methods and techniques definition that allow planning and executing the training and accompaniment services with greater efficiency of the activities carried out. Also, the paper present the basic training programs for computer specialists, health technicians and professionals(AU)


Subject(s)
Humans , Technology/methods , Technology Assessment, Biomedical , Software/standards , Medical Informatics Applications
15.
Rev. cuba. med. mil ; 49(2): e482, abr.-jun. 2020. tab
Article in Spanish | LILACS, CUMED | ID: biblio-1149906

ABSTRACT

Introducción: El Centro Nacional de Toxicología de Cuba, supervisa y controla la información de eventos atribuibles a la inmunización, vacunación e intoxicaciones con medicamentos y plaguicidas. Los casos que llegan al centro, se justifican mayormente por el uso de plaguicidas que tienen un alto nivel de toxicidad y riesgo de muerte. Los especialistas en toxicología, requieren facilidad para revisar las hojas de seguridad, el listado oficial de plaguicidas autorizados en Cuba y los casos anteriores. Esto permite analizar y emitir un diagnóstico, que salve la vida del afectado. Objetivo: Presentar un sistema para la gestión y el análisis de los casos intoxicados por plaguicidas. Métodos: El desarrollo se sustentó en la metodología de software Extreme Programming, modelado con la herramienta CASE Visual Paradigm 8.0 y lenguaje UML 2.0. Se utilizó Java con NetBeans 8.0.2 y como gestor de base de datos PostgreSQL 9.3. Resultados: Se desarrolló una herramienta de gestión de la información toxicológica, así como una base de casos de los síntomas, plaguicidas y diagnóstico por plaguicida. Los especialistas en toxicología cuentan con una herramienta de apoyo a la toma de decisiones, que reduce la ocurrencia de errores humanos(AU)


Introduction: The Cuban National Toxicology Center supervises and controls the information of events attributable to immunization, vaccination and poisonings with medications and pesticides. The cases that arrive at the center are mainly justified by the use of pesticides that have a high level of toxicity and risk of death. Specialists in toxicology require ease to review the safety sheets, the official list of authorized pesticides in Cuba and the above cases. This allows analyzing and issuing a diagnosis that saves the life of the affected person. Objective: To present a system for the management and analysis of cases poisoned by pesticides. Methods: The development was based on the Extreme Programming software methodology, modeled with the CASE Visual Paradigm 8.0 tool and the UML 2.0 language. Java was used with NetBeans 8.0.2 and as PostgreSQL 9.3 database manager. Results: A toxicological information management tool was developed, as well as a case database of symptoms, pesticides and pesticide diagnosis. Toxicology specialists have a decision support tool that reduces the occurrence of human errors(AU)


Subject(s)
Humans , Male , Female , Pesticides/adverse effects , Pesticides/poisoning , Software/standards , Cuba
16.
Rev. cuba. inform. méd ; 12(1)ene.-jun. 2020. tab, graf
Article in Spanish | CUMED, LILACS | ID: biblio-1126554

ABSTRACT

Técnicas como la Tomografía por Emisión de Positrones y la Tomografía Computarizada permiten determinar la naturaleza maligna o benigna de un tumor y estudiar las estructuras anatómicas del cuerpo con imágenes de alta resolución, respectivamente. Investigadores a nivel internacional han utilizado diferentes técnicas para la fusión de la Tomografía por Emisión de Positrones y la Tomografía Computarizada porque permite observar las funciones metabólicas en correlación con las estructuras anatómicas. La presente investigación se propone realizar un análisis y selección de algoritmos que propicien la fusión de neuroimágenes, basado en la precisión de los mismos. De esta forma contribuir al desarrollo de software para la fusión sin necesidad de adquirir los costosos equipos de adquisición de imágenes de alto rendimiento, los cuales son costosos. Para el estudio se aplicaron los métodos Análisis documental, Histórico lógico e Inductivo deductivo. Se analizaron e identificaron las mejores variantes de algoritmos y técnicas para la fusión según la literatura reportada. A partir del análisis de estas técnicas se identifica como mejor variante el esquema de fusión basado en Wavelet para la fusión de las imágenes. Para el corregistro se propone la interpolación Bicúbica. Como transformada discreta de Wavelet se evidencia el uso de la de Haar. Además, la investigación propició desarrollar el esquema de fusión basado en las técnicas anteriores. A partir del análisis realizado se constataron las aplicaciones y utilidad de las técnicas de fusión como sustitución a los altos costos de adquisición de escáneres multifunción PET/CT para Cuba(AU)


Techniques such as Positron Emission Tomography and Computed Tomography allow to determine the malignant or benign nature of a tumor and to study the anatomical structures of the body with high resolution images, respectively. International researchers have used different techniques for the fusion of Positron Emission Tomography and Computed Tomography because it allows observing metabolic functions in correlation with anatomical structures. The present investigation proposes to carry out an analysis and selection of algorithms that favor the fusion of neuroimaging, based on their precision. In this way, contribute to the development of fusion software without the need to purchase expensive high-performance imaging equipment, which is expensive. For the study the documentary analysis, logical historical and deductive inductive methods were applied. The best algorithm variants and techniques for fusion were analyzed and identified according to the reported literature. From the analysis of these techniques, the Wavelet-based fusion scheme for image fusion is identified as the best variant. Bicubic interpolation is proposed for co-registration. As a discrete Wavelet transform, the use of Haar's is evidenced. In addition, the research led to the development of the fusion scheme based on the previous techniques. From the analysis carried out, the applications and usefulness of fusion techniques were verified as a substitute for the high costs of acquiring PET / CT multifunction scanners for Cuba(AU)


Subject(s)
Humans , Male , Female , Image Processing, Computer-Assisted/methods , Software/standards , Tomography, X-Ray Computed/methods , Positron-Emission Tomography/methods , Wavelet Analysis , Cuba
17.
RFO UPF ; 25(3): 443-451, 20201231. ilus, tab
Article in English | LILACS, BBO - dentistry (Brazil) | ID: biblio-1357828

ABSTRACT

Objetivo: este estudo avaliou a acurácia e confiabilidadedas medidas lineares em exames detomografia computadorizada de feixe cônico(TCFC), em dois softwares, utilizando diferentesvoxels e variando o posicionamento da mandíbula.Material e Métodos: 10 imagens de TCFC demandíbulas humanas com 25 pontos foram obtidas,usando diferentes protocolos de aquisição(0.250, 0.300, 0.400-mm voxels) e orientações damandíbula (centralizada, rotacionada 10° lateralmentepara direita e esquerda, inclinada 10° para cima e para baixo); 14 medidas foram realizadasnas reconstruções multiplanares nos softwares XoranCate OsiriX. Os achados foram comparadoscom as medidas físicas através de um paquímetrodigital. O teste ANOVA e o coeficiente de correlaçãoforam utilizados com p < 0,05. Resultados:não houve diferença estatisticamente significantequando as medidas foram comparadas em aquisiçõescom diferentes tamanhos de voxels emambos os softwares. A posição da mandíbula nãoinfluenciou nas medidas. Nenhuma diferença foiencontrada quando os valores foram comparadosentre os softwares e o paquímetro digital. Conclusão:as medidas lineares em ambos os softwaresforam confiáveis e acurados comparados a mensuraçãofísica em todos os protocolos. A acuráciae a confiabilidade das mensurações não influenciaramde acordo com as variações de posicionamentoda mandíbula.(AU)


Objective: this study evaluated the accuracy and reliability of linear measurements on cone beam CT (CBCT) scans in two software programs, using different voxels and varying mandible positioning. Material and methods: CBCT images of 10 human mandibles with 25 markers were obtained using different acquisition protocols (0.250, 0.300, 0.400-mm voxels) and mandible orientations (centered, rotated 10° laterally to right and left, tilted 10 up and down); fourteen measurements were carried out on the multiplanar reconstructions in XoranCat and OsiriX Lite software programs. The findings were compared to physical measurements using a digital caliper. ANOVA and correlation coefficient tests were used, at α = 0.05. Results: there was no statistically significant difference when the measurements were compared in acquisitions with different voxel sizes analysed in both software programs. Mandibular positioning changes did not influence the measurements. No differences were found when the values were compared between the software programs and the digital caliper. Conclusion: linear measurements in both programs were reliable and accurate compared with physical measurements when using all acquisition protocols. The accuracy and reliability of the measurements were not influenced by variations in the mandible positioning.(AU)


Subject(s)
Humans , Software/standards , Cone-Beam Computed Tomography/standards , Dimensional Measurement Accuracy , Mandible/diagnostic imaging , Reference Values , Reproducibility of Results , Analysis of Variance , Cone-Beam Computed Tomography/methods
18.
Rev. bras. enferm ; 73(3): e20180411, 2020. graf
Article in English | LILACS, BDENF - nursing (Brazil) | ID: biblio-1092579

ABSTRACT

ABSTRACT Objectives: to report the user experience of the webQDA software in the support of qualitative data analysis about health literacy of older adults. Methods: quasi-experimental research developed from January 2014 to January 2015, with 118 older adults, all of whom were interviewed to assess the level of health literacy. Interviews were carried out before and after four educational interventions, according to Freire's method named Culture Circle. The interviews were transcribed and entered in the software, which highlighted the analytical categories. Results: the systems of sources, interpretative encoding and questioning of the data available in the software allowed the construction of three categories for the literacy levels and four categories for their dimensions. Final considerations: We concluded that the webQDA software enables the structured encoding of qualitative materials, ensuring faster and effective management of data with systematization and analytical transparency.


RESUMEN Objetivos: relatar la experiencia con la utilización del software webQDA como base al análisis de los datos cualitativos sobre literacidad en salud de ancianos. Métodos: investigación cuasiexperimental realizada en el período de enero de 2014 a enero de 2015, en la cual participaron 118 ancianos mediante entrevistas para evaluar el nivel de literacidad en salud. Se realizaron entrevistas antes y después de cuatro intervenciones educativas, de acuerdo con el método problematizador freireano titulado Círculo de Cultura. Se transcribieron las entrevistas y las insertaron en el software, lo que permitió obtener las categorías analíticas. Resultados: los sistemas de fuentes, de codificación interpretativa y de cuestionamiento de los datos disponibles en el software permitieron elaborar tres categorías sobre los niveles de literacidad y cuatro categorías sobre sus dimensiones. Consideraciones finales: Se concluyó que el software webQDA permite la codificación estructurada de los materiales cualitativos, lo que promueve una gestión más rápida y eficaz de los datos a partir de la sistematización y transparencia analítica.


RESUMO Objetivos: relatar a experiência de uso do software webQDA no apoio à análise dos dados qualitativos acerca do letramento em saúde de idosos. Métodos: pesquisa quase experimental desenvolvida no período de janeiro de 2014 a janeiro de 2015, com 118 idosos, os quais foram entrevistados para avaliar o nível de letramento em saúde. Foram realizadas entrevistas antes e após quatro intervenções educativas, segundo o método problematizador freireano denominado Círculo de Cultura. As entrevistas foram transcritas e inseridas no software, o que permitiu evidenciar as categorias analíticas. Resultados: os sistemas de fontes, codificação interpretativa e questionamento dos dados disponíveis no software permitiram a construção de três categorias para os níveis do letramento e de quatro categorias para as suas dimensões. Considerações finais: Conclui-se que o software webQDA permite a codificação estruturada dos materiais qualitativos, assegurando gestão mais rápida e eficaz dos dados a partir da sistematização e transparência analítica.


Subject(s)
Humans , Research Personnel/psychology , Software/standards , Nursing Research/methods , Qualitative Research , Data Management/instrumentation , Research Design , Software/trends , Nursing Research/trends , Data Management/methods , Life Change Events
19.
Rev. chil. infectol ; 37(1): 69-75, feb. 2020. tab, graf
Article in Spanish | LILACS | ID: biblio-1092724

ABSTRACT

Resumen Las infecciones respiratorias agudas (IRA) causadas por virus son una importante causa de morbilidad y mortalidad en el mundo, afectando principalmente a niños y adultos mayores. Se asocian a un alto número de consultas y hospitalizaciones, a una significativa sobrecarga del sistema de salud y a un alto costo económico. La vigilancia de virus respiratorios tiene el potencial de ayudar a optimizar la respuesta sanitaria, garantizar la disponibilidad de recursos humanos, racionalizar los recursos y disminuir los costos asociados a la atención en salud. Con el objetivo de optimizar la recolección y visualización de los datos de nuestro actual sistema de vigilancia de virus respiratorios, se diseñó una plataforma basada en R y sus paquetes Shiny, que permite la creación de una interfase web interactiva y amigable para la recolección, análisis y publicación de los datos. Se ingresaron a esta plataforma los datos de la red de vigilancia metropolitana de virus respiratorios disponibles desde 2006. En esta plataforma, el investigador demora menos de un minuto en registrar los datos. El análisis y publicación es inmediato, llegando a cualquier usuario con un dispositivo conectado a Internet, quien puede elegir las variables a consultar. Con un costo muy bajo, en poco tiempo y utilizando el lenguaje de programación R, se logró crear un sistema simple e interactivo, disminuyendo el tiempo de carga y análisis de datos de forma considerable, posiblemente aumentando el impacto y la disponibilidad de esta vigilancia.


Abstract Acute respiratory infections (ARI) are an important cause of morbidity and mortality worldwide, affecting mainly children and the elderly. They are associated with a high economic burden, increased number of medical visits and hospitalizations. The surveillance of the circulation of respiratory viruses can reduce the health care associated costs, and to optimize the health response. A platform based on R and its package Shiny was designed, to create an interactive and friendly web interface for gathering, analysis and publication of the data. The data from the Chilean metropolitan respiratory viruses surveillance network, available since 2006, was uploaded into the platform. Using this platform, the researcher spends less than 1 minute to upload the data, and the analysis and publication is immediate, available to be seen by any user with a device connected to Internet, who can choose the variables to be displayed. With a very low cost, in a short time, and using the R programming language, it was possible to create a simple, and interactive platform, considerably decreasing the upload and analysis time, and increasing the impact and availability of this surveillance.


Subject(s)
Humans , Child , Aged , Respiratory Tract Infections/economics , Respiratory Tract Infections/epidemiology , Software/economics , Software/standards , Virus Diseases/epidemiology , Health Care Costs , Models, Theoretical , Viruses , Chile/epidemiology , Internet
20.
Braz. oral res. (Online) ; 34: e017, 2020. tab, graf
Article in English | LILACS | ID: biblio-1089399

ABSTRACT

Abstract Prevention and health promotion are considered important strategies to control oral diseases. Dental caries is preventable disease and remains the most common chronic disease that affects mainly low income children and still considered the main cause of tooth loss in adulthood in Brazil. The aim of this study is to present a System Dynamics model (SDM) specifically developed with the Stella Architect software to estimate the cost and clinical hours required to control the evolution of dental caries in preschool children in Maringá, Brazil. Two main strategies to control caries were considered in the model: the application of fluoride varnish on teeth presenting white spots, and the use of Atraumatic Restorative Treatment (ART) in cavitated carious lesions without pulp involvement. The parameters used in the model were: number of people covered by a local oral health team = 4,000; number of children up to 5 years = 7% of the population; children's decayed, missing, filled teeth (dmft) index = 2.4; time/cost of 4 applications of fluoride varnish = 5 minutes/US$ 0.716; and time/cost of each ART restoration = 15 minutes/US$ 1.475. The SDM generated an estimated total cost of US$698.00, and a total of 112 clinical hours to treat the population in question. The use of the SDM presented here has the potential to assist decision making by measuring the material and human resources required to prevent and control dental caries at an early age.


Subject(s)
Humans , Male , Female , Child, Preschool , Systems Analysis , Dental Caries/economics , Dental Caries/therapy , Dental Atraumatic Restorative Treatment/economics , Time Factors , Software/standards , Brazil , DMF Index , Fluorides, Topical/economics , Dental Materials/economics , Dental Atraumatic Restorative Treatment/methods
SELECTION OF CITATIONS
SEARCH DETAIL