RESUMO
1. Challenges, strategies and new technologies in the field of biotransformation were presented and discussed at the fourth European Biotransformation Workshop which was held in collaboration with the joint ISSX/DMDG meeting on June 15, 2023 at the University of Hertfordshire in Hatfield, UK.2. In this meeting report we summarise the presentations and discussions from this workshop.3. The topics covered are listed below: Unusual biotransformation reactionsBiotransformation Workflows in Discovery utilising various softwares for structure elucidationBiotransformation software for the identification of peptide metabolitesAccelerator Mass Spectrometry (AMS) for endogenous and xenobiotic metabolite profilingMetabolite profiling using quantitative Nuclear magnetic resonance (NMR) and liquid chromatography coupled to inductively coupled plasma-mass spectrometry (LC-ICP-MS).
Assuntos
Biotransformação , Xenobióticos , Humanos , Cromatografia Líquida , Europa (Continente) , Espectrometria de Massas , Metabolômica , Xenobióticos/metabolismoRESUMO
PURPOSE: To compare, vs CMR, four softwares: quantitative gated SPECT (QGS), myometrix (MX), corridor 4DM (4DM), and Emory toolbox (ECTb) to evaluate left ventricular ejection fraction (LVEF), end-systolic (ESV), and end-diastolic volumes (EDVs) by gated MPI CZT-SPECT. METHODS: 48 patients underwent MPI CZT-SPECT and CMR 6 weeks after STEMI, LV parameters were measured with four softwares at MPI CZT-SPECT vs CMR. We evaluated (i) concordance and correlation between MPI CZT-SPECT and CMR, (ii) concordance MPI CZT-SPECT/CMR for the categorical evaluation of the left ventricular dysfunction, and (iii) impacts of perfusion defects > 3 segments on concordance. RESULTS: LVEF: LCC QGS/CMR = 0.81 [+ 2.2% (± 18%)], LCC MX/CMR = 0.83 [+ 1% (± 17.5%)], LCC 4DM/CMR = 0.73 [+ 3.9% (± 21%)], LCC ECTb/CMR = 0.69 [+ 6.6% (± 21.1%)]. ESV: LCC QGS/CMR = 0.90 [- 8 mL (± 40 mL)], LCC MX/CMR = 0.90 [- 9 mL (± 36 mL)], LCC 4DM/CMR = 0.89 [+ 4 mL (± 45 mL)], LCC ECTb/CMR = 0.87 [- 3 mL (± 45 mL)]. EDV: LCC QGS/CMR = 0.70 [- 16 mL (± 67 mL)], LCC MX/CMR = 0.68 [- 21 mL (± 63 mL], LCC 4DM/CMR = 0.72 [+ 9 mL (± 73 mL)], LCC ECTb/CMR = 0.69 [+ 10 mL (± 70 mL)]. CONCLUSION: QGS and MX were the two best-performing softwares to evaluate LVEF after recent STEMI.
Assuntos
Cádmio , Imagem do Acúmulo Cardíaco de Comporta/métodos , Processamento de Imagem Assistida por Computador/instrumentação , Processamento de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Infarto do Miocárdio com Supradesnível do Segmento ST/diagnóstico por imagem , Telúrio , Tomografia Computadorizada de Emissão de Fóton Único/métodos , Função Ventricular Esquerda , Zinco , Adulto , Idoso , Diástole , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Software , Volume Sistólico , Tecnécio Tc 99m Sestamibi , Disfunção Ventricular EsquerdaRESUMO
AgWH50C, an exo-ß-agarase of GH50 isolated from Agarivorans gilvus WH0801, plays a key role in the enzymatic production of neoagarobiose, which has great application prospect in the cosmetics and pharmaceutical industry. In contrast, the poor thermostability becomes the main obstructive factor of glycoside hydrolase (GH) family 50 agarases, including AgWH50C. Herein, based on the AgWH50C crystal structure, we designed several mutants by a multiple cross-linked rational design protocol used thermostability predicting softwares ETSS, PoPMuSiC, and HotMuSiC. To our surprise, the mutant K621F increased its relative activity by as much as 45% and the optimal temperature increased to 38 °C compared to that of wild-type, AgWH50C (30 °C). The thermostability of K621F also exhibited a substantial improvement. Considering that the gelling temperature of the agarose is higher than 35 °C, K621F can be used to hydrolyze agarose for neoagarobiose production.
Assuntos
Alteromonadaceae/enzimologia , Alteromonadaceae/genética , Dissacarídeos/biossíntese , Glicosídeo Hidrolases/genética , Alteromonadaceae/metabolismo , Proteínas de Bactérias/metabolismo , Cristalografia por Raios X , Glicosídeo Hidrolases/metabolismo , Temperatura Alta , Mutagênese , Mutação/genética , Estrutura Secundária de ProteínaRESUMO
The therapeutic properties of plants have been recognised since time immemorial. Many pathological conditions have been treated using plant-derived medicines. These medicines are used as concoctions or concentrated plant extracts without isolation of active compounds. Modern medicine however, requires the isolation and purification of one or two active compounds. There are however a lot of global health challenges with diseases such as cancer, degenerative diseases, HIV/AIDS and diabetes, of which modern medicine is struggling to provide cures. Many times the isolation of "active compound" has made the compound ineffective. Drug discovery is a multidimensional problem requiring several parameters of both natural and synthetic compounds such as safety, pharmacokinetics and efficacy to be evaluated during drug candidate selection. The advent of latest technologies that enhance drug design hypotheses such as Artificial Intelligence, the use of 'organ-on chip' and microfluidics technologies, means that automation has become part of drug discovery. This has resulted in increased speed in drug discovery and evaluation of the safety, pharmacokinetics and efficacy of candidate compounds whilst allowing novel ways of drug design and synthesis based on natural compounds. Recent advances in analytical and computational techniques have opened new avenues to process complex natural products and to use their structures to derive new and innovative drugs. Indeed, we are in the era of computational molecular design, as applied to natural products. Predictive computational softwares have contributed to the discovery of molecular targets of natural products and their derivatives. In future the use of quantum computing, computational softwares and databases in modelling molecular interactions and predicting features and parameters needed for drug development, such as pharmacokinetic and pharmacodynamics, will result in few false positive leads in drug development. This review discusses plant-based natural product drug discovery and how innovative technologies play a role in next-generation drug discovery.
Assuntos
Produtos Biológicos/análise , Biologia Computacional/métodos , Desenho de Fármacos , Descoberta de Drogas/métodos , Plantas Medicinais/química , Inteligência Artificial , Automação Laboratorial , Produtos Biológicos/química , Simulação por Computador , Indústria Farmacêutica , Humanos , Modelos Químicos , Fitoterapia/métodos , Robótica , SoftwareRESUMO
Herein, we propose the synthesis and characterization of graphene for the immobilization of ß-galactosidase for improved galacto-oligosaccharide (GOS) production. The size of synthesized graphene was observed to be 25 nm by TEM analysis while interaction of enzyme with the nanosupport was observed by FTIR spectroscopy. Docking was obtained using molecular docking program Dock v.6.5 while the visual analyses and illustration of protein-ligand complex were investigated by utilizing chimera v.1.6.2 and PyMOL v.1.3 softwares. Immobilized ß-galactosidase (IßG) showed improved stability against various physical and chemical denaturants. Km of IßG was increased to 6.41 mM as compared to 2.38 mM of soluble enzyme without bringing significant change in Vmax value. Maximum GOS content also registered an increase in lactose conversion. The maximum GOS production was achieved by immobilized enzyme at specific temperature and time. Hence, the developed nanosupport can be further exploited for developing a biosensor involving ß-galactosidase or for immobilization of other industrially/therapeutically important enzymes.
Assuntos
Grafite/metabolismo , beta-Galactosidase/metabolismo , Aspergillus oryzae/enzimologia , Cinética , Microscopia Eletrônica de Transmissão , Simulação de Acoplamento Molecular , Espectroscopia de Infravermelho com Transformada de FourierRESUMO
OpenSPIM is an Open Access platform for Selective Plane Illumination Microscopy (SPIM) and allows hundreds of laboratories around the world to generate and process light-sheet data in a cost-effective way due to open-source hardware and software. While setting up a basic OpenSPIM configuration can be achieved expeditiously, correctly assembling and operating more complex OpenSPIM configurations can be challenging for routine standard OpenSPIM users. Detailed instructions on how to equip an OpenSPIM with two illumination sides and two detection axes (X-OpenSPIM) are provided, and a solution is also provided on how the temperature can be controlled in the sample chamber. Additionally, it is demonstrated how to operate it by implementing an ArduinoUNO microcontroller and introducing µOpenSPIM, a new software plugin for OpenSPIM, to facilitate image acquisition. The new software works on any OpenSPIM configuration comes with drift correction functionality, on-the-fly image processing, and gives users more options in the way time-lapse movies are initially set up and saved. Step-by-step guides are also provided within the Supporting Information and on the website on how to align the lasers, configure the hardware, and acquire images using µOpenSPIM. With this, current OpenSPIM users are empowered in various ways, and newcomers striving to use more advanced OpenSPIM systems are helped.
Assuntos
Processamento de Imagem Assistida por Computador , Software , Processamento de Imagem Assistida por Computador/métodos , Lasers , Microscopia de Fluorescência/métodosRESUMO
OBJECTIVE: To compare two clinically available MR volumetry software, NeuroQuant® (NQ) and Inbrain® (IB), and examine the inter-method reliabilities and differences between them. MATERIALS AND METHODS: This study included 172 subjects (age range, 55-88 years; mean age, 71.2 years), comprising 45 normal healthy subjects, 85 patients with mild cognitive impairment, and 42 patients with Alzheimer's disease. Magnetic resonance imaging scans were analyzed with IB and NQ. Mean differences were compared with the paired t test. Inter-method reliability was evaluated with Pearson's correlation coefficients and intraclass correlation coefficients (ICCs). Effect sizes were also obtained to document the standardized mean differences. RESULTS: The paired t test showed significant volume differences in most regions except for the amygdala between the two methods. Nevertheless, inter-method measurements between IB and NQ showed good to excellent reliability (0.72 < r < 0.96, 0.83 < ICC < 0.98) except for the pallidum, which showed poor reliability (left: r = 0.03, ICC = 0.06; right: r = -0.05, ICC = -0.09). For the measurements of effect size, volume differences were large in most regions (0.05 < r < 6.15). The effect size was the largest in the pallidum and smallest in the cerebellum. CONCLUSION: Comparisons between IB and NQ showed significantly different volume measurements with large effect sizes. However, they showed good to excellent inter-method reliability in volumetric measurements for all brain regions, with the exception of the pallidum. Clinicians using these commercial software should take into consideration that different volume measurements could be obtained depending on the software used.
Assuntos
Doença de Alzheimer/fisiopatologia , Encéfalo/fisiologia , Disfunção Cognitiva/fisiopatologia , Software , Idoso , Idoso de 80 Anos ou mais , Doença de Alzheimer/diagnóstico por imagem , Encéfalo/diagnóstico por imagem , Estudos de Casos e Controles , Disfunção Cognitiva/diagnóstico por imagem , Feminino , Humanos , Imageamento por Ressonância Magnética , Masculino , Pessoa de Meia-Idade , Reprodutibilidade dos TestesRESUMO
The process of tooth development is both, fascinating and well-described aspect of embryology. Although a big deal of the dental literature is being focused to the understanding of early stages of tooth development, still huge gap exist in our knowledge on how the dental hard tissues are formed, based on available images and descriptions. Tooth development process takes place in 3D form, inside our body. Therefore, histology should also be additionally explained with the help of additional 3D images and a video, which have not been reported so far. Methodology: Therefore, this brief article is a technical note and preliminary attempt to showcase 3D animation images and video of stages of tooth development which have been designed by the author herself using various the 3D animation softwares such as 3D max (Autodesk Media and Entertainment, San Rafael, California) and Adobe Premiere Pro 5.5 software which is a video-editing software (Adobe Systems, San Rafael, California).
RESUMO
BACKGROUND: Monoclonal Antibodies (mAbs) represent one of the most important classes of biotherapeutic agents. They are used to cure many diseases, including cancer, autoimmune diseases, cardiovascular diseases, angiogenesis-related diseases and, more recently also haemophilia. They can be highly varied in terms of format, source, and specificity to improve efficacy and to obtain more targeted applications. This can be achieved by leaving substantially unchanged the basic structural components for paratope clustering. OBJECTIVES: The objective was to trace the most relevant findings that have deserved prestigious awards over the years, to report the most important clinical applications and to emphasize their latest emerging therapeutic trends. RESULTS: We report the most relevant milestones and new technologies adopted for antibody development. Recent efforts in generating new engineered antibody-based formats are briefly reviewed. The most important antibody-based molecules that are (or are going to be) used for pharmacological practice have been collected in useful tables. CONCLUSION: The topics here discussed prove the undisputed role of mAbs as innovative biopharmaceuticals molecules and as vital components of targeted pharmacological therapies.
Assuntos
Anticorpos Monoclonais , Produtos Biológicos , Anticorpos Monoclonais/uso terapêutico , Humanos , Estudos Prospectivos , Engenharia de Proteínas , Estudos RetrospectivosRESUMO
In drug discovery, in silico methods have become a very important part of the process. These approaches impact the entire development process by discovering and identifying new target proteins as well as designing potential ligands with a significant reduction of time and cost. Furthermore, in silico approaches are also preferred because of reduction in the experimental use of animals as; in vivo testing for safer drug design and repositioning of known drugs. Novel software-based discovery and development such as direct/indirect drug design, molecular modelling, docking, screening, drug-receptor interaction, and molecular simulation studies are very important tools for the predictions of ligand-target interaction pattern, pharmacodynamics as well as pharmacokinetic properties of ligands. On the other part, the computational approaches can be numerous, requiring interdisciplinary studies and the application of advanced computer technology to design effective and commercially feasible drugs. This review mainly focuses on the various databases and software used in drug design and development to speed up the process.
Assuntos
Bases de Dados de Produtos Farmacêuticos , Desenvolvimento de Medicamentos , Software , Animais , HumanosRESUMO
BACKGROUND: Computational or in silico studies are undertaken to assess the drug like properties of lead compounds. These studies help in fast prediction of relevant properties. OBJECTIVE: Through this review, an effort is made to encapsulate some of the important parameters which should be met by a compound for it to be considered as a potential drug candidate along with an overview of automated softwares which can be used for making various predictions. METHODS: Drug uptake, its absorption, evacuation and associated hazardous effects are important factors for consideration in drug designing and should be known in early stages of drug development. Several important physicochemical properties like molecular weight, polar surface area (PSA), molecular flexibility etc. have to be taken into consideration in drug designing. Toxicological assessment is another important aspect of drug discovery which predicts the safety and adverse effects of a drug. RESULTS: Additionally, bioactivity scores of probable drug leads against various human receptors can also be predicted to evaluate the probability of them to act as a potential drug candidate. The in vivo biological targets of a molecule can also be efficiently predicted by molecular docking studies. CONCLUSION: Some important software like iGEMDOCK, AutoDock, OSIRIS property explorer, Molinspiration, MetaPrint2D, admetSAR and their working methodology and principle of working have been summarized in this review.
Assuntos
Desenho de Fármacos , Preparações Farmacêuticas/química , Animais , Simulação por Computador , Desenvolvimento de Medicamentos/métodos , Descoberta de Drogas/métodos , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/etiologia , Humanos , Modelos Biológicos , Simulação de Acoplamento Molecular , Preparações Farmacêuticas/metabolismo , Farmacocinética , SoftwareRESUMO
The history of dialysis, which started only half a century ago, is rich in developments and technological innovations. Thanks to scientific progress and the development of knowledge in the field of dialysis, patient survival will continue to increase and quality of life will continue to improve. More precise purification, reductions in the size and weight of equipment as well as refinement of filtration membranes are a few of the recent and current breakthroughs, and are challenges for further development. Dialysis has a world of opportunities ahead in terms of optimizing processes and new innovations, continuing the progress made in improving patient treatment conditions. Cet article fait partie du numéro supplément Innovations en Néphrologie réalisé avec le soutien institutionnel de Vifor Fresenius Medical Care Renal Pharma.
Assuntos
Rins Artificiais , Diálise Renal/instrumentação , Soluções para Diálise , Humanos , InvençõesRESUMO
OBJECTIVES: This study aims to systematically review the different methods used for wear measurement of dental tissues and materials in clinical studies, their relevance and reliability in terms of accuracy and precision, and the performance of the different steps of the workflow taken independently. METHODS: An exhaustive search of clinical studies related to wear of dental tissues and materials reporting a quantitative measurement method was conducted. MedLine, Embase, Scopus, Cochrane Library and Web of Science databases were used. Prospective studies, pilot studies and case series (>10 patients), as long as they contained a description of wear measurement methodology. Only studies published after 1995 were considered. RESULTS: After duplicates' removal, 495 studies were identified, and 41 remained for quantitative analysis. Thirty-four described wear-measurement protocols, using digital profilometry and superimposition, whereas 7 used alternative protocols. A specific form was designed to analyze the risk of bias. The methods were described in terms of material analyzed; study design; device used for surface acquisition; matching software details and settings; type of analysis (vertical height-loss measurement vs volume loss measurement); type of area investigated (entire occlusal area or selective areas); and results. SINIFICANCE: There is a need of standardization of clinical wear measurement. Current methods exhibit accuracy, which is not sufficient to monitor wear of restorative materials and tooth tissues. Their performance could be improved, notably limiting the use of replicas, using standardized calibration procedures and positive controls, optimizing the settings of scanners and matching softwares, and taking into account unusable data.
Assuntos
Materiais Dentários/química , Desgaste de Restauração Dentária , Desgaste dos Dentes , Humanos , Teste de Materiais , Propriedades de SuperfícieRESUMO
BACKGROUND: We have developed a quantitative structure-activity relationship (QSAR) model for predicting the larvicidal activity of 60 plant-derived molecules against Aedes aegypti L. (Diptera: Culicidae), a vector of several diseases such as dengue, yellow fever, chikungunya and Zika. The balanced subsets method (BSM) based on k-means cluster analysis (k-MCA) was employed to split the data set. The replacement method (RM) variable subset selection technique coupled with multivariable linear regression (MLR) proved to be successful for exploring 18 326 molecular descriptors and fingerprints calculated with PaDEL, Mold2 and EPI Suite open-source softwares. RESULTS: A robust QSAR model (Rtrain2=0.84, Strain = 0.20 and Rtest2=0.92, Stest = 0.23) involving five non-conformational descriptors was established. The model was validated and tested through the use of an external test set of compounds, the leave-one-out (LOO) and leave-more-out (LMO) cross-validation methods, Y-randomization and applicability domain (AD) analysis. CONCLUSION: The QSAR model surpasses previously published models based on geometrical descriptors, thereby representing a suitable tool for predicting larvicidal activity against the vector A. aegypti using a conformation-independent approach. © 2018 Society of Chemical Industry.
Assuntos
Aedes/efeitos dos fármacos , Inseticidas/química , Mosquitos Vetores/efeitos dos fármacos , Compostos Fitoquímicos/química , Relação Quantitativa Estrutura-Atividade , Aedes/crescimento & desenvolvimento , Animais , Larva/efeitos dos fármacos , Larva/crescimento & desenvolvimento , Modelos Químicos , Mosquitos Vetores/crescimento & desenvolvimento , Zika virusRESUMO
Apesar dos termos "esporte" e "lazer" frequentemente caminharem juntos tanto nos aportes legais quanto nas produções científicas e discursos políticos, e ser irrefutável a afinidade entre eles, é importante considerar que estas duas manifestações não se confundem do ponto de vista teórico-metodológico. Isto porque, nem sempre o esporte é lazer; e o lazer não se resume a esporte. Ponderando o exposto, este artigo tem como objetivo estabelecer relação entre alguns aspectos teóricos de lazer, esporte recreativo e os programas e projetos do governo para o setor. Apresenta-se um recorrido que se inicia com concepções teóricas sobre as políticas públicas de esporte e lazer no Brasil, chegando a alguns projetos e programas para sua efetivação.
Despite the terms "sport" and "leisure" often go together both in legal contributions and in scientific productions and political discourses, and the affinity between them is irrefutable, it is important to consider that these two manifestations are not confused from a theoretical-methodological. This is because, sport is not always leisure; and leisure is not just about sport. Considering the above, this article aims to establish a relationship between some theoretical aspects of leisure, recreational sport and government programs and projects for the sector. A tour is presented that begins with theoretical conceptions about public policies on sport and leisure in Brazil, arriving at some projects and programs for their implementation.
RESUMO
Implant placement has become a routine modality of dental care.Improvements in surgical reconstructive methods as well as increased prosthetic demands,require a highly accurate diagnosis, planning and placement. Recently,computer-aided design and manufacturing have made it possible to use data from computerised tomography to not only plan implant rehabilitation,but also transfer this information to the surgery.A review on one of this technique called Stereolithography is presented in this article.It permits graphic and complex 3D implant placement and fabrication of stereolithographic surgical templates. Also offers many significant benefits over traditional procedures.
RESUMO
Objetivo de la revisión: Es mostrar a la comunidad odontológica especialmente a la especialidad de cirugía maxilofacial, una herramienta valiosa y moderna, la estereolitografía como ayuda importante en la planificación y tratamiento en los diversos campos que desarrolla esta especialidad. Métodos: Se revisó la literatura en diversos buscadores y se seleccionó aquellos artículos que tienen relación con el tema. Resultados: Se encontró literatura valiosa que describe con claridad ¨cómo la estereotlitografía es una herramienta de ayuda en la planificación del tratamiento en las diversa áreas de la cirugía maxilofacial. Conclusión: En la cirugía contempor nea el advenimiento de las innovaciones.
Objective of the review: Is to show the dental community especially the specialty of maxillofacial surgery, a valuable and modern tool, stereolithography as important in treatment planning and in the various fields that help develop this specialty. Methods: Literature was reviewed in various search engines and articles that are related to the selected topic. Results: Valuable literature that clearly describes how the estereotlitograf¡a is a tool to aid in treatment planning in different areas of the maxillofacial surgery was found. Conclusion: In contemporary surgery the advent of innovations.
Assuntos
Desenho Assistido por Computador , Fotogrametria , Ilustração Médica , Tomografia Computadorizada de EmissãoRESUMO
The traditional method of teaching Human Anatomy is based on the use of cadavers, text books and the use of images from an atlas. Learning anatomy by means of a cadaver contributes to the understanding of the shape, location and relationship of various organs and structures of the human body. However, the use of cadaverous material presents difficulties in terms of acquisition, conservation, quality and quantity. Thus, to improve the teaching of anatomy other learning objects such as anatomical models, videos and software, have been used. Each of these objects has its qualities as facilitators of knowledge, a fact represented by the impact that they have on the learning of anatomy translated into an improvement in student grades. However, such learning objects should not replace the use of cadavers, rather all of these methods should be integrated in order to improve the performance of students. This article presents a review of the literature on the quality and the language of learning objects used to enhance the teaching of Human Anatomy, in addition to analyzing the influences of computers on changes to the learning objects in content of the discipline of anatomy.
El método tradicional de enseñanza de la Anatomía Humana se basa en el uso de cadáveres, libros de texto y en la utilización de imágenes de atlas. Aprender anatomía por medio del empleo de cadáveres contribuye a la comprensión de la forma, localización y relación de los órganos y estructuras del cuerpo humano. Sin embargo, las dificultades que se observan en la utilización de material cadavérico están relacionadas a su obtención, conservación, calidad y cantidad. En ese sentido, para mejorar la enseñanza de la anatomía se utilizan otros objetos de aprendizaje como modelos anatómicos, videos y softwares. Cada uno de esos objetos tiene cualidades como facilitadores del conocimiento, este hecho se plasma en la obtención de mejores calificaciones por parte de los alumnos. No obstante, estos materiales no deben sustituir al material cadavérico, por el contrario, debe existir una integración de todos los métodos, con el objetivo de mejorar el desempeño del alumno. Este artículo presenta una revisión de la literatura sobre la calidad y el lenguaje de los objetos de aprendizaje utilizados para mejorar la enseñanza de la Anatomía Humana, además de analizar la influencia que los computadores ejercen para que los objetivos y contenidos del aprendizaje de la disciplina de Anatomía sean cumplidos.
Assuntos
Humanos , Anatomia/educação , Modelos Anatômicos , Software , Materiais de EnsinoRESUMO
o computador está presente no cotidiano escolar e sua utilização, por meio de softwares educativos, deve ser mediado e planejado para que esse recurso contribua para a aprendizagem dos alunos, inclusive aqueles com deficiência intelectual (DI). Sendo assim, tivemos por objetivo propor atividades específicas de informática para alunos com DI, por meio de softwares educativos, além de quantificar e analisar as estratégias técnicas e pedagógicas utilizadas. Participaram da pesquisa seis alunos categorizados como DI, matriculados em duas escolas públicas. Para coletar informações utilizamos protocolos de observação e diário de campo. Os dados foram analisados de maneira quantitativa e qualitativa, sendo estes baseados nos conceitos de mediação e zona de desenvolvimento proximal da teoria histórico-cultural. Os resultados indicaram que se os conteúdos trabalhados nas aulas de informática convergirem com as atividades propostas em sala de aula, os alunos com DI têm oportunidades de experienciar atividades diferenciadas que lhes possibilitem o sucesso. Observamos que foram as estratégias de ensino que possibilitaram aos alunos participantes compreender e realizar corretamente as atividades propostas. Dessa maneira, consideramos que os conhecimentos técnicos a respeito dos softwares educativos e os conhecimentos pedagógicos sobre o conteúdo que está sendo trabalhado, não são suficientes para que a atividade proposta contribua para o desenvolvimento de alunos com DI.
The computer is present in everyday school life and using it with educational software must be mediated and planned in order for this resource to contribute to student learning, including those with intellectual disabilities (ID). Therefore, the aim was to propose specific computer activities for students with ID using educational software, and to quantify and analyze the technical and pedagogical strategies used. The participants were six students with ID enrolled in two public schools. To collect information we used observation protocols and a field journal. Data were analyzed quantitatively and qualitatively, based on the concepts of mediation and the zone of proximal development of cultural-historical theory. The results indicated that when the content developed in the computer classes were compatible with the proposed activities in the classroom, students with ID had opportunities to experience different activities that enable them to be successful. We noted that what enabled them to understand and correctly perform the proposed activities were the teaching strategies. Thus, we consider that technical knowledge about educational software and pedagogical knowledge about content that is being worked on, are insufficient to ensure that the proposed activity will contribute to the development of students with ID.
RESUMO
Neste Projeto de Intervenção foram verificadas a descentralização das informações repassadas , as perdas de dados por erros no preenchimento dos BPAs, cadastramentos errados no sistema, não envio de informações por falha humana, verificação da falta de treinamento de pessoal e comprometimento operacional que causam falhas no processo dos fluxos de informação. As informações processadas pela Secretária de Saúde do Município de Lagarto/SE visam ao atendimento mensal de toda a produção de atendimento do SUS aos munícipes, como também a produção de média e alta complexidade, já que o município de Lagarto é sede de Regional de Saúde. Procura-se melhor forma de integrar os sistemas de informação dos setores técnicos da Secretaria Municipal de Saúde, com a criação de um setor de Processamento de dados para que o mesmo coordene a produção dos registros, evitando erros, tornando assim mais céleres os fluxos de informações. Além disso, será implementado um programa de capacitação para facilitar ao pessoal técnico o processo de tomadas de decisões da gestão em relação ao cumprimento ou não das pactuações e metas da Secretaria. A tomada de decisões gerenciais deve ser imediata quando as pactuações não estão sendo cumpridas e, com a reunião de informações em um único setor pode oferecer um suporte melhor para as modernas práticas de gestão, com base em informações produzidas e em suas conclusões...