Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 817
Filtrar
1.
IEEE Trans Biomed Eng ; PP2024 Jul 15.
Artigo em Inglês | MEDLINE | ID: mdl-39008391

RESUMO

OBJECTIVE: Pelvic fractures often require fixation through iliosacral joint, typically guided by fluoroscopy using an untracked C-arm device. However, this involves ionizing radiation exposure and potentially inaccurate screw placement. We introduce the Navigated Orthopaedic Fixations using Ultrasound System (NOFUSS), a radiation-free ultrasound (US)-based end-to-end system for providing real-time navigation for iliosacral screw (ISS) insertions. METHODS: We performed surgeries on 8 human cadaver specimens, inserting four ISSs per specimen to directly compare NOFUSS against conventional fluoroscopy. Six specimens yielded usable (marginal or adequate quality) US images. RESULTS: The median screw entry error, midpoint error, and angulations errors for NOFUSS were 8.4 mm, 7.0 mm, and 1.4◦, compared to 7.5 mm (p = 0.52), 5.7 mm (p = 0.30), and 4.4◦ (p = 0.001) for fluoroscopy respectively. NOFUSS resulted in 6 (50%) breaches, compared to 2 (16.7%) in fluoroscopy (p = 0.19). The median insertion time was 7m 37s and 12m 36s per screw for NOFUSS and fluoroscopy respectively (p = 0.002). The median radiation exposure during the fluoroscopic procedure was 2m 44s, (range: 1m 44s - 3m 18s), with no radiation required for NOFUSS. When considering the three cadavers that yielded only adequate-quality US images (12 screws), the measured entry errors were 3.6 mm and 8.1 mm respectively for NOFUSS and fluoroscopy (p = 0.06). CONCLUSION: NOFUSS achieved insertion accuracies on par with the conventionalfluoroscopicmethod,andreducedinsertiontimesandradiation exposure significantly. SIGNIFICANCE: This study demonstrated the feasibility of an automated, radiation-free, US-based surgical navigation system for ISS insertions.

2.
J Fish Biol ; 2024 Jul 21.
Artigo em Inglês | MEDLINE | ID: mdl-39034462

RESUMO

Current procedures to establish vertebral column regionalization (e.g., histology) in fish are time consuming and difficult to apply. The aim of this study was to develop a more rapid and accurate radiology-based method for Atlantic salmon (Salmo salar). A detailed analysis of 90 animals (4 kg) led to the establishment of region-specific radiographic hallmarks. To elucidate its transferability to other salmonid species, radiography was carried out in brown trout (Salmo trutta), Arctic char (Salvelinus alpinus), rainbow trout (Oncorhynchus mykiss), pink salmon (Oncorhynchus gorbuscha), and Chinook salmon (Oncorhynchus tshawytscha). This method was also evaluated for whole ungutted fish. The vertebral column of Atlantic salmon can be subdivided into five regions (R1-R5) based on anatomy: postcranial (R1, V1, and V2), abdominal (R2, V3-V26), transitional (R3, V27-V36), caudal (R4, V37-V53), and ural (R5, V54-V59). The following specific radiographic hallmarks allow the identification of regions: (i) lack of ribs in R1, (ii) modified parapophysis of the first vertebra of R3, (iii) prominent hemal spine of the first vertebra of R4, and (iv) the separated hemal spine of the most cranial pre-ural vertebra of R5. These hallmarks were all transferable to the other salmonid species assessed. The results include a further description of various region-specific characteristics in Atlantic salmon. The method was found applicable for sedated/whole ungutted fish, verifying it as quick and easy compared to other regionalization methods. The regions defined by radiology in this study agree with the vertebral column regions recently defined for Chinook salmon (O. tshawytscha). Thus, and considering the results of this study on various salmonid species, the currently developed regionalization protocol can be generally used for salmonids.

3.
Environ Sci Technol ; 58(27): 12135-12146, 2024 Jul 09.
Artigo em Inglês | MEDLINE | ID: mdl-38916220

RESUMO

Biosolids are a byproduct of wastewater treatment that can be beneficially applied to agricultural land as a fertilizer. While U.S. regulations limit metals and pathogens in biosolids intended for land applications, no organic contaminants are currently regulated. Novel techniques can aid in detection, evaluation, and prioritization of biosolid-associated organic contaminants (BOCs). For example, nontargeted analysis (NTA) can detect a broad range of chemicals, producing data sets representing thousands of measured analytes that can be combined with computational toxicological tools to support human and ecological hazard assessment and prioritization. We combined NTA with a computer-based tool from the U.S. EPA, the Cheminformatics Hazard Comparison Module (HCM), to identify and prioritize BOCs present in U.S. and Canadian biosolids (n = 16). Four-hundred fifty-one features were detected in at least 80% of samples, with identities of 92 compounds confirmed or assigned probable structures. These compounds were primarily categorized as endogenous compounds, pharmaceuticals, industrial chemicals, and fragrances. Examples of top prioritized compounds were p-cresol and chlorophene, based on human health end points, and fludioxonil and triclocarban, based on ecological health end points. Combining NTA results with hazard comparison data allowed us to prioritize compounds to be included in future studies of the environmental fate and transport of BOCs.


Assuntos
Águas Residuárias , Águas Residuárias/química , Monitoramento Ambiental/métodos , Humanos , Compostos Orgânicos/análise
4.
Phys Med ; 122: 103386, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38805762

RESUMO

PURPOSE: Head and neck cancer (HNC) patients in radiotherapy require adaptive treatment plans due to anatomical changes. Deformable image registration (DIR) is used in adaptive radiotherapy, e.g. for deformable dose accumulation (DDA). However, DIR's ill-posedness necessitates addressing uncertainties, often overlooked in clinical implementations. DIR's further clinical implementation is hindered by missing quantitative commissioning and quality assurance tools. This study evaluates one pathway for more quantitative DDA uncertainties. METHODS: For five HNC patients, each with multiple repeated CTs acquired during treatment, a simultaneous-integrated boost (SIB) plan was optimized. Recalculated doses were warped individually using multiple DIRs from repeated to reference CTs, and voxel-by-voxel dose ranges determined an error-bar for DDA. Followed by evaluating, a previously proposed early-stage DDA uncertainty estimation method tested for lung cancer, which combines geometric DIR uncertainties, dose gradients and their directional dependence, in the context of HNC. RESULTS: Applying multiple DIRs show dose differences, pronounced in high dose gradient regions. The patient with largest anatomical changes (-13.1 % in ROI body volume), exhibited 33 % maximum uncertainty in contralateral parotid, with 54 % of voxels presenting an uncertainty >5 %. Accumulation over multiple CTs partially mitigated uncertainties. The estimation approach predicted 92.6 % of voxels within ±5 % to the reference dose uncertainty across all patients. CONCLUSIONS: DIR variations impact accumulated doses, emphasizing DDA uncertainty quantification's importance for HNC patients. Multiple DIR dose warping aids in quantifying DDA uncertainties. An estimation approach previously described for lung cancer was successfully validated for HNC, for SIB plans, presenting different dose gradients, and for accumulated treatments.


Assuntos
Neoplasias de Cabeça e Pescoço , Terapia com Prótons , Doses de Radiação , Dosagem Radioterapêutica , Planejamento da Radioterapia Assistida por Computador , Neoplasias de Cabeça e Pescoço/radioterapia , Neoplasias de Cabeça e Pescoço/diagnóstico por imagem , Humanos , Incerteza , Terapia com Prótons/métodos , Planejamento da Radioterapia Assistida por Computador/métodos , Processamento de Imagem Assistida por Computador/métodos , Tomografia Computadorizada por Raios X
5.
J Hazard Mater ; 471: 134436, 2024 Jun 05.
Artigo em Inglês | MEDLINE | ID: mdl-38688221

RESUMO

Membrane distillation (MD) has received ample recognition for treating complex wastewater, including hypersaline oil and gas (O&G) produced water (PW). Rigorous water quality assessment is critical in evaluating PW treatment because PW consists of numerous contaminants beyond the targets listed in general discharge and reuse standards. This study evaluated a novel photocatalytic membrane distillation (PMD) process, with and without a UV light source, against a standard vacuum membrane distillation (VMD) process for treating PW, utilizing targeted analyses and a non-targeted chemical identification workflow coupled with toxicity predictions. PMD with UV light resulted in better removals of dissolved organic carbon, ammoniacal nitrogen, and conductivity. Targeted organic analyses identified only trace amounts of acetone and 2-butanone in distillates. According to non-targeted analysis, the number of suspects reduced from 65 in feed to 25-30 across all distillate samples. Certain physicochemical properties of compounds influenced contaminant rejection in different MD configurations. According to preliminary toxicity predictions, VMD, PMD with and without UV distillate samples, respectively contained 21, 22, and 23 suspects associated with critical toxicity concerns. Overall, non-targeted analysis together with toxicity prediction provides a competent supportive tool to assess treatment efficiency and potential impacts on public health and the environment during PW reuse.

6.
Anal Bioanal Chem ; 416(10): 2565-2579, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38530399

RESUMO

Mass-spectrometry-based non-targeted analysis (NTA), in which mass spectrometric signals are assigned chemical identities based on a systematic collation of evidence, is a growing area of interest for toxicological risk assessment. Successful NTA results in better identification of potentially hazardous pollutants within the environment, facilitating the development of targeted analytical strategies to best characterize risks to human and ecological health. A supporting component of the NTA process involves assessing whether suspected chemicals are amenable to the mass spectrometric method, which is necessary in order to assign an observed signal to the chemical structure. Prior work from this group involved the development of a random forest model for predicting the amenability of 5517 unique chemical structures to liquid chromatography-mass spectrometry (LC-MS). This work improves the interpretability of the group's prior model of the same endpoint, as well as integrating 1348 more data points across negative and positive ionization modes. We enhance interpretability by feature engineering, a machine learning practice that reduces the input dimensionality while attempting to preserve performance statistics. We emphasize the importance of interpretable machine learning models within the context of building confidence in NTA identification. The novel data were curated by the labeling of compounds as amenable or unamenable by expert curators, resulting in an enhanced set of chemical compounds to expand the applicability domain of the prior model. The balanced accuracy benchmark of the newly developed model is comparable to performance previously reported (mean CV BA is 0.84 vs. 0.82 in positive mode, and 0.85 vs. 0.82 in negative mode), while on a novel external set, derived from this work's data, the Matthews correlation coefficients (MCC) for the novel models are 0.66 and 0.68 for positive and negative mode, respectively. Our group's prior published models scored MCC of 0.55 and 0.54 on the same external sets. This demonstrates appreciable improvement over the chemical space captured by the expanded dataset. This work forms part of our ongoing efforts to develop models with higher interpretability and higher performance to support NTA efforts.

7.
Sci Total Environ ; 927: 171153, 2024 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-38460683

RESUMO

About 3 billion new tires are produced each year and about 800 million tires become waste annually. Global dependence upon tires produced from natural rubber and petroleum-based compounds represents a persistent and complex environmental problem with only partial and often-times, ineffective solutions. Tire emissions may be in the form of whole tires, tire particles, and chemical compounds, each of which is transported through various atmospheric, terrestrial, and aquatic routes in the natural and built environments. Production and use of tires generates multiple heavy metals, plastics, PAH's, and other compounds that can be toxic alone or as chemical cocktails. Used tires require storage space, are energy intensive to recycle, and generally have few post-wear uses that are not also potential sources of pollutants (e.g., crumb rubber, pavements, burning). Tire particles emitted during use are a major component of microplastics in urban runoff and a source of unique and highly potent toxic substances. Thus, tires represent a ubiquitous and complex pollutant that requires a comprehensive examination to develop effective management and remediation. We approach the issue of tire pollution holistically by examining the life cycle of tires across production, emissions, recycling, and disposal. In this paper, we synthesize recent research and data about the environmental and human health risks associated with the production, use, and disposal of tires and discuss gaps in our knowledge about fate and transport, as well as the toxicology of tire particles and chemical leachates. We examine potential management and remediation approaches for addressing exposure risks across the life cycle of tires. We consider tires as pollutants across three levels: tires in their whole state, as particulates, and as a mixture of chemical cocktails. Finally, we discuss information gaps in our understanding of tires as a pollutant and outline key questions to improve our knowledge and ability to manage and remediate tire pollution.

8.
Cureus ; 16(2): e54653, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38523937

RESUMO

Objective The objective of this study is to compare the outcomes of hospital mortality, the requirement of invasive ventilation, vasopressor requirement, duration of vasopressor requirement, and duration of intensive care unit (ICU) stay among the different causes of sepsis and to determine which cause of sepsis had the most severe outcomes. Methods A retrospective chart review was done in critically ill adult patients who were admitted with sepsis to the ICU from July 2017 until July 2019. Acute Physiology and Chronic Health Evaluation (APACHE) IV scores were calculated on patients admitted to ICU on day one of ICU admission. Each patient was then evaluated for outcomes of hospital mortality, need for invasive ventilation, requirement of vasopressors, duration of vasopressors, and duration of ICU stay. The outcomes were then compared between the different sources of sepsis to determine which source of sepsis had the highest severity. Results In total, 176 patients were included in the study. Ninety-three patients were admitted with respiratory sepsis, 26 patients were admitted with gastrointestinal sepsis, 31 patients were admitted with urosepsis, and 26 patients were admitted with other miscellaneous causes of sepsis. The hospital mortality was highest in the respiratory sepsis group at 32%, with a trend towards statistical significance with a P value of 0.057. ICU stay duration was highest in patients with respiratory sepsis at six days, with a statistically significant P value of < 0.001. The need for invasive ventilation was highest in patients with respiratory sepsis at 64%, with a statistically significant P value of < 0.001. The requirement of vasopressor support was highest in patients with respiratory sepsis at 47% and the duration of vasopressors was highest in both respiratory and gastrointestinal sepsis at three days, however, there was no statistical significance. Conclusion Among the different origins of sepsis, the patients with respiratory sepsis had the most severe outcomes, with the highest need for invasive ventilation and the highest ICU stay duration.

9.
Phys Med Biol ; 69(9)2024 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-38537287

RESUMO

Objective.Online magnetic resonance imaging (MRI) guidance could be especially beneficial for pencil beam scanned (PBS) proton therapy of tumours affected by respiratory motion. For the first time to our knowledge, we investigate the dosimetric impact of respiratory motion on MRI-guided proton therapy compared to the scenario without magnetic field.Approach.A previously developed analytical proton dose calculation algorithm accounting for perpendicular magnetic fields was extended to enable 4D dose calculations. For two geometrical phantoms and three liver and two lung patient cases, static treatment plans were optimised with and without magnetic field (0, 0.5 and 1.5 T). Furthermore, plans were optimised using gantry angle corrections (0.5 T +5° and 1.5 T +15°) to reproduce similar beam trajectories compared to the 0 T reference plans. The effect of motion was then considered using 4D dose calculations without any motion mitigation and simulating 8-times volumetric rescanning, with motion for the patient cases provided by 4DCT(MRI) data sets. Each 4D dose calculation was performed for different starting phases and the CTV dose coverageV95%and homogeneityD5%-D95%were analysed.Main results.For the geometrical phantoms with rigid motion perpendicular to the beam and parallel to the magnetic field, a comparable dosimetric effect was observed independent of the magnetic field. Also for the five 4DCT(MRI) cases, the influence of motion was comparable for all magnetic field strengths with and without gantry angle correction. On average, the motion-induced decrease in CTVV95%from the static plan was 17.0% and 18.9% for 1.5 T and 0.5 T, respectively, and 19.9% without magnetic field.Significance.For the first time, this study investigates the combined impact of magnetic fields and respiratory motion on MR-guided proton therapy. The comparable dosimetric effects irrespective of magnetic field strength indicate that the effects of motion for future MR-guided proton therapy may not be worse than for conventional PBS proton therapy.


Assuntos
Neoplasias Pulmonares , Terapia com Prótons , Humanos , Terapia com Prótons/métodos , Movimento (Física) , Radiometria/métodos , Prótons , Imageamento por Ressonância Magnética/métodos , Planejamento da Radioterapia Assistida por Computador/métodos , Tomografia Computadorizada Quadridimensional/métodos , Neoplasias Pulmonares/diagnóstico por imagem , Neoplasias Pulmonares/radioterapia
10.
Anal Chem ; 96(9): 3707-3716, 2024 Mar 05.
Artigo em Inglês | MEDLINE | ID: mdl-38380899

RESUMO

Recent advances in high-resolution mass spectrometry (HRMS) have enabled the detection of thousands of chemicals from a single sample, while computational methods have improved the identification and quantification of these chemicals in the absence of reference standards typically required in targeted analysis. However, to determine the presence of chemicals of interest that may pose an overall impact on ecological and human health, prioritization strategies must be used to effectively and efficiently highlight chemicals for further investigation. Prioritization can be based on a chemical's physicochemical properties, structure, exposure, and toxicity, in addition to its regulatory status. This Perspective aims to provide a framework for the strategies used for chemical prioritization that can be implemented to facilitate high-quality research and communication of results. These strategies are categorized as either "online" or "offline" prioritization techniques. Online prioritization techniques trigger the isolation and fragmentation of ions from the low-energy mass spectra in real time, with user-defined parameters. Offline prioritization techniques, in contrast, highlight chemicals of interest after the data has been acquired; detected features can be filtered and ranked based on the relative abundance or the predicted structure, toxicity, and concentration imputed from the tandem mass spectrum (MS2). Here we provide an overview of these prioritization techniques and how they have been successfully implemented and reported in the literature to find chemicals of elevated risk to human and ecological environments. A complete list of software and tools is available from https://nontargetedanalysis.org/.


Assuntos
Meio Ambiente , Espectrometria de Massas em Tandem , Humanos
11.
Environ Sci Technol ; 58(8): 3690-3701, 2024 Feb 27.
Artigo em Inglês | MEDLINE | ID: mdl-38350027

RESUMO

This study investigated the presence and human hazards associated with pesticides and other anthropogenic chemicals identified in kale grown in urban and rural environments. Pesticides and related compounds (i.e., surfactants and metabolites) in kale samples were evaluated using a nontargeted data acquisition for targeted analysis method which utilized a pesticide mixture containing >1,000 compounds for suspect screening and quantification. We modeled population-level exposures and assessed noncancer hazards to DEET, piperonyl butoxide, prometon, secbumeton, terbumeton, and spinosyn A using nationally representative estimates of kale consumption across life stages in the US. Our findings indicate even sensitive populations (e.g., pregnant women and children) are not likely to experience hazards from these select compounds were they to consume kale from this study. However, a strictly nontargeted chemical analytical approach identified a total of 1,822 features across all samples, and principal component analysis revealed that the kale chemical composition may have been impacted by agricultural growing practices and environmental factors. Confidence level 2 compounds that were ≥5 times more abundant in the urban samples than in rural samples (p < 0.05) included chemicals categorized as "flavoring and nutrients" and "surfactants" in the EPA's Chemicals and Products Database. Using the US-EPA's Cheminformatics Hazard Module, we identified that many of the nontarget compounds have predicted toxicity scores of "very high" for several end points related to human health. These aspects would have been overlooked using traditional targeted analysis methods, although more information is needed to ascertain whether the compounds identified through nontargeted analysis are of environmental or human health concern. As such, our approach enabled the identification of potentially hazardous compounds that, based on their hazard assessment score, merit follow-up investigations.


Assuntos
Brassica , Praguicidas , Gravidez , Criança , Feminino , Humanos , Fazendas , Medição de Risco , Praguicidas/análise
12.
Environ Health Perspect ; 132(2): 26001, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38319881

RESUMO

BACKGROUND: Per- and polyfluoroalkyl substances (PFAS) encompass a class of chemically and structurally diverse compounds that are extensively used in industry and detected in the environment. The US Environmental Protection Agency (US EPA) 2021 PFAS Strategic Roadmap describes national research plans to address the challenge of PFAS. OBJECTIVES: Systematic Evidence Map (SEM) methods were used to survey and summarize available epidemiological and mammalian bioassay evidence that could inform human health hazard identification for a set of 345 PFAS that were identified by the US EPA's Center for Computational Toxicology and Exposure (CCTE) for in vitro toxicity and toxicokinetic assay testing and through interagency discussions on PFAS of interest. This work builds from the 2022 evidence map that collated evidence on a separate set of ∼150 PFAS. Like our previous work, this SEM does not include PFAS that are the subject of ongoing or completed assessments at the US EPA. METHODS: SEM methods were used to search, screen, and inventory mammalian bioassay and epidemiological literature from peer-reviewed and gray literature sources using manual review and machine-learning software. For each included study, study design details and health end points examined were summarized in interactive web-based literature inventories. Some included studies also underwent study evaluation and detailed extraction of health end point data. All underlying data is publicly available online as interactive visuals with downloadable metadata. RESULTS: More than 13,000 studies were identified from scientific databases. Screening processes identified 121 mammalian bioassay and 111 epidemiological studies that met screening criteria. Epidemiological evidence (available for 12 PFAS) mostly assessed the reproductive, endocrine, developmental, metabolic, cardiovascular, and immune systems. Mammalian bioassay evidence (available for 30 PFAS) commonly assessed effects in the reproductive, whole-body, nervous, and hepatic systems. Overall, 41 PFAS had evidence across mammalian bioassay and epidemiology data streams (roughly 11% of searched chemicals). DISCUSSION: No epidemiological and/or mammalian bioassay evidence were identified for most of the PFAS included in our search. Results from this SEM, our 2022 SEM on ∼150 PFAS, and other PFAS assessment products from the US EPA are compiled into a comprehensive PFAS dashboard that provides researchers and regulators an overview of the current PFAS human health landscape including data gaps and can serve as a scoping tool to facilitate prioritization of PFAS-related research and/or risk assessment activities. https://doi.org/10.1289/EHP13423.


Assuntos
Sistemas de Painéis , Fluorocarbonos , Animais , Estados Unidos , Humanos , United States Environmental Protection Agency , Reprodução , Medição de Risco , Fluorocarbonos/toxicidade , Mamíferos
13.
J Cheminform ; 16(1): 19, 2024 Feb 20.
Artigo em Inglês | MEDLINE | ID: mdl-38378618

RESUMO

The rapid increase of publicly available chemical structures and associated experimental data presents a valuable opportunity to build robust QSAR models for applications in different fields. However, the common concern is the quality of both the chemical structure information and associated experimental data. This is especially true when those data are collected from multiple sources as chemical substance mappings can contain many duplicate structures and molecular inconsistencies. Such issues can impact the resulting molecular descriptors and their mappings to experimental data and, subsequently, the quality of the derived models in terms of accuracy, repeatability, and reliability. Herein we describe the development of an automated workflow to standardize chemical structures according to a set of standard rules and generate two and/or three-dimensional "QSAR-ready" forms prior to the calculation of molecular descriptors. The workflow was designed in the KNIME workflow environment and consists of three high-level steps. First, a structure encoding is read, and then the resulting in-memory representation is cross-referenced with any existing identifiers for consistency. Finally, the structure is standardized using a series of operations including desalting, stripping of stereochemistry (for two-dimensional structures), standardization of tautomers and nitro groups, valence correction, neutralization when possible, and then removal of duplicates. This workflow was initially developed to support collaborative modeling QSAR projects to ensure consistency of the results from the different participants. It was then updated and generalized for other modeling applications. This included modification of the "QSAR-ready" workflow to generate "MS-ready structures" to support the generation of substance mappings and searches for software applications related to non-targeted analysis mass spectrometry. Both QSAR and MS-ready workflows are freely available in KNIME, via standalone versions on GitHub, and as docker container resources for the scientific community. Scientific contribution: This work pioneers an automated workflow in KNIME, systematically standardizing chemical structures to ensure their readiness for QSAR modeling and broader scientific applications. By addressing data quality concerns through desalting, stereochemistry stripping, and normalization, it optimizes molecular descriptors' accuracy and reliability. The freely available resources in KNIME, GitHub, and docker containers democratize access, benefiting collaborative research and advancing diverse modeling endeavors in chemistry and mass spectrometry.

14.
bioRxiv ; 2024 Mar 15.
Artigo em Inglês | MEDLINE | ID: mdl-38168451

RESUMO

Memory reactivation during sleep is thought to facilitate memory consolidation. Most sleep reactivation research has examined how reactivation of specific facts, objects, and associations benefits their overall retention. However, our memories are not unitary, and not all features of a memory persist in tandem over time. Instead, our memories are transformed, with some features strengthened and others weakened. Does sleep reactivation drive memory transformation? We leveraged the Targeted Memory Reactivation technique in an object category learning paradigm to examine this question. Participants (20 female, 14 male) learned three categories of novel objects, where each object had unique, distinguishing features as well as features shared with other members of its category. We used a real-time EEG protocol to cue the reactivation of these objects during sleep at moments optimized to generate reactivation events. We found that reactivation improved memory for distinguishing features while worsening memory for shared features, suggesting a differentiation process. The results indicate that sleep reactivation does not act holistically on object memories, instead supporting a transformation process where some features are enhanced over others.

15.
Phys Imaging Radiat Oncol ; 29: 100531, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-38292650

RESUMO

Background and purpose: Respiratory suppression techniques represent an effective motion mitigation strategy for 4D-irradiation of lung tumors with protons. A magnetic resonance imaging (MRI)-based study applied and analyzed methods for this purpose, including enhanced Deep-Inspiration-Breath-Hold (eDIBH). Twenty-one healthy volunteers (41-58 years) underwent thoracic MR scans in four imaging sessions containing two eDIBH-guided MRIs per session to simulate motion-dependent irradiation conditions. The automated MRI segmentation algorithm presented here was critical in determining the lung volumes (LVs) achieved during eDIBH. Materials and methods: The study included 168 MRIs acquired under eDIBH conditions. The lung segmentation algorithm consisted of four analysis steps: (i) image preprocessing, (ii) MRI histogram analysis with thresholding, (iii) automatic segmentation, (iv) 3D-clustering. To validate the algorithm, 46 eDIBH-MRIs were manually contoured. Sørensen-Dice similarity coefficients (DSCs) and relative deviations of LVs were determined as similarity measures. Assessment of intrasessional and intersessional LV variations and their differences provided estimates of statistical and systematic errors. Results: Lung segmentation time for 100 2D-MRI planes was âˆ¼ 10 s. Compared to manual lung contouring, the median DSC was 0.94 with a lower 95 % confidence level (CL) of 0.92. The relative volume deviations yielded a median value of 0.059 and 95 % CLs of -0.013 and 0.13. Artifact-based volume errors, mainly of the trachea, were estimated. Estimated statistical and systematic errors ranged between 6 and 8 %. Conclusions: The presented analytical algorithm is fast, precise, and readily available. The results are comparable to time-consuming, manual segmentations and other automatic segmentation approaches. Post-processing to remove image artifacts is under development.

16.
Phys Med ; 118: 103301, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38290179

RESUMO

PURPOSE: The aim of this work is to investigate the feasibility of the Jagiellonian Positron Emission Tomography (J-PET) scanner for intra-treatment proton beam range monitoring. METHODS: The Monte Carlo simulation studies with GATE and PET image reconstruction with CASToR were performed in order to compare six J-PET scanner geometries. We simulated proton irradiation of a PMMA phantom with a Single Pencil Beam (SPB) and Spread-Out Bragg Peak (SOBP) of various ranges. The sensitivity and precision of each scanner were calculated, and considering the setup's cost-effectiveness, we indicated potentially optimal geometries for the J-PET scanner prototype dedicated to the proton beam range assessment. RESULTS: The investigations indicate that the double-layer cylindrical and triple-layer double-head configurations are the most promising for clinical application. We found that the scanner sensitivity is of the order of 10-5 coincidences per primary proton, while the precision of the range assessment for both SPB and SOBP irradiation plans was found below 1 mm. Among the scanners with the same number of detector modules, the best results are found for the triple-layer dual-head geometry. The results indicate that the double-layer cylindrical and triple-layer double-head configurations are the most promising for the clinical application, CONCLUSIONS:: We performed simulation studies demonstrating that the feasibility of the J-PET detector for PET-based proton beam therapy range monitoring is possible with reasonable sensitivity and precision enabling its pre-clinical tests in the clinical proton therapy environment. Considering the sensitivity, precision and cost-effectiveness, the double-layer cylindrical and triple-layer dual-head J-PET geometry configurations seem promising for future clinical application.


Assuntos
Terapia com Prótons , Prótons , Estudos de Viabilidade , Tomografia por Emissão de Pósitrons , Terapia com Prótons/métodos , Imagens de Fantasmas , Método de Monte Carlo
18.
Med Phys ; 51(1): 579-590, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-37166067

RESUMO

BACKGROUND: Numerical 4D phantoms, together with associated ground truth motion, offer a flexible and comprehensive data set for realistic simulations in radiotherapy and radiology in target sites affected by respiratory motion. PURPOSE: We present an openly available upgrade to previously reported methods for generating realistic 4DCT lung numerical phantoms, which now incorporate respiratory ribcage motion and improved lung density representation throughout the breathing cycle. METHODS: Density information of reference CTs, toget her with motion from multiple breathing cycle 4DMRIs have been combined to generate synthetic 4DCTs (4DCT(MRI)s). Inter-subject correspondence between the CT and MRI anatomy was first established via deformable image registration (DIR) of binary masks of the lungs and ribcage. Ribcage and lung motions were extracted independently from the 4DMRIs using DIR and applied to the corresponding locations in the CT after post-processing to preserve sliding organ motion. In addition, based on the Jacobian determinant of the resulting deformation vector fields, lung densities were scaled on a voxel-wise basis to more accurately represent changes in local lung density. For validating this process, synthetic 4DCTs, referred to as 4DCT(CT)s, were compared to the originating 4DCTs using motion extracted from the latter, and the dosimetric impact of the new features of ribcage motion and density correction were analyzed using pencil beam scanned proton 4D dose calculations. RESULTS: Lung density scaling led to a reduction of maximum mean lung Hounsfield units (HU) differences from 45 to 12 HU when comparing simulated 4DCT(CT)s to their originating 4DCTs. Comparing 4D dose distributions calculated on the enhanced 4DCT(CT)s to those on the original 4DCTs yielded 2%/2 mm gamma pass rates above 97% with an average improvement of 1.4% compared to previously reported phantoms. CONCLUSIONS: A previously reported 4DCT(MRI) workflow has been successfully improved and the resulting numerical phantoms exhibit more accurate lung density representations and realistic ribcage motion.


Assuntos
Tomografia Computadorizada Quadridimensional , Neoplasias Pulmonares , Humanos , Tomografia Computadorizada Quadridimensional/métodos , Pulmão/diagnóstico por imagem , Radiometria/métodos , Respiração , Imageamento por Ressonância Magnética/métodos , Imagens de Fantasmas , Neoplasias Pulmonares/diagnóstico por imagem , Neoplasias Pulmonares/radioterapia , Planejamento da Radioterapia Assistida por Computador/métodos
19.
Radiother Oncol ; 190: 109973, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-37913953

RESUMO

BACKGROUND AND PURPOSE: This study investigates whether combined proton-photon therapy (CPPT) improves treatment plan quality compared to single-modality intensity-modulated radiation therapy (IMRT) or intensity-modulated proton therapy (IMPT) for head and neck cancer (HNC) patients. Different proton beam arrangements for CPPT and IMPT are compared, which could be of specific interest concerning potential future upright-positioned treatments. Furthermore, it is evaluated if CPPT benefits remain under inter-fractional anatomical changes for HNC treatments. MATERIAL AND METHODS: Five HNC patients with a planning CT and multiple (4-7) repeated CTs were studied. CPPT with simultaneously optimized photon and proton fluence, single-modality IMPT, and IMRT treatment plans were optimized on the planning CT and then recalculated and reoptimized on each repeated CT. For CPPT and IMPT, plans with different degrees of freedom for the proton beams were optimized. Fixed horizontal proton beam line (FHB), gantry-like, and arc-like plans were compared. RESULTS: The target coverage for CPPT without adaptation is insufficient (average V95%=88.4 %), while adapted plans can recover the initial treatment plan quality for target (average V95%=95.5 %) and organs-at-risk. CPPT with increased proton beam flexibility increases plan quality and reduces normal tissue complication probability of Xerostomia and Dysphagia. On average, Xerostomia NTCP reductions compared to IMRT are -2.7 %/-3.4 %/-5.0 % for CPPT FHB/CPPT Gantry/CPPT Arc. The differences for IMPT FHB/IMPT Gantry/IMPT Arc are + 0.8 %/-0.9 %/-4.3 %. CONCLUSION: CPPT for HNC needs adaptive treatments. Increasing proton beam flexibility in CPPT, either by using a gantry or an upright-positioned patient, improves treatment plan quality. However, the photon component is substantially reduced, therefore, the balance between improved plan quality and costs must be further determined.


Assuntos
Neoplasias de Cabeça e Pescoço , Terapia com Prótons , Radioterapia de Intensidade Modulada , Xerostomia , Humanos , Terapia com Prótons/efeitos adversos , Prótons , Dosagem Radioterapêutica , Planejamento da Radioterapia Assistida por Computador , Neoplasias de Cabeça e Pescoço/radioterapia , Neoplasias de Cabeça e Pescoço/etiologia , Radioterapia de Intensidade Modulada/efeitos adversos , Órgãos em Risco , Xerostomia/etiologia
20.
J Expo Sci Environ Epidemiol ; 34(1): 136-147, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-37193773

RESUMO

BACKGROUND: The number of chemicals present in the environment exceeds the capacity of government bodies to characterize risk. Therefore, data-informed and reproducible processes are needed for identifying chemicals for further assessment. The Minnesota Department of Health (MDH), under its Contaminants of Emerging Concern (CEC) initiative, uses a standardized process to screen potential drinking water contaminants based on toxicity and exposure potential. OBJECTIVE: Recently, MDH partnered with the U.S. Environmental Protection Agency (EPA) Office of Research and Development (ORD) to accelerate the screening process via development of an automated workflow accessing relevant exposure data, including exposure new approach methodologies (NAMs) from ORD's ExpoCast project. METHODS: The workflow incorporated information from 27 data sources related to persistence and fate, release potential, water occurrence, and exposure potential, making use of ORD tools for harmonization of chemical names and identifiers. The workflow also incorporated data and criteria specific to Minnesota and MDH's regulatory authority. The collected data were used to score chemicals using quantitative algorithms developed by MDH. The workflow was applied to 1867 case study chemicals, including 82 chemicals that were previously manually evaluated by MDH. RESULTS: Evaluation of the automated and manual results for these 82 chemicals indicated reasonable agreement between the scores although agreement depended on data availability; automated scores were lower than manual scores for chemicals with fewer available data. Case study chemicals with high exposure scores included disinfection by-products, pharmaceuticals, consumer product chemicals, per- and polyfluoroalkyl substances, pesticides, and metals. Scores were integrated with in vitro bioactivity data to assess the feasibility of using NAMs for further risk prioritization. SIGNIFICANCE: This workflow will allow MDH to accelerate exposure screening and expand the number of chemicals examined, freeing resources for in-depth assessments. The workflow will be useful in screening large libraries of chemicals for candidates for the CEC program.


Assuntos
Água Potável , Humanos , Estados Unidos , Fluxo de Trabalho , Algoritmos , Coleta de Dados , Minnesota
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA