RESUMEN
ABSTRACT: Defining prognostic variables in T-lymphoblastic lymphoma (T-LL) remains a challenge. AALL1231 was a Children's Oncology Group phase 3 clinical trial for newly diagnosed patients with T acute lymphoblastic leukemia or T-LL, randomizing children and young adults to a modified augmented Berlin-Frankfurt-Münster backbone to receive standard therapy (arm A) or with addition of bortezomib (arm B). Optional bone marrow samples to assess minimal residual disease (MRD) at the end of induction (EOI) were collected in T-LL analyzed to assess the correlation of MRD at the EOI to event-free survival (EFS). Eighty-six (41%) of the 209 patients with T-LL accrued to this trial submitted samples for MRD assessment. Patients with MRD <0.1% (n = 75) at EOI had a superior 4-year EFS vs those with MRD ≥0.1% (n = 11) (89.0% ± 4.4% vs 63.6% ± 17.2%; P = .025). Overall survival did not significantly differ between the 2 groups. Cox regression for EFS using arm A as a reference demonstrated that MRD EOI ≥0.1% was associated with a greater risk of inferior outcome (hazard ratio, 3.73; 95% confidence interval, 1.12-12.40; P = .032), which was independent of treatment arm assignment. Consideration to incorporate MRD at EOI into future trials will help establish its value in defining risk groups. CT# NCT02112916.
Asunto(s)
Protocolos de Quimioterapia Combinada Antineoplásica , Neoplasia Residual , Leucemia-Linfoma Linfoblástico de Células T Precursoras , Humanos , Niño , Femenino , Masculino , Adolescente , Preescolar , Leucemia-Linfoma Linfoblástico de Células T Precursoras/mortalidad , Leucemia-Linfoma Linfoblástico de Células T Precursoras/tratamiento farmacológico , Leucemia-Linfoma Linfoblástico de Células T Precursoras/patología , Leucemia-Linfoma Linfoblástico de Células T Precursoras/terapia , Protocolos de Quimioterapia Combinada Antineoplásica/uso terapéutico , Bortezomib/administración & dosificación , Bortezomib/uso terapéutico , Adulto Joven , Supervivencia sin Enfermedad , Adulto , Lactante , PronósticoRESUMEN
The immunopathology of herpes simplex virus (HSV)-associated neuroinflammation is a captivating and intricate field of study within the scientific community. HSV, renowned for its latent infection capability, gives rise to a spectrum of neurological expressions, ranging from mild symptoms to severe encephalitis. The enigmatic interplay between the virus and the host's immune responses profoundly shapes the outcome of these infections. This review delves into the multifaceted immune reactions triggered by HSV within neural tissues, intricately encompassing the interplay between innate and adaptive immunity. Furthermore, this analysis delves into the delicate equilibrium between immune defence and the potential for immunopathology-induced neural damage. It meticulously dissects the roles of diverse immune cells, cytokines, and chemokines, unravelling the intricacies of neuroinflammation modulation and its subsequent effects. By exploring HSV's immune manipulation and exploitation mechanisms, this review endeavours to unveil the enigmas surrounding the immunopathology of HSV-associated neuroinflammation. This comprehensive understanding enhances our grasp of viral pathogenesis and holds promise for pioneering therapeutic strategies designed to mitigate the neurological ramifications of HSV infections.
Asunto(s)
Herpes Simple , Simplexvirus , Humanos , Enfermedades Neuroinflamatorias , Inmunidad Adaptativa , CitocinasRESUMEN
OBJECTIVES: Data supporting routine infectious diseases (ID) consultation in gram-negative bloodstream infection (GN-BSI) are limited. We evaluated the association between ID consultation and mortality in patients with GN-BSI in a retrospective population-wide cohort study in Ontario using linked health administrative databases. METHODS: Hospitalized adult patients with GN-BSI between April 2017 and December 2021 were included. The primary outcome was time to all-cause mortality censored at 30 days, analyzed using a mixed effects Cox proportional hazards model with hospital as a random effect. ID consultation 1-10 days after the first positive blood culture was treated as a time-varying exposure. RESULTS: Of 30 159 patients with GN-BSI across 53 hospitals, 11 013 (36.5%) received ID consultation. Median prevalence of ID consultation for patients with GN-BSI across hospitals was 35.0% with wide variability (range 2.7%-76.1%, interquartile range 19.6%-41.1%). In total, 1041 (9.5%) patients who received ID consultation died within 30 days, compared to 1797 (9.4%) patients without ID consultation. In the fully adjusted multivariable model, ID consultation was associated with mortality benefit (adjusted hazard ratio [HR] 0.82, 95% confidence interval [CI] .77-.88, P < .0001; translating to absolute risk reduction of -3.8% or number needed to treat [NNT] of 27). Exploratory subgroup analyses of the primary outcome showed that ID consultation could have greater benefit in patients with high-risk features (nosocomial infection, polymicrobial or non-Enterobacterales infection, antimicrobial resistance, or non-urinary tract source). CONCLUSIONS: Early ID consultation was associated with reduced mortality in patients with GN-BSI. If resources permit, routine ID consultation for this patient population should be considered to improve patient outcomes.
Asunto(s)
Bacteriemia , Infecciones por Bacterias Gramnegativas , Derivación y Consulta , Humanos , Masculino , Estudios Retrospectivos , Femenino , Persona de Mediana Edad , Anciano , Infecciones por Bacterias Gramnegativas/mortalidad , Infecciones por Bacterias Gramnegativas/tratamiento farmacológico , Infecciones por Bacterias Gramnegativas/epidemiología , Derivación y Consulta/estadística & datos numéricos , Bacteriemia/mortalidad , Bacteriemia/microbiología , Bacteriemia/epidemiología , Ontario/epidemiología , Anciano de 80 o más Años , Adulto , Hospitalización/estadística & datos numéricos , Modelos de Riesgos Proporcionales , Mortalidad Hospitalaria , Enfermedades Transmisibles/mortalidadRESUMEN
Chronic kidney disease (CKD) is among the leading causes of death and disability, affecting an estimated 800 million adults globally. The underlying pathophysiology of CKD is complex creating challenges to its management. Primary risk factors for the development and progression of CKD include diabetes mellitus, hypertension, age, obesity, diet, inflammation, and physical inactivity. The high prevalence of diabetes and hypertension in patients with CKD increases the risk for secondary consequences such as cardiovascular disease and peripheral neuropathy. Moreover, the increased prevalence of obesity and chronic levels of systemic inflammation in CKD have downstream effects on critical cellular functions regulating homeostasis. The combination of these factors results in the deterioration of health and functional capacity in those living with CKD. Exercise offers protective benefits for the maintenance of health and function with age, even in the presence of CKD. Despite accumulating data supporting the implementation of exercise for the promotion of health and function in patients with CKD, a thorough description of the responses and adaptations to exercise at the cellular, system, and whole body levels is currently lacking. Therefore, the purpose of this review is to provide an up-to-date comprehensive review of the effects of exercise training on vascular endothelial progenitor cells at the cellular level; cardiovascular, musculoskeletal, and neural factors at the system level; and physical function, frailty, and fatigability at the whole body level in patients with CKD.
Asunto(s)
Hipertensión , Insuficiencia Renal Crónica , Adulto , Humanos , Insuficiencia Renal Crónica/complicaciones , Ejercicio Físico , Hipertensión/complicaciones , Obesidad/complicaciones , InflamaciónRESUMEN
Using phylogenomic analysis, we provide genomic epidemiology analysis of a large blastomycosis outbreak in Ontario, Canada, caused by Blastomyces gilchristii. The outbreak occurred in a locale where blastomycosis is rarely diagnosed, signaling a possible shift in geographically associated incidence patterns. Results elucidated fungal population genetic structure, enhancing understanding of the outbreak.
Asunto(s)
Blastomyces , Blastomicosis , Brotes de Enfermedades , Filogenia , Blastomicosis/epidemiología , Blastomicosis/microbiología , Ontario/epidemiología , Humanos , Blastomyces/genética , Genómica/métodos , Epidemiología Molecular , Masculino , Genoma Fúngico , Femenino , Persona de Mediana EdadRESUMEN
Candida parapsilosis is a common cause of non-albicans candidemia. It can be transmitted in healthcare settings resulting in serious healthcare-associated infections and can develop drug resistance to commonly used antifungal agents. Following a significant increase in the percentage of fluconazole (FLU)-nonsusceptible isolates from sterile site specimens of patients in two Ontario acute care hospital networks, we used whole genome sequence (WGS) analysis to retrospectively investigate the genetic relatedness of isolates and to assess potential in-hospital spread. Phylogenomic analysis was conducted on all 19 FLU-resistant and seven susceptible-dose dependent (SDD) isolates from the two hospital networks, as well as 13 FLU susceptible C. parapsilosis isolates from the same facilities and 20 isolates from patients not related to the investigation. Twenty-five of 26 FLU-nonsusceptible isolates (resistant or SDD) and two susceptible isolates from the two hospital networks formed a phylogenomic cluster that was highly similar genetically and distinct from other isolates. The results suggest the presence of a persistent strain of FLU-nonsusceptible C. parapsilosis causing infections over a 5.5-year period. Results from WGS were largely comparable to microsatellite typing. Twenty-seven of 28 cluster isolates had a K143R substitution in lanosterol 14-α-demethylase (ERG11) associated with azole resistance. As the first report of a healthcare-associated outbreak of FLU-nonsusceptible C. parapsilosis in Canada, this study underscores the importance of monitoring local antimicrobial resistance trends and demonstrates the value of WGS analysis to detect and characterize clusters and outbreaks. Timely access to genomic epidemiological information can inform targeted infection control measures.
Asunto(s)
Candida parapsilosis , Fluconazol , Humanos , Fluconazol/farmacología , Estudios Retrospectivos , Pruebas de Sensibilidad Microbiana , Farmacorresistencia Fúngica/genética , Antifúngicos/farmacología , Antifúngicos/uso terapéutico , Genómica , Hospitales , OntarioRESUMEN
OBJECTIVES: The risk factors and outcomes associated with persistent bacteraemia in Gram-negative bloodstream infection (GN-BSI) are not well described. We conducted a follow-on analysis of a retrospective population-wide cohort to characterize persistent bacteraemia in patients with GN-BSI. METHODS: We included all hospitalized patients >18 years old with GN-BSI between April 2017 and December 2021 in Ontario who received follow-up blood culture (FUBC) 2-5 days after the index positive blood culture. Persistent bacteraemia was defined as having a positive FUBC with the same Gram-negative organism as the index blood culture. We identified variables independently associated with persistent bacteraemia in a multivariable logistic regression model. We evaluated whether persistent bacteraemia was associated with increased odds of 30- and 90-day all-cause mortality using multivariable logistic regression models adjusted for potential confounders. RESULTS: In this study, 8807 patients were included; 600 (6.8%) had persistent bacteraemia. Having a permanent catheter, antimicrobial resistance, nosocomial infection, ICU admission, respiratory or skin and soft tissue source of infection, and infection by a non-fermenter or non-Enterobacterales/anaerobic organism were associated with increased odds of having persistent bacteraemia. The 30-day mortality was 17.2% versus 9.6% in those with and without persistent bacteraemia (aOR 1.65, 95% CI 1.29-2.11), while 90-day mortality was 25.5% versus 16.9%, respectively (aOR 1.53, 95% CI 1.24-1.89). Prevalence and odds of developing persistent bacteraemia varied widely depending on causative organism. CONCLUSIONS: Persistent bacteraemia is uncommon in GN-BSI but is associated with poorer outcomes. A validated risk stratification tool may be useful to identify patients with persistent bacteraemia.
Asunto(s)
Bacteriemia , Infecciones por Bacterias Gramnegativas , Humanos , Bacteriemia/epidemiología , Bacteriemia/microbiología , Bacteriemia/mortalidad , Estudios Retrospectivos , Masculino , Femenino , Persona de Mediana Edad , Anciano , Ontario/epidemiología , Infecciones por Bacterias Gramnegativas/epidemiología , Infecciones por Bacterias Gramnegativas/mortalidad , Infecciones por Bacterias Gramnegativas/microbiología , Factores de Riesgo , Bacterias Gramnegativas/aislamiento & purificación , Adulto , Cultivo de Sangre , Infección Hospitalaria/epidemiología , Infección Hospitalaria/microbiología , Infección Hospitalaria/mortalidad , Anciano de 80 o más Años , Antibacterianos/uso terapéutico , Relevancia ClínicaRESUMEN
PURPOSE OF REVIEW: The purpose of this review is to highlight the importance of a multidisciplinary thrombotic microangiopathies (TMA) Team. This goal will be accomplished through review of the complement system, discuss various causes of thrombotic microangiopathies (TMA), and aspects of their diagnosis and management. In so doing, readers will gain an appreciation for the complexity of this family of disorders and realize the benefit of a dedicated multidisciplinary TMA Team. RECENT FINDINGS: TMA causes derive from multiple specialty areas, are difficult to timely recognize, pose complex challenges, and require multidisciplinary management. Hematopoietic stem cell transplant-associated TMA (TA-TMA) and TA-TMA related multiorgan dysfunction syndrome (TA-TMA MODS) are areas of burgeoning research; use of complement testing and eculizumab precision-dosing has been found to better suppress complement activity in TA-TMA than standard eculizumab dosing. Newer tests are available to risk-stratify obstetric patients at risk for severe pre-eclampsia, whose features resemble those of TA-TMA MODS. Numerous disorders may produce TMA-like findings, and a systematic approach aids in their identification. TMA Teams elevate institutional awareness of increasingly recognized TMAs, will help expedite diagnostic and therapeutic interventions, and create pathways to future TMA-related research and facilitate access to clinical trials. SUMMARY: Establishment of a TMA-Team is valuable in developing the necessary institutional expertise needed to promptly recognize and appropriately manage patients with TMA.
Asunto(s)
Medicina , Microangiopatías Trombóticas , Humanos , Microangiopatías Trombóticas/diagnóstico , Microangiopatías Trombóticas/etiología , Microangiopatías Trombóticas/terapia , Proteínas del Sistema ComplementoRESUMEN
OBJECTIVE: To evaluate temporal and regional variation in biologic and targeted synthetic DMARD (b/tsDMARD) initiation for rheumatoid arthritis (RA) in England and Wales. METHODS: An observational cohort study was conducted for people with RA enrolled in the National Early Inflammatory Arthritis Audit (NEIAA) between May 2018 and April 2022 who had 12-month follow-up data. Temporal trends in escalation to b/tsDMARDs within 12 months of initial rheumatology assessment were explored, including comparisons before and after publication (July 2021) of national guidelines that lowered the threshold for b/tsDMARD initiation to include moderate-severity RA. Case-mix-adjusted, mixed-effects regression was used to evaluate regional and hospital-level variation in b/tsDMARD initiation. RESULTS: Of 6,098 RA patients with available follow-up, 508 (8.3%) initiated b/tsDMARDs within 12 months of initial assessment. b/tsDMARD escalation increased marginally towards the end of the study period (9.2% in May 2021/22); however, no significant differences were evident after guidelines were published permitting b/tsDMARDs for moderate-severity RA. The proportion of individuals escalated to b/tsDMARDs varied considerably between regions, ranging from 5.1% in Wales to 10.7% in North-West England. Following case-mix adjustment, the intraclass correlation (ICC) for hospitals within regions was 0.17, compared with a between-region ICC of 0.0, suggesting that the observable regional variation reflected hospital-level differences rather than systematic differences between regions themselves. CONCLUSION: There is marked variation in escalation to b/tsDMARDs for people newly-diagnosed with RA throughout England and Wales, despite a universal healthcare system. These disparities must be addressed if we are to deliver equitable access to b/tsDMARDs, regardless of geography.
RESUMEN
PURPOSE: To determine the risk of endophthalmitis in eyes undergoing intravitreal injections (IVIs) of anti-VEGF based on cumulative number of injections per eye. DESIGN: Retrospective cohort study. PARTICIPANTS: Patients from a single center undergoing IVIs of ranibizumab, aflibercept, or bevacizumab. METHODS: Eyes were divided into quartiles based on injection number causative of endophthalmitis between January 1, 2011, and June 1, 2022. MAIN OUTCOME MEASURES: Interquartile clinical outcomes and cumulative risk of endophthalmitis per injection and per eye. RESULTS: A total of 43 393 eyes received 652 421 anti-VEGF injections resulting in 231 endophthalmitis cases (0.035% per injection, 1 in 2857), of which 215 were included. The cumulative endophthalmitis risk increased from 0.0018% (1 in 55 556) after 1 injection to 0.013% (1 in 7692) after 11 injections (0.0012 percentage point change), versus 0.014% (1 in 7143) after 12 injections to 0.025% (1 in 4000) after 35 injections (0.00049 percentage point change), versus 0.025% (1 in 4000) after 36 injections to 0.031% (1 in 3226) after 66 injections (0.00017 percentage point change), versus 0.031% (1 in 3226) after 63 injections to 0.033% (1 in 3030) after 126 injections (0.000042 percentage point change) (P < 0.001). Likewise, the cumulative endophthalmitis risk per eye increased from 0.028% (1 in 3571) to 0.20% (1 in 500) between injections 1 and 11 (0.018 percentage point change), versus 0.21% (1 in 476) to 0.38% (1 in 263) between injections 12 and 35 (0.0075 percentage point change), versus 0.38% (1 in 263) to 0.46% (1 in 217) between injections 36 and 66 (0.0026 percentage point change), versus 0.46% (1 in 217) to 0.50% (1 in 200) between injections 67 and 126 (0.00063 percentage point change) (P < 0.001). CONCLUSIONS: The cumulative endophthalmitis risk per injection and per eye increased with greater number of injections received but appeared to do so at a higher rate during earlier injections and at a lower rate further into the treatment course. FINANCIAL DISCLOSURE(S): Proprietary or commercial disclosure may be found after the references.
Asunto(s)
Inhibidores de la Angiogénesis , Bevacizumab , Endoftalmitis , Inyecciones Intravítreas , Ranibizumab , Receptores de Factores de Crecimiento Endotelial Vascular , Proteínas Recombinantes de Fusión , Factor A de Crecimiento Endotelial Vascular , Endoftalmitis/epidemiología , Humanos , Inyecciones Intravítreas/efectos adversos , Estudios Retrospectivos , Receptores de Factores de Crecimiento Endotelial Vascular/administración & dosificación , Inhibidores de la Angiogénesis/administración & dosificación , Femenino , Proteínas Recombinantes de Fusión/administración & dosificación , Factor A de Crecimiento Endotelial Vascular/antagonistas & inhibidores , Masculino , Ranibizumab/administración & dosificación , Anciano , Factores de Riesgo , Bevacizumab/administración & dosificación , Persona de Mediana Edad , Anciano de 80 o más Años , Infecciones Bacterianas del Ojo/epidemiología , IncidenciaRESUMEN
Cell and gene therapy is a fast-growing field for cancer therapeutics requiring reliable instrumentation and technologies. Key parameters essential for satisfying Chemistry Manufacturing and Controls criteria standards are routinely performed using flow cytometry. Recently, image cytometry was developed for cell characterization and cell-based assays but had not yet demonstrated sufficient sensitivity for surface marker detection. We developed the Cellaca® PLX image cytometry system and the respective methodologies required for immunophenotyping, GFP and RFP transfection/transduction efficiencies, and cell health analyses for routine cell characterization. All samples tested were compared directly to results from the CytoFLEX flow cytometer. PBMCs were stained with T-cell surface markers for immunophenotyping, and results show highly comparable CD3, CD4, and CD8 populations (within 5 %). GFP- or RFP-expressing cell lines were analyzed for transfection/transduction efficiencies, and the percentage positive cells and respective viabilities were equivalent on both systems. Staurosporine-treated Jurkat cells were stained for apoptotic markers, where annexin V and caspase-3 positive cells were within 5 % comparing both instruments. The proposed system may provide a complementary tool for performing routine cell-based experiments with improved efficiency and sensitivity compared to prior image cytometers, which may be significantly valuable to the cell and gene therapy field.
Asunto(s)
Apoptosis , Humanos , Inmunofenotipificación , Transfección , Línea Celular , Células Jurkat , Citometría de Flujo/métodosRESUMEN
PURPOSE: Oxalate is an excellent calcium ion attractor with great abundance in the human body, and the liver is the major source of oxalate. The Glycolate oxidase-1 (GOX1) gene is solely responsible for the glycolate and glyoxylate metabolism and produces oxalate. This study has been designed to comprehend the association of genetic variants of the GOX1 gene with the risk of hyperoxaluria and renal stone disease in the Indian population. METHOD: The present study is a candidate gene approach prospective case-control study carried out on 300 participants (150 cases and 150 controls) at Muljibhai Patel Urological Hospital, Gujarat, India. Biochemical parameters, including serum levels of calcium, creatinine, parathyroid hormone, and 24-h urine metabolites, were performed. The genotyping of GOX1 gene variants rs6086287, rs2235250, rs2255183, and rs2294303 was performed using a customized TaqMan assay probe by RT-PCR. RESULT: Parathyroid hormone, serum creatinine, and urine metabolites were significantly elevated in nephrolithiasis compared to healthy individuals. All mutated homozygous genotypes GG (rs6086287), TT (rs2235250), GG (rs2255183), and CC (rs2294303) were significantly associated with a high risk of renal stone disease. Individuals diagnosed with hyperoxaluria and carrying TG (rs6086287), AG (rs2255183), and TT (rs2294303) genotypes have a significantly high risk of renal stone disease. Moreover, haplotype analysis and correlation analysis also confirmed the strong association between genetic variants and nephrolithiasis. CONCLUSION: Genetic variants of the GOX1 genes were associated with renal stone disease. In the presence of risk genotype and hyperoxaluria, the susceptibility to develop renal stone disease risk gets modulated.
Asunto(s)
Oxidorreductasas de Alcohol , Hiperoxaluria , Cálculos Renales , Humanos , Calcio , Estudios de Casos y Controles , Cálculos Renales/complicaciones , Hiperoxaluria/genética , Oxalatos/orina , Hormona Paratiroidea , CreatininaRESUMEN
BACKGROUND: For extended-release drug formulations, effective half-life (t1/2eff) is a relevant pharmacokinetic parameter to inform dosing strategies and time to reach steady state. Tacrolimus, an immunosuppressant commonly used for the prophylaxis of organ rejection in transplant patients, is available as both immediate- and extended-release formulations. To the best of our knowledge, the t1/2eff of tacrolimus from these different formulations has not yet been assessed. The objective of this study was to characterize the t1/2eff and terminal half-life (t1/2z) of an extended-release once-daily tacrolimus formulation (LCPT) and twice-daily immediate-release tacrolimus (IR-Tac). METHODS: A noncompartmental analysis of pharmacokinetic data obtained from a phase 2 study in de novo kidney transplant recipients receiving either LCPT or IR-Tac was conducted. Intensive blood sampling was performed on days 1, 7, and 14, and tacrolimus whole blood concentrations were measured using a validated liquid chromatography with tandem mass spectrometry method. T1/2eff was estimated using within-participant accumulation ratios. T1/2z was estimated by linear regression of the terminal phase of the concentration versus time profile. RESULTS: The median accumulation ratios of LCPT and IR-Tac on day 14 were 3.18 and 2.06, respectively.The median (interquartile range; IQR) t1/2eff for LCPT at day 14 of dosing was 48.4 (37.4-77.9) hours, whereas the t1/2z was 20.3 (17.6-22.9) hours. For IR-Tac, the median (IQR) t1/2eff and t1/2z on day 14 were 12.5 (8.8-23.0) hours and 12.2 (9.2-15.7) hours, respectively. CONCLUSIONS: Consistent with its prolonged release of tacrolimus, LCPT demonstrated a higher accumulation ratio and a longer t1/2eff compared with IR-Tac. These findings underscore the pharmacokinetic differences between different drug formulations of the same moiety and may help inform dose adjustments for LCPT in kidney transplantation.
RESUMEN
PURPOSE: To evaluate anatomic outcomes and surgeon response following the use of microserrated (Sharkskin, Alcon, Forth Worth, TX) internal limiting membrane (ILM) forceps compared with conventional (Grieshaber; Alcon) ILM forceps for peeling of the ILM. METHODS: Patients were prospectively assigned in a 1:1 randomized fashion to undergo ILM peeling using microserrated forceps or conventional forceps. Rates of retinal hemorrhages, deep retinal grasps, ILM regrasping, time to ILM removal, and surgeon questionnaire comparing the use of microserrated and conventional ILM forceps were analyzed. RESULTS: A total of 90 eyes of 90 patients were included in this study. The mean number of deep retinal grasps was higher in the conventional forceps group (1.51 ± 1.70 vs. 0.33 ± 0.56, respectively [P < 0.0001]). The mean number of failed ILM grasps was higher with conventional forceps (6.62 ± 3.51 vs. 5.18 ± 2.06 [P = 0.019]). Microserrated forceps provided more comfortability (lower number) in initiating the ILM flap (2.16 ± 0.85 vs. 1.56 ± 0.76, P < 0.001), comfortability in regrasping the ILM flap (2.51 ± 1.01 vs. 1.98 ± 0.89, P = 0.01), and comfortability in completing the ILM flap (2.42 ± 1.03 vs. 1.84 ± 1.02, P = 0.01). CONCLUSION: Surgeons utilizing the microserrated forceps experienced fewer deep retina grasps and fewer failed ILM grasps compared with conventional ILM forceps. The microserrated forceps was also a more favorable experience subjectively among the surgeons.
Asunto(s)
Membrana Basal , Agudeza Visual , Vitrectomía , Humanos , Femenino , Masculino , Estudios Prospectivos , Membrana Basal/cirugía , Vitrectomía/instrumentación , Vitrectomía/métodos , Anciano , Persona de Mediana Edad , Instrumentos Quirúrgicos , Membrana Epirretinal/cirugía , Tomografía de Coherencia Óptica , Diseño de Equipo , Estudios de Seguimiento , Colgajos QuirúrgicosRESUMEN
BACKGROUND: Bolus materials have been used for decades in radiotherapy. Most frequently, these materials are utilized to bring dose closer to the skin surface to cover superficial targets optimally. While cavity filling, such as nasal cavities, is desirable, traditional commercial bolus is lacking, requiring other solutions. Recently, investigators have worked on utilizing 3D printing technology, including commercially available solutions, which can overcome some challenges with traditional bolus. PURPOSE: To utilize failure modes and effects analysis (FMEA) to successfully implement a comprehensive 3D printed bolus solution to replace commercial bolus in our clinic using a series of open-source (or free) software products. METHODS: 3D printed molds for bespoke bolus were created by exporting the DICOM structures of the bolus designed in the treatment planning system and manipulated to create a multipart mold for 3D printing. A silicone (Ecoflex 00-30) mixture is poured into the mold and cured to form the bolus. Molds for sheet bolus of five thicknesses were also created. A comprehensive FMEA was performed to guide workflow adjustments and QA steps. RESULTS: The process map identified 39 and 30 distinct steps for the bespoke and flat sheet bolus workflows, respectively. The corresponding FMEA highlighted 119 and 86 failure modes, with 69 shared between the processes. Misunderstanding of plan intent was a potential cause for most of the highest-scoring failure modes, indicating that physics and dosimetry involvement early in the process is paramount. CONCLUSION: FMEA informed the design and implementation of QA steps to guarantee a safe and high-quality comprehensive implementation of silicone bolus from 3D printed molds. This approach allows for greater adaptability not afforded by traditional bolus, as well as potential dissemination to other clinics due to the open-source nature of the workflow.
Asunto(s)
Impresión Tridimensional , Dosificación Radioterapéutica , Planificación de la Radioterapia Asistida por Computador , Siliconas , Programas Informáticos , Flujo de Trabajo , Humanos , Planificación de la Radioterapia Asistida por Computador/métodos , Siliconas/química , Radioterapia de Intensidad Modulada/métodosRESUMEN
PURPOSE: The United States (US) federal government uses health provider shortage areas (HPSAs) to define patient accessibility to primary care physicians. It is unclear whether HPSAs can be applied to eye care providers (ECPs). Our study determined the applicability of federal HPSA designations to ECP availability in the US. DESIGN: Cross-sectional study. PARTICIPANTS: US general population and ophthalmologists/optometrists in the Medicare database. METHODS: The primary care HPSA score, visual impairment prevalence, and ECP location were determined for each census tract or county using data from the US Department of Health and Human Services, the Centers for Disease Control and Prevention, and Centers for Medicare and Medicaid Services. MAIN OUTCOME MEASURES: Association of HPSA with vision loss and ECP density was examined. The 2-step floating catchment area approach was used to newly define eye care shortage areas (patient accessibility score [PAS], higher being worse accessibility) for every county in the US, by weighting the 2-step FCA scores by prevalence of vision loss and ECP density. Multivariable logistic regression was used to identify sociodemographic variables associated with areas of ECP shortage. RESULTS: Among 72 735 census tracts included, statistically significant but weak correlations of HPSA score with visual impairment (VI) (r = 0.38; P < 0.0001) and ECP density per county population (r = -0.18; P < 0.0001) were found. Only 54.0% of census tracts with < 25th percentile ECP density per county were HPSAs (P < 0.0001). Of census tracts > than 75th percentile for VI only 58.0% were HPSAs (P < 0.0001). Multivariable regression found a higher odds of ECP PAS ≥ 75th percentile (worse accessibility) in rural counties (adjusted odds ratio [aOR], 2.47; 95% confidence interval [CI], 1.93-3.67; P < 0.001) and counties with a greater prevalence of residents with less than a high school education (aOR, 1.21; 95% CI, 1.19-1.25; P < 0.001), residents ≥ 65 years of age (aOR, 1.10; 95% CI, 1.07-1.13; P < 0.001), and uninsured residents (aOR, 1.04; 95% CI, 1.01-1.06; P < 0.001). Counties with a greater proportion of men (aOR, 0.93; 95% CI, 0.89-0.967; P < 0.001) or White residents (aOR, 0.99; 95% CI, 0.98-0.99) had a lower odds of ECP PAS ≥ 75th percentile. CONCLUSIONS: Current HPSAs only weakly correlate with ECP supply. We propose a new approach to identify counties with high need but limited access to eye care. FINANCIAL DISCLOSURE(S): Proprietary or commercial disclosure may be found after the references.
Asunto(s)
Accesibilidad a los Servicios de Salud , Medicare , Anciano , Masculino , Humanos , Estados Unidos/epidemiología , Estudios TransversalesRESUMEN
INTRODUCTION: According to the US Renal Data System (USRDS), patients with end-stage kidney disease (ESKD) on maintenance dialysis had higher mortality during early COVID-19 pandemic. Less is known about the effect of the pandemic on the delivery of outpatient maintenance hemodialysis and its impact on death. We examined the effect of pandemic-related disruption on the delivery of dialysis treatment and mortality in patients with ESKD receiving maintenance hemodialysis in the Veterans Health Administration (VHA) facilities, the largest integrated national healthcare system in the USA. METHODS: Using national VHA electronic health records data, we identified 7,302 Veterans with ESKD who received outpatient maintenance hemodialysis in VHA healthcare facilities during the COVID-19 pandemic (February 1, 2020, to December 31, 2021). We estimated the average change in the number of hemodialysis treatments received and deaths per 1,000 patients per month during the pandemic by conducting interrupted time-series analyses. We used seasonal autoregressive moving average (SARMA) models, in which February 2020 was used as the conditional intercept and months thereafter as conditional slope. The models were adjusted for seasonal variations and trends in rates during the pre-pandemic period (January 1, 2007, to January 31, 2020). RESULTS: The number (95% CI) of hemodialysis treatments received per 1,000 patients per month during the pre-pandemic and pandemic periods were 12,670 (12,525-12,796) and 12,865 (12,729-13,002), respectively. Respective all-cause mortality rates (95% CI) were 17.1 (16.7-17.5) and 19.6 (18.5-20.7) per 1,000 patients per month. Findings from SARMA models demonstrate that there was no reduction in the dialysis treatments delivered during the pandemic (rate ratio: 0.999; 95% CI: 0.998-1.001), but there was a 2.3% (95% CI: 1.5-3.1%) increase in mortality. During the pandemic, the non-COVID hospitalization rate was 146 (95% CI: 143-149) per 1,000 patients per month, which was lower than the pre-pandemic rate of 175 (95% CI: 173-176). In contrast, there was evidence of higher use of telephone encounters during the pandemic (3,023; 95% CI: 2,957-3,089), compared with the pre-pandemic rate (1,282; 95% CI: 1,241-1,324). CONCLUSIONS: We found no evidence that there was a disruption in the delivery of outpatient maintenance hemodialysis treatment in VHA facilities during the COVID-19 pandemic and that the modest rise in deaths during the pandemic is unlikely to be due to missed dialysis.
Asunto(s)
COVID-19 , Fallo Renal Crónico , Veteranos , Humanos , Diálisis Renal , Pandemias , COVID-19/epidemiología , Estudios RetrospectivosRESUMEN
PURPOSE: Approximately 80% of brain metastases originate from non-small cell lung cancer (NSCLC). Immune checkpoint inhibitors (ICI) and stereotactic radiosurgery (SRS) are frequently utilized in this setting. However, concerns remain regarding the risk of radiation necrosis (RN) when SRS and ICI are administered concurrently. METHODS: A retrospective study was conducted through the International Radiosurgery Research Foundation. Logistic regression models and competing risks analyses were utilized to identify predictors of any grade RN and symptomatic RN (SRN). RESULTS: The study included 395 patients with 2,540 brain metastases treated with single fraction SRS and ICI across 11 institutions in four countries with a median follow-up of 14.2 months. The median age was 67 years. The median margin SRS dose was 19 Gy; 36.5% of patients had a V12 Gy ≥ 10 cm3. On multivariable analysis, V12 Gy ≥ 10 cm3 was a significant predictor of developing any grade RN (OR: 2.18) and SRN (OR: 3.95). At 1-year, the cumulative incidence of any grade and SRN for all patients was 4.8% and 3.8%, respectively. For concurrent and non-concurrent groups, the cumulative incidence of any grade RN was 3.8% versus 5.3%, respectively (p = 0.35); and for SRN was 3.8% vs. 3.6%, respectively (p = 0.95). CONCLUSION: The risk of any grade RN and symptomatic RN following single fraction SRS and ICI for NSCLC brain metastases increases as V12 Gy exceeds 10 cm3. Concurrent ICI and SRS do not appear to increase this risk. Radiosurgical planning techniques should aim to minimize V12 Gy.
Asunto(s)
Neoplasias Encefálicas , Carcinoma de Pulmón de Células no Pequeñas , Neoplasias Pulmonares , Radiocirugia , Humanos , Anciano , Carcinoma de Pulmón de Células no Pequeñas/radioterapia , Carcinoma de Pulmón de Células no Pequeñas/secundario , Radiocirugia/efectos adversos , Radiocirugia/métodos , Inhibidores de Puntos de Control Inmunológico , Estudios Retrospectivos , Neoplasias Pulmonares/radioterapia , Neoplasias Pulmonares/patología , Neoplasias Encefálicas/patologíaRESUMEN
Cellular therapy development and manufacturing has focused on providing novel therapeutic cell-based products for various diseases. The International Organization for Standardization (ISO) has provided guidance on critical quality attributes (CQAs) that shall be considered when testing and releasing cellular therapeutic products. Cell count and viability measurements are two of the CQAs that are determined during development, manufacturing, testing, and product release. The ISO Cell Counting Standard Part 1 and 2 addressed the needs for improving the quality of cell counting results. However, there is currently no guidance on the qualification and selection of a fit-for-purpose cell viability detection method. In this work, we present strategies for the characterization and comparison of AO/PI and AO/DAPI staining methods using the heat-killed (HK) and low temperature/nutrient-deprived (LT/ND) cell death models to evaluate the comparability of cell viability measurements and identify potential causes of differences. We compared the AO/PI and AO/DAPI staining methods using HK and LT/ND-generated dead cells, investigated the staining time effects on cell viability measurements, and determined their viability linearity with different mixtures of live and dead cells. Furthermore, we validated AO/PI and AO/DAPI cell viability measurement with a long-term cell proliferation assay. Finally, we demonstrate a practical example of cell viability measurement comparison using AO/PI and AO/DAPI on antibiotic-selected transduced Jurkat and THP-1 cells to select a fit-for-purpose method for functional genomics screening. The proposed strategies may potentially enable scientists to properly characterize, compare, and select cell viability detection methods that are critical for cellular therapeutic product development and manufacturing.
RESUMEN
PURPOSE OF REVIEW: The aim of this study was to update visual outcomes, microbial spectrum and complications in eyes with endophthalmitis following cataract surgery. RECENT FINDINGS: A single-institution, retrospective review of eyes treated for endophthalmitis following cataract surgery between 2 January 2014 and 10 January 2017. This study included 112 cases of endophthalmitis following cataract surgery, 58 of which were culture-positive (51.8%). The most isolated organisms were coagulase-negative Staphylococci (56.9%). Oral flora were present in 17.2% of cases. At 6 months, 71.7% of patients achieved visual acuity of at least 20/200 and 51.7% achieved at least 20/40 or better. Visual acuity was better in culture-negative vs. culture-positive cases (â¼20/290 vs. â¼20/80, P â = â0.03), and in nonoral flora-associated vs. oral flora-associated culture-positive cases (â¼CF vs. â¼20/150, P â < â0.01). SUMMARY: Following postcataract surgery endophthalmitis, approximately 70% of eyes achieved vision of 20/200 or better and half achieved vision of 20/40 or better 6âmonths after treatment. Poor visual outcomes were seen in eyes with positive bacterial cultures and with oral flora.