RESUMO
Gastroschisis mortality is 75-100% in low-resource settings. In Rwanda, late deaths are often due to sepsis. We aimed to understand the effect of antimicrobial use on survival. We conducted a retrospective review of gastroschisis patients at a tertiary hospital in Kigali, Rwanda between January 2016-June 2019. Demographics, antimicrobial use, microbiology, and outcomes were abstracted. Descriptive and univariate analyses were conducted to assess factors associated with improved survival. Among 92 gastroschisis patients, mortality was 77%(n = 71); 23%(n = 21) died within 48 h. 98%(n = 90) of patients received antibiotics on arrival. Positive blood cultures were obtained in 41%(n = 38). Patients spent 86%(SD = 20%) of their hospital stay on antibiotics and 38%(n = 35) received second-line agents. There was no difference in age at arrival, birth weight, gestational age, silo complications, or antimicrobial selection between survivors and non-survivors. Late death patients spent more total hospital days and post-abdominal closure days on antibiotics (p < 0.001) compared to survivors. There was no difference in the proportion of hospital stay on second-line antibiotics (p = 0.1). CONCLUSION: We identified frequent late deaths, prolonged antibiotic courses, and regular use of second-line antibiotic agents in this retrospective cohort of Rwandan gastroschisis patients. Future studies are needed to evaluate antimicrobial resistance in pediatric surgical patients in Rwanda. WHAT IS KNOWN: ⢠Global disparities in gastroschisis outcomes are extreme, with <4% mortality in high-income settings and 75-100% mortality in low-income settings. ⢠Antimicrobial surveillance data is sparse across Africa, but existing evidence suggests high levels of resistance to first-line antibiotics in Rwanda. WHAT IS NEW: ⢠In-hospital survival for gastroschisis was 23% from 2016-2019 and most deaths occurred late (>48hrs after admission) due to sepsis. ⢠Rwandan gastroschisis patients received prolonged courses of antibiotics and second-line antibiotics were frequently used without culture data, raising concern for antimicrobial resistance.
Assuntos
Gastrosquise , Humanos , Criança , Gastrosquise/complicações , Gastrosquise/tratamento farmacológico , Estudos Retrospectivos , Ruanda/epidemiologia , Pacientes Internados , Antibacterianos/uso terapêuticoRESUMO
INTRODUCTION: Neonatal surgical diseases are prime examples of the global disparity in surgical access and outcomes, with survival for conditions like gastroschisis reaching above 95% in high-income settings but usually fatal in low-income settings. This study aims to examine outcomes and predictors of mortality in patients with two specific neonatal surgical conditions that often require early transfer and prolonged inpatient care (gastroschisis and intestinal atresia) at Rwanda's main pediatric referral hospital. METHODS: A single-institution retrospective chart review of neonates with gastroschisis and intestinal atresia was conducted between January 2016 and June 2019. Abstracted data included demographics, referral history, admission interventions, operative details, in-hospital complications, nutrition patterns, length of stay, and mortality. Daily logs were created to evaluate feeding status, infection status, and antibiotic usage. Descriptive and univariate analysis was conducted, with the primary outcome being survival to hospital discharge. RESULTS: A total of 112 patients met inclusion criteria (82% gastroschisis [n = 92] and 18% intestinal atresia [n = 20]). Median age at arrival was 0 d (GS) [IQR 0-1 d] and 8.5 d (IA) [IQR 4-10 d] (P < 0.0001). Survival to discharge was 22.8% (GS) (n = 21) and 60% (IA) (n = 12) with a mean length of stay of 28.3 d (GS) and 18.4 d (IA). The median number of days to initiation of oral feeds was 8.5 d [IQR 7-11] for gastroschisis survivors. CONCLUSIONS: Neonatal surgical conditions that require early transfer and prolonged nutritional intervention are challenging in low-resource settings, but through treatment by a comprehensive pediatric surgical service, improving survival is possible.
Assuntos
Gastrosquise , Atresia Intestinal , Criança , Gastrosquise/complicações , Gastrosquise/cirurgia , Hospitalização , Humanos , Recém-Nascido , Atresia Intestinal/complicações , Atresia Intestinal/epidemiologia , Atresia Intestinal/cirurgia , Estudos Retrospectivos , Ruanda/epidemiologia , Resultado do TratamentoRESUMO
BACKGROUND: Time to hormonal control after definitive management of hyperthyroidism is unknown but may influence patient and physician decision making when choosing between treatment options. The hypothesis is that the euthyroid state is achieved faster after thyroidectomy than RAI ablation. METHODS: A retrospective review of all patients undergoing definitive therapy for hyperthyroidism was performed. Outcomes after thyroidectomy were compared to RAI. RESULTS: Over 3 years, 217 patients underwent definitive therapy for hyperthyroidism at a county hospital: 121 patients received RAI, and 96 patients underwent thyroidectomy. Age was equivalent (p = 0.72). More males underwent RAI (25% vs 15%, p = 0.05). Endocrinologists referred for both treatments equally (p = 0.82). Both treatments were offered after a minimum 1-year trial of medical management (p = 0.15). RAI patients mostly had Graves (93%), versus 73% of thyroidectomy patients (p < 0.001). Thyroidectomy patients more frequently had eye symptoms (35% vs 13%, p < 0.001), compressive symptoms (74% vs 15%, p < 0.001), or were pregnant/nursing (14% vs 0, p < 0.001). While the thyroidectomy patients had a documented discussion of all treatment modalities, 79% of RAI patients did not have a documented discussion regarding the option of surgical management (p < 0.001). Both treatment groups achieved an euthyroid state (71% vs 65%, p = 0.39). Thyroidectomy patients became euthyroid faster [3 months (2-7 months) versus 9 months (4-14 months); p < 0.001]. CONCLUSIONS: Thyroidectomy for hyperthyroidism renders a patient to an euthyroid state faster than RAI. This finding may be important for patients and clinicians considering definitive options for hyperthyroidism.
Assuntos
Doença de Graves/terapia , Radioisótopos do Iodo/uso terapêutico , Tireoidectomia , Adulto , Comunicação , Feminino , Doença de Graves/sangue , Doença de Graves/complicações , Humanos , Masculino , Pessoa de Meia-Idade , Educação de Pacientes como Assunto , Estudos Retrospectivos , Fatores de Tempo , Tri-Iodotironina/sangueRESUMO
The tumor suppressor p53 can be sent to the proteasome for degradation by placing its nucleo-cytoplasmic shuttling under ligand control. Endogenous p53 is ubiquitinated by MDM2 in the nucleus, and controlling the access of p53 to the nuclear compartment regulates its ubiquitination and proteasomal degradation. This was accomplished by the use of a protein switch that places nuclear translocation under the control of externally applied dexamethasone. Fluorescence microscopy revealed that sending protein switch p53 (PS-p53) to the nucleus produces a distinct punctate distribution in both the cytoplasm and nucleus. The nuclear role in accessing the proteasome was investigated by inhibiting classical nuclear export with leptomycin B. Trapping PS-p53 in the nucleus only allows this punctate staining in that compartment, suggesting that PS-p53 must translocate first to the nuclear compartment for cytoplasmic punctate staining to occur. The role of MDM2 binding was explored by inhibiting MDM2/p53 binding with nutlin-3. Inhibition of this interaction blocked both nuclear export and cytoplasmic and nuclear punctate staining, providing evidence that any change in localization after nuclear translocation is due to MDM2 binding. Further, blocking the proteolytic activity of the proteasome maintained the nuclear localization of the construct. Truncations of p53 were made to determine smaller constructs still capable of interacting with MDM2, and their subcellular localization and degradation potential was observed. PS-p53 and a smaller construct containing the two MDM2 binding regions of p53 (Box I + V) were indeed degraded by the proteasome as measured by loss of enhanced green fluorescent protein that was also fused to the construct. The influence of these constructs on p53 gene transactivation function was assessed and revealed that PS-p53 decreased gene transactivation, while PS-p53(Box I + V) did not significantly change baseline gene transactivation.
Assuntos
Núcleo Celular/metabolismo , Complexo de Endopeptidases do Proteassoma/metabolismo , Proteínas Proto-Oncogênicas c-mdm2/metabolismo , Proteína Supressora de Tumor p53/metabolismo , Transporte Ativo do Núcleo Celular , Antineoplásicos/farmacologia , Química Farmacêutica , Citoplasma/metabolismo , Avaliação Pré-Clínica de Medicamentos , Ácidos Graxos Insaturados/metabolismo , Genes Reporter , Proteínas de Fluorescência Verde/metabolismo , Humanos , Imidazóis/metabolismo , Ligantes , Células MCF-7 , Microscopia de Fluorescência , Mutação , Piperazinas/metabolismo , Ligação Proteica , Ubiquitina/metabolismoRESUMO
BACKGROUND: Sarcopenia and orthostatic hypotension are growing age-related health burdens associated with adverse outcomes, including falls. Despite a possible pathophysiological link, the association between the 2 disorders is not well elucidated. We sought to investigate this relationship in The Irish Longitudinal Study on Ageing (TILDA). METHODS: Data from 2 858 participants at wave 3 of TILDA were analyzed. Probable sarcopenia was defined as per the European Working Group on Sarcopenia in Older People revised definition cutoffs (hand grip strength [HGS] <27 kg in men, <16 kg in women, and/or 5-chair stand test [5CST] time >15 seconds). Participants underwent an active stand orthostatic test with continuous blood pressure (BP) monitoring. Multilevel mixed-effects models, controlling for possible confounders, were used to assess the effect of probable sarcopenia by HGS and 5CST criteria on the change in BP after standing. RESULTS: HGS- and 5CST-defined probable sarcopenia were independently associated with an attenuated BP recovery at 10-20 seconds poststand (systolic BP: ß -0.54, p < .001; ß -0.25, p < .001). On average, those meeting HGS probable sarcopenia criteria had a significantly lower BP at 20, 30, and 40 seconds (differences in systolic BP: -5.01 mmHg, -3.68 mmHg, -2.32 mmHg, p < .05 for all). Those meeting 5CST probable sarcopenia criteria had a significant difference in systolic BP at 20 seconds (-1.94 mmHg, p = .002) but not at 30 or 40 seconds. CONCLUSION: Probable sarcopenia had a significant association with delayed orthostatic BP recovery, with HGS-defined probable sarcopenia having a stronger association than 5CST-defined probable sarcopenia. Results support a modest but significant pathophysiological link between probable sarcopenia and orthostatic hypotension.
Assuntos
Hipotensão Ortostática , Sarcopenia , Masculino , Humanos , Feminino , Idoso , Estudos Longitudinais , Sarcopenia/complicações , Sarcopenia/diagnóstico , Sarcopenia/epidemiologia , Hipotensão Ortostática/complicações , Hipotensão Ortostática/diagnóstico , Hipotensão Ortostática/epidemiologia , Força da Mão , Envelhecimento/fisiologia , Hemodinâmica/fisiologia , Pressão SanguíneaRESUMO
Frailty in older adults is associated with greater risk of cognitive decline. Brain connectivity insights could help understand the association, but studies are lacking. We applied connectome-based predictive modeling to a 32-item self-reported Frailty Index (FI) using resting state functional MRI data from The Irish Longitudinal Study on Ageing. A total of 347 participants were included (48.9% male, mean age 68.2 years). From connectome-based predictive modeling, we obtained 204 edges that positively correlated with the FI and composed the "frailty network" characterised by connectivity of the visual network (right); and 188 edges that negatively correlated with the FI and formed the "robustness network" characterized by connectivity in the basal ganglia. Both networks' highest degree node was the caudate but with different patterns: from caudate to visual network in the frailty network; and to default mode network in the robustness network. The FI was correlated with walking speed but not with metrics of global cognition, reinforcing the matching between the FI and the brain connectivity pattern found (main predicted connectivity in basal ganglia).
Assuntos
Conectoma , Fragilidade , Humanos , Masculino , Idoso , Feminino , Estudos Longitudinais , Fragilidade/diagnóstico por imagem , Encéfalo/diagnóstico por imagem , Envelhecimento , Imageamento por Ressonância Magnética , Rede Nervosa/diagnóstico por imagemRESUMO
Background: Frailty in older adults has been associated with reduced brain health. However, structural brain signatures of frailty remain understudied. Our aims were: (1) Explore associations between a frailty index (FI) and brain structure on magnetic resonance imaging (MRI). (2) Identify the most important FI features driving the associations. Methods: We designed a cross-sectional observational study from a population-based study (The Irish Longitudinal Study on Aging: TILDA). Participants aged ≥50 years who underwent the wave 3 MRI sub-study were included. We measured cortex, basal ganglia, and each of the Desikan-Killiany regional volumes. Age-and sex-adjusted correlations were performed with a 32-item self-reported FI that included conditions commonly tested for frailty in research and clinical settings. A graph theory analysis of the network composed by each FI item and cortex volume was performed. White matter fiber integrity was quantified using diffusion tensor imaging (DTI). Results: In 523 participants (mean age 69, 49% men), lower cortex and thalamic volumes were independently associated with higher FI. Sensory and functional difficulties, diabetes, polypharmacy, knee pain, and self-reported health were the main FI associations with cortex volume. In the network analysis, cortex volume had a modest influence within the frailty network. Regionally, higher FI was significantly associated with lower volumes in both orbitofrontal and temporal cortices. DTI analyses revealed inverse associations between the FI and the integrity of some association bundles. Conclusion: The FI used had a recognizable but subtle structural brain signature in this sample. Only some FI deficits were directly associated with cortex volume, suggesting scope for developing FIs that include metrics more specifically related with brain health in future aging neuroscience studies.
RESUMO
Decreasing the time to species identification and antibiotic susceptibility determination of strains recovered from patients with bacteremia significantly decreases morbidity and mortality. Herein, we validated a method to identify Gram-negative bacteria directly from positive blood culture medium using the Bruker MALDI Biotyper and to rapidly perform susceptibility testing using the BD Phoenix.
Assuntos
Bacteriemia/diagnóstico , Técnicas de Tipagem Bacteriana/métodos , Sangue/microbiologia , Bactérias Gram-Negativas/efeitos dos fármacos , Bactérias Gram-Negativas/isolamento & purificação , Infecções por Bactérias Gram-Negativas/diagnóstico , Espectrometria de Massas por Ionização e Dessorção a Laser Assistida por Matriz/métodos , Antibacterianos/farmacologia , Bacteriemia/microbiologia , Infecções por Bactérias Gram-Negativas/microbiologia , Humanos , Testes de Sensibilidade Microbiana/métodos , Fatores de TempoRESUMO
PURPOSE: The estrogen receptor forms insoluble aggregates in the insoluble cytoskeletal subcellular fraction when bound to the antagonist fulvestrant. The ligand-binding domain was isolated and fused to signal sequences to target subcellular compartments. Sequestering a pro-apoptotic peptide tested the utility of a protein targeted to the insoluble fraction. METHODS: The ligand-binding domain of the estrogen receptor was isolated and fused with signal sequences, either a nuclear localization signal or nuclear export signal. The subcellular localization when bound to drug fulvestrant was examined, specifically its interaction with cytokeratins 8 and 18. The ability to target a therapeutic peptide to the insoluble fraction was tested by fusing a therapeutic coiled-coil from Bcr-Abl in K562 cells. RESULTS: The estrogen receptor ligand-binding domain responds to fulvestrant by translocating to the insoluble fraction. Adding a signal sequence significantly limited the translocation to either the nucleus or cytoplasm. The cytokeratin 8/18 status of the cell did not alter this response. The therapeutic coiled-coil fused to ERLBD was inactivated upon ligand induction. CONCLUSIONS: Isolating the ligand-binding domain of the estrogen receptor creates a ligand-controllable protein capable of translocation to the insoluble fraction. This can be used to sequester an active peptide to alter its function.
Assuntos
Estradiol/análogos & derivados , Moduladores de Receptor Estrogênico/farmacologia , Receptores de Estrogênio/química , Receptores de Estrogênio/metabolismo , Estradiol/farmacologia , Fulvestranto , Humanos , Queratinas/metabolismo , Sinais de Exportação Nuclear , Sinais de Localização Nuclear , Ligação Proteica , Estrutura Terciária de Proteína , Transporte Proteico/efeitos dos fármacos , SolubilidadeRESUMO
The Sustained Attention to Response Task (SART) is a computer-based go/no-go task to measure neurocognitive function in older adults. However, simplified average features of this complex dataset lead to loss of primary information and fail to express associations between test performance and clinically meaningful outcomes. Here, we combine a novel method to visualise individual trial (raw) information obtained from the SART test in a large population-based study of ageing in Ireland and an automatic clustering technique. We employed a thresholding method, based on the individual trial number of mistakes, to identify poorer SART performances and a fuzzy clusters algorithm to partition the dataset into 3 subgroups, based on the evolution of SART performance after 4 years. Raw SART data were available for 3468 participants aged 50 years and over at baseline. The previously reported SART visualisation-derived feature 'bad performance', indicating the number of SART trials with at least 4 mistakes, and its evolution over time, combined with the fuzzy c-mean (FCM) algorithm, individuated 3 clusters corresponding to 3 degrees of physiological dysregulation. The biggest cluster (94% of the cohort) was constituted by healthy participants, a smaller cluster (5% of the cohort) by participants who showed improvement in cognitive and psychological status, and the smallest cluster (1% of the cohort) by participants whose mobility and cognitive functions dramatically declined after 4 years. We were able to identify in a cohort of relatively high-functioning community-dwelling adults a very small group of participants who showed clinically significant decline. The selected smallest subset manifested not only mobility deterioration, but also cognitive decline, the latter being usually hard to detect in population-based studies. The employed techniques could identify at-risk participants with more specificity than current methods, and help clinicians better identify and manage the small proportion of community-dwelling older adults who are at significant risk of functional decline and loss of independence.
RESUMO
Gait speed is a measure of general fitness. Changing from usual (UGS) to maximum (MGS) gait speed requires coordinated action of many body systems. Gait speed reserve (GSR) is defined as MGS-UGS. From a shortlist of 88 features across five categories including sociodemographic, cognitive, and physiological, we aimed to find and compare the sets of predictors that best describe UGS, MGS, and GSR. For this, we leveraged data from 3,925 adults aged 50+ from Wave 3 of The Irish Longitudinal Study on Ageing (TILDA). Features were selected by a histogram gradient boosting regression-based stepwise feature selection pipeline. Each model's feature importance and input-output relationships were explored using TreeExplainer from the Shapely Additive Explanations explainable machine learning package. The mean R a d j 2 (SD) from fivefold cross-validation on training data and the R a d j 2 score on test data were 0.38 (0.04) and 0.41 for UGS, 0.45 (0.04) and 0.46 for MGS, and 0.19 (0.02) and 0.21 for GSR. Each model selected features across all categories. Features common to all models were age, grip strength, chair stands time, mean motor reaction time, and height. Exclusive to UGS and MGS were educational attainment, fear of falling, Montreal cognitive assessment errors, and orthostatic intolerance. Exclusive to MGS and GSR were body mass index (BMI), and number of medications. No features were selected exclusively for UGS and GSR. Features unique to UGS were resting-state pulse interval, Center for Epidemiologic Studies Depression Scale (CESD) depression, sit-to-stand difference in diastolic blood pressure, and left visual acuity. Unique to MGS were standard deviation in sustained attention to response task times, resting-state heart rate, smoking status, total heartbeat power during paced breathing, and visual acuity. Unique to GSR were accuracy proportion in a sound-induced flash illusion test, Mini-mental State Examination errors, and number of cardiovascular conditions. No interactions were present in the GSR model. The four features that overall gave the most impactful interactions in the UGS and MGS models were age, chair stands time, grip strength, and BMI. These findings may help provide new insights into the multisystem predictors of gait speed and gait speed reserve in older adults and support a network physiology approach to their study.
RESUMO
The quantification of biological age in humans is an important scientific endeavor in the face of ageing populations. The frailty index (FI) methodology is based on the accumulation of health deficits and captures variations in health status within individuals of the same age. The aims of this study were to assess whether the addition of age to an FI improves its mortality prediction and whether the associations of the individual FI items differ in strength. We utilized data from The Irish Longitudinal Study on Ageing to conduct, by sex, machine learning analyses of the ability of a 32-item FI to predict 8-year mortality in 8174 wave 1 participants aged 50 or more years. By wave 5, 559 men and 492 women had died. In the absence of age, the FI was an acceptable predictor of mortality with AUCs of 0.7. When age was included, AUCs improved to 0.8 in men and 0.9 in women. After age, deficits related to physical function and self-rated health tended to have higher importance scores. Not all FI variables seemed equally relevant to predict mortality, and age was by far the most relevant feature. Chronological age should remain an important consideration when interpreting the prognostic significance of an FI.
RESUMO
The Sustained Attention to Response Task (SART) has been used to measure neurocognitive functions in older adults. However, simplified average features of this complex dataset may result in loss of primary information and fail to express associations between test performance and clinically meaningful outcomes. Here, we describe a new method to visualise individual trial (raw) information obtained from the SART test, vis-à-vis age, and groups based on mobility status in a large population-based study of ageing in Ireland. A thresholding method, based on the individual trial number of mistakes, was employed to better visualise poorer SART performances, and was statistically validated with binary logistic regression models to predict mobility and cognitive decline after 4 years. Raw SART data were available for 4864 participants aged 50 years and over at baseline. The novel visualisation-derived feature bad performance, indicating the number of SART trials with at least 4 mistakes, was the most significant predictor of mobility decline expressed by the transition from Timed Up-and-Go (TUG) < 12 to TUG ≥ 12 s (OR = 1.29; 95% CI 1.14-1.46; p < 0.001), and the only significant predictor of new falls (OR = 1.11; 95% CI 1.03-1.21; p = 0.011), in models adjusted for multiple covariates. However, no SART-related variables resulted significant for the risk of cognitive decline, expressed by a decrease of ≥2 points in the Mini-Mental State Examination (MMSE) score. This novel multimodal visualisation could help clinicians easily develop clinical hypotheses. A threshold approach to the evaluation of SART performance in older adults may better identify subjects at higher risk of future mobility decline.
RESUMO
BACKGROUND: Evaluation of a thyroid nodule is a common referral seen by surgeons and frequently requires ultrasound-guided fine needle aspiration (US-guided FNA). While surgical residents may have sufficient exposure to thyroid surgery, many lack exposure to office-based procedures, such as US-guided FNA. General surgery residents should be provided with knowledge and practical skills in the application of diagnostic and interventional neck ultrasound to manage the common workup of a thyroid nodule. METHODS: This study sought to instruct and measure surgical residents' performance in thyroid US-guided FNA and evaluate their views regarding instituting such a formal curriculum. Twelve (n = 12) senior residents completed a written pretest and questionnaire, then watched an instructional video and practiced a simulated thyroid US-guided FNA on our created model. Then residents were evaluated while performing actual thyroid US-guided FNAs on patients in our clinic. Residents then completed the same written exam and questionnaire for objective measure. RESULTS: Eight of the chief residents (62%) felt "not comfortable" with the procedure on the pre-course survey; this was reduced to 0% on the post-course survey. Moderate comfort level increased from 15% to 50% and extreme comfort increased from 0% to 8%. From the 11 residents who completed the pre- and post-test exam, 82% (n = 9) significantly improved their score through the curriculum (pre-test: 40.9 vs. post-test: 61.8; p = 0.05). CONCLUSION: With focused instruction, residents are able to learn ultrasound-guided thyroid biopsy with improvement in subjective confidence level and objective measures. Resident feedback was positive and emphasized the importance of such training in surgical residency curriculum.
Assuntos
Competência Clínica , Educação de Pós-Graduação em Medicina/métodos , Cirurgia Geral/educação , Biópsia Guiada por Imagem , Nódulo da Glândula Tireoide/diagnóstico por imagem , Ultrassonografia de Intervenção , Biópsia por Agulha Fina , Currículo , Humanos , Internato e Residência/métodos , Treinamento por Simulação , Nódulo da Glândula Tireoide/patologia , Estados UnidosRESUMO
Within social hierarchies, low social status is associated with increased vigilance, hostile expectations, and reactive aggression. We propose that societal devaluation is common across many low social status groups and produces a sense of threatened social worth. Threatened social worth may lead those of low status to be more vigilant towards social threats, thereby increasing the likelihood of hostile attributions and endorsement of aggression. Integrating theory on belongingness, social rejection, and stigma compensation, two studies test a sequential process model demonstrating that threatened social worth mediates the relationship between status, hostile attributions, and endorsement of aggression. Employing a relative status manipulation, Study 2 reveals a causal effect of status and highlights the importance of perceptions of low social status on threatened social worth. These data demonstrate the role of social worth in explaining the link between status and hostility and have implications for research in the social, health, and developmental domains.
Assuntos
Agressão/psicologia , Hostilidade , Classe Social , Percepção Social , Adulto , Feminino , Hierarquia Social , Humanos , Renda , Masculino , Modelos TeóricosRESUMO
CONTEXT: Timely processing of blood cultures with positive results, including Gram staining and notification of clinicians, is a critical function of the clinical microbiology laboratory. Analysis of processing time in our laboratory revealed opportunities to enhance workflow efficiency. We found that the average time from positive blood culture result to removal of the bottle for processing (positive-to-removal [PR] time) was inadequate for our rapid pathogen identification program. OBJECTIVE: To determine whether increased vigilance about PR time and prioritization of laboratory resources would decrease PR time and total processing time. DESIGN: We performed a retrospective analysis of blood culture PR time 7 months before and 7 months after an in-service meeting during which the importance of PR time was emphasized, and corrective measures were implemented. RESULTS: Before the in-service meeting, the average PR time for 5057 samples was 38 minutes, with an aggregate time of 192,251 minutes. Unexpectedly, we discovered that only 51.8% (2617 of 5057) of the positive blood cultures were removed in less than 10 minutes. After the in-service meeting, for 5293 samples, the average PR time improved to 8 minutes, the aggregate time improved to 44,630 minutes, and 84.5% (4470 of 5293) of the positive blood cultures were removed in less than 10 minutes. These improvements reduced the time to telephone notification of the Gram stain results to a caregiver by 46.7% (from 105 minutes to 56 minutes). CONCLUSIONS: Increased awareness of barriers to rapid pathogen identification and interventions for improving performance time significantly enhanced care of patients with bloodstream infections.
Assuntos
Bacteriemia/diagnóstico , Sangue/microbiologia , Fungemia/diagnóstico , Técnicas Microbiológicas/normas , Bacteriemia/microbiologia , Fungemia/microbiologia , Humanos , Laboratórios Hospitalares , Técnicas Microbiológicas/instrumentação , Técnicas Microbiológicas/métodos , Estudos Retrospectivos , Espectrometria de Massas por Ionização e Dessorção a Laser Assistida por Matriz , Fatores de TempoRESUMO
Lymphadenectomy is the standard of care for metastatic melanoma in the inguinal lymph node basin. Historically, open surgery was the only treatment option. However, in recent years, videoscopic inguinal lymphadenectomy (VIL) has become a popular approach as it offers a minimally invasive alternative, provides similar oncologic control and reduces wound complications. Even though the VIL approach is being used more frequently, the patient populations that stand to benefit the most from this approach are still under investigation. Despite continued advances in safety for laparoscopic surgery, many surgeons are hesitant to perform these procedures on pregnant women. In this report, we present a successful VIL in a pregnant patient, describe our technique and demonstrate the safety of performing VIL in expectant mothers. To our knowledge, this case represents the first VIL performed in an expectant mother.
RESUMO
BACKGROUND: An intervention for Gram-negative bloodstream infections that integrated mass spectrometry technology for rapid diagnosis with antimicrobial stewardship oversight significantly improved patient outcomes and reduced hospital costs. As antibiotic resistance rates continue to grow at an alarming speed, the current study was undertaken to assess the impact of this intervention in a challenging patient population with bloodstream infections caused by antibiotic-resistant Gram-negative bacteria. METHODS: A total of 153 patients with antibiotic-resistant Gram-negative bacteremia hospitalized prior to the study intervention were compared to 112 patients treated post-implementation. Outcomes assessed included time to optimal antibiotic therapy, time to active treatment when inactive, hospital and intensive care unit length of stay, all-cause 30-day mortality, and total hospital expenditures. RESULTS: Integrating rapid diagnostics with antimicrobial stewardship improved time to optimal antibiotic therapy (80.9 h in the pre-intervention period versus 23.2 h in the intervention period, P < 0.001) and effective antibiotic therapy (89.7 h versus 32 h, P < 0.001). Patients in the pre-intervention period had increased duration of hospitalization compared to those in the intervention period (23.3 days versus 15.3 days, P = 0.0001) and longer intensive care unit length of stay (16 days versus 10.7 days, P = 0.008). Mortality among patients during the intervention period was lower (21% versus 8.9%, P = 0.01) and our study intervention remained a significant predictor of survival (OR, 0.3; 95% confidence interval [CI], 0.12-0.79) after multivariate logistic regression. Mean hospital costs for each inpatient survivor were reduced $26,298 in the intervention cohort resulting in an estimated annual cost savings of $2.4 million (P = 0.002). CONCLUSIONS: Integration of rapid identification and susceptibility techniques with antimicrobial stewardship resulted in significant improvements in clinical and financial outcomes for patients with bloodstream infections caused by antibiotic-resistant Gram-negatives. The intervention decreased hospital and intensive care unit length of stay, total hospital costs, and reduced all-cause 30-day mortality.
Assuntos
Acinetobacter baumannii/isolamento & purificação , Antibacterianos/uso terapêutico , Bacteriemia/tratamento farmacológico , Escherichia coli/isolamento & purificação , Infecções por Bactérias Gram-Negativas/tratamento farmacológico , Klebsiella/isolamento & purificação , Pseudomonas aeruginosa/isolamento & purificação , Infecções por Acinetobacter/diagnóstico , Infecções por Acinetobacter/tratamento farmacológico , Infecções por Acinetobacter/microbiologia , Infecções por Acinetobacter/mortalidade , Adulto , Idoso , Bacteriemia/diagnóstico , Bacteriemia/microbiologia , Bacteriemia/mortalidade , Farmacorresistência Bacteriana Múltipla , Escherichia coli/enzimologia , Infecções por Escherichia coli/diagnóstico , Infecções por Escherichia coli/tratamento farmacológico , Infecções por Escherichia coli/microbiologia , Infecções por Escherichia coli/mortalidade , Feminino , Infecções por Bactérias Gram-Negativas/diagnóstico , Infecções por Bactérias Gram-Negativas/microbiologia , Infecções por Bactérias Gram-Negativas/mortalidade , Custos Hospitalares , Mortalidade Hospitalar , Humanos , Unidades de Terapia Intensiva , Klebsiella/enzimologia , Infecções por Klebsiella/diagnóstico , Infecções por Klebsiella/tratamento farmacológico , Infecções por Klebsiella/microbiologia , Infecções por Klebsiella/mortalidade , Tempo de Internação , Masculino , Testes de Sensibilidade Microbiana , Pessoa de Meia-Idade , Infecções por Pseudomonas/diagnóstico , Infecções por Pseudomonas/tratamento farmacológico , Infecções por Pseudomonas/microbiologia , Infecções por Pseudomonas/mortalidade , Espectrometria de Massas por Ionização e Dessorção a Laser Assistida por Matriz , Taxa de Sobrevida , Fatores de Tempo , Tempo para o Tratamento , Resultado do Tratamento , Resistência beta-Lactâmica , beta-Lactamases/metabolismoRESUMO
CONTEXT: Early diagnosis of gram-negative bloodstream infections, prompt identification of the infecting organism, and appropriate antibiotic therapy improve patient care outcomes and decrease health care expenditures. In an era of increasing antimicrobial resistance, methods to acquire and rapidly translate critical results into timely therapies for gram-negative bloodstream infections are needed. OBJECTIVE: To determine whether mass spectrometry technology coupled with antimicrobial stewardship provides a substantially improved alternative to conventional laboratory methods. DESIGN: An evidence-based intervention that integrated matrix-assisted laser desorption and ionization time-of-flight mass spectrometry, rapid antimicrobial susceptibility testing, and near-real-time antimicrobial stewardship practices was implemented. Outcomes in patients hospitalized prior to initiation of the study intervention were compared to those in patients treated after implementation. Differences in length of hospitalization and hospital costs were assessed in survivors. RESULTS: The mean hospital length of stay in the preintervention group survivors (n = 100) was 11.9 versus 9.3 days in the intervention group (n = 101; P = .01). After multivariate analysis, factors independently associated with decreased length of hospitalization included the intervention (hazard ratio, 1.38; 95% confidence interval, 1.01-1.88) and active therapy at 48 hours (hazard ratio, 2.9; confidence interval, 1.15-7.33). Mean hospital costs per patient were $45 709 in the preintervention group and $26 162 in the intervention group (P = .009). CONCLUSIONS: Integration of rapid identification and susceptibility techniques with antimicrobial stewardship significantly improved time to optimal therapy, and it decreased hospital length of stay and total costs. This innovative strategy has ramifications for other areas of patient care.
Assuntos
Anti-Infecciosos/uso terapêutico , Bacteriemia/economia , Infecções por Bactérias Gram-Negativas/economia , Custos Hospitalares/estatística & dados numéricos , Espectrometria de Massas por Ionização e Dessorção a Laser Assistida por Matriz/métodos , Adulto , Idoso , Idoso de 80 Anos ou mais , Anti-Infecciosos/economia , Anti-Infecciosos/farmacologia , Bacteriemia/diagnóstico , Bacteriemia/tratamento farmacológico , Análise Custo-Benefício , Intervenção Médica Precoce/economia , Medicina Baseada em Evidências/economia , Feminino , Infecções por Bactérias Gram-Negativas/diagnóstico , Infecções por Bactérias Gram-Negativas/tratamento farmacológico , Hospitalização/economia , Humanos , Tempo de Internação/economia , Tempo de Internação/estatística & dados numéricos , Masculino , Testes de Sensibilidade Microbiana/economia , Pessoa de Meia-Idade , Análise Multivariada , Avaliação de Resultados em Cuidados de Saúde , Espectrometria de Massas por Ionização e Dessorção a Laser Assistida por Matriz/economia , Texas , Fatores de TempoRESUMO
OBJECTIVE: To study the role of drains in lumbar spine fusions. METHODS: The charts of 402 patients who underwent lumbar decompression and fusion (LDF) were retrospectively reviewed. Patients were classified per International Classification of Diseases, 9th Edition (ICD-9) procedure code as 81.07 (lateral fusion, 74.9%) and 81.08 (posterior fusion, 25.1%). The investigators studied the prevalence of drain use in lumbar fusion procedures and the impact of drain use on postoperative fever, wound infection, posthemorrhagic anemia, blood transfusion, and hospital cost. RESULTS: No significant differences in wound infection rates were noted between patients with and without drains (3.5% vs 2.6%, P = 0.627). The difference in postoperative fever rates between patients with and without drains (63.2% vs 52.6%, P = 0.05) was of borderline significance. Posthemorrhagic anemia was statistically more common in patients with drains (23.5% vs 7.7%, P = 0.000). Allogeneic blood transfusion was also statistically more common in the drained group (23.9% vs 6.8%, P = 0.000). Postoperative hemoglobin levels were lower in patients with drains who underwent one-level (9.5 g/dL vs 11.3 g/dL) or two-level (9.3 g/dL vs 10.2 g/dL) spine fusions. In this series in which drains were liberally used, no patient had to return to the operating room because of postoperative hematoma. An increased rate of allogeneic blood transfusion was noticed with posthemorrhagic anemia and drain use. The rate of allogeneic blood transfusion increased from 5.6% in patients without drains or posthemorrhagic anemia to 38.8% in patients with drains and posthemorrhagic anemia as a secondary diagnosis. The use of drains was associated with statistically insignificant increases in length of stay and cost in posterior procedures. Drain use was associated with shorter length of stay and hospital charges in lateral fusions of three or more levels. CONCLUSIONS: Drain use did not increase the risk of wound infection in patients undergoing LDF, but it had some impact on the prevalence of postoperative fever. Drain use was significantly associated with posthemorrhagic anemia and allogeneic blood transfusion. Drain use did not have a significant economic impact on hospital length of stay and charges except in lateral procedures involving three or more levels.