RESUMO
Acetaminophen (APAP) overdose is the most common cause of acute liver failure (ALF) in the United States. Liver transplantation (LT) is potentially lifesaving for patients with ALF, but its feasibility in clinical practice is limited. Liver assist devices, such as the Molecular Adsorbent Recirculating System (MARS), are used in some centers as a "bridge" to liver transplantation or as a means of liver recovery, but their role in the treatment of ALF is not well-defined. We present the case of a 44-year-old man with APAP-associated ALF who experienced hepatic recovery after treatment with MARS.
RESUMO
BACKGROUND: Due to development of an immune-dysregulated phenotype, advanced liver disease in all forms predisposes patients to sepsis acquisition, including by opportunistic pathogens such as fungi. Little data exists on fungal infection within a medical intensive liver unit (MILU), particularly in relation to acute on chronic liver failure. AIM: To investigate the impact of fungal infections among critically ill patients with advanced liver disease, and compare outcomes to those of patients with bacterial infections. METHODS: From our prospective registry of MILU patients from 2018-2022, we included 27 patients with culture-positive fungal infections and 183 with bacterial infections. We compared outcomes between patients admitted to the MILU with fungal infections to bacterial counterparts. Data was extracted through chart review. RESULTS: All fungal infections were due to Candida species, and were most frequently blood isolates. Mortality among patients with fungal infections was significantly worse relative to the bacterial cohort (93% vs 52%, P < 0.001). The majority of the fungal cohort developed grade 2 or 3 acute on chronic liver failure (ACLF) (90% vs 64%, P = 0.02). Patients in the fungal cohort had increased use of vasopressors (96% vs 70%, P = 0.04), mechanical ventilation (96% vs 65%, P < 0.001), and dialysis due to acute kidney injury (78% vs 52%, P = 0.014). On MILU admission, the fungal cohort had significantly higher Acute Physiology and Chronic Health Evaluation (108 vs 91, P = 0.003), Acute Physiology Score (86 vs 65, P = 0.003), and Model for End-Stage Liver Disease-Sodium scores (86 vs 65, P = 0.041). There was no significant difference in the rate of central line use preceding culture (52% vs 40%, P = 0.2). Patients with fungal infection had higher rate of transplant hold placement, and lower rates of transplant; however, differences did not achieve statistical significance. CONCLUSION: Mortality was worse among patients with fungal infections, likely attributable to severe ACLF development. Prospective studies examining empiric antifungals in severe ACLF and associations between fungal infections and transplant outcomes are critical.
RESUMO
BACKGROUND: Historically, norepinephrine has been administered through a central venous catheter (CVC) because of concerns about the risk of ischemic tissue injury if extravasation from a peripheral IV catheter (PIVC) occurs. Recently, several reports have suggested that peripheral administration of norepinephrine may be safe. RESEARCH QUESTION: Can a protocol for peripheral norepinephrine administration safely reduce the number of days a CVC is in use and frequency of CVC placement? STUDY DESIGN AND METHODS: This was a prospective observational cohort study conducted in the medical ICU at a quaternary care academic medical center. A protocol for peripheral norepinephrine administration was developed and implemented in the medical ICU at the study site. The protocol was recommended for use in patients who met prespecified criteria, but was used at the treating clinician's discretion. All adult patients admitted to the medical ICU receiving norepinephrine through a PIVC from February 2019 through June 2021 were included. RESULTS: The primary outcome was the number of days of CVC use that were avoided per patient, and the secondary safety outcomes included the incidence of extravasation events. Six hundred thirty-five patients received peripherally administered norepinephrine. The median number of CVC days avoided per patient was 1 (interquartile range, 0-2 days per patient). Of the 603 patients who received norepinephrine peripherally as the first norepinephrine exposure, 311 patients (51.6%) never required CVC insertion. Extravasation of norepinephrine occurred in 35 patients (75.8 events/1,000 d of PIVC infusion [95% CI, 52.8-105.4 events/1,000 d of PIVC infusion]). Most extravasations caused no or minimal tissue injury. No patient required surgical intervention. INTERPRETATION: This study suggests that implementing a protocol for peripheral administration of norepinephrine safely can avoid 1 CVC day in the average patient, with 51.6% of patients not requiring CVC insertion. No patient experienced significant ischemic tissue injury with the protocol used. These data support performance of a randomized, prospective, multicenter study to characterize the net benefits of peripheral norepinephrine administration compared with norepinephrine administration through a CVC.
Assuntos
Cateterismo Venoso Central , Cateteres Venosos Centrais , Adulto , Humanos , Norepinefrina , Estudos Prospectivos , Centros Médicos Acadêmicos , Cateterismo Venoso Central/efeitos adversosAssuntos
Vitamina K , Humanos , Administração Intravenosa , Infusões Intravenosas , Administração OralRESUMO
BACKGROUND: Rifaximin is frequently administered to critically ill patients with liver disease and hepatic encephalopathy, but patients currently or recently treated with antibiotics were frequently excluded from studies of rifaximin efficacy. Due to overlapping spectrums of activity, combination therapy with broad-spectrum antibiotics and rifaximin may be unnecessary. A pharmacist-driven protocol was piloted to reduce potentially overlapping therapy in critically ill patients with liver disease. It was hypothesized that withholding rifaximin during broad-spectrum antibiotic therapy would be safe and reduce healthcare costs. AIM: To determine the clinical, safety, and financial impact of discontinuing rifaximin during broad-spectrum antibiotic therapy in critically ill liver patients. METHODS: This was a single-center, quasi-experimental, pre-post study based on a pilot pharmacist-driven protocol. Patients in the protocol group were prospectively identified via the medical intensive care unit (ICU) (MICU) protocol to have rifaximin withheld during broad-spectrum antibiotic treatment. These were compared to a historical cohort who received combination therapy with broad-spectrum antibiotics and rifaximin. All data were collected retrospectively. The primary outcome was days alive and free of delirium and coma (DAFD) to 14 d. Safety outcomes included MICU length of stay, 48-h change in vasopressor dose, and ICU mortality. Secondary outcomes characterized rifaximin cost savings and protocol adherence. Multivariable analysis was utilized to evaluate the association between group assignment and the primary outcome while controlling for potential confounding factors. RESULTS: Each group included 32 patients. The median number of delirium- and coma-free days was similar in the control and protocol groups [3 interquartile range (IQR 0, 8) vs 2 (IQR 0, 9.5), P = 0.93]. In multivariable analysis, group assignment was not associated with a reduced ratio of days alive and free of delirium or coma at 14 d. The protocol resulted in a reduced median duration of rifaximin use during broad-spectrum antibiotic therapy [6 d control (IQR 3, 9.5) vs 1 d protocol (IQR 0, 1); P < 0.001]. Rates of other secondary clinical and safety outcomes were similar including ICU mortality and 48-h change in vasopressor requirements. Overall adherence to the protocol was 91.4%. The median estimated total cost of rifaximin therapy per patient was reduced from $758.40 (IQR $379.20, $1200.80) to $126.40 (IQR $0, $126.40), P < 0.01. CONCLUSION: The novel pharmacist-driven protocol for rifaximin discontinuation was associated with significant cost savings and no differences in safety outcomes including DAFD.
RESUMO
A retrospective review of patients unable to take medications by mouth showed short interruptions of therapy for most patients. In a secondary analysis, our data showed maintenance and/or achievement of viral suppression for most patients. A retrospective review of intensive care patients unable to take antiretrovirals by mouth showed 56.6% of patients experiencing a transient interruption in therapy. Additionally, our case series further supports previous literature on crushing dolutegravir and bictegravir regimens to maintain and achieve viral suppression.
RESUMO
BACKGROUND: Essential to the coagulation pathway, vitamin K (phytonadione) is used to correct clotting factor deficiencies and for reversal of warfarin-induced bleeding. In practice, high-dose intravenous (IV) vitamin K is often used, despite limited evidence supporting repeated dosing. OBJECTIVE: This study sought to characterize differences in responders and nonresponders to high-dose vitamin K to guide dosing strategies. METHODS: This was a case-control study of hospitalized adults who received vitamin K 10 mg IV daily for 3 days. Cases were represented by patients who responded to the first dose of IV vitamin K and controls were nonresponders. The primary outcome was change in international normalized ratio (INR) over time with subsequent vitamin K doses. Secondary outcomes included factors associated with response to vitamin K and incidence of safety events. The Cleveland Clinic Institutional Review Board approved this study. RESULTS: There were 497 patients included, and 182 were responders. Most patients had underlying cirrhosis (91.5%). In responders, the INR decreased from 1.89 at baseline (95% CI = [1.74-2.04]) to 1.40 on day 3 (95% CI = [1.30-1.50]). In nonresponders, the INR decreased from 1.97 (95% CI = [1.83-2.13]) to 1.85 ([1.72-1.99]). Factors associated with response included lower body weight, absence of cirrhosis, and lower bilirubin. There was a low incidence of safety events observed. CONCLUSIONS: In this study of mainly patients with cirrhosis, the overall adjusted decrease in INR over 3 days was 0.3, which may have minimal clinical impact. Additional studies are needed to identify populations who may benefit from repeated daily doses of high-dose IV vitamin K.
Assuntos
Vitamina K , Varfarina , Adulto , Humanos , Estudos de Casos e Controles , Varfarina/uso terapêutico , Vitamina K 1/uso terapêutico , Vitamina K 1/farmacologia , Coagulação Sanguínea , Coeficiente Internacional Normatizado , Cirrose Hepática/tratamento farmacológico , Anticoagulantes/efeitos adversosRESUMO
Critically ill patients are at an increased risk for developing stress ulcers of the mucosa of the upper gastrointestinal (GI) tract. Bleeding from stress ulcers was previously associated with a longer stay in the intensive care unit and an increased risk of death. Thus, most patients admitted to the intensive care unit receive stress ulcer prophylaxis. However, there is a growing concern that acid-suppression drugs may be associated with increased frequency of nosocomial pneumonia and Clostridioides difficile infection. In this article, the authors address controversies regarding stress ulcer prophylaxis in critically ill patients and provide guidance for its appropriate use in this setting.
Assuntos
Úlcera Péptica , Úlcera Gástrica , Doença Aguda , Estado Terminal/terapia , Humanos , Unidades de Terapia Intensiva , Úlcera Péptica/complicações , Úlcera Péptica/prevenção & controle , Úlcera Gástrica/tratamento farmacológico , Úlcera Gástrica/etiologia , Úlcera Gástrica/prevenção & controle , ÚlceraRESUMO
BACKGROUND: The direct comparison of twice daily (BID) and thrice daily (TID) dosing of subcutaneous low dose unfractionated heparin (LDUH) for venous thromboembolism (VTE) prophylaxis in a mixed inpatient population is not well-studied. OBJECTIVE: This study evaluated the effectiveness and safety of BID compared to TID dosing of LDUH for prevention of VTE. METHODS: Retrospective, single-center analysis of patients who received LDUH for VTE prophylaxis between July and September 2015. Outcomes were identified by ICD-9 codes. A matched cohort was created using propensity scores and multivariate analysis was conducted to identify independent risk factors for VTE. The primary outcome was incidence of symptomatic VTE. RESULTS: In the full cohort, VTE occurred in 0.71% of patients who received LDUH BID compared to 0.77% of patients who received LDUH TID (p = 0.85). There was no difference in major (p = 0.85) and minor (p = 0.52) bleeding between the BID and TID groups. For the matched cohort, VTE occurred in 1.4% of BID patients and 2.1% of TID patients (p = 0.32). Major bleed occurred in 0.36% of BID patients and 0.52% of TID patients (p = 0.7), while a minor bleed was seen in 3.4% of BID patients and 2.1% of TID patients (p = 0.13). Personal history of VTE (p = 0.002) and weight (p = 0.035) were independently associated with increased risk of VTE. CONCLUSION: This study did not demonstrate a difference in effectiveness or safety between BID and TID dosing of LDUH for VTE prevention.
Assuntos
Heparina , Tromboembolia Venosa , Anticoagulantes , Hemorragia/induzido quimicamente , Humanos , Estudos Retrospectivos , Tromboembolia Venosa/epidemiologia , Tromboembolia Venosa/etiologia , Tromboembolia Venosa/prevenção & controleRESUMO
OBJECTIVES: Studies of the use of IV N-acetylcysteine in the management of non-acetaminophen-induced acute liver failure have evaluated various dosing regimens. The only randomized trial studying this application described a 72-hour regimen. However, observational studies have reported extended duration until normalization of international normalized ratio. This study seeks to compare differences in patient outcomes based on IV N-acetylcysteine duration. DESIGN: Retrospective cohort study. SETTING: Medical ICU at a large quaternary care academic medical institution and liver transplant center. PATIENTS: Adult patients admitted to the medical ICU who received IV N-acetylcysteine for the treatment of non-acetaminophen-induced acute liver failure. INTERVENTIONS: Patients were divided into cohorts based on duration; standard duration of IV N-acetylcysteine was considered 72 hours, whereas extended duration was defined as continuation beyond 72 hours. MEASUREMENTS AND MAIN RESULTS: The primary outcome was time to normalization of international normalized ratio to less than 1.3 or less than 1.5; secondary outcomes included all-cause mortality and transplant-free survival at 3 weeks. In total, 53 patients were included: 40 in the standard duration cohort and 13 in the extended duration. There were no major differences in baseline characteristics. There was no significant difference in time to international normalized ratio normalization between cohorts. Transplant-free survival was higher with extended duration (76.9% extended vs 41.4% standard; p = 0.03). All-cause mortality at 3 weeks was numerically lower in the extended duration group (0% extended vs 24.1% standard; p = 0.08). CONCLUSIONS: Patients with non-acetaminophen-induced acute liver failure who received extended duration N-acetylcysteine were found to have significantly higher transplant-free survival than patients who received standard duration, although there was no significant difference in time to normalization of international normalized ratio or overall survival. Prospective, randomized, multicenter study is warranted to identify subpopulations of patients with non-acetaminophen-induced acute liver failure who could benefit from extended treatment duration as a bridge to transplant or spontaneous recovery.
RESUMO
The rapid spread of the severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) has led to a global pandemic. The 2019 coronavirus disease (COVID-19) presents with a spectrum of symptoms ranging from mild to critical illness requiring intensive care unit (ICU) admission. Acute respiratory distress syndrome is a major complication in patients with severe COVID-19 disease. Currently, there are no recognized pharmacological therapies for COVID-19. However, a large number of COVID-19 patients require respiratory support, with a high percentage requiring invasive ventilation. The rapid spread of the infection has led to a surge in the rate of hospitalizations and ICU admissions, which created a challenge to public health, research, and medical communities. The high demand for several therapies, including sedatives, analgesics, and paralytics, that are often utilized in the care of COVID-19 patients requiring mechanical ventilation, has created pressure on the supply chain resulting in shortages in these critical medications. This has led clinicians to develop conservation strategies and explore alternative therapies for sedation, analgesia, and paralysis in COVID-19 patients. Several of these alternative approaches have demonstrated acceptable levels of sedation, analgesia, and paralysis in different settings but they are not commonly used in the ICU. Additionally, they have unique pharmaceutical properties, limitations, and adverse effects. This narrative review summarizes the literature on alternative drug therapies for the management of sedation, analgesia, and paralysis in COVID-19 patients. Also, this document serves as a resource for clinicians in current and future respiratory illness pandemics in the setting of drug shortages.
Assuntos
Analgésicos Opioides/administração & dosagem , COVID-19/complicações , Hipnóticos e Sedativos/administração & dosagem , Bloqueadores Neuromusculares/administração & dosagem , Respiração Artificial , Síndrome do Desconforto Respiratório/terapia , Síndrome do Desconforto Respiratório/virologia , Estado Terminal , Humanos , Unidades de Terapia Intensiva , Pandemias , SARS-CoV-2RESUMO
BACKGROUND: Empiric combination antimicrobial therapy is often used in patients with decompensating septic shock. However, the optimal duration of combination therapy is unknown. STUDY QUESTION: The goal of this study was to compare the clinical effects of a single dose of an aminoglycoside to an extended duration of aminoglycosides for combination therapy in patients with septic shock without renal dysfunction. STUDY DESIGN: Retrospective, single-center evaluation of patients with septic shock who received empiric combination therapy with an aminoglycoside. MEASURES AND OUTCOMES: Two patient cohorts were evaluated: those who received a single dose of an aminoglycoside and those who received more than 1 dose of an aminoglycoside. The primary outcome was shock-free days at day 14. Secondary outcomes included mortality, length of stay, clinical cure, and nephrotoxicity. A post hoc subgroup analysis including only patients who received more than 2 doses of an aminoglycoside compared with a single dose was conducted. RESULTS: One hundred fifty-one patients were included in this evaluation, 94 in the single-dose aminoglycoside group and 57 in the extended duration group. There was no difference in shock-free days at day 14 between patients who received a single dose of an aminoglycoside or those who received an extended duration (12.0 vs. 11.6 days; P = 0.56). There were no differences in mortality, length of stay, clinical cure rates, or rates of nephrotoxicity between groups (28% for single dose vs. 26% for extended duration; P = 0.86). No differences in outcomes were detected when evaluating patients who received more than 2 doses of an aminoglycoside compared with a single dose. CONCLUSIONS: Patients with septic shock and normal renal function who received a single dose of an aminoglycoside for combination antimicrobial therapy had no differences detected in shock duration or nephrotoxicity development compared with those who received an extended duration of aminoglycoside combination therapy.
RESUMO
Background: Centers for Disease Control and Prevention recommends 3 months of once-weekly rifapentine/isoniazid (3HP) for latent tuberculosis infection (LTBI) treatment given by directly observed therapy (DOT) or self-administered therapy (SAT) in patients ≥2 years old. 3HP has been associated with increased incidence of hepatic, gastrointestinal, flu-like, and cutaneous adverse drug reactions (ADRs) compared with isoniazid monotherapy. Objective: This study evaluated 3HP completion rates and tolerability for LTBI treatment in a real-world setting. Methods: A single-center retrospective cohort with a nested case-control study, comparing patients experiencing ADRs with those who did not, evaluated patients ≥18 years old receiving 3HP by DOT or SAT for LTBI at Cleveland Clinic from October 2011 through July 2018. Information on baseline characteristics, 3HP administrations, and ADRs were collected. Results: Of 199 patients screened, 144 were included (111 DOT, 33 SAT). 3HP completion rates were high at 82.6% and similar between DOT and SAT groups. During treatment, 92/144 (63.9%) patients experienced any ADR. The most common ADR included flu-like symptoms (38.2%) and gastrointestinal (31.9%) and hepatic (2.1%) reactions. Despite high rate of overall ADRs, rates of significant ADRs (grade 2 or higher) were 4.2%. Overall, 9% of patients discontinued 3HP because of ADRs. After adjusting for other factors associated with ADRs at baseline, SAT was not associated with increased incidence of ADRs, but female sex was a significant predictor (odds ratio = 2.61 [95% CI, 1.23 to 5.56]). Conclusion and Relevance: This study observed high 3HP treatment completion rates, low incidence of significant ADRs, and low discontinuation rates resulting from ADRs.
Assuntos
Antituberculosos/uso terapêutico , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/etiologia , Isoniazida/uso terapêutico , Tuberculose Latente/tratamento farmacológico , Rifampina/análogos & derivados , Adulto , Antituberculosos/administração & dosagem , Antituberculosos/efeitos adversos , Estudos de Casos e Controles , Terapia Diretamente Observada/métodos , Esquema de Medicação , Quimioterapia Combinada , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/epidemiologia , Feminino , Trato Gastrointestinal/efeitos dos fármacos , Humanos , Isoniazida/administração & dosagem , Isoniazida/efeitos adversos , Tuberculose Latente/epidemiologia , Fígado/efeitos dos fármacos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Rifampina/administração & dosagem , Rifampina/efeitos adversos , Rifampina/uso terapêutico , AutoadministraçãoRESUMO
BACKGROUND: Despite a lack of data from intensive care patients, bispectral index monitors are often used to measure the depth of sedation for critically ill patients with acute respiratory distress syndrome (ARDS) who require continuous neuromuscular blocking agents. OBJECTIVE: To evaluate differences in the effectiveness and safety of monitoring sedation by using bispectral index or traditional methods in patients with ARDS who are receiving continuous neuromuscular blocking agents. METHODS: This noninterventional, single-center, retrospective cohort study included adult patients with ARDS who are receiving a neuromuscular blocking agent. Daily sedation and analgesia while a neuromuscular blocking agent was being administered were compared between patients with and patients without orders for titration based on bispectral index values. Clinical outcomes also were evaluated. RESULTS: Overall, sedation and analgesia did not differ between patients with and patients without titration based on bispectral index. Compared with patients without such titration, patients with bispectral index-based titration experienced more dose adjustments for the sedation agent (median [interquartile range], 7 [4-11] vs 1 [0-5], respectively, P < .001) and the analgesic (1 [0-2] vs 0 [0-1], respectively; P = .003) during the first 24 hours of neuromuscular blockade, but this was not associated with any difference in clinical outcomes. CONCLUSIONS: Titration based on bispectral index did not result in a significant difference in sedation or analgesia exposure, or clinical outcomes, from that achieved with traditional sedation monitoring in patients with ARDS who were receiving a neuromuscular blocking agent, despite more dose adjustments during the first 24 hours of receiving the neuromuscular blocking agent.
Assuntos
Sedação Consciente/métodos , Cuidados Críticos/métodos , Eletroencefalografia/métodos , Bloqueio Neuromuscular/métodos , Dor/tratamento farmacológico , Síndrome do Desconforto Respiratório/complicações , Adulto , Estudos de Coortes , Estado Terminal , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Bloqueadores Neuromusculares/administração & dosagem , Estudos RetrospectivosRESUMO
BACKGROUND: Patients with liver disease are concomitantly at increased risk of venous thromboembolism (VTE) and bleeding events due to changes in the balance of pro- and anti-hemostatic substances. As such, recommendations for the use of pharmacological VTE prophylaxis are lacking. Recent studies have found no difference in rates of VTE in those receiving and not receiving pharmacological VTE prophylaxis, though most studies have been small. Thus, our study sought to establish if pharmacological VTE prophylaxis is effective and safe in patients with liver disease. AIM: To determine if there is net clinical benefit to providing pharmacological VTE prophylaxis to cirrhotic patients. METHODS: In this retrospective study, 1806 patients were propensity matched to assess if pharmacological VTE prophylaxis is effective and safe in patients with cirrhosis. Patients were divided and evaluated based on receipt of pharmacological VTE prophylaxis. RESULTS: The composite primary outcome of VTE or major bleeding was more common in the no prophylaxis group than the prophylaxis group (8.7% vs 5.1%, P = 0.002), though this outcome was driven by higher rates of major bleeding (6.9% vs 2.9%, P < 0.001) rather than VTE (1.9% vs 2.2%, P = 0.62). There was no difference in length of stay or in-hospital mortality between groups. Pharmacological VTE prophylaxis was independently associated with lower rates of major bleeding (OR = 0.42, 95%CI: 0.25-0.68, P = 0.0005), but was not protective against VTE on multivariable analysis. CONCLUSION: Pharmacological VTE prophylaxis was not associated with a significant reduction in the rate of VTE in patients with liver disease, though no increase in major bleeding events was observed.
RESUMO
Treatment of suspected infections in critically ill patients requires the timely initiation of appropriate antimicrobials and rapid de-escalation of unnecessary broad-spectrum coverage. New advances in rapid diagnostic tests can now offer earlier detection of pathogen and potential resistance mechanisms within hours of initial culture growth. These technologies, combined with pharmacist antimicrobial stewardship efforts, may result in shorten time to adequate coverage or earlier de-escalation of unnecessary broad spectrum antimicrobials, which could improve patient outcomes and lower overall treatment cost. Furthermore, de-escalation of antimicrobials may lead to decreased emergence of resistant organisms and adverse events associated with antimicrobials. Clinical pharmacists should be aware of new rapid diagnostic tests, including their application, clinical evidence, and limitations, in order to implement the most appropriate clinical treatment strategy when patients have positive cultures. This review will focus on commercially available rapid diagnostic tests for infections that are routinely encountered by critically ill patients, including gram-positive and gram-negative bacterial blood stream infections, Candida, and Clostridioides difficile.
Assuntos
Doenças Transmissíveis/diagnóstico , Doenças Transmissíveis/tratamento farmacológico , Testes Diagnósticos de Rotina/métodos , Testes Diagnósticos de Rotina/normas , Unidades de Terapia Intensiva/normas , Antibacterianos/uso terapêutico , Gestão de Antimicrobianos , Bacteriemia/diagnóstico , Estado Terminal , Infecção Hospitalar/diagnóstico , Humanos , Testes de Sensibilidade MicrobianaRESUMO
BACKGROUND: Vasopressin is often utilized for hemodynamic support in patients with septic shock. However, the most appropriate patient to initiate therapy in is unknown. This study was conducted to determine factors associated with hemodynamic response to fixed-dose vasopressin in patients with septic shock. METHODS: Single-center, retrospective cohort of patients receiving fixed-dose vasopressin for septic shock for at least 6 h with concomitant catecholamines in the medical, surgical, or neurosciences intensive care unit (ICU) at a tertiary care center. Patients were classified as responders or non-responders to fixed-dose vasopressin. Response was defined as a decrease in catecholamine dose requirements and achievement of mean arterial pressure ≥ 65 mmHg at 6 h after initiation of vasopressin. RESULTS: A total of 938 patients were included: 426 responders (45%), 512 non-responders (55%). Responders had lower rates of in-hospital (57 vs. 72%; P < 0.001) and ICU mortality (50 vs. 68%; P < 0.001), and increased ICU-free days at day 14 and hospital-free days at day 28 (2.3 ± 3.8 vs. 1.6 ± 3.3; P < 0.001 and 4.2 ± 7.2 vs. 2.8 ± 6.0; P < 0.001, respectively). On multivariable analysis, non-medical ICU location was associated with increased response odds (OR 1.70; P = 0.0049) and lactate at vasopressin initiation was associated with decreased response odds (OR 0.93; P = 0.0003). Factors not associated with response included APACHE III score, SOFA score, corticosteroid use, and catecholamine dose. CONCLUSION: In this evaluation, 45% responded to the addition of vasopressin with improved outcomes compared to non-responders. The only factors found to be associated with vasopressin response were ICU location and lactate concentration.
RESUMO
STUDY OBJECTIVES: To describe compliance with antibiotic recommendations based on a previously published procalcitonin (PCT)-guided algorithm in clinical practice, to compare PCT algorithm compliance rates between PCT assays ordered in the antibiotic initiation setting (PCT concentration measured less than 24 hours after antibiotic initiation or before antibiotic initiation) with those in the antibiotic continuation setting (PCT concentration measured 24 hours or more after antibiotic initiation), and to evaluate patient- and PCT-related factors independently associated with algorithm compliance in patients in the medical intensive care unit (MICU). DESIGN: Single-center retrospective cohort study. SETTING: Large MICU in a tertiary care academic medical center. PATIENTS: A total of 527 adults admitted to the MICU unit over a 2-year period (November 1, 2011-October 31, 2013) who had a total of 957 PCT assays performed. PCT assays whose results were determined in the MICU were allocated retrospectively to either the initiation setting cohort or the continuation setting cohort based on timing of the PCT assay. MEASUREMENTS AND MAIN RESULTS: Each PCT assay was treated as a separate episode. Antibiotic regimens were compared between the 24-hour periods before and after the results of each PCT assay and evaluated against an algorithm to determine compliance. Clinical, laboratory, PCT-related, and microbiologic variables were assessed during the 24-hour period after the PCT assay results to determine their influence on PCT algorithm compliance. A larger proportion of PCT episodes occurred in the initiation setting (540 [56.4%]) than in the continuation setting (417 [43.5%]). Overall, compliance with PCT algorithm recommendations was low (48.5%) and not significantly different between the initiation setting and the continuation setting (49.1% vs 47.7%, p=0.678). No patient-related or PCT-related factors were independently associated with PCT algorithm compliance on multivariable logistic regression. CONCLUSION: Compliance with PCT algorithm antibiotic recommendations in both the initiation and continuation settings was lower than that reported in published randomized studies. No factors were independently associated with PCT algorithm compliance. Institutions using PCT assays to guide antibiotic use should assess compliance with algorithm antibiotic recommendations. Inclusion of a formalized antimicrobial stewardship program along with a PCT-guided algorithm is highly recommended.