Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 28
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Ann Emerg Med ; 2024 Jun 11.
Artículo en Inglés | MEDLINE | ID: mdl-38864781

RESUMEN

STUDY OBJECTIVE: To evaluate if out-of-hospital administration of fentanyl and intranasal ketamine, compared to fentanyl alone, improves early pain control after injury. METHODS: We conducted an out-of-hospital randomized, placebo-controlled, blinded, parallel group clinical trial from October 2017 to December 2021. Participants were male, aged 18 to 65 years, receiving fentanyl to treat acute traumatic pain prior to hospital arrival, treated by an urban fire-based emergency medical services agency, and transported to the region's only adult Level I trauma center. Participants randomly received 50 mg intranasal ketamine or placebo. The primary outcome was the proportion with a minimum 2-point reduction in self-described pain on the verbal numerical rating scale 30 minutes after study drug administration assessed by 95% confidence interval overlap. Secondary outcomes were side effects, pain ratings, and additional pain medications through the first 3 hours of care. RESULTS: Among the 192 participants enrolled, 89 (46%) were White, (median age, 36 years; interquartile range, 27 to 53 years), with 103 receiving ketamine and 89 receiving placebo. There was no difference in the proportion experiencing improved pain 30 minutes after treatment (46/103 [44.7%] ketamine versus 32/89 [36.0%] placebo; difference in proportions, 8.7%; 95% confidence interval, -5.1% to 22.5%; P=.22) or at any time point through 3 hours. There was no difference in secondary outcomes or side effects. CONCLUSION: In our sample, we did not detect an analgesic benefit of adding 50 mg intranasal ketamine to fentanyl in out-of-hospital trauma patients.

2.
J Surg Res ; 281: 104-111, 2023 01.
Artículo en Inglés | MEDLINE | ID: mdl-36152398

RESUMEN

INTRODUCTION: Screening for blunt cardiac injury (BCI) includes obtaining a serum troponin level and an electrocardiogram for patients diagnosed with a sternal fracture. Our institution has transitioned to the use of a high sensitivity troponin I (hsTnI). The aim of this study was to determine whether hsTnI is comparable to troponin I (TnI) in identifying clinically significant BCI. MATERIALS AND METHODS: Trauma patients presenting to a level I trauma center over a 24-mo period with the diagnosis of sternal fracture were screened for BCI. Any initial TnI more than 0.04 ng/mL or hsTnI more than 18 ng/L was considered positive for potential BCI. Clinically significant BCI was defined as a new-bundle branch block, ST wave change, echocardiogram change, or need for cardiac catheterization. RESULTS: Two hundred sixty five patients with a sternal fracture were identified, 161 underwent screening with TnI and 104 with hsTnI. For TnI, the sensitivity and specificity for detection of clinically significant BCI was 0.80 and 0.79, respectively. For hsTnI, the sensitivity and specificity for detection of clinically significant BCI was 0.71 and 0.69, respectively. A multivariate analysis demonstrated the odds ratio for significant BCI with a positive TnI was 14.4 (95% confidence interval, 3.9-55.8, P < 0.0001) versus an odds ratio of 5.48 (95% confidence interval 1.9-15.7, P = 0.002) in the hsTnI group. CONCLUSIONS: The sensitivity of hsTnI is comparable to TnI for detection of significant BCI. Additional investigation is needed to determine the necessity and interval for repeat testing and the need for additional diagnostic testing.


Asunto(s)
Contusiones Miocárdicas , Traumatismos Torácicos , Humanos , Troponina I , Sensibilidad y Especificidad , Electrocardiografía , Biomarcadores
3.
Am J Ther ; 30(2): e95-e102, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-34387562

RESUMEN

BACKGROUND: Altered drug and nutrient absorption presents a unique challenge in critically ill patients. Performing an acetaminophen absorption test (AAT) has been used as a marker for gastric motility and upper small bowel absorption; thus, it may provide objective data regarding enteral absorptive ability in critically ill patients. STUDY QUESTION: What is the clinical experience with AAT when used as a surrogate marker for enteral absorption in critically ill patients? STUDY DESIGN: This single-center, retrospective, cohort study evaluated serum acetaminophen concentrations within 180 minutes following 1-time enteral administration of an AAT. Patients admitted to the surgical and medical intensive care units and medical intensive care units over a 7-year period were evaluated. Groups were defined as positive (acetaminophen concentration of ≥10 mg/L) or negative (acetaminophen concentration of <10 mg/L) AAT. MEASURES AND OUTCOMES: The outcomes were to describe the clinical experience, characteristics, and performance of AAT. RESULTS: Forty-eight patients were included. Patients were 58.5 ± 14 years of age, mostly male (58.3%), and admitted to the surgical intensive care unit (66.7%). Median hospital length of stay was 47.5 (27-78.8) days. Thirty-four patients (70.8%) had a positive AAT [median concentration, 14 (12-18) mg/L]. Median time to first detectable concentration was 37 (33-64) minutes. AAT characteristics were similar between the groups including total dose, weight-based dose, time to first and second assays, drug formulation, and site of administration between groups. There were no independent risk factors identified on regression analysis for negative AAT. CONCLUSIONS: An acetaminophen dose of 15 mg/kg with 2 coordinated serum concentrations approximately 30 and 60 minutes after administration is a reasonable construct for AAT. Future research is needed to assess AAT utility, safety, and clinical outcomes for predicting patient ability to absorb enteral feeds and medications.


Asunto(s)
Acetaminofén , Enfermedad Crítica , Humanos , Masculino , Femenino , Enfermedad Crítica/terapia , Estudios de Cohortes , Estudios Retrospectivos , Nutrición Enteral , Unidades de Cuidados Intensivos
4.
Antimicrob Agents Chemother ; 66(1): e0161121, 2022 01 18.
Artículo en Inglés | MEDLINE | ID: mdl-34662194

RESUMEN

Patients admitted to the intensive care unit (ICU) may need continuous renal replacement therapy (CRRT) due to acute kidney injury or worsening of underlying chronic kidney disease. This will affect their antimicrobial exposure and may have a significant impact on the treatment. We aim to develop a cefepime pharmacokinetic (PK) model in CRRT ICU patients and generate the posterior predictions for a group and assess their therapy outcomes. Adult patients, who were admitted to the ICU, received cefepime, and had its concentration measured while on CRRT were included from three different data sets. In two data sets, samples were collected from the predialyzer, postdialyzer ports, and effluent fluid at different times within the same dosing interval. The third data set had only cefepime plasma concentration measured as part of clinical service. Patients' demographics, cefepime regimens and concentration, CRRT parameters, and therapy outcomes were recorded. NPAG was used for population PK and posterior predictions. A total of 125 patients were included. Cefepime was described by a five-compartment model, and the CRRT flow rates described the rates of cefepime transfer between compartments. The posterior predictions were generated for the third data set and the median (range) fT>MIC was 100% (27%-100%) and fT>4×MIC was 64% (0%-100%). The mortality rate was 53%. There was no difference in target attainment in terms of clinical cure and 30-day mortality. This model can be used as a precision dosing tool in CRRT patients. Future studies may address other PK/PD targets in a larger population.


Asunto(s)
Lesión Renal Aguda , Terapia de Reemplazo Renal Continuo , Lesión Renal Aguda/tratamiento farmacológico , Adulto , Antibacterianos/farmacocinética , Cefepima/uso terapéutico , Enfermedad Crítica/terapia , Humanos , Unidades de Cuidados Intensivos , Terapia de Reemplazo Renal
5.
J Surg Res ; 280: 234-240, 2022 12.
Artículo en Inglés | MEDLINE | ID: mdl-36007482

RESUMEN

INTRODUCTION: While the pillars of trauma resuscitation are surgical hemostasis and blood product administration, norepinephrine (NE) can be used as an adjunct. The goal of this study was to evaluate the relationship between the maximum dose of NE, timing of NE administration, and mortality in trauma patients. METHODS: Patients admitted between January 2013 and January 2021 treated with NE were reviewed. Univariate and multivariate logistic regression were used to assess whether maximum NE dose was independently associated with mortality. Optimal dosage rates for NE were determined via Youden Index. Subgroup analyses comparing those who received NE within versus after the first 24 h of admission were conducted. RESULTS: Three hundred fifty-first trauma patients were included, with 217 (62%) surviving. Patients who died received an average maximum dose of 16.7 mcg/min compared to 9.1 mcg/min in survivors (P = 0.0003). Mortality rate increased with dosage (P < 0.0001), with doses greater than 20 mcg/min having 79% mortality. Those who received NE within the first 24 h had an inflection point in mortality at 16 mcg/min (Youden = 0.45) (OR 1.06; 95% CI 1.03-1.10). For patients who received NE after the first 24 h, an inflection point in mortality was at 10 mcg/min (Youden = 0.34) (OR 1.09; 95% CI 1.04-1.14). CONCLUSIONS: Higher maximum doses of NE were associated with increased mortality. Patients initiated on NE more than 24 h into their admission displayed an inflection point at a lower dose than those initiated later. This suggests that trauma patients initiated on NE after 24 h from injury may have a dire prognosis.


Asunto(s)
Norepinefrina , Vasoconstrictores , Humanos , Norepinefrina/uso terapéutico , Resucitación
6.
Prehosp Emerg Care ; 26(3): 422-427, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-34028315

RESUMEN

Background: All medications should be stored within temperature ranges defined by manufacturers, but logistical and operational challenges of prehospital and military settings complicate adherence to these recommendations. Lorazepam and succinylcholine experience clinically relevant heat-related degradation, whereas midazolam does not. Because ketamine's stability when stored outside manufacturer recommendations is unknown, we evaluated the heat-related degradation of ketamine exposed to several temperature ranges. Methods: One hundred twenty vials of ketamine (50 mg/mL labeled concentration) from the same manufacturer lot were equally distributed and stored for six months in five environments: an active EMS unit in southwest Ohio (May-October 2019); heat chamber at constant 120 °F (C1); heat chamber fluctuating over 24 hours from 86 °F-120 °F (C2); heat chamber fluctuating over 24 hours from 40 °F-120 °F (C3); heat chamber kept at constant 70 °F (manufacturer recommended room temperature, C4). Four ketamine vials were removed every 30 days from each environment and sent to an FDA-accredited commercial lab for high performance liquid chromatography testing. Data loggers and thermistors allowed temperature recording every minute for all environments. Cumulative heat exposure was quantified by mean kinetic temperature (MKT), which accounts for additional heat-stress over time caused by temperature fluctuations and is a superior measure than simple ambient temperature. MKT was calculated for each environment at the time of ketamine removal. Descriptive statistics were used to describe the concentration changes at each time point. Results: The MKT ranged from 73.6 °F-80.7 °F in the active EMS unit and stayed constant for each chamber (C1 MKT: 120 °F, C2 MKT: 107.3 °F, C3 MKT: 96.5 °F, C4 MKT: 70 °F). No significant absolute ketamine degradation, or trends in degradation, occurred in any environment at any time point. The lowest median concentration occurred in the EMS-stored samples removed after 6 months [48.2 mg/mL (47.75, 48.35)], or 96.4% relative strength to labeled concentration. Conclusion: Ketamine samples exhibited limited degradation after 6 months of exposure to real world and simulated extreme high temperature environments exceeding manufacturer recommendations. Future studies are necessary to evaluate ketamine stability beyond 6 months.


Asunto(s)
Servicios Médicos de Urgencia , Ketamina , Estabilidad de Medicamentos , Almacenaje de Medicamentos , Calor , Humanos , Temperatura
7.
Artículo en Inglés | MEDLINE | ID: mdl-33722885

RESUMEN

Sepsis causes half of acute kidney injuries in the intensive care unit (ICU). ICU patients may need continuous renal replacement therapy (CRRT), which will affect their antimicrobial exposure. We aimed to build a cefepime population pharmacokinetic (PK) model in CRRT ICU patients and perform simulations to assess target attainment. Patients who were ≥18 years old, were admitted to the ICU, and received cefepime 2 g every 8 h as a 4-h infusion while on CRRT were enrolled prospectively. Samples were collected from the predialyzer ports, postdialyzer ports, and effluent fluid at 1, 2, 3, 4, and 8 h after the first dose and at steady state. Age, sex, weight, urine output, and CRRT parameters were recorded. Pmetrics was used for population PK and simulations. The target exposure was 100% of the dosing interval during which the free beta-lactam concentration is above the MIC (fT>MIC). Ten patients were included; their mean age was 53 years, and mean weight was 119 kg. Seventy percent were males. Cefepime was described by a five-compartment model. The downtime was applied to the CRRT flow rates, which were used to describe the rates of transfer between the compartments. At MICs of ≤8 mg/liter, intermittent infusion of 2 g cefepime every 8 h achieved good target attainment both early in therapy and at steady state. Only extended- and continuous-infusion regimens achieved good target attainment at MICs of 16 mg/liter. In conclusion, 2 g cefepime infused over 30 min followed by extended infusion of 2 g every 8 h achieved good target attainment at MICs of ≤16 mg/liter with different CRRT flow rates and may be considered in resistant bacterial infections.


Asunto(s)
Terapia de Reemplazo Renal Continuo , Adolescente , Antibacterianos/uso terapéutico , Cefepima , Enfermedad Crítica , Femenino , Humanos , Masculino , Pruebas de Sensibilidad Microbiana , Persona de Mediana Edad , Método de Montecarlo , Terapia de Reemplazo Renal
8.
J Surg Res ; 265: 139-146, 2021 09.
Artículo en Inglés | MEDLINE | ID: mdl-33940236

RESUMEN

BACKGROUND: There is no consensus on what dose of norepinephrine corresponds with futility. The purpose of this study was to investigate the maximum infusion and cumulative doses of norepinephrine associated with survival for patients in medical and surgical intensive care units (MICU and SICU). MATERIALS AND METHODS: A retrospective review was conducted of 661 critically ill patients admitted to a large academic medical center who received norepinephrine. Univariate, multivariate, and area under the curve analyses with optimal cut offs for maximum infusion rate and cumulative dosage were determined by Youden Index. RESULTS: The population was 54.9% male, 75.8% white, and 58.7 ± 16.1 y old with 384 (69.8%) admitted to the MICU and 166 (30.2%) admitted to the SICU, including 38 trauma patients. Inflection points in mortality were seen at 18 mcg/min and 17.6 mg. The inflection point was higher in MICU patients at 21 mcg/min and lower in SICU patients at 11 mcg/min. MICU patients also had a higher maximum cumulative dosage of 30.7 mg, compared to 2.7 mg in SICU patients. In trauma patients, norepinephrine infusions up to 5 mcg/min were associated with a 41.7% mortality rate. CONCLUSION: A maximum rate of 18 mcg/min and cumulative dose of 17.6 mg were the inflection points for mortality risk in ICU patients, with SICU patients tolerating lower doses. In trauma patients, even low doses of norepinephrine were associated with higher mortality. These data suggest that MICU, SICU, and trauma patients differ in need for, response to, and outcome from escalating norepinephrine doses.


Asunto(s)
Agonistas alfa-Adrenérgicos/administración & dosificación , Enfermedad Crítica/terapia , Inutilidad Médica , Norepinefrina/administración & dosificación , Heridas y Lesiones/mortalidad , Adulto , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Ohio/epidemiología , Estudios Retrospectivos , Heridas y Lesiones/tratamiento farmacológico
9.
Hosp Pharm ; 56(5): 560-568, 2021 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-34720161

RESUMEN

Background: Induction of antibiotic resistance is associated with increased morbidity and mortality in AmpC ß-lactamase producing Enterobacteriaceae. The use of ceftriaxone is controversial for treatment of these organisms due to concerns for inducible resistance. This study was designed to compare treatment failure rates between ceftriaxone and antipseudomonal ß-lactam antibiotics when used as definitive therapy for organisms most commonly associated with chromosomal AmpC ß-lactamase production. Methods: A retrospective, single-center cohort study was performed enrolling patients hospitalized with monomicrobial Enterobacter, Citrobacter, or Serratia spp. infections. The primary objective compared proportion of treatment failure between groups. All patients received either ceftriaxone or an antipseudomonal ß-lactam alone within 24 hours of culture finalization, and with a duration of at least 72 hours for definitive treatment. Treatment failure was defined as either clinical failure (abnormal white blood cell count or temperature on day 7 or 14 post-antibiotics) or microbiologic failure (regrowth of the same organism at same site within 14 or 21 days). Results: Of 192 total patients, treatment failure was observed in 24/71 patients (34%) receiving ceftriaxone and in 42/121 patients (35%) receiving antipseudomonal ß-lactam (P = .98). No difference was observed between clinical or microbiologic failure rates between groups. The ceftriaxone group had significantly more patients undergoing treatment for urinary tract infections (51% vs 17%, P < .001), but treatment failure rates remained similar between groups when comparing infections of all other sources. Conclusion: Ceftriaxone has comparable treatment failure rates to antipseudomonal ß-lactams for susceptible Enterobacteriaceae infections and may be considered as a therapeutic option. Further, prospective research is needed to validate optimal dosing and application in all sites of infection.

10.
J Surg Res ; 249: 225-231, 2020 05.
Artículo en Inglés | MEDLINE | ID: mdl-31991331

RESUMEN

BACKGROUND: Venous thromboembolism (VTE) risk increases with age. Scarce data exist for patients age ≥65 y. This study evaluated VTE incidence in elderly, high-risk trauma patients receiving unfractionated heparin (UFH) or enoxaparin chemoprophylaxis. MATERIALS AND METHODS: This retrospective, single-center, cohort study included trauma patients age ≥ 65 y with risk assessment profile (RAP) ≥ 5 who received UFH or enoxaparin chemoprophylaxis. The primary outcome was VTE incidence requiring therapeutic anticoagulation. An age-modified RAP (RAP-AM) was calculated as RAP without age distribution points. Logistic regression analyses were performed to identify independent predictors for VTE development and chemoprophylactic agent selection. Bleeding incidence compared packed red blood cells utilized. RESULTS: A total of 1090 patients were included (UFH, n = 655; enoxaparin, n = 435). VTE occurred in 39 (3.6%) patients with no difference between groups in proximal deep vein thrombosis (2.1% versus 3.0%, P = 0.52) or pulmonary embolism (1.2% versus 1.4%, P = 0.96). Weight ≥125 kg (OR 4.12, 95% CI 1.06-16.11) and RAP-AM ≥ 5 (OR 6.52, 95% CI 2.65-16.03) were independently associated with VTE development. Increasing age (OR 1.04, 95% CI 1.03-1.06), initiation ≤ 24 h (OR 2.17, 95% CI 1.66-2.84) and creatinine clearance ≤ 30 mL/min (OR 1.61, 95% CI 1.17-2.21) were independent predictors of receiving UFH whereas increasing ISS (OR 0.97, 95% CI 0.95-0.99) was associated with receiving enoxaparin. CONCLUSIONS: VTE incidence may be similar for high-risk, elderly trauma patients receiving UFH and enoxaparin chemoprophylaxis. Further research is necessary to determine noninferiority of UFH to enoxaparin in this patient population.


Asunto(s)
Anticoagulantes/uso terapéutico , Enoxaparina/uso terapéutico , Embolia Pulmonar/prevención & control , Tromboembolia Venosa/prevención & control , Heridas y Lesiones/complicaciones , Factores de Edad , Anciano , Anciano de 80 o más Años , Envejecimiento/fisiología , Femenino , Humanos , Incidencia , Masculino , Embolia Pulmonar/epidemiología , Embolia Pulmonar/etiología , Embolia Pulmonar/fisiopatología , Sistema de Registros/estadística & datos numéricos , Estudios Retrospectivos , Resultado del Tratamiento , Tromboembolia Venosa/epidemiología , Tromboembolia Venosa/etiología , Tromboembolia Venosa/fisiopatología
11.
Crit Care Nurs Q ; 42(1): 2-11, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-30507659

RESUMEN

Clostridium difficile is a gram-positive, anaerobic, spore-forming bacterium that is the leading cause of nosocomial infections in hospitals in the United States. Critically ill patients are at high risk for C. difficile infection (CDI) and face potentially detrimental effects, including prolonged hospitalization, risk of recurrent disease, complicated surgery, and death. CDI requires a multidisciplinary approach to decrease hospital transmission and improve treatment outcomes. This article briefly reviews the current literature and guideline recommendations for treatment and prevention of CDI, with a focus on antibiotic treatment considerations including dosing, routes of administration, efficacy data, adverse effects, and monitoring parameters.


Asunto(s)
Antibacterianos/uso terapéutico , Clostridioides difficile/aislamiento & purificación , Infecciones por Clostridium/diagnóstico , Infecciones por Clostridium/tratamiento farmacológico , Clostridioides difficile/patogenicidad , Infección Hospitalaria/prevención & control , Hospitales , Humanos , Probióticos , Inhibidores de la Bomba de Protones
12.
Crit Care Nurs Q ; 42(1): 12-29, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-30507660

RESUMEN

Alcohol withdrawal syndrome (AWS) is a complex neurologic disorder that develops after an acute reduction in or cessation of chronic alcohol consumption that alters neurotransmitter conduction. The incidence of AWS in the intensive care unit varies, but has been associated with poor outcomes. This is primarily driven by downregulation of gamma-aminobutyric acid (GABA) leading to autonomic excitability and psychomotor agitation. No clinical assessment tools have been validated to assess for AWS in the intensive care unit, particularly for patients requiring mechanical ventilation. The Clinical Institute Withdrawal Assessment for Alcohol Scale, Revised, may be considered to gauge the extent of withdrawal, but is not particular with acute presentations in this population. Symptom-triggered use of GABA agonist such as benzodiazepines remains the mainstay of pharmacotherapeutic intervention. Nonbenzodiazepine GABA agonists such as barbiturates and propofol as well as non-GABA adjunctive agents such as dexmedetomidine, ketamine, and antipsychotic agents may help reduce the need for symptom-triggered benzodiazepine dosing, but lack robust data. Agent selection should be based on patient-specific factors such as renal and hepatic metabolism, duration of action, and clearance. Institution-specific protocols directing GABA-acting medications and adjunctive medications for excitatory, adrenergic, and delirium assessments could be considered to improve patient outcomes and caregiver satisfaction.


Asunto(s)
Alcoholismo , Benzodiazepinas/uso terapéutico , Dexmedetomidina/uso terapéutico , Hipnóticos y Sedantes/uso terapéutico , Síndrome de Abstinencia a Sustancias/tratamiento farmacológico , Benzodiazepinas/farmacología , Dexmedetomidina/farmacología , Humanos , Hipnóticos y Sedantes/farmacología , Unidades de Cuidados Intensivos
13.
J Surg Res ; 231: 373-379, 2018 11.
Artículo en Inglés | MEDLINE | ID: mdl-30278956

RESUMEN

BACKGROUND: Minimizing the interval between diagnosis of sepsis and administration of antibiotics improves patient outcomes. We hypothesized that a commercially available bedside clinical surveillance visualization system (BSV) would hasten antibiotic administration and decrease length of stay (LOS) in surgical intensive care unit (SICU) patients. METHODS: A BSV, integrated with the electronic medical record and displayed at bedside, was implemented in our SICU in July 2016. A visual sepsis screen score (SSS) was added in July 2017. All patients admitted to SICU beds with bedside displays equipped with a BSV were analyzed to determine mean SSS, maximum SSS, time from positive SSS to antibiotic administration, SICU LOS, and mortality. RESULTS: During the study period, 232 patients were admitted to beds equipped with the clinical surveillance visualization system. Thirty patients demonstrated positive SSS followed by confirmed sepsis (23 Pre-SSS versus 7 Post-SSS). Mean and maximum SSS were similar. Time from positive SSS to antibiotic administration was decreased in patients with a visual SSS (55.3 ± 15.5 h versus 16.2 ± 9.2 h; P < 0.05). ICU and hospital LOS was also decreased (P < 0.01). CONCLUSIONS: Implementation of a visual SSS into a BSV led to a decreased time interval between the positive SSS and administration of antibiotics and was associated with shorter SICU and hospital LOS. Integration of a visual decision support system may help providers adhere to Surviving Sepsis Guidelines.


Asunto(s)
Sistemas de Computación , Cuidados Críticos/métodos , Sistemas de Apoyo a Decisiones Clínicas , Pruebas en el Punto de Atención , Complicaciones Posoperatorias/diagnóstico , Mejoramiento de la Calidad/estadística & datos numéricos , Sepsis/diagnóstico , Adulto , Anciano , Antibacterianos/uso terapéutico , Estudios de Cohortes , Cuidados Críticos/normas , Femenino , Adhesión a Directriz/estadística & datos numéricos , Humanos , Tiempo de Internación/estadística & datos numéricos , Masculino , Persona de Mediana Edad , Complicaciones Posoperatorias/tratamiento farmacológico , Complicaciones Posoperatorias/mortalidad , Guías de Práctica Clínica como Asunto , Sepsis/tratamiento farmacológico , Sepsis/etiología , Sepsis/mortalidad , Factores de Tiempo , Resultado del Tratamiento
14.
J Surg Res ; 225: 6-14, 2018 05.
Artículo en Inglés | MEDLINE | ID: mdl-29605036

RESUMEN

BACKGROUND: It is unknown whether ketamine administered via patient-controlled analgesia (PCA) provides adequate analgesia while reducing opioid consumption in the traumatically injured patient. Differences in opioid consumption, pain scores, and adverse effects between ketamine and hydromorphone PCA were studied. MATERIALS AND METHODS: This is an investigator-initiated, single-center, double-blinded, randomized, pilot trial conducted from 2014 to 2016 at a level 1 trauma center. Nonintubated trauma patients in intensive care, who were receiving PCA, were randomized to ketamine or hydromorphone PCA plus opioid analgesics for breakthrough pain. RESULTS: Twenty subjects were randomized. There was no difference in median daily breakthrough opioid use (10 [0.63-19.38] mg versus 10 [4.38-22.5] mg, P = 0.55). Subjects in the ketamine group had lower median cumulative opioid use on therapy day 1 than the hydromorphone group (4.6 [2.5-15] mg versus 41.8 [31.8-50] mg, P < 0.001), as well as in the first 48 h (10 [3.3-15] mg versus 48.5 [32.1-67.5] mg, P < 0.001) and first 72 h (10 [4.2-15] mg versus 42.5 [31.7-65.2] mg, P < 0.001) of therapy. Daily oxygen supplementation requirements were lower in the ketamine group (0.5 [0-1.5] L/min versus 2 [0.5-3] L/min, P = 0.020). Hallucinations occurred more frequently in the ketamine group (40% versus 0%, P = 0.090). CONCLUSIONS: Ketamine PCA led to lower cumulative opioid consumption and lower oxygen supplementation requirements, though hallucinations occurred more frequently with use of ketamine. Additional studies are needed to investigate the tolerability of ketamine as an alternative to traditional opioid-based PCA.


Asunto(s)
Dolor Agudo/tratamiento farmacológico , Analgesia Controlada por el Paciente/métodos , Analgésicos/administración & dosificación , Alucinaciones/epidemiología , Hidromorfona/administración & dosificación , Ketamina/administración & dosificación , Heridas y Lesiones/complicaciones , Dolor Agudo/diagnóstico , Dolor Agudo/etiología , Adulto , Analgesia Controlada por el Paciente/efectos adversos , Método Doble Ciego , Femenino , Alucinaciones/inducido químicamente , Humanos , Hidromorfona/efectos adversos , Ketamina/efectos adversos , Masculino , Persona de Mediana Edad , Dimensión del Dolor , Proyectos Piloto , Resultado del Tratamiento , Adulto Joven
15.
Ann Pharmacother ; 52(12): 1204-1210, 2018 12.
Artículo en Inglés | MEDLINE | ID: mdl-29871503

RESUMEN

BACKGROUND: Continuous renal replacement therapy (CRRT) may be associated with thrombocytopenia in critically ill patients. A confounding factor is concomitant use of unfractionated heparin (UFH) and suspicion for heparin-induced thrombocytopenia (HIT). OBJECTIVE: To determine the impact of CRRT on platelet count and development of thrombocytopenia. METHODS: Retrospective analyses evaluated the intrapatient change in platelet count following CRRT initiation. Critically ill adult patients who received CRRT for at least 48 hours were included. The primary outcome was intrapatient change in platelet count from CRRT initiation through the first 5 days of therapy. Secondary outcomes included thrombocytopenia incidence, identification of concomitant factors associated with thrombocytopenia, and frequency of HIT. RESULTS: 80 patients were included. Median platelet count at CRRT initiation (D0) was 128000/µL (81500-212500/µL), which was higher than those on subsequent post-CRRT days (D1: 104500/µL [63000-166750/µL]; D2: 88500/µL [53500-136750/µL]; D3: 91000/µL [49000-138000/µL]; D4: 93000/µL [46000-134000/µL]; and D5: 76000/µL [45500-151000/µL]; P < 0.05 for all). Twenty-five (35%) patients had thrombocytopenia on CRRT D0 compared with D2 (56.3%), D3 (58.7%), and D5 (59.1%); P < 0.05 for all. Controlling for potential confounders, Sequential Organ Failure Assessment score at the time of CRRT initiation was the only independent factor associated with thrombocytopenia. One (1.3%) patient had confirmed HIT. Conclusion and Relevance: This study is the first to demonstrate serial decreases in platelet count across multiple days after CRRT initiation. These data may provide additional insight to thrombocytopenia development in critically ill patients receiving heparin while on CRRT that is not associated with HIT.


Asunto(s)
Enfermedad Crítica/terapia , Terapia de Reemplazo Renal/efectos adversos , Trombocitopenia/sangre , Trombocitopenia/etiología , Adulto , Femenino , Heparina/efectos adversos , Humanos , Masculino , Persona de Mediana Edad , Recuento de Plaquetas/tendencias , Terapia de Reemplazo Renal/tendencias , Estudios Retrospectivos , Trombocitopenia/diagnóstico , Adulto Joven
16.
Ann Pharmacother ; 51(7): 614-616, 2017 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-28205455

RESUMEN

Obesity presents a growing challenge in critically ill patients because of variable medication pharmacokinetics and pharmacodynamics. Vasopressors used in the treatment of septic shock, including norepinephrine, are dosed using weight-based (WB) or non-weight-based (NWB) strategies. Retrospective research has evaluated the effect of total body weight and body mass index on vasopressor requirements, consequently finding that obese patients require less total vasopressor per kilogram to obtain clinical end points such as mean arterial pressure. Although this effect is not completely understood, this may suggest that a NWB dosing strategy is preferred over a WB strategy in obese patients to minimize potential for error.


Asunto(s)
Norepinefrina/administración & dosificación , Obesidad/complicaciones , Choque Séptico/tratamiento farmacológico , Vasoconstrictores/administración & dosificación , Presión Arterial , Índice de Masa Corporal , Peso Corporal , Humanos
17.
Respir Care ; 67(1): 16-23, 2022 01.
Artículo en Inglés | MEDLINE | ID: mdl-34815325

RESUMEN

BACKGROUND: Inhaled tobramycin can be used for empiric or definitive therapy of ventilator-associated pneumonia (VAP) in mechanically ventilated patients. This is believed to minimize systemic exposure and potential adverse drug toxicities including acute kidney injury (AKI). However, detectable serum tobramycin concentrations have been reported after inhaled tobramycin therapy with AKI. METHODS: This retrospective, observational study evaluated mechanically ventilated adult subjects admitted to ICUs at a large, urban academic medical center that received empiric inhaled tobramycin for VAP. Subjects were separated into detectable (ie, ≥ 0.6 mg/L) or undetectable serum tobramycin concentration groups, and characteristics were compared. Independent predictors for detectable serum tobramycin concentration and new onset AKI during or within 48 h of therapy discontinuation were assessed. RESULTS: Fifty-nine inhaled tobramycin courses in 53 subjects were included in the analysis, of which 39 (66.1%) courses administered to 35 (66.0%) subjects had detectable serum tobramycin concentrations. Subjects with detectable serum tobramycin concentrations were older (57.1 y ± 11.4 vs 45.9 ±15.0, P = .004), had higher PEEP (9.2 cm H2O [7.0-11.0] vs 8.0 [5.6-8.9], P = .049), chronic kidney disease stage ≥ 2 (10 [29.4%] vs 0 [0%], P = .009), and higher serum creatinine before inhaled tobramycin therapy (1.26 mg/dL [0.84-2.18] vs 0.76 [0.47-1.28], P = .004). Age (odds ratio 1.09 [95% CI 1.02-1.16], P = .009) and PEEP (odds ratio 1.47 [95% CI 1.08-2.0], P =.01) were independent predictors for detectable serum tobramycin concentration. Thirty-seven subjects had no previous renal disease or injury, of which 9 (24.3%) developed an AKI. Sequential Organ Failure Assessment score (odds ratio 1.72 [95% CI 1.07-2.76], P = .03) was the only independent predictor for AKI. CONCLUSIONS: Detectable serum tobramycin concentrations were frequently observed in critically ill, mechanically ventilated subjects receiving empiric inhaled tobramycin for VAP. Subject age and PEEP were independent predictors for detectable serum tobramycin concentration. Serum monitoring and empiric dose reductions should be considered in older patients and those requiring higher PEEP.


Asunto(s)
Lesión Renal Aguda , Neumonía Asociada al Ventilador , Adulto , Humanos , Anciano , Tobramicina/uso terapéutico , Antibacterianos/uso terapéutico , Neumonía Asociada al Ventilador/tratamiento farmacológico , Estudios Retrospectivos , Enfermedad Crítica
18.
J Trauma Acute Care Surg ; 93(4): 545-551, 2022 10 01.
Artículo en Inglés | MEDLINE | ID: mdl-35545799

RESUMEN

BACKGROUND: The goals of sedation in the critically ill surgical patient are to minimize pain, anxiety, and agitation without hindering cardiopulmonary function. One potential benefit of tracheostomy during endotracheal intubation is the reduction of sedation and analgesia; however, there are little data to support this supposition. We hypothesized that patients undergoing tracheostomy would have a rapid reduction in sedation and analgesia following tracheostomy. METHODS: A retrospective review of tracheostomies performed at a single Level I trauma center from January 2013 to June 2018 was completed. An evaluation of Glasgow Coma Scale, Richmond Agitation-Sedation Scale, and Confusion Assessment Method for the intensive care unit 72 hours pretracheostomy to 72 hours posttracheostomy was performed. The total daily dose of sedation, anxiolytic, and analgesic medications administered were recorded. Mixed-effects models were used to evaluate longitudinal drug does over time (hours). RESULTS: Four hundred sixty-eight patients included for analysis with a mean age of 58.8 ± 18.3 years. There was a significant decrease in propofol and fentanyl utilization from 24 hours pretracheostomy to 24 hours posttracheostomy in both dose and number of patients receiving these continuous intravenous medications. Similarly, total morphine milligram equivalents (MME) use and continuous midazolam significantly decreased from 24 hours pretracheostomy to 24 hours posttracheostomy. By contrast, intermittent enteral quetiapine and methadone administration increased after tracheostomy. Importantly, Richmond Agitation-Sedation Scale, Glasgow Coma Scale, and Confusion Assessment Method scoring were also significantly improved as early as 24 hours posttracheostomy. Total MME use was significantly elevated in patients younger than 65 years and in male patients pretracheostomy compared with female patients. Patients admitted to the medical intensive care unit had significantly higher MME use compared with those in the surgical intensive care unit pretracheostomy. CONCLUSION: Tracheostomy allows for a rapid and significant reduction in intravenous sedation and analgesia medication utilization. Posttracheostomy sedation can transition to intermittent enteral medications, potentially contributing to the observed improvements in postoperative mental status and agitation. LEVEL OF EVIDENCE: Therapeutic/Care Management; Level III.


Asunto(s)
Analgesia , Ansiolíticos , Propofol , Adulto , Anciano , Analgésicos , Endrín/análogos & derivados , Femenino , Fentanilo , Humanos , Hipnóticos y Sedantes , Unidades de Cuidados Intensivos , Masculino , Metadona , Midazolam , Persona de Mediana Edad , Derivados de la Morfina , Dolor , Fumarato de Quetiapina , Respiración Artificial , Traqueostomía
19.
J Trauma Acute Care Surg ; 92(2): 266-276, 2022 02 01.
Artículo en Inglés | MEDLINE | ID: mdl-34789700

RESUMEN

BACKGROUND: Blunt chest wall injury accounts for 15% of trauma admissions. Previous studies have shown that the number of rib fractures predicts inpatient opioid requirements, raising concerns for pharmacologic consequences, including hypotension, delirium, and opioid dependence. We hypothesized that intercostal injection of liposomal bupivacaine would reduce analgesia needs and improve spirometry metrics in trauma patients with rib fractures. METHODS: A prospective, double-blinded, randomized placebo-control study was conducted at a Level I trauma center as a Food and Drug Administration investigational new drug study. Enrollment criteria included patients 18 years or older admitted to the intensive care unit with blunt chest wall trauma who could not achieve greater than 50% goal inspiratory capacity. Patients were randomized to liposomal bupivacaine or saline injections in up to six intercostal spaces. Primary outcome was to examine pain scores and breakthrough pain medications for 96-hour duration. The secondary endpoint was to evaluate the effects of analgesia on pulmonary physiology. RESULTS: One hundred patients were enrolled, 50 per cohort, with similar demographics (Injury Severity Score, 17.9 bupivacaine 17.6 control) and comorbidities. Enrolled patients had a mean age of 60.5 years, and 47% were female. Rib fracture number, distribution, and targets for injection were similar between groups. While both groups displayed a decrease in opioid use over time, there was no change in mean daily pain scores. The bupivacaine group achieved higher incentive spirometry volumes over Days 1 and 2 (1095 mL, 1063 mL bupivacaine vs. 900 mL, 866 mL control). Hospital and intensive care unit lengths of stay were similar and there were no differences in postinjection pneumonia, use of epidural catheters or adverse events bet ween groups. CONCLUSION: While intercostal liposomal bupivacaine injection is a safe method for rib fracture-related analgesia, it was not effective in reducing pain scores, opioid requirements, or hospital length of stay. Bupivacaine injection transiently improved incentive spirometry volumes, but without a reduction in the development of pneumonia. LEVEL OF EVIDENCE: Therapeutic/care management, Level II.


Asunto(s)
Anestésicos Locales/administración & dosificación , Bupivacaína/administración & dosificación , Manejo del Dolor/métodos , Fracturas de las Costillas/complicaciones , Analgésicos Opioides/uso terapéutico , Método Doble Ciego , Femenino , Humanos , Inyecciones , Puntaje de Gravedad del Traumatismo , Unidades de Cuidados Intensivos , Tiempo de Internación/estadística & datos numéricos , Liposomas , Masculino , Persona de Mediana Edad , Dimensión del Dolor , Estudios Prospectivos , Espirometría
20.
J Pharm Pract ; : 8971900221134551, 2022 Oct 25.
Artículo en Inglés | MEDLINE | ID: mdl-36282867

RESUMEN

Objective: To evaluate practitioner use of ketamine and identify potential barriers to use in acutely and critically ill patients. To compare characteristics, beliefs, and practices of ketamine frequent users and non-users. Methods: An online survey developed by members of the Society of Critical Care Medicine (SCCM) Clinical Pharmacy and Pharmacology Section was distributed to physician, pharmacist, nurse practitioner, physician assistant and nurse members of SCCM. The online survey queried SCCM members on self-reported practices regarding ketamine use and potential barriers in acute and critically ill patients. Results: Respondents, 341 analyzed, were mostly adult physicians, practicing in the United States at academic medical centers. Clinicians were comfortable or very comfortable using ketamine to facilitate intubation (80.0%), for analgesia (77.9%), procedural sedation (79.4%), continuous ICU sedation (65.8%), dressing changes (62.4%), or for asthma exacerbation and status epilepticus (58.8% and 40.4%). Clinicians were least comfortable with ketamine use for alcohol withdrawal and opioid detoxification (24.7% and 23.2%). Most respondents reported "never" or "infrequently" using ketamine preferentially for continuous IV analgesia (55.6%) or sedation (61%). Responses were mixed across dosing ranges and duration. The most common barriers to ketamine use were adverse effects (42.6%), other practitioners not routinely using the medication (41.5%), lack of evidence (33.5%), lack of familiarity (33.1%), and hospital/institutional policy guiding the indication for use (32.3%). Conclusion: Although most critical care practitioners report feeling comfortable using ketamine, there are many inconsistencies in practice regarding dose, duration, and reasons to avoid or limit ketamine use. Further educational tools may be targeted at practitioners to improve appropriate ketamine use.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA