Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 28
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Ann Pharmacother ; 57(10): 1178-1184, 2023 10.
Artículo en Inglés | MEDLINE | ID: mdl-36803019

RESUMEN

BACKGROUND: Essential to the coagulation pathway, vitamin K (phytonadione) is used to correct clotting factor deficiencies and for reversal of warfarin-induced bleeding. In practice, high-dose intravenous (IV) vitamin K is often used, despite limited evidence supporting repeated dosing. OBJECTIVE: This study sought to characterize differences in responders and nonresponders to high-dose vitamin K to guide dosing strategies. METHODS: This was a case-control study of hospitalized adults who received vitamin K 10 mg IV daily for 3 days. Cases were represented by patients who responded to the first dose of IV vitamin K and controls were nonresponders. The primary outcome was change in international normalized ratio (INR) over time with subsequent vitamin K doses. Secondary outcomes included factors associated with response to vitamin K and incidence of safety events. The Cleveland Clinic Institutional Review Board approved this study. RESULTS: There were 497 patients included, and 182 were responders. Most patients had underlying cirrhosis (91.5%). In responders, the INR decreased from 1.89 at baseline (95% CI = [1.74-2.04]) to 1.40 on day 3 (95% CI = [1.30-1.50]). In nonresponders, the INR decreased from 1.97 (95% CI = [1.83-2.13]) to 1.85 ([1.72-1.99]). Factors associated with response included lower body weight, absence of cirrhosis, and lower bilirubin. There was a low incidence of safety events observed. CONCLUSIONS: In this study of mainly patients with cirrhosis, the overall adjusted decrease in INR over 3 days was 0.3, which may have minimal clinical impact. Additional studies are needed to identify populations who may benefit from repeated daily doses of high-dose IV vitamin K.


Asunto(s)
Vitamina K , Warfarina , Adulto , Humanos , Estudios de Casos y Controles , Warfarina/uso terapéutico , Vitamina K 1/uso terapéutico , Vitamina K 1/farmacología , Coagulación Sanguínea , Relación Normalizada Internacional , Cirrosis Hepática/tratamiento farmacológico , Anticoagulantes/efectos adversos
2.
J Intensive Care Med ; 36(2): 157-174, 2021 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-32844730

RESUMEN

The rapid spread of the severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) has led to a global pandemic. The 2019 coronavirus disease (COVID-19) presents with a spectrum of symptoms ranging from mild to critical illness requiring intensive care unit (ICU) admission. Acute respiratory distress syndrome is a major complication in patients with severe COVID-19 disease. Currently, there are no recognized pharmacological therapies for COVID-19. However, a large number of COVID-19 patients require respiratory support, with a high percentage requiring invasive ventilation. The rapid spread of the infection has led to a surge in the rate of hospitalizations and ICU admissions, which created a challenge to public health, research, and medical communities. The high demand for several therapies, including sedatives, analgesics, and paralytics, that are often utilized in the care of COVID-19 patients requiring mechanical ventilation, has created pressure on the supply chain resulting in shortages in these critical medications. This has led clinicians to develop conservation strategies and explore alternative therapies for sedation, analgesia, and paralysis in COVID-19 patients. Several of these alternative approaches have demonstrated acceptable levels of sedation, analgesia, and paralysis in different settings but they are not commonly used in the ICU. Additionally, they have unique pharmaceutical properties, limitations, and adverse effects. This narrative review summarizes the literature on alternative drug therapies for the management of sedation, analgesia, and paralysis in COVID-19 patients. Also, this document serves as a resource for clinicians in current and future respiratory illness pandemics in the setting of drug shortages.


Asunto(s)
Analgésicos Opioides/administración & dosificación , COVID-19/complicaciones , Hipnóticos y Sedantes/administración & dosificación , Bloqueantes Neuromusculares/administración & dosificación , Respiración Artificial , Síndrome de Dificultad Respiratoria/terapia , Síndrome de Dificultad Respiratoria/virología , Enfermedad Crítica , Humanos , Unidades de Cuidados Intensivos , Pandemias , SARS-CoV-2
3.
Am J Ther ; 29(2): e163-e174, 2020 Jan 16.
Artículo en Inglés | MEDLINE | ID: mdl-32452843

RESUMEN

BACKGROUND: Empiric combination antimicrobial therapy is often used in patients with decompensating septic shock. However, the optimal duration of combination therapy is unknown. STUDY QUESTION: The goal of this study was to compare the clinical effects of a single dose of an aminoglycoside to an extended duration of aminoglycosides for combination therapy in patients with septic shock without renal dysfunction. STUDY DESIGN: Retrospective, single-center evaluation of patients with septic shock who received empiric combination therapy with an aminoglycoside. MEASURES AND OUTCOMES: Two patient cohorts were evaluated: those who received a single dose of an aminoglycoside and those who received more than 1 dose of an aminoglycoside. The primary outcome was shock-free days at day 14. Secondary outcomes included mortality, length of stay, clinical cure, and nephrotoxicity. A post hoc subgroup analysis including only patients who received more than 2 doses of an aminoglycoside compared with a single dose was conducted. RESULTS: One hundred fifty-one patients were included in this evaluation, 94 in the single-dose aminoglycoside group and 57 in the extended duration group. There was no difference in shock-free days at day 14 between patients who received a single dose of an aminoglycoside or those who received an extended duration (12.0 vs. 11.6 days; P = 0.56). There were no differences in mortality, length of stay, clinical cure rates, or rates of nephrotoxicity between groups (28% for single dose vs. 26% for extended duration; P = 0.86). No differences in outcomes were detected when evaluating patients who received more than 2 doses of an aminoglycoside compared with a single dose. CONCLUSIONS: Patients with septic shock and normal renal function who received a single dose of an aminoglycoside for combination antimicrobial therapy had no differences detected in shock duration or nephrotoxicity development compared with those who received an extended duration of aminoglycoside combination therapy.

4.
Ann Pharmacother ; 54(5): 457-463, 2020 05.
Artículo en Inglés | MEDLINE | ID: mdl-31729245

RESUMEN

Background: Centers for Disease Control and Prevention recommends 3 months of once-weekly rifapentine/isoniazid (3HP) for latent tuberculosis infection (LTBI) treatment given by directly observed therapy (DOT) or self-administered therapy (SAT) in patients ≥2 years old. 3HP has been associated with increased incidence of hepatic, gastrointestinal, flu-like, and cutaneous adverse drug reactions (ADRs) compared with isoniazid monotherapy. Objective: This study evaluated 3HP completion rates and tolerability for LTBI treatment in a real-world setting. Methods: A single-center retrospective cohort with a nested case-control study, comparing patients experiencing ADRs with those who did not, evaluated patients ≥18 years old receiving 3HP by DOT or SAT for LTBI at Cleveland Clinic from October 2011 through July 2018. Information on baseline characteristics, 3HP administrations, and ADRs were collected. Results: Of 199 patients screened, 144 were included (111 DOT, 33 SAT). 3HP completion rates were high at 82.6% and similar between DOT and SAT groups. During treatment, 92/144 (63.9%) patients experienced any ADR. The most common ADR included flu-like symptoms (38.2%) and gastrointestinal (31.9%) and hepatic (2.1%) reactions. Despite high rate of overall ADRs, rates of significant ADRs (grade 2 or higher) were 4.2%. Overall, 9% of patients discontinued 3HP because of ADRs. After adjusting for other factors associated with ADRs at baseline, SAT was not associated with increased incidence of ADRs, but female sex was a significant predictor (odds ratio = 2.61 [95% CI, 1.23 to 5.56]). Conclusion and Relevance: This study observed high 3HP treatment completion rates, low incidence of significant ADRs, and low discontinuation rates resulting from ADRs.


Asunto(s)
Antituberculosos/uso terapéutico , Efectos Colaterales y Reacciones Adversas Relacionados con Medicamentos/etiología , Isoniazida/uso terapéutico , Tuberculosis Latente/tratamiento farmacológico , Rifampin/análogos & derivados , Adulto , Antituberculosos/administración & dosificación , Antituberculosos/efectos adversos , Estudios de Casos y Controles , Terapia por Observación Directa/métodos , Esquema de Medicación , Quimioterapia Combinada , Efectos Colaterales y Reacciones Adversas Relacionados con Medicamentos/epidemiología , Femenino , Tracto Gastrointestinal/efectos de los fármacos , Humanos , Isoniazida/administración & dosificación , Isoniazida/efectos adversos , Tuberculosis Latente/epidemiología , Hígado/efectos de los fármacos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Rifampin/administración & dosificación , Rifampin/efectos adversos , Rifampin/uso terapéutico , Autoadministración
6.
Antimicrob Agents Chemother ; 60(1): 431-6, 2016 01.
Artículo en Inglés | MEDLINE | ID: mdl-26525802

RESUMEN

The increasing prevalence of multidrug-resistant (MDR) nosocomial infections accounts for increased morbidity and mortality of such infections. Infections with MDR Gram-negative isolates are frequently treated with colistin. Based on recent pharmacokinetic studies, current colistin dosing regimens may result in a prolonged time to therapeutic concentrations, leading to suboptimal and delayed effective treatment. In addition, studies have demonstrated an association between an increased colistin dose and improved clinical outcomes. However, the specific dose at which these outcomes are observed is unknown and warrants further investigation. This retrospective study utilized classification and regression tree (CART) analysis to determine the dose of colistin most predictive of global cure at day 7 of therapy. Patients were assigned to high- and low-dose cohorts based on the CART-established breakpoint. The secondary outcomes included microbiologic outcomes, clinical cure, global cure, lengths of intensive care unit (ICU) and hospital stays, and 7- and 28-day mortalities. Additionally, safety outcomes focused on the incidence of nephrotoxicity associated with high-dose colistin therapy. The CART-established breakpoint for high-dose colistin was determined to be >4.4 mg/kg of body weight/day, based on ideal body weight. This study evaluated 127 patients; 45 (35%) received high-dose colistin, and 82 (65%) received low-dose colistin. High-dose colistin was associated with day 7 global cure (40% versus 19.5%; P = 0.013) in bivariate and multivariate analyses (odds ratio [OR] = 3.40; 95% confidence interval [CI], 1.37 to 8.45; P = 0.008). High-dose colistin therapy was also associated with day 7 clinical cure, microbiologic success, and mortality but not with the development of acute kidney injury. We concluded that high-dose colistin (>4.4 mg/kg/day) is independently associated with day 7 global cure.


Asunto(s)
Antibacterianos/uso terapéutico , Bacteriemia/tratamiento farmacológico , Colistina/uso terapéutico , Bacterias Gramnegativas/efectos de los fármacos , Infecciones por Bacterias Gramnegativas/tratamiento farmacológico , Resistencia betalactámica , Lesión Renal Aguda/prevención & control , Anciano , Bacteriemia/microbiología , Bacteriemia/mortalidad , Bacteriemia/patología , Carbapenémicos/uso terapéutico , Esquema de Medicación , Cálculo de Dosificación de Drogas , Femenino , Bacterias Gramnegativas/crecimiento & desarrollo , Bacterias Gramnegativas/patogenicidad , Infecciones por Bacterias Gramnegativas/microbiología , Infecciones por Bacterias Gramnegativas/mortalidad , Infecciones por Bacterias Gramnegativas/patología , Humanos , Unidades de Cuidados Intensivos , Tiempo de Internación , Masculino , Pruebas de Sensibilidad Microbiana , Persona de Mediana Edad , Análisis Multivariante , Estudios Retrospectivos , Análisis de Supervivencia , Resultado del Tratamiento
7.
Antimicrob Agents Chemother ; 59(7): 3748-53, 2015 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-25845872

RESUMEN

There are limited treatment options for carbapenem-resistant Gram-negative infections. Currently, there are suggestions in the literature that combination therapy should be used, which frequently includes antibiotics to which the causative pathogen demonstrates in vitro resistance. This case-control study evaluated risk factors associated with all-cause mortality rates for critically ill patients with carbapenem-resistant Gram-negative bacteremia. Adult patients who were admitted to an intensive care unit with sepsis and a blood culture positive for Gram-negative bacteria resistant to a carbapenem were included. Patients with polymicrobial, recurrent, or breakthrough infections were excluded. Included patients were classified as survivors (controls) or nonsurvivors (cases) at 30 days after the positive blood culture. Of 302 patients screened, 168 patients were included, of whom 90 patients died (53.6% [cases]) and 78 survived (46.4% [controls]) at 30 days. More survivors received appropriate antibiotics (antibiotics with in vitro activity) than did nonsurvivors (93.6% versus 53.3%; P < 0.01). Combination therapy, defined as multiple appropriate agents given for 48 h or more, was more common among survivors than nonsurvivors (32.1% versus 7.8%; P < 0.01); however, there was no difference in multiple-agent use when in vitro activity was not considered (including combinations with carbapenems) (87.2% versus 80%; P = 0.21). After adjustment for baseline factors with multivariable logistic regression, combination therapy was independently associated with decreased risk of death (odds ratio, 0.19 [95% confidence interval, 0.06 to 0.56]; P < 0.01). These data suggest that combination therapy with multiple agents with in vitro activity is associated with improved survival rates for critically ill patients with carbapenem-resistant Gram-negative bacteremia. However, that association is lost if in vitro activity is not considered.


Asunto(s)
Antibacterianos/uso terapéutico , Quimioterapia Combinada , Infecciones por Bacterias Gramnegativas/tratamiento farmacológico , Infecciones por Bacterias Gramnegativas/mortalidad , Resistencia betalactámica , Anciano , Bacteriemia/tratamiento farmacológico , Carbapenémicos/uso terapéutico , Estudios de Casos y Controles , Enfermedad Crítica/mortalidad , Femenino , Humanos , Unidades de Cuidados Intensivos , Masculino , Pruebas de Sensibilidad Microbiana , Persona de Mediana Edad , Estudios Retrospectivos , Tasa de Supervivencia , Síndrome de Respuesta Inflamatoria Sistémica/diagnóstico , Síndrome de Respuesta Inflamatoria Sistémica/tratamiento farmacológico , Síndrome de Respuesta Inflamatoria Sistémica/microbiología
8.
Ann Pharmacother ; 49(10): 1105-12, 2015 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-26187741

RESUMEN

BACKGROUND: Inhaled nitric oxide and inhaled epoprostenol have been evaluated for the management of hypoxemia in acute respiratory distress syndrome, with clinical trials demonstrating comparable improvements in oxygenation. However, these trials have several limitations, making it difficult to draw definitive conclusions regarding clinical outcomes. OBJECTIVE: The aim of this study was to evaluate the noninferiority and safety of inhaled epoprostenol compared with inhaled nitric oxide in mechanically ventilated acute respiratory distress syndrome (ARDS) patients with a primary outcome of ventilator-free days from day 1 to day 28. METHODS: This was a retrospective, noninterventional, propensity-matched, noninferiority cohort study. Propensity score for receipt of inhaled nitric oxide was developed and patients were matched accordingly using a prespecified algorithm. Secondary objectives included evaluating day 28 intensive care unit-free days, changes in PaO2/FiO2 ratio after inhalation therapy initiation, and hospital mortality. Safety endpoints assessed included hypotension, methemoglobinemia, renal dysfunction, rebound hypoxemia, significant bleeding, and thrombocytopenia. RESULTS: Ninety-four patients were included, with 47 patients in each group. Patients were well-matched with similar baseline characteristics, except patients in inhaled nitric oxide group had lower PaO2/FiO2 ratio. Management of ARDS was similar between groups. Mean difference in ventilator-free days between inhaled epoprostenol and inhaled nitric oxide was 2.16 days (95% confidence interval = -0.61 to 4.9), with lower limit of 95% confidence interval greater than the prespecified margin, hence satisfying noninferiority. There were no differences in any secondary or safety outcomes. CONCLUSIONS: Inhaled epoprostenol was noninferior to inhaled nitric oxide with regard to ventilator-free days from day 1 to day 28 in ARDS patients.


Asunto(s)
Epoprostenol/administración & dosificación , Óxido Nítrico/administración & dosificación , Respiración Artificial , Síndrome de Dificultad Respiratoria/tratamiento farmacológico , Vasodilatadores/administración & dosificación , Administración por Inhalación , Anciano , Femenino , Humanos , Hipoxia/tratamiento farmacológico , Masculino , Persona de Mediana Edad , Puntaje de Propensión , Estudios Retrospectivos , Factores de Tiempo , Desconexión del Ventilador
9.
Chest ; 165(2): 348-355, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-37611862

RESUMEN

BACKGROUND: Historically, norepinephrine has been administered through a central venous catheter (CVC) because of concerns about the risk of ischemic tissue injury if extravasation from a peripheral IV catheter (PIVC) occurs. Recently, several reports have suggested that peripheral administration of norepinephrine may be safe. RESEARCH QUESTION: Can a protocol for peripheral norepinephrine administration safely reduce the number of days a CVC is in use and frequency of CVC placement? STUDY DESIGN AND METHODS: This was a prospective observational cohort study conducted in the medical ICU at a quaternary care academic medical center. A protocol for peripheral norepinephrine administration was developed and implemented in the medical ICU at the study site. The protocol was recommended for use in patients who met prespecified criteria, but was used at the treating clinician's discretion. All adult patients admitted to the medical ICU receiving norepinephrine through a PIVC from February 2019 through June 2021 were included. RESULTS: The primary outcome was the number of days of CVC use that were avoided per patient, and the secondary safety outcomes included the incidence of extravasation events. Six hundred thirty-five patients received peripherally administered norepinephrine. The median number of CVC days avoided per patient was 1 (interquartile range, 0-2 days per patient). Of the 603 patients who received norepinephrine peripherally as the first norepinephrine exposure, 311 patients (51.6%) never required CVC insertion. Extravasation of norepinephrine occurred in 35 patients (75.8 events/1,000 d of PIVC infusion [95% CI, 52.8-105.4 events/1,000 d of PIVC infusion]). Most extravasations caused no or minimal tissue injury. No patient required surgical intervention. INTERPRETATION: This study suggests that implementing a protocol for peripheral administration of norepinephrine safely can avoid 1 CVC day in the average patient, with 51.6% of patients not requiring CVC insertion. No patient experienced significant ischemic tissue injury with the protocol used. These data support performance of a randomized, prospective, multicenter study to characterize the net benefits of peripheral norepinephrine administration compared with norepinephrine administration through a CVC.


Asunto(s)
Cateterismo Venoso Central , Catéteres Venosos Centrales , Adulto , Humanos , Norepinefrina , Estudios Prospectivos , Centros Médicos Académicos , Cateterismo Venoso Central/efectos adversos
10.
World J Hepatol ; 16(3): 379-392, 2024 Mar 27.
Artículo en Inglés | MEDLINE | ID: mdl-38577538

RESUMEN

BACKGROUND: Due to development of an immune-dysregulated phenotype, advanced liver disease in all forms predisposes patients to sepsis acquisition, including by opportunistic pathogens such as fungi. Little data exists on fungal infection within a medical intensive liver unit (MILU), particularly in relation to acute on chronic liver failure. AIM: To investigate the impact of fungal infections among critically ill patients with advanced liver disease, and compare outcomes to those of patients with bacterial infections. METHODS: From our prospective registry of MILU patients from 2018-2022, we included 27 patients with culture-positive fungal infections and 183 with bacterial infections. We compared outcomes between patients admitted to the MILU with fungal infections to bacterial counterparts. Data was extracted through chart review. RESULTS: All fungal infections were due to Candida species, and were most frequently blood isolates. Mortality among patients with fungal infections was significantly worse relative to the bacterial cohort (93% vs 52%, P < 0.001). The majority of the fungal cohort developed grade 2 or 3 acute on chronic liver failure (ACLF) (90% vs 64%, P = 0.02). Patients in the fungal cohort had increased use of vasopressors (96% vs 70%, P = 0.04), mechanical ventilation (96% vs 65%, P < 0.001), and dialysis due to acute kidney injury (78% vs 52%, P = 0.014). On MILU admission, the fungal cohort had significantly higher Acute Physiology and Chronic Health Evaluation (108 vs 91, P = 0.003), Acute Physiology Score (86 vs 65, P = 0.003), and Model for End-Stage Liver Disease-Sodium scores (86 vs 65, P = 0.041). There was no significant difference in the rate of central line use preceding culture (52% vs 40%, P = 0.2). Patients with fungal infection had higher rate of transplant hold placement, and lower rates of transplant; however, differences did not achieve statistical significance. CONCLUSION: Mortality was worse among patients with fungal infections, likely attributable to severe ACLF development. Prospective studies examining empiric antifungals in severe ACLF and associations between fungal infections and transplant outcomes are critical.

11.
Open Forum Infect Dis ; 10(12): ofad554, 2023 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-38088984

RESUMEN

A retrospective review of patients unable to take medications by mouth showed short interruptions of therapy for most patients. In a secondary analysis, our data showed maintenance and/or achievement of viral suppression for most patients. A retrospective review of intensive care patients unable to take antiretrovirals by mouth showed 56.6% of patients experiencing a transient interruption in therapy. Additionally, our case series further supports previous literature on crushing dolutegravir and bictegravir regimens to maintain and achieve viral suppression.

12.
World J Hepatol ; 15(11): 1226-1236, 2023 Nov 27.
Artículo en Inglés | MEDLINE | ID: mdl-38075005

RESUMEN

BACKGROUND: Rifaximin is frequently administered to critically ill patients with liver disease and hepatic encephalopathy, but patients currently or recently treated with antibiotics were frequently excluded from studies of rifaximin efficacy. Due to overlapping spectrums of activity, combination therapy with broad-spectrum antibiotics and rifaximin may be unnecessary. A pharmacist-driven protocol was piloted to reduce potentially overlapping therapy in critically ill patients with liver disease. It was hypothesized that withholding rifaximin during broad-spectrum antibiotic therapy would be safe and reduce healthcare costs. AIM: To determine the clinical, safety, and financial impact of discontinuing rifaximin during broad-spectrum antibiotic therapy in critically ill liver patients. METHODS: This was a single-center, quasi-experimental, pre-post study based on a pilot pharmacist-driven protocol. Patients in the protocol group were prospectively identified via the medical intensive care unit (ICU) (MICU) protocol to have rifaximin withheld during broad-spectrum antibiotic treatment. These were compared to a historical cohort who received combination therapy with broad-spectrum antibiotics and rifaximin. All data were collected retrospectively. The primary outcome was days alive and free of delirium and coma (DAFD) to 14 d. Safety outcomes included MICU length of stay, 48-h change in vasopressor dose, and ICU mortality. Secondary outcomes characterized rifaximin cost savings and protocol adherence. Multivariable analysis was utilized to evaluate the association between group assignment and the primary outcome while controlling for potential confounding factors. RESULTS: Each group included 32 patients. The median number of delirium- and coma-free days was similar in the control and protocol groups [3 interquartile range (IQR 0, 8) vs 2 (IQR 0, 9.5), P = 0.93]. In multivariable analysis, group assignment was not associated with a reduced ratio of days alive and free of delirium or coma at 14 d. The protocol resulted in a reduced median duration of rifaximin use during broad-spectrum antibiotic therapy [6 d control (IQR 3, 9.5) vs 1 d protocol (IQR 0, 1); P < 0.001]. Rates of other secondary clinical and safety outcomes were similar including ICU mortality and 48-h change in vasopressor requirements. Overall adherence to the protocol was 91.4%. The median estimated total cost of rifaximin therapy per patient was reduced from $758.40 (IQR $379.20, $1200.80) to $126.40 (IQR $0, $126.40), P < 0.01. CONCLUSION: The novel pharmacist-driven protocol for rifaximin discontinuation was associated with significant cost savings and no differences in safety outcomes including DAFD.

13.
Cleve Clin J Med ; 89(7): 363-367, 2022 07 01.
Artículo en Inglés | MEDLINE | ID: mdl-35777844

RESUMEN

Critically ill patients are at an increased risk for developing stress ulcers of the mucosa of the upper gastrointestinal (GI) tract. Bleeding from stress ulcers was previously associated with a longer stay in the intensive care unit and an increased risk of death. Thus, most patients admitted to the intensive care unit receive stress ulcer prophylaxis. However, there is a growing concern that acid-suppression drugs may be associated with increased frequency of nosocomial pneumonia and Clostridioides difficile infection. In this article, the authors address controversies regarding stress ulcer prophylaxis in critically ill patients and provide guidance for its appropriate use in this setting.


Asunto(s)
Úlcera Péptica , Úlcera Gástrica , Enfermedad Aguda , Enfermedad Crítica/terapia , Humanos , Unidades de Cuidados Intensivos , Úlcera Péptica/complicaciones , Úlcera Péptica/prevención & control , Úlcera Gástrica/tratamiento farmacológico , Úlcera Gástrica/etiología , Úlcera Gástrica/prevención & control , Úlcera
14.
J Pharm Pract ; 35(2): 190-196, 2022 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-33016183

RESUMEN

BACKGROUND: The direct comparison of twice daily (BID) and thrice daily (TID) dosing of subcutaneous low dose unfractionated heparin (LDUH) for venous thromboembolism (VTE) prophylaxis in a mixed inpatient population is not well-studied. OBJECTIVE: This study evaluated the effectiveness and safety of BID compared to TID dosing of LDUH for prevention of VTE. METHODS: Retrospective, single-center analysis of patients who received LDUH for VTE prophylaxis between July and September 2015. Outcomes were identified by ICD-9 codes. A matched cohort was created using propensity scores and multivariate analysis was conducted to identify independent risk factors for VTE. The primary outcome was incidence of symptomatic VTE. RESULTS: In the full cohort, VTE occurred in 0.71% of patients who received LDUH BID compared to 0.77% of patients who received LDUH TID (p = 0.85). There was no difference in major (p = 0.85) and minor (p = 0.52) bleeding between the BID and TID groups. For the matched cohort, VTE occurred in 1.4% of BID patients and 2.1% of TID patients (p = 0.32). Major bleed occurred in 0.36% of BID patients and 0.52% of TID patients (p = 0.7), while a minor bleed was seen in 3.4% of BID patients and 2.1% of TID patients (p = 0.13). Personal history of VTE (p = 0.002) and weight (p = 0.035) were independently associated with increased risk of VTE. CONCLUSION: This study did not demonstrate a difference in effectiveness or safety between BID and TID dosing of LDUH for VTE prevention.


Asunto(s)
Heparina , Tromboembolia Venosa , Anticoagulantes , Hemorragia/inducido químicamente , Humanos , Estudios Retrospectivos , Tromboembolia Venosa/epidemiología , Tromboembolia Venosa/etiología , Tromboembolia Venosa/prevención & control
15.
J Gambl Stud ; 27(4): 523-63, 2011 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-21191637

RESUMEN

Problem Gambling (PG) represents a serious problem for affected individuals, their families and society in general. Previous approaches to understanding PG have been confined to only a subset of the psychobiological factors influencing PG. We present a model that attempts to integrate potential causal factors across levels of organization, providing empirical evidence from the vast literature on PG and complimentary literatures in decision-making and addiction. The model posits that components are arranged systematically to bias decisions in favor of either immediately approaching or avoiding targets affording the opportunity for immediate reward. Dopamine, Testosterone and Endogenous Opioids favor immediate approach, while Serotonin and Cortisol favor inhibition. Glutamate is involved in associative learning between stimuli and promotes the approach response through its link to the DA reward system. GABA functions to monitor performance and curb impulsive decision-making. Finally, while very high levels of Norepinephrine can induce arousal to an extent that is detrimental to sound decision-making, the reactivity of the Norepinephrine system and its effects of Cortisol levels can shift the focus towards long-term consequences, thereby inhibiting impulsive decisions. Empirical evidence is provided showing the effects of each component on PG and decision-making across behavioural, neuropsychological, functional neuroimaging and genetic levels. Last, an effect size analysis of the growing pharmacotherapy literature is presented. It is hoped that this model will stimulate multi-level research to solidify our comprehension of biased decision-making in PG and suggest pharmacological and psychological approaches to treatment.


Asunto(s)
Conducta Adictiva/psicología , Toma de Decisiones , Juego de Azar/psicología , Modelos Psicológicos , Autoimagen , Autoeficacia , Conducta Adictiva/clasificación , Conducta Adictiva/diagnóstico , Femenino , Juego de Azar/clasificación , Juego de Azar/diagnóstico , Humanos , Control Interno-Externo , Masculino , Poder Psicológico , Reproducibilidad de los Resultados , Asunción de Riesgos , Conducta Social
16.
Crit Care Explor ; 3(5): e0411, 2021 May.
Artículo en Inglés | MEDLINE | ID: mdl-34036270

RESUMEN

OBJECTIVES: Studies of the use of IV N-acetylcysteine in the management of non-acetaminophen-induced acute liver failure have evaluated various dosing regimens. The only randomized trial studying this application described a 72-hour regimen. However, observational studies have reported extended duration until normalization of international normalized ratio. This study seeks to compare differences in patient outcomes based on IV N-acetylcysteine duration. DESIGN: Retrospective cohort study. SETTING: Medical ICU at a large quaternary care academic medical institution and liver transplant center. PATIENTS: Adult patients admitted to the medical ICU who received IV N-acetylcysteine for the treatment of non-acetaminophen-induced acute liver failure. INTERVENTIONS: Patients were divided into cohorts based on duration; standard duration of IV N-acetylcysteine was considered 72 hours, whereas extended duration was defined as continuation beyond 72 hours. MEASUREMENTS AND MAIN RESULTS: The primary outcome was time to normalization of international normalized ratio to less than 1.3 or less than 1.5; secondary outcomes included all-cause mortality and transplant-free survival at 3 weeks. In total, 53 patients were included: 40 in the standard duration cohort and 13 in the extended duration. There were no major differences in baseline characteristics. There was no significant difference in time to international normalized ratio normalization between cohorts. Transplant-free survival was higher with extended duration (76.9% extended vs 41.4% standard; p = 0.03). All-cause mortality at 3 weeks was numerically lower in the extended duration group (0% extended vs 24.1% standard; p = 0.08). CONCLUSIONS: Patients with non-acetaminophen-induced acute liver failure who received extended duration N-acetylcysteine were found to have significantly higher transplant-free survival than patients who received standard duration, although there was no significant difference in time to normalization of international normalized ratio or overall survival. Prospective, randomized, multicenter study is warranted to identify subpopulations of patients with non-acetaminophen-induced acute liver failure who could benefit from extended treatment duration as a bridge to transplant or spontaneous recovery.

17.
J Pharm Pract ; 32(3): 327-338, 2019 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-30808257

RESUMEN

Treatment of suspected infections in critically ill patients requires the timely initiation of appropriate antimicrobials and rapid de-escalation of unnecessary broad-spectrum coverage. New advances in rapid diagnostic tests can now offer earlier detection of pathogen and potential resistance mechanisms within hours of initial culture growth. These technologies, combined with pharmacist antimicrobial stewardship efforts, may result in shorten time to adequate coverage or earlier de-escalation of unnecessary broad spectrum antimicrobials, which could improve patient outcomes and lower overall treatment cost. Furthermore, de-escalation of antimicrobials may lead to decreased emergence of resistant organisms and adverse events associated with antimicrobials. Clinical pharmacists should be aware of new rapid diagnostic tests, including their application, clinical evidence, and limitations, in order to implement the most appropriate clinical treatment strategy when patients have positive cultures. This review will focus on commercially available rapid diagnostic tests for infections that are routinely encountered by critically ill patients, including gram-positive and gram-negative bacterial blood stream infections, Candida, and Clostridioides difficile.


Asunto(s)
Enfermedades Transmisibles/diagnóstico , Enfermedades Transmisibles/tratamiento farmacológico , Pruebas Diagnósticas de Rutina/métodos , Pruebas Diagnósticas de Rutina/normas , Unidades de Cuidados Intensivos/normas , Antibacterianos/uso terapéutico , Programas de Optimización del Uso de los Antimicrobianos , Bacteriemia/diagnóstico , Enfermedad Crítica , Infección Hospitalaria/diagnóstico , Humanos , Pruebas de Sensibilidad Microbiana
18.
Am J Crit Care ; 28(5): 377-384, 2019 09.
Artículo en Inglés | MEDLINE | ID: mdl-31474608

RESUMEN

BACKGROUND: Despite a lack of data from intensive care patients, bispectral index monitors are often used to measure the depth of sedation for critically ill patients with acute respiratory distress syndrome (ARDS) who require continuous neuromuscular blocking agents. OBJECTIVE: To evaluate differences in the effectiveness and safety of monitoring sedation by using bispectral index or traditional methods in patients with ARDS who are receiving continuous neuromuscular blocking agents. METHODS: This noninterventional, single-center, retrospective cohort study included adult patients with ARDS who are receiving a neuromuscular blocking agent. Daily sedation and analgesia while a neuromuscular blocking agent was being administered were compared between patients with and patients without orders for titration based on bispectral index values. Clinical outcomes also were evaluated. RESULTS: Overall, sedation and analgesia did not differ between patients with and patients without titration based on bispectral index. Compared with patients without such titration, patients with bispectral index-based titration experienced more dose adjustments for the sedation agent (median [interquartile range], 7 [4-11] vs 1 [0-5], respectively, P < .001) and the analgesic (1 [0-2] vs 0 [0-1], respectively; P = .003) during the first 24 hours of neuromuscular blockade, but this was not associated with any difference in clinical outcomes. CONCLUSIONS: Titration based on bispectral index did not result in a significant difference in sedation or analgesia exposure, or clinical outcomes, from that achieved with traditional sedation monitoring in patients with ARDS who were receiving a neuromuscular blocking agent, despite more dose adjustments during the first 24 hours of receiving the neuromuscular blocking agent.


Asunto(s)
Sedación Consciente/métodos , Cuidados Críticos/métodos , Electroencefalografía/métodos , Bloqueo Neuromuscular/métodos , Dolor/tratamiento farmacológico , Síndrome de Dificultad Respiratoria/complicaciones , Adulto , Estudios de Cohortes , Enfermedad Crítica , Femenino , Humanos , Masculino , Persona de Mediana Edad , Bloqueantes Neuromusculares/administración & dosificación , Estudios Retrospectivos
19.
World J Hepatol ; 11(4): 379-390, 2019 Apr 27.
Artículo en Inglés | MEDLINE | ID: mdl-31114642

RESUMEN

BACKGROUND: Patients with liver disease are concomitantly at increased risk of venous thromboembolism (VTE) and bleeding events due to changes in the balance of pro- and anti-hemostatic substances. As such, recommendations for the use of pharmacological VTE prophylaxis are lacking. Recent studies have found no difference in rates of VTE in those receiving and not receiving pharmacological VTE prophylaxis, though most studies have been small. Thus, our study sought to establish if pharmacological VTE prophylaxis is effective and safe in patients with liver disease. AIM: To determine if there is net clinical benefit to providing pharmacological VTE prophylaxis to cirrhotic patients. METHODS: In this retrospective study, 1806 patients were propensity matched to assess if pharmacological VTE prophylaxis is effective and safe in patients with cirrhosis. Patients were divided and evaluated based on receipt of pharmacological VTE prophylaxis. RESULTS: The composite primary outcome of VTE or major bleeding was more common in the no prophylaxis group than the prophylaxis group (8.7% vs 5.1%, P = 0.002), though this outcome was driven by higher rates of major bleeding (6.9% vs 2.9%, P < 0.001) rather than VTE (1.9% vs 2.2%, P = 0.62). There was no difference in length of stay or in-hospital mortality between groups. Pharmacological VTE prophylaxis was independently associated with lower rates of major bleeding (OR = 0.42, 95%CI: 0.25-0.68, P = 0.0005), but was not protective against VTE on multivariable analysis. CONCLUSION: Pharmacological VTE prophylaxis was not associated with a significant reduction in the rate of VTE in patients with liver disease, though no increase in major bleeding events was observed.

20.
Behav Brain Res ; 186(1): 107-17, 2008 Jan 10.
Artículo en Inglés | MEDLINE | ID: mdl-17854920

RESUMEN

The zebrafish has been an excellent model organism of developmental biology and genetics. Studying its behavior will add to the already strong knowledge of its biology and will strengthen the use of this species in behavior genetics and neuroscience. Anxiety is one of the most problematic human psychiatric conditions. Arguably, it arises as a result of abnormally exaggerated natural fear responses. The zebrafish may be an appropriate model to investigate the biology of fear and anxiety. Fear responses are expressed by animals when exposed to predators, and these responses can be learned or innate. Here we investigated whether zebrafish respond differentially to a natural predator or other fish species upon their first exposure to these fish. Naïve zebrafish were shown four species of fish chosen based on predatory status (predatory or harmless) and geographical origin (allopatric or sympatric). Our results suggest that naïve zebrafish respond differentially to the stimulus fish. Particularly interesting is the antipredatory response elicited by the zebrafish's sympatric predator, the Indian Leaf Fish, and the fact that this latter species exhibited almost no predatory attacks. The findings obtained open a new avenue of research into what zebrafish perceive as "dangerous" or fear inducing. They will also allow us to develop fear and anxiety related behavioral test methods with which the contribution of genes to, or the effects of novel anxiolytic substances on these behaviors may be analyzed.


Asunto(s)
Reacción de Prevención/fisiología , Aprendizaje Discriminativo/fisiología , Reconocimiento Visual de Modelos/fisiología , Conducta Predatoria , Carácter Cuantitativo Heredable , Análisis de Varianza , Grupos de Población Animal , Animales , Reconocimiento en Psicología , Especificidad de la Especie , Estadísticas no Paramétricas , Pez Cebra
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA