Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 21
Filtrar
1.
Br J Anaesth ; 132(4): 685-694, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38242802

RESUMEN

BACKGROUND: The peripheral perfusion index is the ratio of pulsatile to nonpulsatile static blood flow obtained by photoplethysmography and reflects peripheral tissue perfusion. We investigated the association between intraoperative perfusion index and postoperative acute kidney injury in patients undergoing major noncardiac surgery and receiving continuous vasopressor infusions. METHODS: In this exploratory post hoc analysis of a pragmatic, cluster-randomised, multicentre trial, we obtained areas and cumulative times under various thresholds of perfusion index and investigated their association with acute kidney injury in multivariable logistic regression analyses. In secondary analyses, we investigated the association of time-weighted average perfusion index with acute kidney injury. The 30-day mortality was a secondary outcome. RESULTS: Of 2534 cases included, 8.9% developed postoperative acute kidney injury. Areas and cumulative times under a perfusion index of 3% and 2% were associated with an increased risk of acute kidney injury; the strongest association was observed for area under a perfusion index of 1% (adjusted odds ratio [aOR] 1.32, 95% confidence interval [CI] 1.00-1.74, P=0.050, per 100%∗min increase). Additionally, time-weighted average perfusion index was associated with acute kidney injury (aOR 0.82, 95% CI 0.74-0.91, P<0.001) and 30-day mortality (aOR 0.68, 95% CI 0.49-0.95, P=0.024). CONCLUSIONS: Larger areas and longer cumulative times under thresholds of perfusion index and lower time-weighted average perfusion index were associated with postoperative acute kidney injury in patients undergoing major noncardiac surgery and receiving continuous vasopressor infusions. CLINICAL TRIAL REGISTRATION: NCT04789330.


Asunto(s)
Lesión Renal Aguda , Hipotensión , Humanos , Complicaciones Posoperatorias/etiología , Índice de Perfusión , Estudios Retrospectivos , Lesión Renal Aguda/etiología , Factores de Riesgo , Hipotensión/complicaciones
2.
Transplantation ; 108(2): 483-490, 2024 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-38259180

RESUMEN

BACKGROUND: Improper opioid prescription after surgery is a well-documented iatrogenic contributor to the current opioid epidemic in North America. In fact, opioids are known to be overprescribed to liver transplant patients, and liver transplant patients with high doses or prolonged postsurgical opioid use have higher risks of graft failure and death. METHODS: This is a retrospective cohort study of 552 opioid-naive patients undergoing liver transplant at an academic center between 2012 and 2019. The primary outcome was the discrepancy between the prescribed discharge opioid daily dose and each patient's own inpatient opioid consumption 24 h before discharge. Variables were analyzed with Wilcoxon and chi-square tests and logistic regression. RESULTS: Opioids were overprescribed in 65.9% of patients, and 54.3% of patients who required no opioids the day before discharge were discharged with opioid prescriptions. In contrast, opioids were underprescribed in 13.4% of patients, among whom 27.0% consumed inpatient opioids but received no discharge opioid prescription. The median prescribed opioid daily dose was 333.3% and 56.3% of the median inpatient opioid daily dose in opioid overprescribed and underprescribed patients, respectively. Importantly, opioid underprescribed patients had higher rates of opioid refill 1 to 30 and 31 to 90 d after discharge, and the rate of opioid underprescription more than doubled from 2016 to 2019. CONCLUSIONS: Opioids are both over- and underprescribed to liver transplant patients, and opioid underprescribed patients had higher rates of opioid refill. Therefore, we proposed to prescribe discharge opioid prescriptions based on liver transplant patients' inpatient opioid consumption to provide patient-centered opioid prescriptions.


Asunto(s)
Trasplante de Hígado , Trasplantes , Humanos , Trasplante de Hígado/efectos adversos , Analgésicos Opioides/efectos adversos , Estudios Retrospectivos , Prescripciones
3.
BMJ Open ; 13(11): e078713, 2023 11 19.
Artículo en Inglés | MEDLINE | ID: mdl-37984940

RESUMEN

INTRODUCTION: Catecholamine vasopressors such as norepinephrine are the standard drugs used to maintain mean arterial pressure during liver transplantation. At high doses, catecholamines may impair organ perfusion. Angiotensin II is a peptide vasoconstrictor that may improve renal perfusion pressure and glomerular filtration rate, a haemodynamic profile that could reduce acute kidney injury. Angiotensin II is approved for vasodilatory shock but has not been rigorously evaluated for treatment of hypotension during liver transplantation. The objective is to assess the efficacy of angiotensin II as a second-line vasopressor infusion during liver transplantation. This trial will establish the efficacy of angiotensin II in decreasing the dose of norepinephrine to maintain adequate blood pressure. Completion of this study will allow design of a follow-up, multicentre trial powered to detect a reduction of organ injury in liver transplantation. METHODS AND ANALYSIS: This is a double-blind, randomised clinical trial. Eligible subjects are adults with a Model for End-Stage Liver Disease Sodium Score ≥25 undergoing deceased donor liver transplantation. Subjects are randomised 1:1 to receive angiotensin II or saline placebo as the second-line vasopressor infusion. The study drug infusion is initiated on reaching a norepinephrine dose of 0.05 µg kg-1 min-1 and titrated per protocol. The primary outcome is the dose of norepinephrine required to maintain a mean arterial pressure ≥65 mm Hg. Secondary outcomes include vasopressin or epinephrine requirement and duration of hypotension. Safety outcomes include incidence of thromboembolism within 48 hours of the end of surgery and severe hypertension. An intention-to-treat analysis will be performed for all randomised subjects receiving the study drug. The total dose of norepinephrine will be compared between the two arms by a one-tailed Mann-Whitney U test. ETHICS AND DISSEMINATION: The trial protocol was approved by the local Institutional Review Board (#20-30948). Results will be posted on ClinicalTrials.gov and published in a peer-reviewed journal. TRIAL REGISTRATION NUMBER: ClinicalTrials.govNCT04901169.


Asunto(s)
Enfermedad Hepática en Estado Terminal , Hipotensión , Trasplante de Hígado , Adulto , Humanos , Angiotensina II/uso terapéutico , Índice de Severidad de la Enfermedad , Donadores Vivos , Vasoconstrictores/uso terapéutico , Hipotensión/tratamiento farmacológico , Norepinefrina/uso terapéutico , Método Doble Ciego , Catecolaminas/uso terapéutico , Ensayos Clínicos Controlados Aleatorios como Asunto , Estudios Multicéntricos como Asunto
4.
Clin Transplant ; 37(10): e15049, 2023 10.
Artículo en Inglés | MEDLINE | ID: mdl-37329290

RESUMEN

BACKGROUND: Outcome data for the great majority of liver normothermic machine perfusion (NMP) cases derive from the strict confines of clinical trials. Detailed specifics regarding the intraoperative and early postoperative impact of NMP on reperfusion injury and its sequelae during real-world use of this emerging technology remain largely unavailable. METHODS: We analyzed transplants performed in a 3-month pilot period during which surgeons invoked commercial NMP at their discretion. Living donor, multi-organ, and hypothermic machine perfusion transplants were excluded. RESULTS: Intraoperatively, NMP (n = 24) compared to static cold storage (n = 25) recipients required less peri-reperfusion bolus epinephrine (0 vs. 60 µg; p < .001) and post-reperfusion fresh frozen plasma (2.5 vs. 7.0 units; p = .0069), platelets (.0 vs. 2.0 units; p = .042), and hemostatic agents (0% vs. 24%; p = .010). Time from incision to venous reperfusion did not differ (3.6 vs. 3.1; p = .095) but time from venous reperfusion to surgery end was shorter for NMP recipients (2.3 vs. 2.8 h; p = .0045). Postoperatively, NMP recipients required fewer red blood cell (1.0 vs. 4.0 units; p = .0083) and fresh frozen plasma (4.0 vs. 7.0 units; p = .046) transfusions, had shorter intensive care unit stays (33.5 vs. 58.4 h; p = .012), and experienced less early allograft dysfunction according to both the Model for Early Allograft Function Score (3.4 vs. 5.0; p = .0047) and peak AST within 10 days of transplant (619 vs. 1,181 U/L; p = .036). Liver acceptance for the corresponding recipient was conditional on NMP use for 63% (15/24) of cases. CONCLUSION: Real-world NMP use was associated with significantly lower intensity of reperfusion injury and intraoperative and postoperative care that may translate into patient benefit.


Asunto(s)
Trasplante de Hígado , Daño por Reperfusión , Humanos , Preservación de Órganos , Hígado , Perfusión
5.
Clin Transplant ; 37(10): e15057, 2023 10.
Artículo en Inglés | MEDLINE | ID: mdl-37350743

RESUMEN

BACKGROUND: The post-operative course after Liver Transplantation (LT) can be complicated by early allograft dysfunction (EAD), primary nonfunction (PNF) and death. A lactate concentration at the end of transplant of ≥5 mmol/L was recently proposed as a predictive marker of PNF, EAD, and mortality; this study aimed to validate these previous reports in a large single center cohort. METHODS: This retrospective cohort study included adult liver transplant recipients who received grafts from deceased donors at our center between June 2012 and May 2021. Receiver operating characteristic (ROC) curves for the lactate concentration at the end of transplantation were computed to determine the AUC for PNF, EAD and mortality at 90 days. RESULTS: In our cohort of 1137 cases, the AUCs for lactate to predict EAD, PNF and mortality were respectively .56 (95% confidence interval [CI]: .53-.60), .69 (95% CI: .52-.85), and .74 (95% CI: .63-.84). CONCLUSION: The clinical value of lactate concentration at the end of transplantation to predict PNF, EAD and mortality at 90 days was, at best, modest, as shown by the relatively low AUCs. Our findings cannot validate previous reports that the lactate level alone is a good predictor of poor outcomes after liver transplantation.


Asunto(s)
Trasplante de Hígado , Disfunción Primaria del Injerto , Adulto , Humanos , Trasplante de Hígado/efectos adversos , Ácido Láctico , Estudios Retrospectivos , Supervivencia de Injerto , Trasplante Homólogo , Aloinjertos , Disfunción Primaria del Injerto/etiología , Factores de Riesgo
6.
Br J Anaesth ; 130(5): 519-527, 2023 05.
Artículo en Inglés | MEDLINE | ID: mdl-36925330

RESUMEN

BACKGROUND: Intraoperative hypotension is associated with postoperative complications. The use of vasopressors is often required to correct hypotension but the best vasopressor is unknown. METHODS: A multicentre, cluster-randomised, crossover, feasibility and pilot trial was conducted across five hospitals in California. Phenylephrine (PE) vs norepinephrine (NE) infusion as the first-line vasopressor in patients under general anaesthesia alternated monthly at each hospital for 6 months. The primary endpoint was first-line vasopressor administration compliance of 80% or higher. Secondary endpoints were acute kidney injury (AKI), 30-day mortality, myocardial injury after noncardiac surgery (MINS), hospital length of stay, and rehospitalisation within 30 days. RESULTS: A total of 3626 patients were enrolled over 6 months; 1809 patients were randomised in the NE group, 1817 in the PE group. Overall, 88.2% received the assigned first-line vasopressor. No drug infiltrations requiring treatment were reported in either group. Patients were median 63 yr old, 50% female, and 58% white. Randomisation in the NE group vs PE group did not reduce readmission within 30 days (adjusted odds ratio=0.92; 95% confidence interval, 0.6-1.39), 30-day mortality (1.01; 0.48-2.09), AKI (1.1; 0.92-1.31), or MINS (1.63; 0.84-3.16). CONCLUSIONS: A large and diverse population undergoing major surgery under general anaesthesia was successfully enrolled and randomised to receive NE or PE infusion. This pilot and feasibility trial was not powered for adverse postoperative outcomes and a follow-up multicentre effectiveness trial is planned. CLINICAL TRIAL REGISTRATION: NCT04789330 (ClinicalTrials.gov).


Asunto(s)
Lesión Renal Aguda , Hipotensión , Humanos , Adulto , Femenino , Masculino , Fenilefrina , Norepinefrina/uso terapéutico , Proyectos Piloto , Estudios de Factibilidad , Resultado del Tratamiento , Hipotensión/tratamiento farmacológico , Hipotensión/etiología , Vasoconstrictores/uso terapéutico , Anestesia General/efectos adversos
7.
JMIR Perioper Med ; 5(1): e40831, 2022 Dec 08.
Artículo en Inglés | MEDLINE | ID: mdl-36480254

RESUMEN

BACKGROUND: Inhaled anesthetics in the operating room are potent greenhouse gases and are a key contributor to carbon emissions from health care facilities. Real-time clinical decision support (CDS) systems lower anesthetic gas waste by prompting anesthesia professionals to reduce fresh gas flow (FGF) when a set threshold is exceeded. However, previous CDS systems have relied on proprietary or highly customized anesthesia information management systems, significantly reducing other institutions' accessibility to the technology and thus limiting overall environmental benefit. OBJECTIVE: In 2018, a CDS system that lowers anesthetic gas waste using methods that can be easily adopted by other institutions was developed at the University of California San Francisco (UCSF). This study aims to facilitate wider uptake of our CDS system and further reduce gas waste by describing the implementation of the FGF CDS toolkit at UCSF and the subsequent implementation at other medical campuses within the University of California Health network. METHODS: We developed a noninterruptive active CDS system to alert anesthesia professionals when FGF rates exceeded 0.7 L per minute for common volatile anesthetics. The implementation process at UCSF was documented and assembled into an informational toolkit to aid in the integration of the CDS system at other health care institutions. Before implementation, presentation-based education initiatives were used to disseminate information regarding the safety of low FGF use and its relationship to environmental sustainability. Our FGF CDS toolkit consisted of 4 main components for implementation: sustainability-focused education of anesthesia professionals, hardware integration of the CDS technology, software build of the CDS system, and data reporting of measured outcomes. RESULTS: The FGF CDS system was successfully deployed at 5 University of California Health network campuses. Four of the institutions are independent from the institution that created the CDS system. The CDS system was deployed at each facility using the FGF CDS toolkit, which describes the main components of the technology and implementation. Each campus made modifications to the CDS tool to best suit their institution, emphasizing the versatility and adoptability of the technology and implementation framework. CONCLUSIONS: It has previously been shown that the FGF CDS system reduces anesthetic gas waste, leading to environmental and fiscal benefits. Here, we demonstrate that the CDS system can be transferred to other medical facilities using our toolkit for implementation, making the technology and associated benefits globally accessible to advance mitigation of health care-related emissions.

8.
Anaesth Crit Care Pain Med ; 41(5): 101126, 2022 10.
Artículo en Inglés | MEDLINE | ID: mdl-35811037

RESUMEN

BACKGROUND: The field of machine learning is being employed more and more in medicine. However, studies have shown that the quality of published studies frequently lacks completeness and adherence to published reporting guidelines. This assessment has not been done in the subspecialty of anesthesiology. METHODS: We appraised the quality of reporting of a convenience sample of 67 peer-reviewed publications sourced from the scoping review by Hashimoto et al. Each publication was appraised on the presence of reporting elements (reporting compliance) selected from 4 peer-reviewed guidelines for reporting on machine learning studies. Results are described in several cross sections, including by section of manuscript (e.g. abstract, introduction, etc.), year of publication, impact factor of journal, and impact of publication. RESULTS: On average, reporting compliance was 64% ± 13%. There was marked heterogeneity of reporting based on section of manuscript. There was a mild trend towards increased quality of reporting with increasing impact factor of journal of publication and increasing average number of citations per year since publication, and no trend regarding recency of publication. CONCLUSION: The quality of reporting of machine learning studies in anesthesiology is on par with other fields, but can benefit from improvement, especially in presenting methodology, results, and discussion points, including interpretation of models and pitfalls therein. Clinicians in today's learning health systems will benefit from skills in appraisal of evidence. Several reporting guidelines have been released, and updates to mainstream guidelines are under development, which we hope will usher in improvement in reporting quality.


Asunto(s)
Anestesiología , Anestesiología/métodos , Estudios de Cohortes , Humanos , Aprendizaje Automático , Proyectos de Investigación
9.
Br J Anaesth ; 129(3): 317-326, 2022 09.
Artículo en Inglés | MEDLINE | ID: mdl-35688657

RESUMEN

BACKGROUND: Practice patterns related to intraoperative fluid administration and vasopressor use have potentially evolved over recent years. However, the extent of such changes and their association with perioperative outcomes, such as the development of acute kidney injury (AKI), have not been studied. METHODS: We performed a retrospective analysis of major abdominal surgeries in adults across 26 US hospitals between 2015 and 2019. The primary outcome was AKI as defined by the Kidney Disease Improving Global Outcomes definition (KDIGO) using only serum creatinine criteria. Univariable linear predictive additive models were used to describe the dose-dependent risk of AKI given fluid administration or vasopressor use. RESULTS: Over the study period, we observed a decrease in the volume of crystalloid administered, a decrease in the proportion of patients receiving more than 10 ml kg-1 h-1 of crystalloid, an increase in the amount of norepinephrine equivalents administered, and a decreased duration of hypotension. The incidence of AKI increased between 2016 and 2019. An increase of crystalloid administration from 1 to 10 ml kg-1 h-1 was associated with a 58% decreased risk of AKI. CONCLUSIONS: Despite decreased duration of hypotension during the study period, decreased fluid administration and increased vasopressor use were associated with increased incidence of AKI. Crystalloid administration below 10 ml kg-1 h-1 was associated with an increased risk of AKI. Although no causality can be concluded, these data suggest that prevention and treatment of hypotension during abdominal surgery with liberal use of vasopressors at the expense of fluid administration is associated with an increased risk of postoperative AKI.


Asunto(s)
Lesión Renal Aguda , Hipotensión , Abdomen/cirugía , Lesión Renal Aguda/epidemiología , Lesión Renal Aguda/etiología , Lesión Renal Aguda/terapia , Adulto , Soluciones Cristaloides , Humanos , Hipotensión/complicaciones , Complicaciones Posoperatorias/epidemiología , Complicaciones Posoperatorias/etiología , Complicaciones Posoperatorias/terapia , Estudios Retrospectivos , Factores de Riesgo , Vasoconstrictores/uso terapéutico
10.
Clin Transplant ; 36(2): e14528, 2022 02.
Artículo en Inglés | MEDLINE | ID: mdl-34739731

RESUMEN

BACKGROUND: Delayed graft function (DGF) after kidney transplantation is a common occurrence and correlates with poor graft and patient outcomes. Donor characteristics and care are known to impact DGF. We attempted to show the relationship between achievement of specific donor management goals (DMG) and DGF. METHODS: This is a retrospective case-control study using data from 14 046 adult kidney donations after brain death from hospitals in 18 organ procurement organizations (OPOs) which were transplanted to adult recipients between 2012 and 2018. Data on DMG compliance and donor, recipient, and ischemia-related factors were used to create multivariable logistic regression models. RESULTS: The overall rate of DGF was 29.4%. Meeting DMGs for urine output and vasopressor use were associated with decreased risk of DGF. Sensitivity analyses performed with different imputation methods, omitting recipient factors, and analyzing multiple time points yielded largely consistent results. CONCLUSIONS: The development of DMGs continues to show promise in improving outcomes in the kidney transplant recipient population. Studies have already shown increased kidney utilization in smaller cohorts, as well as other organs, and shown decreased rates of DGF. Additional research and analysis are required to assess interactions between meeting DMGs and correlation versus causality in DMGs and DGF.


Asunto(s)
Funcionamiento Retardado del Injerto , Trasplante de Riñón , Adulto , Estudios de Casos y Controles , Funcionamiento Retardado del Injerto/epidemiología , Funcionamiento Retardado del Injerto/etiología , Objetivos , Supervivencia de Injerto , Humanos , Trasplante de Riñón/efectos adversos , Estudios Retrospectivos , Factores de Riesgo , Donantes de Tejidos
11.
Clin Transplant ; 36(3): e14539, 2022 03.
Artículo en Inglés | MEDLINE | ID: mdl-34791697

RESUMEN

BACKGROUND: Most patients are listed for liver transplant (LT) following extensive workup as outpatients ("conventional evaluation"). Some patients undergo urgent evaluation as inpatients after being transferred to a transplant center ("expedited evaluation"). We hypothesized that expedited patients would have inferior survival due to disease severity at the time of transplant and shorter workup time. METHODS: Patients who underwent evaluation for LT at our institution between 2012 and 2016 were retrospectively reviewed. The expedited and conventional cohorts were defined as above. Living donor LT recipients, combined liver-kidney recipients, acute liver failure patients, and re-transplant patients were excluded. We compared patient characteristics and overall survival between patients who received a transplant following expedited evaluation and those who did not, and between LT recipients based on expedited or conventional evaluation. RESULTS: Five-hundred and nine patients were included (110 expedited, 399 conventional). There was no difference in graft or patient survival at 1 year for expedited versus conventional LT recipients. In multivariable analysis of overall survival, only Donor Risk Index (HR 1.97, CI 1.04-3.73, P = .037, per unit increase) was associated with increased risk of death. CONCLUSIONS: Patients who underwent expedited evaluation for LT had significant demographic and clinical differences from patients who underwent conventional evaluation, but comparable post-transplant survival.


Asunto(s)
Trasplante de Hígado , Supervivencia de Injerto , Humanos , Trasplante de Hígado/efectos adversos , Donadores Vivos , Estudios Retrospectivos , Factores de Riesgo , Receptores de Trasplantes , Resultado del Tratamiento
12.
Transplant Direct ; 7(10): e771, 2021 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-34604507

RESUMEN

Early prediction of whether a liver allograft will be utilized for transplantation may allow better resource deployment during donor management and improve organ allocation. The national donor management goals (DMG) registry contains critical care data collected during donor management. We developed a machine learning model to predict transplantation of a liver graft based on data from the DMG registry. METHODS: Several machine learning classifiers were trained to predict transplantation of a liver graft. We utilized 127 variables available in the DMG dataset. We included data from potential deceased organ donors between April 2012 and January 2019. The outcome was defined as liver recovery for transplantation in the operating room. The prediction was made based on data available 12-18 h after the time of authorization for transplantation. The data were randomly separated into training (60%), validation (20%), and test sets (20%). We compared the performance of our models to the Liver Discard Risk Index. RESULTS: Of 13 629 donors in the dataset, 9255 (68%) livers were recovered and transplanted, 1519 recovered but used for research or discarded, 2855 were not recovered. The optimized gradient boosting machine classifier achieved an area under the curve of the receiver operator characteristic of 0.84 on the test set, outperforming all other classifiers. CONCLUSIONS: This model predicts successful liver recovery for transplantation in the operating room, using data available early during donor management. It performs favorably when compared to existing models. It may provide real-time decision support during organ donor management and transplant logistics.

13.
Curr Top Med Chem ; 20(20): 1824-1838, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32552648

RESUMEN

Testicular cancer is an aggressive malignancy with a rising incidence rate across the globe. Testicular germ cell tumors are the most commonly diagnosed cancers, and surgical removal of the testes is often a radical necessity along with chemotherapy and radiotherapy. While seminomas are receptive to radiotherapy as well as chemotherapy, non-seminomatous germ cell tumors respond to chemotherapy only. Due to the singular nature of testicular cancers with associated orchiectomy and mortality, it is important to study the molecular basis and genetic underpinnings of this group of cancers across male populations globally. In this review, we shed light on the population pharmacogenetics of testicular cancer, pediatric and adult tumors, current clinical trials, genetic determinants of chemotherapy-induced toxicity in testicular cancer, as well as the molecular signal transduction pathways operating in this malignancy. Taken together, our discussions will help in enhancing our understanding of genetic factors in testicular carcinogenesis and chemotherapy-induced toxicity, augment our knowledge of this aggressive cancer at the cellular and molecular level, as well as improve precision medicine approaches to combat this disease.


Asunto(s)
Protocolos de Quimioterapia Combinada Antineoplásica/efectos adversos , Protocolos de Quimioterapia Combinada Antineoplásica/farmacología , Neoplasias de Células Germinales y Embrionarias/tratamiento farmacológico , Transducción de Señal/efectos de los fármacos , Neoplasias Testiculares/tratamiento farmacológico , Protocolos de Quimioterapia Combinada Antineoplásica/uso terapéutico , Ensayos Clínicos como Asunto , Humanos , Masculino , Neoplasias de Células Germinales y Embrionarias/genética , Transducción de Señal/genética , Neoplasias Testiculares/genética
14.
Liver Transpl ; 26(8): 1019-1029, 2020 08.
Artículo en Inglés | MEDLINE | ID: mdl-32427417

RESUMEN

More anesthesiologists are routinely using transesophageal echocardiography (TEE) during liver transplant surgery, but the effects on patient outcome are unknown. Transplant anesthesiologists are therefore uncertain if they should undergo additional training and adopt TEE. In response to these clinical questions, the Society for the Advancement of Transplant Anesthesia appointed experts in liver transplantation and who are certified in TEE to evaluate all available published evidence on the topic. The aim was to produce a summary with greater explanatory power than individual reports to guide transplant anesthesiologists in their decision to use TEE. An exhaustive search recovered 51 articles of uncontrolled clinical observations. Topics chosen for this study were effectiveness and safety because they were a major or minor topic in all articles. The pattern of clinical use was a common topic and was included to provide contextual information. Summarized observations showed effectiveness as the ability to make a new and unexpected diagnosis and to direct the choice of clinical management. These were reported in each stage of liver transplant surgery. There were observations that TEE facilitated rapid diagnosis of life-threatening conditions difficult to identify with other types of monitoring commonly used in the operating room. Real-time diagnosis by TEE images made anesthesiologists confident in their choice of interventions, especially those with a high risk of complications such as use of anticoagulants for intracardiac thrombosis. The summarized observations in this systematic review suggest that TEE is an effective form of monitoring with a safety profile similar to that in cardiac surgery patients.


Asunto(s)
Anestesia , Anestesiología , Trasplante de Hígado , Anestesia/efectos adversos , Anestesiólogos , Ecocardiografía Transesofágica , Humanos , Trasplante de Hígado/efectos adversos
15.
Transplantation ; 104(11): e308-e316, 2020 11.
Artículo en Inglés | MEDLINE | ID: mdl-32467477

RESUMEN

BACKGROUND: Acute kidney injury (AKI) after liver transplantation is associated with increased morbidity and mortality. It remains controversial whether the choice of vena cava reconstruction technique impacts AKI. METHODS: This is a single-center retrospective cohort of 897 liver transplants performed between June 2009 and September 2018 using either the vena cava preserving piggyback technique or caval replacement technique without veno-venous bypass or shunts. The association between vena cava reconstruction technique and stage of postoperative AKI was assessed using multivariable ordinal logistic regression. Causal mediation analysis was used to evaluate warm ischemia time as a potential mediator of this association. RESULTS: The incidence of AKI (AKI stage ≥2) within 48 h after transplant was lower in the piggyback group (40.3%) compared to the caval replacement group (51.8%, P < 0.001). Piggyback technique was associated with a reduced risk of developing a higher stage of postoperative AKI (odds ratio, 0.49; 95% confidence interval, 0.37-0.65, P < 0.001). Warm ischemia time was shorter in the piggyback group and identified as potential mediator of this effect. There was no difference in renal function (estimated glomerular filtration rate and the number of patients alive without dialysis) 1 y after transplant. CONCLUSIONS: Piggyback technique, compared with caval replacement, was associated with a reduced incidence of AKI after liver transplantation. There was no difference in long-term renal outcomes between the 2 groups.


Asunto(s)
Lesión Renal Aguda/prevención & control , Trasplante de Hígado/efectos adversos , Procedimientos Quirúrgicos Vasculares/efectos adversos , Vena Cava Inferior/cirugía , Lesión Renal Aguda/diagnóstico , Lesión Renal Aguda/mortalidad , Femenino , Tasa de Filtración Glomerular , Supervivencia de Injerto , Humanos , Incidencia , Trasplante de Hígado/mortalidad , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Medición de Riesgo , Factores de Riesgo , Factores de Tiempo , Resultado del Tratamiento , Procedimientos Quirúrgicos Vasculares/mortalidad , Isquemia Tibia/efectos adversos
16.
Transplantation ; 102(11): e466-e471, 2018 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-30048397

RESUMEN

BACKGROUND: In liver transplantation, both cold and warm ischemia times are known to impact early graft function. The extraction time is a period during the initial phase of organ cooling which occurs during deceased donor procurement. During this time, the organ is at risk of suboptimal cooling. Whether donor extraction time, the time from donor aortic cross-clamp to removal of the donor organ from the body cavity has an effect on early graft function is not known. METHODS: We investigated the effect of donor extraction time on early graft function in 292 recipients of liver grafts procured locally and transplanted at our center between June 2012 and December 2016. Early graft function was assessed using the model of early allograft function score in a multivariable regression model including donor extraction time, cold ischemia time, warm ischemia time, donor risk index, and terminal donor sodium. RESULTS: Donor extraction time had an independent effect on early graft function measured by the model of early allograft function score (coefficient, 0.021; 95% confidence interval, 0.007-0.035; P < 0.01; for each minute increase of donor extraction time). Besides donor extraction time, cold ischemia time, warm ischemia time, and donor risk index had a significant effect on early graft function. CONCLUSIONS: We demonstrate an independent effect of donor extraction time on graft function after liver transplantation. Efforts to minimize donor extraction time could improve early graft function in liver transplantation.


Asunto(s)
Isquemia Fría , Hepatectomía , Trasplante de Hígado/métodos , Tempo Operativo , Donantes de Tejidos , Recolección de Tejidos y Órganos/métodos , Isquemia Tibia , Adulto , Isquemia Fría/efectos adversos , Técnicas de Apoyo para la Decisión , Femenino , Supervivencia de Injerto , Hepatectomía/efectos adversos , Humanos , Pruebas de Función Hepática , Trasplante de Hígado/efectos adversos , Masculino , Persona de Mediana Edad , Valor Predictivo de las Pruebas , Disfunción Primaria del Injerto/diagnóstico , Disfunción Primaria del Injerto/etiología , Medición de Riesgo , Factores de Riesgo , Factores de Tiempo , Recolección de Tejidos y Órganos/efectos adversos , Resultado del Tratamiento , Isquemia Tibia/efectos adversos
17.
J Cardiothorac Vasc Anesth ; 31(1): 32-36, 2017 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-28277245

RESUMEN

OBJECTIVE: Determine if surgery start time impacts patient outcomes in elective cardiac surgery. DESIGN: This was a retrospective study. SETTING: This study was based at a single academic institution. PARTICIPANTS: Patients undergoing elective cardiac surgery over a 3-year period were included. INTERVENTIONS: There were no interventions. MEASUREMENTS AND MAIN RESULTS: The authors performed a retrospective study of patients undergoing elective cardiac surgery over a 3-year period. They divided their patient groups into those who had an anesthesia start time between 6:00 a.m. and 4:00 p.m. and those who had an anesthesia start time between 4:01 p.m. and 5:59 a.m. In the original sample and propensity-score-matched groups, the authors examined the effects of start time on morbidity, mortality, and several metrics of hospital length of stay. The start time of elective cardiac surgery did not have a statistically significant effect upon mortality, individual or composite morbidity, or hospital length of stay in either the original sample or the propensity-score-matched sample. CONCLUSIONS: The authors' results suggested that elective cardiac surgery may be performed late at night without adverse effects, although institutional support for this effort (such as 24-hour intensivist coverage to facilitate fast-track extubation) may have been integral to their findings.


Asunto(s)
Atención Posterior/normas , Procedimientos Quirúrgicos Cardíacos/efectos adversos , Procedimientos Quirúrgicos Electivos/efectos adversos , Atención Posterior/estadística & datos numéricos , Anciano , Anestesiología/organización & administración , Procedimientos Quirúrgicos Cardíacos/mortalidad , Procedimientos Quirúrgicos Cardíacos/normas , Procedimientos Quirúrgicos Electivos/mortalidad , Procedimientos Quirúrgicos Electivos/normas , Femenino , Investigación sobre Servicios de Salud/métodos , Mortalidad Hospitalaria , Humanos , Tiempo de Internación/estadística & datos numéricos , Masculino , Persona de Mediana Edad , Ciudad de Nueva York/epidemiología , Admisión y Programación de Personal/organización & administración , Cuidados Posoperatorios/métodos , Estudios Retrospectivos , Factores de Tiempo , Resultado del Tratamiento
18.
J Crit Care ; 33: 14-8, 2016 06.
Artículo en Inglés | MEDLINE | ID: mdl-26975737

RESUMEN

PURPOSE: Prior studies report that weekend admission to an intensive care unit is associated with increased mortality, potentially attributed to the organizational structure of the unit. This study aims to determine whether treatment of hypotension, a risk factor for mortality, differs according to level of staffing. METHODS: Using the Multiparameter Intelligent Monitoring in Intensive Care database, we conducted a retrospective study of patients admitted to an intensive care unit at Beth Israel Deaconess Medical Center who experienced one or more episodes of hypotension. Episodes were categorized according to the staffing level, defined as high during weekday daytime (7 am-7 pm) and low during weekends or nighttime (7 pm-7 am). RESULTS: Patients with a hypotensive event on a weekend were less likely to be treated compared with those that occurred during the weekday daytime (P = .02). No association between weekday daytime vs weekday nighttime staffing levels and treatment of hypotension was found (risk ratio, 1.02; 95% confidence interval, 0.98-1.07). CONCLUSION: Patients with a hypotensive event on a weekend were less likely to be treated than patients with an event during high-staffing periods. No association between weekday nighttime staffing and hypotension treatment was observed. We conclude that treatment of a hypotensive episode relies on more than solely staffing levels.


Asunto(s)
Atención Posterior/organización & administración , Enfermedad Crítica , Hipotensión/terapia , Unidades de Cuidados Intensivos/organización & administración , Admisión y Programación de Personal/organización & administración , Anciano , Anciano de 80 o más Años , Estudios de Cohortes , Bases de Datos Factuales , Femenino , Mortalidad Hospitalaria , Humanos , Masculino , Persona de Mediana Edad , Oportunidad Relativa , Estudios Retrospectivos , Factores de Riesgo
19.
Pain Physician ; 18(5): E827-9, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-26431136

RESUMEN

BACKGROUND: Perioperative use of opioids is associated with the risk of opioid-induced respiratory depression. Naloxone is a competitive opioid antagonist typically administered to reverse opioid-induced respiratory depression. Postoperative administration of naloxone may be considered a proxy for significant postoperative opioid-induced respiratory depression and data regarding its use may be utilized as a quality measure. Few large studies have been done to characterize the population and define an incidence of naloxone recipients in the postoperative inpatient setting. OBJECTIVES: We aimed to characterize the demographics of patients receiving postoperative naloxone, as well as the incidence of administration in the first 72 post-operative hours at a large urban academic medical center in the United States. STUDY DESIGN: This is a retrospective cohort study. SETTING: Major urban tertiary teaching institution. METHODS: The robust electronic record database of The Department of Anesthesiology at The Icahn School of Medicine at Mount Sinai, as well as the institution's data warehouse were instrumental in allowing almost 450,000 surgical cases performed between 2001 and 2014 to be screened for naloxone administration within the first 72 postoperative hours. Organ harvests, outside of OR intubations, cancelled cases, and patients age less than or equal to 18 were excluded from the total case count. RESULTS: Naloxone was administered 433 times in a total of 442,699 postoperative cases. This yielded an incidence of 0.1%. Additionally, the demographics of the group receiving naloxone were described. The mean age was 60, mean body mass index (BMI) was 27, 60% were women, and the mean American Society of Anesthesiologists (ASA) status was 3. Average time to naloxone administration was 21 hours (standard deviation 7) after surgery. Thirteen percent of the cases were emergent. Breakdown of anesthetic technique revealed that 81% of the cases were performed under general anesthesia, 7% with monitored anesthesia care (MAC), and 12% under neuraxial anesthesia. This study lays the groundwork for further elucidating risk factors for postoperative administration of naloxone. LIMITATIONS: This is a retrospective study. CONCLUSION: The overall incidence of postoperative naloxone administration over a 13 year period in approximately 450,000 patients was 0.1%. Demographics of this group were older, ASA 3 women, qualifying as overweight, but not obese, undergoing elective surgery with a general anesthetic technique. Average time to administration was 21 hours postoperatively.


Asunto(s)
Analgésicos Opioides/efectos adversos , Naloxona/administración & dosificación , Antagonistas de Narcóticos/administración & dosificación , Complicaciones Posoperatorias/tratamiento farmacológico , Insuficiencia Respiratoria/tratamiento farmacológico , Adulto , Anciano , Femenino , Hospitales de Enseñanza , Humanos , Incidencia , Masculino , Persona de Mediana Edad , Complicaciones Posoperatorias/inducido químicamente , Insuficiencia Respiratoria/inducido químicamente , Estudios Retrospectivos , Factores de Riesgo , Centros de Atención Terciaria
20.
Chest ; 148(6): 1470-1476, 2015 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-26270005

RESUMEN

BACKGROUND: Indwelling arterial catheters (IACs) are used extensively in the ICU for hemodynamic monitoring and for blood gas analysis. IAC use also poses potentially serious risks, including bloodstream infections and vascular complications. The purpose of this study was to assess whether IAC use was associated with mortality in patients who are mechanically ventilated and do not require vasopressor support. METHODS: This study used the Multiparameter Intelligent Monitoring in Intensive Care II database, consisting of > 24,000 patients admitted to the Beth Israel Deaconess Medical Center ICU between 2001 and 2008. Patients requiring mechanical ventilation who did not require vasopressors or have a diagnosis of sepsis were identified, and the primary outcome was 28-day mortality. A model based on patient demographics, comorbidities, vital signs, and laboratory results was developed to estimate the propensity for IAC placement. Patients were then propensity matched, and McNemar test was used to evaluate the association of IAC with 28-day mortality. RESULTS: We identified 1,776 patients who were mechanically ventilated who met inclusion criteria. There were no differences in the covariates included in the final propensity model between the IAC and non-IAC propensity-matched groups. For the matched cohort, there was no difference in 28-day mortality between the IAC group and the non-IAC group (14.7% vs 15.2%; OR, 0.96; 95% CI, 0.62-1.47). CONCLUSIONS: In hemodynamically stable patients who are mechanically ventilated, the presence of an IAC is not associated with a difference in 28-day mortality. Validation in other datasets, as well as further analyses in other subgroups, is warranted.


Asunto(s)
Cateterismo Periférico , Monitoreo Fisiológico/métodos , Insuficiencia Respiratoria , Adulto , Anciano , Análisis de los Gases de la Sangre/métodos , Cateterismo Periférico/efectos adversos , Cateterismo Periférico/métodos , Cateterismo Periférico/mortalidad , Catéteres de Permanencia/efectos adversos , Femenino , Hemodinámica , Mortalidad Hospitalaria , Humanos , Unidades de Cuidados Intensivos/estadística & datos numéricos , Tiempo de Internación/estadística & datos numéricos , Masculino , Persona de Mediana Edad , Evaluación de Resultado en la Atención de Salud , Puntaje de Propensión , Respiración Artificial/métodos , Insuficiencia Respiratoria/sangre , Insuficiencia Respiratoria/diagnóstico , Insuficiencia Respiratoria/fisiopatología , Insuficiencia Respiratoria/terapia , Medición de Riesgo , Estados Unidos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...