Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 15 de 15
Filtrar
1.
Am J Transplant ; 21(12): 4003-4011, 2021 12.
Artigo em Inglês | MEDLINE | ID: mdl-34129720

RESUMO

Current risk-adjusted models for donor lung use and lung graft survival do not include donor critical care data. We sought to identify modifiable donor physiologic and mechanical ventilation parameters that predict donor lung use and lung graft survival. This is a prospective observational study of donors after brain death (DBDs) managed by 19 Organ Procurement Organizations from 2016 to 2019. Demographics, mechanical ventilation parameters, and critical care data were recorded at standardized time points during donor management. The lungs were transplanted from 1811 (30%) of 6052 DBDs. Achieving ≥7 critical care endpoints was a positive predictor of donor lung use. After controlling for recipient factors, donor blood pH positively predicted lung graft survival (OR 1.48 per 0.1 unit increase in pH) and the administration of dopamine during donor management negatively predicted lung graft survival (OR 0.19). Tidal volumes ≤8 ml/kg predicted body weight (OR 0.65), and higher positive end-expiratory pressures (OR 0.91 per cm H2 O) predicted decreased donor lung use without affecting lung graft survival. A randomized clinical trial is needed to inform optimal ventilator management strategies in DBDs.


Assuntos
Sobrevivência de Enxerto , Obtenção de Tecidos e Órgãos , Morte Encefálica , Cuidados Críticos , Humanos , Pulmão , Doadores de Tecidos
2.
J Trauma Acute Care Surg ; 88(6): 783-788, 2020 06.
Artigo em Inglês | MEDLINE | ID: mdl-32459446

RESUMO

BACKGROUND: Delayed graft function (DGF), the need for dialysis in the first week following kidney transplant, affects approximately one quarter of deceased-donor kidney transplant recipients. Donor demographics, donor serum creatinine, and graft cold ischemia time are associated with DGF. However, there is no consensus on the optimal management of hemodynamic instability in organ donors after brain death (DBDs). Our objective was to determine the relationship between vasopressor selection during donor management and the development of DGF. METHODS: Prospective observational data, including demographic and critical care parameters, were collected for all DBDs managed by 17 organ procurement organizations from nine Organ Procurement and Transplantation Network Regions between 2012 and 2018. Recipient outcome data were linked with donor data through donor identification numbers. Donor critical care parameters, including type of vasopressor and doses, were recorded at three standardized time points during donor management. The analysis included only donors who received at least one vasopressor at all three time points. Vasopressor doses were converted to norepinephrine equivalent doses and analyzed as continuous variables. Univariate analyses were conducted to determine the association between donor variables and DGF. Results were adjusted for known predictors of DGF using binary logistic regression. RESULTS: Complete data were available for 5,554 kidney transplant recipients and 2,985 DBDs. On univariate analysis, donor serum creatinine, donor age, donor subtype, kidney donor profile index, graft cold ischemia time, phenylephrine dose, and dopamine dose were associated with DGF. After multivariable analysis, increased donor serum creatinine, donor age, kidney donor profile index, graft cold ischemia time, and phenylephrine dose remained independent predictors of DGF. CONCLUSION: Higher doses of phenylephrine were an independent predictor of DGF. With the exception of phenylephrine, the selection and dose of vasopressor during donor management did not predict the development of DGF. LEVEL OF EVIDENCE: Prognostic study, Level III.


Assuntos
Morte Encefálica/fisiopatologia , Cuidados Críticos/estatística & dados numéricos , Função Retardada do Enxerto/epidemiologia , Transplante de Rim/efeitos adversos , Rim/efeitos dos fármacos , Vasoconstritores/efeitos adversos , Adulto , Fatores Etários , Isquemia Fria/efeitos adversos , Cuidados Críticos/métodos , Função Retardada do Enxerto/etiologia , Função Retardada do Enxerto/prevenção & controle , Relação Dose-Resposta a Droga , Feminino , Humanos , Rim/irrigação sanguínea , Rim/fisiopatologia , Transplante de Rim/métodos , Transplante de Rim/estatística & dados numéricos , Masculino , Pessoa de Meia-Idade , Fenilefrina/administração & dosagem , Fenilefrina/efeitos adversos , Estudos Prospectivos , Medição de Risco , Obtenção de Tecidos e Órgãos/métodos , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Vasoconstritores/administração & dosagem , Adulto Jovem
3.
Clin Transplant ; 34(5): e13835, 2020 05.
Artigo em Inglês | MEDLINE | ID: mdl-32068301

RESUMO

BACKGROUND: No standard exists for the use of deceased donor liver biopsy during procurement. We sought to evaluate liver biopsy and the impact of findings on outcomes and graft utilization. METHODS: A prospective observational study of donors after neurologic determination of death was conducted from 02/2012-08/2017 (16 OPOs). Donor data were collected through the UNOS Donor Management Goals Registry Web Portal and linked to the Scientific Registry of Transplant Recipients (SRTR) for recipient outcomes. Recipients of biopsied donor livers (BxDL) were studied and a Cox proportional hazard analysis was used to identify independent predictors of 1-year graft survival. RESULTS: Data from 5449 liver transplant recipients were analyzed, of which 1791(33%) received a BxDL. There was no difference in graft or patient survival between the non-BxDL and BxDL recipient groups. On adjusted analysis of BxDL recipients, macrosteatosis (21%-30%[n = 148] and >30%[n = 92]) was not found to predict 1-year graft survival, whereas increasing donor age (HR1.02), donor Hispanic ethnicity (HR1.62), donor INR (HR1.18), and recipient life support (HR2.29) were. CONCLUSIONS: Excellent graft and patient survival can be achieved in recipients of BxDL grafts. Notably, as demonstrated by the lack of effect of macrosteatosis on survival, donor to recipient matching may contribute to these outcomes.


Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Biópsia , Sobrevivência de Enxerto , Humanos , Fígado , Doadores Vivos , Doadores de Tecidos , Transplantados
4.
Am J Surg ; 213(1): 73-79, 2017 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-27381816

RESUMO

BACKGROUND: A rhabdomyolysis protocol (RP) with mannitol and bicarbonate to prevent acute renal dysfunction (ARD, creatinine >2.0 mg/dL) remains controversial. METHODS: Patients with creatine kinase (CK) greater than 2,000 U/L over a 10-year period were identified. Shock, Injury Severity Score, massive transfusion, intravenous contrast exposure, and RP use were evaluated. RP was initiated for a CK greater than 10,000 U/L (first half of the study) or greater than 20,000 U/L (second half). Multivariable analyses were used to identify predictors of ARD and the independent effect of the RP. RESULTS: Seventy-seven patients were identified, 24 (31%) developed ARD, and 4 (5%) required hemodialysis. After controlling for other risk factors, peak CK greater than 10,000 U/L (odds ratio 8.6, P = .016) and failure to implement RP (odds ratio 5.7, P = .030) were independent predictors of ARD. Among patients with CK greater than 10,000, ARD developed in 26% of patients with the RP versus 70% without it (P = .008). CONCLUSION: Reduced ARD was noted with RP. A prospective controlled study is still warranted.


Assuntos
Injúria Renal Aguda/prevenção & controle , Bicarbonatos/uso terapêutico , Diuréticos Osmóticos/uso terapêutico , Manitol/uso terapêutico , Rabdomiólise/complicações , Ferimentos e Lesões/complicações , Injúria Renal Aguda/diagnóstico , Injúria Renal Aguda/etiologia , Adulto , Algoritmos , Protocolos Clínicos , Creatina Quinase , Bases de Dados Factuais , Feminino , Humanos , Escala de Gravidade do Ferimento , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Adulto Jovem
5.
JAAPA ; 29(5): 47-53, 2016 May.
Artigo em Inglês | MEDLINE | ID: mdl-27124230

RESUMO

BACKGROUND: This study aimed to determine the prevalence and occupational characteristics of physician assistants (PAs) and nurse practitioners (NPs) in outpatient surgical subspecialty clinics. METHODS: The 2007 and 2008 National Ambulatory Medical Care Survey (NAMCS) databases were queried for the number and characteristics of office visits seen by different provider types (PAs or NPs, physicians, or both) in various surgical subspecialties. RESULTS: More than 250 million weighted sample visits were analyzed. PAs or NPs were involved in 5.9% of visits, though the percentage of patients seen by them alone (1.1%) was significantly lower (P<0.0001). PAs and NPs were more likely to be involved in pre- or postoperative visits, and often saw the same diagnoses alone as physicians only. The most common procedures performed by PAs and NPs varied according to subspecialty. CONCLUSIONS: PAs and NPs have a minor prevalence in the ambulatory surgical workforce during the time period studied. Further integration of these providers into the outpatient setting may help optimize efficiency in ambulatory surgical care.


Assuntos
Procedimentos Cirúrgicos Ambulatórios , Pesquisas sobre Atenção à Saúde , Assistência Ambulatorial , Humanos , Profissionais de Enfermagem , Visita a Consultório Médico , Assistentes Médicos , Estados Unidos
6.
J Trauma Acute Care Surg ; 79(4 Suppl 2): S164-70, 2015 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-26131787

RESUMO

BACKGROUND: Historically, strategies to reduce acute rejection and improve graft survival in kidney transplant recipients included blood transfusions (BTs) before transplantation. While advents in recipient immunosuppression strategies have replaced this practice, the impact of BTs in the organ donor on recipient graft outcomes has not been evaluated. We hypothesize that BTs in organ donors after neurologic determination of death (DNDDs) translate into improved recipient renal graft outcomes, as measured by a decrease in delayed graft function (DGF). METHODS: Donor demographics, critical care end points, the use of BTs, and graft outcome data were prospectively collected on DNDDs from March 2012 to October 2013 in the United Network for Organ Sharing Region 5 Donor Management Database. Propensity analysis determined each DNDD's probability of receiving packed red blood cells based on demographic and critical care data as well as provider bias. The primary outcome measure was the rate of DGF (dialysis in the first week after transplantation) in different donor BT groups as follows: no BT, any BT, 1 to 5, 6 to 10, or greater than 10 packed red blood cell units. Regression models determined the relationship between donor BTs and recipient DGF after accounting for known predictors of DGF as well as the propensity to receive a BT. RESULTS: Data were complete for 1,884 renal grafts from 1,006 DNDDs; 52% received any BT, 32% received 1 to 5 U, 11% received 6 to 10, and 9% received greater than 10 U of blood. Grafts from transfused donors had a lower rate of DGF compared with those of the nontransfused donors (26% vs. 34%, p < 0.001). After adjusting for known confounders, grafts from donors with any BT had a lower odds of DGF (odds ratio, 0.76; p = 0.030), and this effect was greatest in those with greater than 10 U transfused. CONCLUSION: Any BT in a DNDD was associated with a 23% decrease in the odds of recipients developing DGF, and this effect was more pronounced as the number of BTs increased. LEVEL OF EVIDENCE: Therapeutic study, level III; epidemiologic/prognostic study, level II.


Assuntos
Transfusão de Sangue/estatística & dados numéricos , Sobrevivência de Enxerto , Transplante de Rim , Doadores de Tecidos , Adulto , Cadáver , Função Retardada do Enxerto , Feminino , Humanos , Terapia de Imunossupressão , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Resultado do Tratamento
7.
JAMA Surg ; 149(9): 969-75, 2014 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-25054379

RESUMO

IMPORTANCE: The shortage of organs available for transplant has led to the use of expanded criteria donors (ECDs) to extend the donor pool. These donors are older and have more comorbidities and efforts to optimize the quality of their organs are needed. OBJECTIVE: To determine the impact of meeting a standardized set of critical care end points, or donor management goals (DMGs), on the number of organs transplanted per donor in ECDs. DESIGN, SETTING, AND PARTICIPANTS: Prospective interventional study from February 2010 to July 2013 of all ECDs managed by the 8 organ procurement organizations in the southwestern United States (United Network for Organ Sharing Region 5). INTERVENTIONS: Implementation of 9 DMGs as a checklist to guide the management of every ECD. The DMGs represented normal cardiovascular, pulmonary, renal, and endocrine end points. Meeting the DMG bundle was defined a priori as achieving any 7 of the 9 end points and was recorded at the time of referral to the organ procurement organization, at the time of authorization for donation, 12 to 18 hours later, and prior to organ recovery. MAIN OUTCOMES AND MEASURES: The primary outcome measure was 3 or more organs transplanted per donor and binary logistic regression was used to identify independent predictors with P < .05. RESULTS: There were 671 ECDs with a mean (SD) number of 2.1 (1.3) organs transplanted per donor. Ten percent of the ECDs had met the DMG bundle at referral, 15% at the time of authorization, 33% at 12 to 18 hours, and 45% prior to recovery. Forty-three percent had 3 or more organs transplanted per donor. Independent predictors of 3 or more organs transplanted per donor were older age (odds ratio [OR] = 0.95 per year [95% CI, 0.93-0.97]), increased creatinine level (OR = 0.73 per mg/dL [95% CI, 0.63-0.85]), DMGs met prior to organ recovery (OR = 1.90 [95% CI, 1.35-2.68]), and a change in the number of DMGs achieved from referral to organ recovery (OR = 1.11 per additional DMG [95% CI, 1.00-1.23]). CONCLUSIONS AND RELEVANCE: Meeting DMGs prior to organ recovery with ECDs is associated with achieving 3 or more organs transplanted per donor. An increase in the number of critical care end points achieved throughout the care of a potential donor by both donor hospital and organ procurement organization is also associated with an increase in organ yield.


Assuntos
Doadores de Tecidos/estatística & dados numéricos , Obtenção de Tecidos e Órgãos/organização & administração , Idoso , Cuidados Críticos , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Objetivos Organizacionais , Estudos Prospectivos , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Estados Unidos
8.
Am J Surg ; 207(6): 817-23, 2014 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-24576582

RESUMO

BACKGROUND: Central line-associated bloodstream infections (CLABSIs) are a significant source of morbidity and mortality. This study sought to determine whether implementation of the Institute for Healthcare Improvement (IHI) Central Line Bundle would reduce the incidence of CLABSIs. METHODS: The IHI Central Line Bundle was implemented in a surgical intensive care unit. Patient demographics and the rate of CLABSIs per 1,000 catheter days were compared between the pre- and postintervention groups. Contemporaneous infection rates in an adjacent ICU were measured. RESULTS: Baseline demographics were similar between the pre- and postintervention groups. The rate of CLABSIs per catheter days decreased from 19/3,784 to 3/1,870 after implementation of the IHI Bundle (1.60 vs 5.02 CLABSIs per 1,000 catheter days; rate ratio .32 [.08 to .99, P < .05]). There was no significant change in CLABSIs in the control ICU. CONCLUSIONS: Implementation of the IHI Central Line Bundle reduced the incidence of CLABSIs in our SICU by 68%, preventing 12 CLABSIs, 2.5 deaths, and saving $198,600 annually.


Assuntos
Infecções Relacionadas a Cateter/prevenção & controle , Cateterismo Venoso Central/efeitos adversos , Cateterismo Venoso Central/normas , Controle de Infecções/organização & administração , Unidades de Terapia Intensiva/organização & administração , Pacotes de Assistência ao Paciente/normas , Melhoria de Qualidade , APACHE , Adulto , Estudos de Casos e Controles , Infecções Relacionadas a Cateter/economia , Infecções Relacionadas a Cateter/epidemiologia , Lista de Checagem , Feminino , Custos de Cuidados de Saúde , Humanos , Incidência , Controle de Infecções/economia , Unidades de Terapia Intensiva/economia , Los Angeles/epidemiologia , Masculino , Pacotes de Assistência ao Paciente/economia , Estudos Prospectivos
9.
J Trauma Acute Care Surg ; 74(4): 1133-7, 2013 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-23511156

RESUMO

BACKGROUND: Cirrhosis is known to be a significant risk factor for morbidity and mortality following trauma such that its presence is a requirement for trauma center transfer. The impact of trauma center level on post-injury survival in cirrhotic patients has not been well studied. METHODS: The National Trauma Databank (version 7) was used to identify patients admitted with cirrhosis as a preexisting comorbidity. Patients who were dead on arrival, died in the emergency department, or had missing trauma center information were excluded. Our primary outcome measure was overall mortality stratified by admission trauma center level. Logistic regression analysis was used to derive adjusted mortality results. RESULTS: A total of 3,395 patients met inclusion criteria (0.16% of all National Trauma Databank patients). Patients admitted to a Level I center were more likely to be younger and minorities, experience penetrating injuries, and require immediate operative intervention despite similar Injury Severity Scores (ISS). Overall mortality was lower at Level I centers compared with other centers (10.3% vs. 14.0%, p = 0.001). After logistic regression, Level I centers were associated with significantly lower mortality compared with non-Level I centers (adjusted odds ratio, 0.70; 95% confidence interval, 0.53-0.89; p = 0.004). CONCLUSION: The mortality for cirrhotic patients admitted to a Level I trauma center was significantly less compared with those admitted to non-Level I centers. The etiology of this improved outcome needs to be identified and transmitted to non-Level I centers. LEVEL OF EVIDENCE: Epidemiologic study, level III.


Assuntos
Cirrose Hepática/complicações , Centros de Traumatologia/organização & administração , Ferimentos e Lesões/mortalidade , Feminino , Mortalidade Hospitalar/tendências , Humanos , Escala de Gravidade do Ferimento , Cirrose Hepática/mortalidade , Masculino , Pessoa de Meia-Idade , Razão de Chances , Fatores de Risco , Taxa de Sobrevida/tendências , Estados Unidos/epidemiologia , Ferimentos e Lesões/complicações
10.
Am J Surg ; 204(6): 939-43; discussion 943, 2012 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-23026384

RESUMO

BACKGROUND: Alcohol intoxication in pediatric trauma is underappreciated. The aim of this study was to characterize alcohol screening rates in pediatric trauma. METHODS: The Los Angeles County Trauma System Database was queried for all patients aged ≤ 18 years who required admission between 2003 and 2008. Patients were compared by age and gender. RESULTS: A total of 18,598 patients met the inclusion criteria; 4,899 (26.3%) underwent blood alcohol screening, and 2,797 (57.1%) of those screened positive. Screening increased with age (3.3% for 0-9 years, 15.1% for 10-14 years, and 45.4% for 15-18 years; P < .01), as did alcohol intoxication (1.9% for 0-9 years, 5.8% 10-14 years, and 27.3% for 15-18 years; P < .01). Male gender predicted higher mortality in those aged 15 to 18 years (adjusted odds ratio, 1.7; P < .01), while alcohol intoxication did not (adjusted odds ratio, .97; P = .84). CONCLUSIONS: Alcohol intoxication is common in adolescent trauma patients. Screening is encouraged for pediatric trauma patients aged ≥10 years who require admission.


Assuntos
Intoxicação Alcoólica/diagnóstico , Etanol/sangue , Programas de Rastreamento/estatística & dados numéricos , Ferimentos e Lesões/complicações , Adolescente , Fatores Etários , Intoxicação Alcoólica/sangue , Intoxicação Alcoólica/complicações , Intoxicação Alcoólica/epidemiologia , Biomarcadores/sangue , Criança , Pré-Escolar , Feminino , Humanos , Lactente , Recém-Nascido , Los Angeles , Masculino , Análise Multivariada , Razão de Chances , Prevalência , Prognóstico , Estudos Retrospectivos , Fatores Sexuais , Índices de Gravidade do Trauma , Ferimentos e Lesões/mortalidade
11.
Arch Surg ; 147(9): 856-62, 2012 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-22987181

RESUMO

HYPOTHESIS: Discrepancies exist in complications and outcomes at teaching trauma centers (TTCs) vs nonteaching TCs (NTCs). DESIGN: Retrospective review of the National Trauma Data Bank research data sets (January 1, 2007, through December 31, 2008). SETTING: Level II TCs. PATIENTS: Patients at TTCs were compared with patients at NTCs using demographic, clinical, and outcome data. Regression modeling was used to adjust for confounding factors to determine the effect of house staff presence on failure to rescue, defined as mortality after an in-house complication. MAIN OUTCOME MEASURES: The primary outcome measures were major complications, in-hospital mortality, and failure to rescue. RESULTS: In total, 162 687 patients were available for analysis, 36 713 of whom (22.6%) were admitted to NTCs. Compared with patients admitted to TTCs, patients admitted to NTCs were older (52.8 vs 50.7 years), had more severe head injuries (8.3% vs 7.8%), and were more likely to undergo immediate operation (15.0% vs 13.2%) or ICU admission (28.1% vs 22.8%) (P < .01 for all). The mean Injury Severity Scores were similar between the groups (10.1 for patients admitted to NTCs vs 10.4 for patients admitted to TTCs, P < .01). Compared with patients admitted to TTCs, patients admitted to NTCs experienced fewer complications (adjusted odds ratio [aOR], 0.63; P < .01), had a lower adjusted mortality rate (aOR, 0.87; P = .01), and were less likely to experience failure to rescue (aOR, 0.81; P = .01). CONCLUSIONS: Admission to level II TTCs is associated with an increased risk for major complications and a higher rate of failure to rescue compared with admission to level II NTCs. Further investigation of the differences in care provided by level II TTCs vs NTCs may identify areas for improvement in residency training and processes of care.


Assuntos
Internato e Residência , Ferimentos e Lesões/cirurgia , Feminino , Hospitais de Ensino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Centros de Traumatologia , Resultado do Tratamento
12.
J Surg Res ; 177(2): 306-9, 2012 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-22709683

RESUMO

BACKGROUND: Complications after inferior vena cava (IVC) injury, including venous thromboembolism (VTE), are expected, but the exact incidence is poorly defined. The purpose of this study is to examine the VTE rate following ligation versus repair of IVC injuries. MATERIALS AND METHODS: The California State Inpatient Database was queried for all adult patients (age >14 y) admitted between 2005 and 2008 with IVC injuries. Demographic data, mechanism of injury, operative technique (ligation versus repair), and outcomes were recorded. Outcomes were compared according to operative technique. RESULTS: A total of 308 patients with IVC injuries were evaluated. The study population was mostly male (81.2%), young (median age 24 y), and Hispanic (43.2%). Overall mortality was 37.3%. The mechanisms of injury included gunshot wounds (52.3%), stab wounds (14.0%), and motor vehicle collisions (14.9%). Associated injuries were present in 100% of cases, with duodenal injuries being the most common. The majority of injuries were managed by primary repair (76.6%), with ligation performed in 23.4%. Patients who underwent ligation had a longer hospital stay (median 9 versus 6 d, P = 0.04) and a trend towards a higher mortality (45.8% versus 34.8%, P = 0.10), with no difference in VTE rate (4.2% versus 1.7%, P > 0.99). CONCLUSIONS: As expected, IVC injuries carry a very high mortality rate and are always associated with other injuries. We demonstrated a surprisingly low rate of VTE after operative management for IVC injury, which was similar for patients undergoing ligation and repair.


Assuntos
Procedimentos Cirúrgicos Vasculares , Lesões do Sistema Vascular/complicações , Veia Cava Inferior/lesões , Tromboembolia Venosa/epidemiologia , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , California/epidemiologia , Feminino , Humanos , Ligadura , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Lesões do Sistema Vascular/cirurgia , Tromboembolia Venosa/etiologia , Adulto Jovem
13.
J Trauma ; 71(2): 316-21; discussion 321-2, 2011 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-21825933

RESUMO

BACKGROUND: The natural history and optimal treatment of upper extremity (UE) deep venous thromboses (DVT's) remains uncertain as does the clinical significance of catheter-associated (CA) UE DVT's. We sought to analyze predictors of UE DVT resolution and hypothesized that anticoagulation will be associated with quicker UE DVT clot resolution and that CA UE DVT's whose catheters are removed will resolve more often than non-CA UE DVT's. METHODS: All patients on the surgical intensive care unit service were prospectively followed from January 2008 to May 2010. A standardized DVT prevention protocol was used and screening bilateral UE and lower extremity duplex examinations were obtained within 48 hours of admission and then weekly. Computed tomography angiography for pulmonary embolism was obtained if clinically indicated. Patients with UE DVT were treated according to attending discretion. Data regarding patient demographics and UE DVT characteristics were recorded: DVT location, catheter association, occlusive status, treatment, and resolution. The primary outcome measure was UE DVT resolution before hospital discharge. Interval decrease in size on the subsequent duplex after UE DVT detection was also noted. UE DVTs without a follow-up duplex were excluded from the final analysis. Univariate and multivariate analyses were used to identify independent predictors of UE DVT resolution. RESULTS: There were 201 UE DVT's in 129 patients; 123 DVTs had a follow-up duplex and were included. Fifty-four percent of UEDVTs improved on the next duplex, 60% resolved before discharge, and 2% embolized. The internal jugular was the most common site (52%) and 72% were nonocclusive. Sixty-four percent were CAUEDVT's and line removal was associated with more frequent improvement on the next duplex (55% vs. 17%, p = 0.047, mid-P exact). Sixty-eight percent of UEDVTs were treated with some form of anticoagulation, but this was not associated with improved UE DVT resolution (61% vs. 60%). Independent predictors of clot resolution were location in the arm (odds ratio = 4.1 compared with the internal jugular, p = 0.031) and time from clot detection until final duplex (odds ratio =1.052 per day, p = 0.032). CONCLUSION: A majority of UE DVT's are CA, more than half resolve before discharge, and 2% embolize. Anticoagulation does not appear to affect outcomes, but line removal does result in a quicker decrease in clot size.


Assuntos
Trombose Venosa/epidemiologia , Trombose Venosa/prevenção & controle , Ferimentos e Lesões/epidemiologia , Idoso , Estado Terminal , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Fatores de Risco , Procedimentos Cirúrgicos Operatórios , Resultado do Tratamento , Ultrassonografia Doppler Dupla , Extremidade Superior , Trombose Venosa/diagnóstico por imagem
14.
Arch Surg ; 146(4): 459-63, 2011 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-21502456

RESUMO

HYPOTHESIS: We sought to identify risk factors that might predict acute traumatic injury findings on thoracic computed tomography (TCT) among patients having a normal initial chest radiograph (CR). DESIGN: In this retrospective analysis, Abbreviated Injury Score cutoffs were chosen to correspond with obvious physical examination findings. Multivariate logistic regression analysis was performed to identify risk factors predicting acute traumatic injury findings. SETTING: Urban level I trauma center. PATIENTS: All patients with blunt trauma having both CR and TCT between July 1, 2005, and June 30, 2007. Patients with abnormalities on their CR were excluded. MAIN OUTCOME MEASURE: Finding of any acute traumatic abnormality on TCT, despite a normal CR. RESULTS: A total of 2435 patients with blunt trauma were identified; 1744 (71.6%) had a normal initial CR, and 394 (22.6%) of these had acute traumatic findings on TCT. Multivariate logistic regression demonstrated that an abdominal Abbreviated Injury Score of 3 or higher (P = .001; odds ratio, 2.6), a pelvic or extremity Abbreviated Injury Score of 2 or higher (P < .001; odds ratio, 2.0), age older than 30 years (P = .004; odds ratio, 1.4), and male sex (P = .04; odds ratio, 1.3) were significantly associated with traumatic findings on TCT. No aortic injuries were diagnosed in patients with a normal CR. Limiting TCT to patients with 1 or more risk factors predicting acute traumatic injury findings would have resulted in reduced radiation exposure and in a cost savings of almost $250,000 over the 2-year period. Limiting TCT to this degree would not have missed any clinically significant vertebral fractures or vascular injuries. CONCLUSION: Among patients with a normal screening CR, reserving TCT for older male patients with abdominal or extremity blunt trauma seems safe and cost-effective.


Assuntos
Radiografia Torácica , Traumatismos Torácicos/diagnóstico por imagem , Tomografia Computadorizada por Raios X , Ferimentos não Penetrantes/diagnóstico por imagem , Escala Resumida de Ferimentos , Adulto , Idoso , California , Feminino , Humanos , Escala de Gravidade do Ferimento , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Razão de Chances , Valor Preditivo dos Testes , Estudos Retrospectivos , Fatores de Risco , Centros de Traumatologia
15.
Arch Surg ; 140(11): 1122-5, 2005 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-16342377

RESUMO

HYPOTHESIS: Central venous blood gas (VBG) measurements of pH, PCO2, and base excess can be substituted for the same values obtained from an arterial blood gas (ABG) analysis in mechanically ventilated trauma patients, obviating the need for arterial puncture. DESIGN AND SETTING: Prospective comparison of 99 sets of VBGs and ABGs at a level 1 academic trauma center. PATIENTS: A consecutive sample of 25 trauma patients admitted to the intensive care unit who required mechanical ventilation and had both central venous and arterial catheters. MAIN OUTCOME MEASURES: Pearson correlations and Bland-Altman limits of agreement (LOAs) for pH, PCO2, and base excess values from each set of VBGs and ABGs. RESULTS: When VBG and ABG values were compared, pH had R = 0.92, P<.001, and 95% LOAs of -0.09 to 0.03; PCO2, R = 0.88, P<.001, and 95% LOAs of -2.2 to 10.9; and base excess, R = 0.96, P<.001, and 95% LOAs of -2.2 to 1.8. A receiver operating characteristic curve showed that a central venous PCO2 of 50 mm Hg had 100% sensitivity and 84% specificity for determining significant hypercarbia (arterial PCO2 > 50 mm Hg). CONCLUSIONS: Central venous and arterial PCO2, pH, and base excess values correlate well, but their LOAs represent clinically significant ranges that could affect management. Although VBGs cannot be substituted for ABGs in mechanically ventilated trauma patients during the initial phases of resuscitation, clinically reliable conclusions can be reached with VBG analysis.


Assuntos
Gasometria/métodos , Respiração Artificial , Ferimentos e Lesões/sangue , Dióxido de Carbono/sangue , Humanos , Concentração de Íons de Hidrogênio , Monitorização Fisiológica , Estudos Prospectivos , Curva ROC
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA