Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 18 de 18
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Epileptic Disord ; 16(4): 439-48, 2014 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-25498516

RESUMO

AIM: To determine whether there is added benefit in detecting electrographic abnormalities from 16-24 hours of continuous video-EEG in adult medical/surgical ICU patients, compared to a 30-minute EEG. METHODS: This was a prospectively enroled non-randomized study of 130 consecutive ICU patients for whom EEG was requested. For 117 patients, a 30-minute EEG was requested for altered mental state and/or suspected seizures; 83 patients continued with continuous video-EEG for 16-24 hours and 34 patients had only the 30-minute EEG. For 13 patients with prior seizures, continuous video-EEG was requested and was carried out for 16-24 hours. We gathered EEG data prospectively, and reviewed the medical records retrospectively to assess the impact of continuous video-EEG. RESULTS: A total of 83 continuous video-EEG recordings were performed for 16-24 hours beyond 30 minutes of routine EEG. All were slow, and 34% showed epileptiform findings in the first 30 minutes, including 2% with seizures. Over 16-24 hours, 14% developed new or additional epileptiform abnormalities, including 6% with seizures. In 8%, treatment was changed based on continuous video-EEG. Among the 34 EEGs limited to 30 minutes, almost all were slow and 18% showed epileptiform activity, including 3% with seizures. Among the 13 patients with known seizures, continuous video-EEG was slow in all and 69% had epileptiform abnormalities in the first 30 minutes, including 31% with seizures. An additional 8% developed epileptiform abnormalities over 16-24 hours. In 46%, treatment was changed based on continuous video-EEG. CONCLUSION: This study indicates that if continuous video-EEG is not available, a 30-minute EEG in the ICU has a substantial diagnostic yield and will lead to the detection of the majority of epileptiform abnormalities. In a small percentage of patients, continuous video-EEG will lead to the detection of additional epileptiform abnormalities. In a sub-population, with a history of seizures prior to the initiation of EEG recording, the benefits of continuous video-EEG in monitoring seizure activity and influencing treatment may be greater.


Assuntos
Encefalopatias/diagnóstico , Eletroencefalografia/métodos , Epilepsia/diagnóstico , Unidades de Terapia Intensiva , Monitorização Fisiológica/métodos , Gravação de Videoteipe/métodos , Adulto , Humanos , Estudos Retrospectivos
2.
Anesth Analg ; 115(5): 1109-19, 2012 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-23051883

RESUMO

BACKGROUND: Device-related bloodstream infections are associated with a significant increase in patient morbidity and mortality in multiple health care settings. Recently, intraoperative bacterial contamination of conventional open-lumen 3-way stopcock sets has been shown to be associated with increased patient mortality. Intraoperative use of disinfectable, needleless closed catheter devices (DNCCs) may reduce the risk of bacterial injection as compared to conventional open-lumen devices due to an intrinsic barrier to bacterial entry associated with valve design and/or the capacity for surface disinfection. However, the relative benefit of DNCC valve design (intrinsic barrier capacity) as compared to surface disinfection in attenuation of bacterial injection in the clinical environment is untested and entirely unknown. The primary aim of the current study was to investigate the relative efficacy of a novel disinfectable stopcock, the Ultraport zero, with and without disinfection in attenuating intraoperative injection of potential bacterial pathogens as compared to a conventional open-lumen stopcock intravascular device. The secondary aims were to identify risk factors for bacterial injection and to estimate the quantity of bacterial organisms injected during catheter handling. METHODS: Four hundred sixty-eight operating room environments were randomized by a computer generated list to 1 of 3 device-injection schemes: (1) injection of the Ultraport zero stopcock with hub disinfection before injection, (2) injection of the Ultraport zero stopcock without prior hub disinfection, and (3) injection of the conventional open-lumen stopcock closed with sterile caps according to usual practice. After induction of general anesthesia, the primary anesthesia provider caring for patients in each operating room environment was asked to perform a series of 5 injections of sterile saline through the assigned device into an ex vivo catheter system. The primary outcome was the incidence of bacterial contamination of the injected fluid column (effluent). Risk factors for effluent contamination were identified in univariate analysis, and a controlled laboratory experiment was used to generate an estimate of the bacterial load injected for contaminated effluent samples. RESULTS: The incidence of effluent bacterial contamination was 0% (0/152) for the Ultraport zero stopcock with hub disinfection before injection, 4% (7/162) for the Ultraport zero stopcock without hub disinfection before injection, and 3.2% (5/154) for the conventional open-lumen stopcock. The Ultraport zero stopcock with hub disinfection before injection was associated with a significant reduction in the risk of bacterial injection as compared to the conventional open-lumen stopcock (RR = 8.15 × 10(-8), 95% CI, 3.39 × 10(-8) to 1.96 × 10(-7), P = <0.001), with an absolute risk reduction of 3.2% (95% CI, 0.5% to 7.4%). Provider glove use was a risk factor for effluent contamination (RR = 10.48, 95% CI, 3.16 to 34.80, P < 0.001). The estimated quantity of bacteria injected reached a clinically significant threshold of 50,000 colony-forming units per each injection series. CONCLUSIONS: The Ultraport zero stopcock with hub disinfection before injection was associated with a significant reduction in the risk of inadvertent bacterial injection as compared to the conventional open-lumen stopcock. Future studies should examine strategies designed to facilitate health care provider DNCC hub disinfection and proper device handling.


Assuntos
Catéteres/microbiologia , Contaminação de Equipamentos/prevenção & controle , Desenho de Equipamento/normas , Mãos/microbiologia , Pessoal de Saúde/normas , Transmissão de Doença Infecciosa do Profissional para o Paciente/prevenção & controle , Adulto , Idoso , Feminino , Humanos , Controle de Infecções , Injeções Intravenosas , Masculino , Pessoa de Meia-Idade , Método Simples-Cego , Células-Tronco/microbiologia
3.
Anesth Analg ; 112(1): 98-105, 2011 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-20686007

RESUMO

BACKGROUND: We have recently shown that intraoperative bacterial transmission to patient IV stopcock sets is associated with increased patient mortality. In this study, we hypothesized that bacterial contamination of anesthesia provider hands before patient contact is a risk factor for direct intraoperative bacterial transmission. METHODS: Dartmouth-Hitchcock Medical Center is a tertiary care and level 1 trauma center with 400 inpatient beds and 28 operating suites. The first and second operative cases in each of 92 operating rooms were randomly selected for analysis. Eighty-two paired samples were analyzed. Ten pairs of cases were excluded because of broken or missing sampling protocol and lost samples. We identified cases of intraoperative bacterial transmission to the patient IV stopcock set and the anesthesia environment (adjustable pressure-limiting valve and agent dial) in each operating room pair by using a previously validated protocol. We then used biotype analysis to compare these transmitted organisms to those organisms isolated from the hands of anesthesia providers obtained before the start of each case. Provider-origin transmission was defined as potential pathogens isolated in the patient stopcock set or environment that had an identical biotype to the same organism isolated from hands of providers. We also assessed the efficacy of the current intraoperative cleaning protocol by evaluating isolated potential pathogens identified at the start of case 2. Poor intraoperative cleaning was defined as 1 or more potential pathogens found in the anesthesia environment at the start of case 2 that were not there at the beginning of case 1. We collected clinical and epidemiological data on all the cases to identify risk factors for contamination. RESULTS: One hundred sixty-four cases (82 case pairs) were studied. We identified intraoperative bacterial transmission to the IV stopcock set in 11.5 % (19/164) of cases, 47% (9/19) of which were of provider origin. We identified intraoperative bacterial transmission to the anesthesia environment in 89% (146/164) of cases, 12% (17/146) of which were of provider origin. The number of rooms that an attending anesthesiologist supervised simultaneously, the age of the patient, and patient discharge from the operating room to an intensive care unit were independent predictors of bacterial transmission events not directly linked to providers. CONCLUSION: The contaminated hands of anesthesia providers serve as a significant source of patient environmental and stopcock set contamination in the operating room. Additional sources of intraoperative bacterial transmission, including postoperative environmental cleaning practices, should be further studied.


Assuntos
Anestesia/normas , Infecção Hospitalar/transmissão , Contaminação de Equipamentos/prevenção & controle , Mãos/microbiologia , Pessoal de Saúde/normas , Salas Cirúrgicas/normas , Adulto , Idoso , Infecção Hospitalar/microbiologia , Infecção Hospitalar/prevenção & controle , Feminino , Desinfecção das Mãos/métodos , Desinfecção das Mãos/normas , Humanos , Período Intraoperatório , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Fatores de Risco
4.
Anesth Analg ; 108(6): 1741-6, 2009 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-19448195

RESUMO

BACKGROUND: Exposure to red blood cell (RBC) transfusions has been associated with increased mortality after cardiac surgery. We examined long-term survival for cardiac surgical patients who received one or two RBC units during index hospitalization. METHODS: Nine thousand seventy-nine consecutive patients undergoing coronary artery bypass graft, valve, or coronary artery bypass graft/valve surgery at eight centers in northern New England during 2001-2004 were examined after exclusions. A probabilistic match between the regional registry and the Social Security Administration's Death Master File determined mortality through June 30, 2006. Cox Proportional Hazard and propensity methods were used to calculate adjusted hazard ratios. RESULTS: Thirty-six percent of patients (n = 3254) were exposed to one or two RBC units. Forty-three percent of RBCs were given intraoperatively, 56% in the postoperative period and 1% were preoperative. Patients transfused were more likely to be anemic, older, smaller, female and with more comorbid illness. Survival was significantly decreased for all patients exposed to 1 or 2 U of RBCs during hospitalization for cardiac surgery compared with those who received none (P < 0.001). After adjustment for patient and disease characteristics, patients exposed to 1 or 2 U of RBCs had a 16% higher long-term mortality risk (adjusted hazard ratios = 1.16, 95% CI: 1.01-1.34, P = 0.035). CONCLUSIONS: Exposure to 1 or 2 U of RBCs was associated with a 16% increased hazard of decreased survival after cardiac surgery.


Assuntos
Procedimentos Cirúrgicos Cardíacos/mortalidade , Transfusão de Eritrócitos/efeitos adversos , Idoso , Idoso de 80 Anos ou mais , Anemia/terapia , Estudos de Coortes , Ponte de Artéria Coronária , Feminino , Humanos , Estimativa de Kaplan-Meier , Masculino , Pessoa de Meia-Idade , Assistência Perioperatória , Modelos de Riscos Proporcionais , Estudos Prospectivos , Sobrevida , Resultado do Tratamento
5.
Circulation ; 114(1 Suppl): I430-4, 2006 Jul 04.
Artigo em Inglês | MEDLINE | ID: mdl-16820614

RESUMO

BACKGROUND: Chronic obstructive pulmonary disease (COPD) is associated with increased in-hospital mortality in patients undergoing coronary artery bypass surgery (CABG). Long-term survival is less well understood. The present study examined the effect of COPD on survival after CABG. METHODS AND RESULTS: We conducted a prospective study of 33,137 consecutive isolated CABG patients between 1992 and 2001 in northern New England. Records were linked to the National Death Index for long-term mortality data. Cox proportional hazards regression was used to calculate hazard ratios (HRs). Patients were stratified by: no comorbidities (none), COPD, COPD plus comorbidities, and other comorbidities with no COPD. There were 131,434 person years of follow-up and 5344 deaths. The overall incidence rate (deaths per 100 person years) was 4.1. By group, rates were: 2.1 (none), 4.0 (COPD alone), 5.5 (other), and 9.4 (COPD plus; log rank P<0.001). After adjustment, survival with COPD alone was worse compared with none (HR, 1.8; 95% CI, 1.6 to 2.1; P<0.001). Patients with other comorbidities compared with none had even worse survival (HR, 2.2; 95% CI, 2.1 to 2.4; P<0.001). Patients with COPD plus other comorbidities compared with none had the worst long-term survival (HR, 3.6; 95% CI, 3.3 to 3.9; P<0.001). CONCLUSIONS: Patients with only COPD had significantly reduced long-term survival compared with patient with no comorbidities. Patients with COPD and > or = 1 other comorbidity had the worst survival rate when compared with all of the other groups.


Assuntos
Ponte de Artéria Coronária , Doença das Coronárias/cirurgia , Doença Pulmonar Obstrutiva Crônica/complicações , Adulto , Idoso , Idoso de 80 Anos ou mais , Comorbidade , Ponte de Artéria Coronária/estatística & dados numéricos , Doença das Coronárias/complicações , Doença das Coronárias/epidemiologia , Bases de Dados Factuais , Complicações do Diabetes/cirurgia , Feminino , Seguimentos , Insuficiência Cardíaca/epidemiologia , Humanos , Nefropatias/epidemiologia , Tábuas de Vida , Masculino , Pessoa de Meia-Idade , Obesidade/epidemiologia , Úlcera Péptica/epidemiologia , Doenças Vasculares Periféricas/epidemiologia , Prevalência , Modelos de Riscos Proporcionais , Estudos Prospectivos , Doença Pulmonar Obstrutiva Crônica/epidemiologia , Sistema de Registros , Análise de Sobrevida , Resultado do Tratamento
6.
Circulation ; 114(1 Suppl): I43-8, 2006 Jul 04.
Artigo em Inglês | MEDLINE | ID: mdl-16820613

RESUMO

BACKGROUND: Hemodilutional anemia during cardiopulmonary bypass (CPB) is associated with increased mortality during coronary artery bypass graft (CABG) surgery. The impact of intraoperative red blood cell (RBC) transfusion to treat anemia during surgery is less understood. We examined the relationship between anemia during CPB, RBC transfusion, and risk of low-output heart failure (LOF). METHODS AND RESULTS: Data were collected on 8004 isolated CABG patients in northern New England between 1996 and 2004. Patients were excluded if they experienced postoperative bleeding or received > or = 3 units of transfused RBCs. LOF was defined as need for intraoperative or postoperative intra-aortic balloon pump, return to CPB, or > or = 2 inotropes at 48 hours. Having a lower nadir HCT was also associated with an increased risk of developing LOF (adjusted odds ratio, 0.90; 95% CI, 0.82 to 0.92; P=0.016), and that risk was further increased when patients received RBC transfusion. When adjusted for nadir hematocrit, exposure to RBC transfusion was a significant, independent predictor of LOF (adjusted odds ratio, 1.27; 95% CI, 1.00 to 1.61; P=0.047). CONCLUSIONS: In this study, we observed that exposure to both hemodilutional anemia and RBC transfusion during surgery are associated with increased risk of LOF, defined as placement of an intraoperative or postoperative intra-aortic balloon pump, return to CPB after initial separation, or treatment with > or = 2 inotropes at 48 hours postoperatively, after CABG. The risk of LOF is greater among patients exposed to intraoperative RBCs versus anemia alone.


Assuntos
Anemia/terapia , Baixo Débito Cardíaco/epidemiologia , Ponte Cardiopulmonar/efeitos adversos , Ponte de Artéria Coronária , Insuficiência Cardíaca/epidemiologia , Complicações Intraoperatórias/terapia , Complicações Pós-Operatórias/epidemiologia , Reação Transfusional , Idoso , Idoso de 80 Anos ou mais , Anemia/etiologia , Perda Sanguínea Cirúrgica , Transfusão de Sangue/normas , Transfusão de Sangue/estatística & dados numéricos , Baixo Débito Cardíaco/tratamento farmacológico , Baixo Débito Cardíaco/etiologia , Baixo Débito Cardíaco/cirurgia , Cardiotônicos/uso terapêutico , Estudos de Coortes , Feminino , Fidelidade a Diretrizes , Insuficiência Cardíaca/tratamento farmacológico , Insuficiência Cardíaca/etiologia , Insuficiência Cardíaca/cirurgia , Hematócrito , Humanos , Hipóxia-Isquemia Encefálica/etiologia , Hipóxia-Isquemia Encefálica/prevenção & controle , Balão Intra-Aórtico , Complicações Intraoperatórias/etiologia , Maine/epidemiologia , Masculino , Pessoa de Meia-Idade , New Hampshire/epidemiologia , Complicações Pós-Operatórias/etiologia , Guias de Prática Clínica como Assunto , Padrões de Prática Médica/estatística & dados numéricos , Estudos Prospectivos , Risco , Vermont/epidemiologia
7.
Circ Cardiovasc Qual Outcomes ; 6(1): 35-41, 2013 Jan 01.
Artigo em Inglês | MEDLINE | ID: mdl-23300268

RESUMO

BACKGROUND: The survival of patients who undergo aortic valve replacement (AVR) for severe aortic stenosis with reduced preoperative ejection fractions (EFs) is not well described in the literature. METHODS AND RESULTS: Patients undergoing AVR for severe aortic stenosis were analyzed using the Northern New England Cardiovascular Disease Study Group surgical registry. Patients were stratified by preoperative EF (≥50%, 40%-49%, and <40%) and concomitant coronary artery bypass grafting. Crude and adjusted survival across strata of EF was estimated for patients up to 8 years beyond their index admission. A total of 5277 patients underwent AVR for severe aortic stenosis between 1992 and 2008. There were 727 (14%) patients with preoperative EF <40%. Preoperative EF had minimal effect on postoperative morbidity. There was no difference in 30-day mortality across EF strata among the isolated AVR cohort. Preserved EF conferred 30-day survival benefit among the AVR+coronary artery bypass grafting population (EF≥50%, 96%; EF<40%, 91%; P=0.003). Patients with preserved EF had significantly improved 6-month and 8-year survival compared with their reduced EF counterparts. CONCLUSIONS: Survival after AVR or AVR+coronary artery bypass grafting was most favorable among patients with preoperative preserved EF. However, patients with mild to moderately depressed EF experienced a substantial survival benefit compared with the natural history of medically treated patients. Furthermore, minor reductions of EF carried equivalent increased risk to those with more compromised function suggesting patients are best served when an AVR is performed before even minor reductions in myocardial function.


Assuntos
Estenose da Valva Aórtica/mortalidade , Estenose da Valva Aórtica/cirurgia , Valva Aórtica/cirurgia , Implante de Prótese de Valva Cardíaca , Período Pré-Operatório , Volume Sistólico/fisiologia , Função Ventricular Esquerda/fisiologia , Idoso , Idoso de 80 Anos ou mais , Estenose da Valva Aórtica/epidemiologia , Ponte de Artéria Coronária , Feminino , Humanos , Estudos Longitudinais , Masculino , Pessoa de Meia-Idade , New England/epidemiologia , Estudos Prospectivos , Sistema de Registros , Estudos Retrospectivos , Fatores de Risco , Índice de Gravidade de Doença , Taxa de Sobrevida , Resultado do Tratamento
9.
Circ Cardiovasc Qual Outcomes ; 5(5): 638-44, 2012 Sep 01.
Artigo em Inglês | MEDLINE | ID: mdl-22828825

RESUMO

BACKGROUND: Postoperative low-output failure (LOF) is an important contributor to morbidity and mortality after coronary artery bypass grafting surgery. We sought to understand which pre- and intra-operative factors contribute to postoperative LOF and to what degree the surgeon may influence rates of LOF. METHODS AND RESULTS: We identified 11 838 patients undergoing nonemergent, isolated coronary artery bypass grafting surgery using cardiopulmonary bypass by 32 surgeons at 8 centers in northern New England from 2001 to 2009. Our cohort included patients with preoperative ejection fractions >40%. Patients with preoperative intraaortic balloon pumps were excluded. LOF was defined as the need for ≥2 inotropes at 48 hours, an intra- or post-operative intraaortic balloon pumps, or return to cardiopulmonary bypass (for hemodynamic reasons). Case volume varied across the 32 surgeons (limits, 80-766; median, 344). The overall rate of LOF was 4.3% (return to cardiopulmonary bypass, 2.6%; intraaortic balloon pumps, 1.0%; inotrope usage, 0.8%; combination, 1.0%). The predicted risk of LOF did not differ across surgeons, P=0.79, and the observed rates varied from 1.1% to 10.2%, P<0.001. Patients operated by low-rate surgeons had shorter clamp and bypass times, antegrade cardioplegia, longer maximum intervals between cardioplegia doses, lower cardioplegia volume per anastomosis or minute of ischemic time, and less hot-shot use. Patients operated on by higher LOF surgeons had higher rates of postoperative acute kidney injury. CONCLUSIONS: Rates of LOF significantly varied across surgeons and could not be explained solely by patient case mix, suggesting that variability in perioperative practices influences risk of LOF.


Assuntos
Baixo Débito Cardíaco/epidemiologia , Ponte de Artéria Coronária/efeitos adversos , Insuficiência Cardíaca/epidemiologia , Assistência Perioperatória/estatística & dados numéricos , Padrões de Prática Médica/estatística & dados numéricos , Idoso , Baixo Débito Cardíaco/terapia , Ponte Cardiopulmonar , Cardiotônicos/uso terapêutico , Distribuição de Qui-Quadrado , Competência Clínica/estatística & dados numéricos , Feminino , Insuficiência Cardíaca/terapia , Humanos , Incidência , Balão Intra-Aórtico , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Análise Multivariada , New England/epidemiologia , Estudos Prospectivos , Sistema de Registros , Reoperação , Medição de Risco , Fatores de Risco , Fatores de Tempo , Resultado do Tratamento
10.
Ann Thorac Surg ; 94(6): 2038-45, 2012 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-22959580

RESUMO

BACKGROUND: We previously reported that transfusion of 1 to 2 units of red blood cells (RBCs) confers a 16% increased hazard of late death after cardiac surgical treatment. We explored whether a similar effect existed among octogenarians. METHODS: We enrolled 17,026 consecutive adult patients undergoing cardiac operations from 2001 to 2008 in northern New England. Patients receiving more than 2 units of RBCs or undergoing emergency operations were excluded. Early (to 6 months) and late (to 3 years, among those surviving longer than 6 months) survival was confirmed using the Social Security Death Index. We estimated the relationship between RBCs and survival, and any interaction by age (<80 years versus ≥80 years) or procedure. We calculated the adjusted hazard ratio (HR), and plotted adjusted survival curves. RESULTS: Patients receiving RBCs had more comorbidities irrespective of age. Patients 80 years of age or older underwent transfusion more often than patients younger than 80 years (51% versus 30%; p<0.001). There was no evidence of an interaction by age or procedure (p>0.05). Among patients younger than 80 years, RBCs significantly increased a patient's risk of early death [HR, 2.03; 95% confidence interval [CI], 1.47, 2.80] but not late death 1.21 (95%CI, 0.88, 1.67). RBCs did not increase the risk of early [HR, 1.47; 95% CI, 0.84, 2.56] or late (HR, 0.92 95% CI, 0.50, 1.69) death in patients 80 years or older. CONCLUSIONS: Octogenarians receive RBCs more often than do younger patients. Although transfusion of 1 to 2 units of RBCs increases the risk of early death in patients younger than 80 years, this effect was not present among octogenarians. There was no significant effect of RBCs in late death in either age group.


Assuntos
Anemia/terapia , Transfusão de Sangue/métodos , Procedimentos Cirúrgicos Cardíacos , Cardiopatias/cirurgia , Fatores Etários , Idoso de 80 Anos ou mais , Anemia/complicações , Anemia/mortalidade , Transfusão de Sangue/mortalidade , Feminino , Seguimentos , Cardiopatias/complicações , Cardiopatias/mortalidade , Humanos , Masculino , New England/epidemiologia , Estudos Retrospectivos , Fatores de Risco , Taxa de Sobrevida/tendências , Fatores de Tempo
11.
Ann Thorac Surg ; 92(4): 1260-7, 2011 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-21958769

RESUMO

BACKGROUND: We examined a recent regional experience to determine the effect of a prior cardiac operation on short-term and midterm outcomes after coronary artery bypass grafting (CABG). METHODS: We identified 20,703 patients who underwent nonemergent CABG at 8 centers in northern New England from 2000 to 2008, of whom 818 (3.8%) had undergone prior cardiac operations. Prior CABG using a minimal or full sternotomy was considered a prior sternotomy. Survival data out to 4 years were obtained from a link with the Social Security Administration Death Index. Hazard ratios were estimated using a Cox proportional hazards regression model, and adjusted survival curves were estimated using inverse probability weighting. In a separate analysis, 1,182 patients were matched 1:1 by a patient's propensity for having undergone prior CABG. RESULTS: Patients with prior sternotomies had a greater burden of comorbid diseases and increased acuity and had a greater likelihood of returning to the operating room for bleeding and low cardiac output failure. Prior sternotomy was associated with an increased risk of death out to 4 years for patients undergoing CABG, with an unmatched hazard ratio of 1.34 (95% confidence interval, 1.10 to 1.64) and a matched hazard ratio of 1.36 (95% confidence interval, 1.01 to 1.81). CONCLUSIONS: Analyses of our recent regional experience with nonemergent CABG showed that a prior cardiac operation was associated with a nearly twofold increased hazard of death at up to 4 years of follow-up.


Assuntos
Ponte de Artéria Coronária/mortalidade , Doença da Artéria Coronariana/cirurgia , Idoso , Idoso de 80 Anos ou mais , Procedimentos Cirúrgicos Cardíacos , Doença da Artéria Coronariana/mortalidade , Feminino , Seguimentos , Mortalidade Hospitalar/tendências , Humanos , Estimativa de Kaplan-Meier , Masculino , Pessoa de Meia-Idade , New England/epidemiologia , Período Pós-Operatório , Pontuação de Propensão , Estudos Prospectivos , Medição de Risco , Fatores de Risco , Taxa de Sobrevida/tendências , Fatores de Tempo
12.
Qual Saf Health Care ; 19(5): 392-8, 2010 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-20977993

RESUMO

BACKGROUND: Transfusion of red blood cells, while often used for treating blood loss or haemodilution, is also associated with higher infection rates and mortality. The authors implemented an initiative to reduce variation in the number of perioperative transfusions associated with cardiac surgery. METHODS: The authors examined patients undergoing non-emergent cardiac surgery at a single centre from the third quarter 2004 to the second quarter 2007. Phase I focused on understanding the current process of managing and treating perioperative anaemia. Phase II focused on (1) quality-improvement project dissemination to staff, (2) developing and implementing new protocols, and (3) assessing the effect of subsequent interventions. Data reports were updated monthly and posted in the clinical units. Phase III determined whether reductions in transfusion rates persisted. RESULTS: Indications for transfusions were investigated during Phase II. More than half (59%) of intraoperative transfusions were for low haematocrit (Hct), and 31% for predicted low Hct during cardiopulmonary bypass. 43% of postoperative transfusions were for low Hct, with an additional 16% for failure to diurese. The last Hct value prior to transfusion was noted (Hct 25-23, p=0.14), suggestive of a higher tolerance for a lower Hct by staff surgeons. Intraoperative transfusions diminished across phases: 33% in Phase I, 25.8% in Phase II and 23.4% in Phase III (p<0.001). Relative to Phase I, postoperative transfusions diminished significantly over Phase II and III. CONCLUSIONS: We report results from a focused quality-improvement initiative to rationalise treatment of perioperative anaemia. Transfusion rates declined significantly across each phase of the project.


Assuntos
Anemia/terapia , Transfusão de Sangue/estatística & dados numéricos , Garantia da Qualidade dos Cuidados de Saúde , Procedimentos Cirúrgicos Torácicos , Idoso , Infecção Hospitalar/prevenção & controle , Feminino , Humanos , Masculino , Assistência Perioperatória , Reação Transfusional
13.
J Patient Saf ; 5(3): 153-9, 2009 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-19927048

RESUMO

AIM: To compare the quality between 2 commonly used sedation practices for upper endoscopic ultrasound (EUS) by using expert observational analysis of the sedation practice. METHODS: After institutional review board approval, 50 adults undergoing EUS had videotape observation of the procedural sedation: 25 received benzodiazepine/opiate administered by the endoscopy team as per the standard protocol at our institution, and 25 received propofol administered by a dedicated anesthesiologist. Quantitative analysis of the video was performed using the Dartmouth Operative Conditions Scale (DOCS). The DOCS is a tool previously developed to quantify the adequacy of procedural sedation through an objective measurement of the patient state during the sedation process. In this study, the DOCS was used in a novel way to compare the quality of sedation provided by different sedation protocols. Data were collected on patient demographics, patient and provider satisfaction, efficiency, side effects, and safety measures. RESULTS: Videotape analysis using the DOCS revealed that 52% (13/25) of the standard group exhibited an uncontrolled patient state (significant undersedation and/or oversedation) on 1 or more occasion during their EUS procedure compared with 28% (7/25) of the propofol group. Patients in the standard group spent 7.1% of the procedure in an uncontrolled patient state, whereas patients in the propofol group experienced an uncontrolled state approximately 1.0% of the procedure time. Overall efficiency as measured by time in both the procedure room and in recovery was superior in the propofol group. These patients spent 12 less minutes on average in the procedure room and were ready for discharge in about half the time (56 minutes versus 109 minutes). The propofol group experienced significantly less in-hospital and at-home nausea and vomiting and felt back to baseline status more quickly. Finally, patient satisfaction was improved in the propofol group: 60% felt the procedure was better than anticipated versus 21% in the standard group. CONCLUSIONS: Expert videotape analysis of the patient state during procedural sedation allows direct comparison of sedation methodologies using small numbers of patients. In our institution, endoscopist-directed sedation using a midazolam/narcotic combination for EUS proved inferior to sedation using propofol given by an anesthesiologist. Specifically, a midazolam/narcotic combination provided less effective intraprocedural conditions, was less efficient both before and after the procedure, and was less satisfactory to patients as compared with propofol. Results of this type of analysis can be used to drive appropriate system redesign and improve patient care.


Assuntos
Sedação Consciente , Endossonografia , Hipnóticos e Sedativos/uso terapêutico , Observação , Padrões de Prática Médica , Propofol/uso terapêutico , Idoso , Coleta de Dados , Endoscopia Gastrointestinal , Humanos , Pessoa de Meia-Idade , Estudos Prospectivos , Gravação de Videoteipe
15.
Anesth Analg ; 100(6): 1614-1621, 2005 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-15920183

RESUMO

Studies of pediatric sedation practice have suffered from the lack of an objective scale that would allow for a comparison of the effectiveness and safety of sedation provided by various providers and techniques. We present the Dartmouth Operative Conditions Scale (DOCS), which is designed as a research tool to codify the appropriateness of the procedural conditions provided by various sedation interventions. To begin, human factors methodology was used to develop a model of the pediatric sedation process and to define the criteria for measuring a patient's condition during a procedure (DOCS). To accomplish validation, 70 video clips (30-s duration) were then selected from more than 300 h of procedural video tape for testing/grading purposes. Inter-rater reliability was tested by comparing the score for each video clip among 10 different raters. Intra-rater reliability was evaluated by retesting all of the raters 1 yr after their initial rating. Construct validity was confirmed by analyzing the change in DOCS score relative to the time that sedation intervention was undertaken. Criterion validity was tested by comparing the DOCS to a modified COMFORT score. The DOCS was completed with excellent inter-rater (kappa = 0.84) and intra-rater (kappa = 0.91) agreement by 10 health care providers with various backgrounds during the 1-yr study period. Criterion validity was supported by the close correlation between the DOCS and the modified COMFORT scores for 20 distinct video clips (Spearman correlation coefficient = 0.98; P <0.001). The distribution of DOCS scores 20 min after the anesthetic induction was significantly lower than the scores before initiation of sedation, and scores after emergence were consistently higher than those 20 min after sedation (P <0.001), thus confirming construct validity of the scale. The DOCS is a validated research tool when used with video data for comparing the effectiveness and safety of pediatric sedation service, regardless of technique used for decreasing anxiety or pain during a procedure.


Assuntos
Sedação Consciente/normas , Criança , Sedação Consciente/efeitos adversos , Guias como Assunto , Humanos , Modelos Estatísticos , Variações Dependentes do Observador , Sala de Recuperação , Reprodutibilidade dos Testes , Gravação de Videoteipe
16.
Crit Care Med ; 31(12 Suppl): S668-71, 2003 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-14724464

RESUMO

BACKGROUND: Anemia in the critically ill patient population is common. This anemia of critical illness is a distinct clinical entity characterized by blunted erythropoietin production and abnormalities in iron metabolism identical to what is commonly referred to as the anemia of chronic disease. FINDINGS: As a result of this anemia, critically ill patients receive an extraordinarily large number of blood transfusions. Between 40% and 50% of all patients admitted to intensive care units receive at least one red blood cell unit, and the average is close to five red blood cell units during their intensive care unit stay. There is little evidence that "routine" transfusion of stored allogeneic red blood cells is beneficial for critically ill patients. Most critically ill patients can tolerate hemoglobin levels as low as 7 mg/dL, so a more conservative approach to red blood cell transfusion is warranted. CONCLUSION: Practice strategies should be directed toward a reduction of blood loss (phlebotomy) and a decrease in the transfusion threshold in critically ill patients.


Assuntos
Anemia/sangue , Anemia/terapia , Cuidados Críticos/métodos , Transfusão de Eritrócitos/métodos , Eritropoetina/sangue , Hemorragia/etiologia , Humanos , Ferro/sangue , Avaliação de Processos e Resultados em Cuidados de Saúde , Flebotomia/efeitos adversos
17.
Anesth Analg ; 95(6): 1483-8, table of contents, 2002 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-12456405

RESUMO

UNLABELLED: Avoidance of tachycardia is a commonly described goal for anesthetic management during coronary artery bypass graft (CABG) surgery. However, an association between increased intraoperative heart rate and mortality has not been described. We conducted an observational study to evaluate the association between preinduction heart rate (heart rate upon arrival to the operating room) and in-hospital mortality during CABG surgery. Data were collected on 5934 CABG patients. Fifteen percent of patients had an increased preinduction heart rate > or =80 bpm. Crude mortality was significantly more frequent among patients with increased preinduction heart rate (P(trend) = 0.002). After adjustment for baseline differences among patients, preinduction heart rate > or =80 bpm remained associated with increased mortality (P(trend) < 0.001). The increased heart rate may be a cause of the observed mortality. Alternatively, faster heart rate may be either a marker of patients with irreversible myocardial damage, or a marker of patients with limited cardiac reserve at risk for further injury. Lastly, faster heart rate may be a marker for under-use of beta-adrenergic blockade. Because the use of preoperative beta-adrenergic blockade in CABG patients is associated with improved in-hospital survival, further investigation concerning the effect of intraoperative treatment of increased heart rate with beta-adrenergic blockers on mortality after CABG surgery is warranted. IMPLICATIONS: We conducted an observational study to evaluate the association between heart rate upon arrival to the operating room (preinduction heart rate) and in-hospital mortality during coronary artery bypass graft surgery. After adjustment for baseline differences among patients, preinduction heart rate > or =80 bpm was associated with an increased in-hospital mortality after coronary artery bypass graft surgery.


Assuntos
Ponte de Artéria Coronária/mortalidade , Frequência Cardíaca , Mortalidade Hospitalar , Antagonistas Adrenérgicos beta/farmacologia , Adulto , Idoso , Idoso de 80 Anos ou mais , Feminino , Frequência Cardíaca/efeitos dos fármacos , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA