Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
1.
Surg Infect (Larchmt) ; 22(6): 635-639, 2021 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-34270364

RESUMO

Background: Medical knowledge is constantly growing at an exponential rate. Despite this growth, it is estimated to take 17 years for medical innovation to reach the bedside and improve clinical care. Implementation science is the scientific study of methods to facilitate the update of evidence-based practice and research into regular use and policy. Discussion: Implementation science offers theories, models, and frameworks aimed at decreasing the time it takes to get medical innovation to the patient and to sustain the care improvements. Implementation science principles center around five main fundamental concepts that include information diffusion, dissemination, implementation, adoption, and sustainability. Understanding these fundamental concepts allow clinicians to prepare for an implementation by asking the correct questions such as: Are we ready for change?; What is our current process that we want to change?; Who needs to be involved in the implementation?; and How do we measure success? This article describes a successful catheter-associated urinary tract infection quality improvement program implemented using implementation science principles. Conclusion: Implementation science offers many proven tools and strategies to implement new evidence-based medicine and medical innovations into common practice. Clinicians are often the leaders of change and should develop an understanding of implementation science fundamentals to allow successful implementation of quality improvement and research initiatives.


Assuntos
Ciência da Implementação , Melhoria de Qualidade , Infecção da Ferida Cirúrgica , Medicina Baseada em Evidências , Humanos
2.
J Trauma Acute Care Surg ; 89(6): 999-1017, 2020 12.
Artigo em Inglês | MEDLINE | ID: mdl-32941349

RESUMO

BACKGROUND: Assessment of the immediate need for specific blood product transfusions in acutely bleeding patients is challenging. Clinical assessment and commonly used coagulation tests are inaccurate and time-consuming. The goal of this practice management guideline was to evaluate the role of the viscoelasticity tests, which are thromboelastography (TEG) and rotational thromboelastometry (ROTEM), in the management of acutely bleeding trauma, surgical, and critically ill patients. METHODS: Systematic review and meta-analyses of manuscripts comparing TEG/ROTEM with non-TEG/ROTEM-guided blood products transfusions strategies were performed. The Grading of Recommendations Assessment, Development and Evaluation methodology was applied to assess the level of evidence and create recommendations for TEG/ROTEM-guided blood product transfusions in adult trauma, surgical, and critically ill patients. RESULTS: Using TEG/ROTEM-guided blood transfusions in acutely bleeding trauma, surgical, and critically ill patients was associated with a tendency to fewer blood product transfusions in all populations. Thromboelastography/ROTEM-guided transfusions were associated with a reduced number of additional invasive hemostatic interventions (angioembolic, endoscopic, or surgical) in surgical patients. Thromboelastography/ROTEM-guided transfusions were associated with a reduction in mortality in trauma patients. CONCLUSION: In patients with ongoing hemorrhage and concern for coagulopathy, we conditionally recommend using TEG/ROTEM-guided transfusions, compared with traditional coagulation parameters, to guide blood component transfusions in each of the following three groups: adult trauma patients, adult surgical patients, and adult patients with critical illness. LEVEL OF EVIDENCE: Systematic Review/Meta-Analysis, level III.


Assuntos
Transtornos da Coagulação Sanguínea/sangue , Transfusão de Sangue/normas , Hemorragia/terapia , Guias de Prática Clínica como Assunto , Tromboelastografia/métodos , Adulto , Transtornos da Coagulação Sanguínea/diagnóstico , Testes de Coagulação Sanguínea , Transfusão de Sangue/métodos , Estado Terminal , Hemorragia/sangue , Hemorragia/etiologia , Hemorragia/mortalidade , Humanos , Avaliação de Resultados em Cuidados de Saúde , Sociedades Médicas , Procedimentos Cirúrgicos Operatórios/efeitos adversos , Tromboelastografia/efeitos adversos , Ferimentos e Lesões/complicações , Ferimentos e Lesões/mortalidade
3.
J Trauma Acute Care Surg ; 87(1): 27-34, 2019 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-31260424

RESUMO

BACKGROUND: Rates of damage control laparotomy (DCL) vary widely and consensus on appropriate indications does not exist. The purposes of this multicenter quality improvement (QI) project were to decrease the use of DCL and to identify indications where consensus exists. METHODS: In 2016, six US Level I trauma centers performed a yearlong, QI project utilizing a single QI tool: audit and feedback. Each emergent trauma laparotomy was prospectively reviewed. Damage control laparotomy cases were adjudicated based on the majority vote of faculty members as being appropriate or potentially, in retrospect, safe for definitive laparotomy. The rate of DCL for 2 years prior (2014 and 2015) was retrospectively collected and used as a control. To account for secular trends of DCL, interrupted time series was used to effectiveness of the QI interventions. RESULTS: Eight hundred seventy-two emergent laparotomies were performed: 73% definitive laparotomies, 24% DCLs, and 3% intraoperative deaths. Of the 209 DCLs, 162 (78%) were voted appropriate, and 47 (22%) were voted to have been potentially safe for definitive laparotomy. Rates of DCL ranged from 16% to 34%. Common indications for DCL for which consensus existed were packing (103/115 [90%] appropriate) and hemodynamic instability (33/40 [83%] appropriate). The only common indication for which primary closure at the initial laparotomy could have been safely performed was avoiding a planned second look (16/32 [50%] appropriate). CONCLUSION: A single faceted QI intervention failed to decrease the rate of DCL at six US Level I trauma centers. However, opportunities for improvement in safely decreasing the rate of DCL were present. Second look laparotomy appears to lack consensus as an indication for DCL and may represent a target to decrease the rate of DCL after injury. LEVEL OF EVIDENCE: Epidemiological study with one negative criterion, level III.


Assuntos
Traumatismos Abdominais/cirurgia , Laparotomia/métodos , Melhoria de Qualidade , Centros de Traumatologia/estatística & dados numéricos , Traumatismos Abdominais/diagnóstico , Traumatismos Abdominais/terapia , Adulto , Feminino , Humanos , Laparotomia/estatística & dados numéricos , Masculino , Estudos Retrospectivos , Cirurgia de Second-Look/métodos , Cirurgia de Second-Look/estatística & dados numéricos
4.
J Trauma Acute Care Surg ; 87(2): 282-288, 2019 08.
Artigo em Inglês | MEDLINE | ID: mdl-30939584

RESUMO

BACKGROUND: In patients for whom surgical equipoise exists for damage control laparotomy (DCL) and definitive laparotomy (DEF), the effect of DCL and its associated resource utilization are unknown. We hypothesized that DEF would be associated with fewer abdominal complications and less resource utilization. METHODS: In 2016, six US Level I trauma centers performed a yearlong, prospective, quality improvement project with the primary aim to safely decrease the use of DCL. From this cohort of patients undergoing emergent trauma laparotomy, those who underwent DCL but were judged by majority faculty vote at each center to have been candidates for potential DEF (pDEF) were prospectively identified. These pDEF patients were matched 1:1 using propensity scoring to the DEF patients. The primary outcome was the incidence of major abdominal complications (MAC). Deaths within 5 days were excluded. Outcomes were assessed using both Bayesian generalized linear modeling and negative binomial regression. RESULTS: Eight hundred seventy-two total patients were enrolled, 639 (73%) DEF and 209 (24%) DCL. Of the 209 DCLs, 44 survived 5 days and were judged to be patients who could have safely been closed at the primary laparotomy. Thirty-nine pDEF patients were matched to 39 DEF patients. There were no differences in demographics, mechanism of injury, Injury Severity Score, prehospital/emergency department/operating room vital signs, laboratory values, resuscitation, or procedures performed during laparotomy. There was no difference in MAC between the two groups (31% DEF vs. 21% pDEF, relative risk 0.99, 95% credible interval 0.60-1.54, posterior probability 56%). Definitive laparotomy was associated with a 72%, 77%, and 72% posterior probability of more hospital-free, intensive care unit-free, and ventilator-free days, respectively. CONCLUSION: In patients for whom surgeons have equipoise for DCL versus definitive surgery, definitive abdominal closure was associated with a similar probability of MAC, but a high probability of fewer hospital-free, intensive care unit-free, and ventilator-free days. LEVEL OF EVIDENCE: Therapeutic/care management, level III.


Assuntos
Laparotomia/métodos , Tempo de Internação/estatística & dados numéricos , Ferimentos e Lesões/cirurgia , Adulto , Teorema de Bayes , Feminino , Humanos , Laparotomia/efeitos adversos , Laparotomia/mortalidade , Masculino , Pessoa de Meia-Idade , Pontuação de Propensão , Estudos Prospectivos , Ferimentos e Lesões/complicações , Ferimentos e Lesões/mortalidade
5.
Surg Infect (Larchmt) ; 20(2): 107-110, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-30489217

RESUMO

BACKGROUND: Renovating or building a new intensive care unit (ICU) can be a challenging project. Planning the renovation or rebuild as a quality improvement project will help break down the process into manageable pieces with clear goals. METHODS: Literature was reviewed with regards to ICU design and renovation, with specific attention to patient quality improvement, process and structural change, healthcare systems engineering, emerging technology, and infection control. RESULTS: In any quality improvement initiative, a first step is to create a multidisciplinary change team charged with leading the rebuild process. This team should include frontline providers, administration, architects, infection prevention specialists, and healthcare system engineers. Healthcare system engineers (HSEs) are specialized system and human factors engineers who can assist with data analysis, create mathematical models to anticipate areas of difficulty, and perform simulations to assist with the actual structural changes as well as the process changes aimed at eliminating nosocomial infections. Every aspect of creating a new ICU space should begin with infection control standards of practice ranging from selection of furniture and computer keyboards, to identifying the best location of the soiled utility rooms. There are many infection control products that may be considered during the building process such as tele-tracking hand hygiene stations and heavy-metal-coated surfaces aimed at decreasing surface colonization and subsequent infections. CONCLUSIONS: This article offers suggestions on renovating or rebuilding an ICU aimed at eliminating the preventable harm associated with hospital acquired infections.


Assuntos
Arquitetura Hospitalar , Controle de Infecções/métodos , Unidades de Terapia Intensiva , Infecção da Ferida Cirúrgica/prevenção & controle , Humanos , Segurança do Paciente , Qualidade da Assistência à Saúde
6.
J Trauma Acute Care Surg ; 85(4): 697-703, 2018 10.
Artigo em Inglês | MEDLINE | ID: mdl-30036259

RESUMO

BACKGROUND: We initiated a prospective interventional study using a nurse-driven bedside dysphagia screen (BDS) in patients with cervical spine injury (CI) to address three objectives: (1) determine the incidence of dysphagia, (2) determine the utility of the new BDS as a screening tool, and (3) compare patient outcomes, specifically dysphagia-related complications, in the study period with a retrospective cohort. METHODS: All patients with CI admitted to a Level I trauma center were enrolled in a prospective 12-month study (June 2016-June 2017) and then were compared with a previous 18-month cohort of similar patients. Our new protocol mandated that every patient underwent a BDS before oral intake. If the patient failed the BDS, a modified barium swallow (MBS) was obtained. Exclusion criteria were emergency department discharge, inability to participate in a BDS, leaving against medical advice, BDS protocol violations, or death before BDS. A failed MBS was defined as a change in diet and a need for a repeat MBS. Dysphagia was defined as a failed MBS or the presence of a dysphagia-related complication. RESULTS: Of 221 consecutive prospective patients identified, 114 met inclusion criteria. The incidence of dysphagia was 16.7% in all prospective study patients, 14.9% in patients with isolated CI, and 30.8% in patients with spinal cord injury. The BDS demonstrated 84.2% sensitivity, 95.8% specificity, 80.0% positive predictive value, and 96.8% negative predictive value. There were no dysphagia-related complications. The prospective study patients demonstrated significantly less dysphagia-related complications (p = 0.048) when compared with the retrospective cohort of 276 patients. CONCLUSIONS: The introduction of the BDS resulted in increased dysphagia diagnoses, with a significant reduction in dysphagia-related complications. We recommend incorporating BDS into care pathways for patients with CI. LEVEL OF EVIDENCE: Study type diagnostic test, level III.


Assuntos
Síndrome Medular Central/complicações , Transtornos de Deglutição/diagnóstico , Testes Imediatos , Fraturas da Coluna Vertebral/complicações , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Vértebras Cervicais/lesões , Transtornos de Deglutição/etiologia , Ingestão de Líquidos , Reações Falso-Negativas , Reações Falso-Positivas , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Valor Preditivo dos Testes , Estudos Prospectivos , Inquéritos e Questionários , Água , Adulto Jovem
7.
Surg Infect (Larchmt) ; 19(6): 582-586, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-29812994

RESUMO

BACKGROUND: Blood cultures (BCx) are the gold standard for diagnosing blood stream infections. However, contamination remains a challenge and can increase cost, hospital days, and unnecessary antibiotic use. National goals are to keep overall BCx contamination rates to ≤3%. Our healthcare system recently moved to a BCx system with better organism recovery, especially for gram-negative, fastidious, and anaerobic bacteria. The study objectives were to determine the benefits/consequences of implementing a more sensitive blood culture system, specifically on contamination rates. METHODS: The electronic health record was queried for all BCx obtained within our tertiary-care health system from April 2015 to October 2016. Cultures were divided into those obtained 12 months before and six months after the new system was introduced. A positive BCx was defined as one with any growth. Contaminated BCx were defined as those showing coagulase-negative Staphylococcus, Corynebacterium, Bacillus, Micrococcus, or Propionibacterium acnes. Cultures with Staphylococcus aureus, Klebsiella pneumoniae, or Escherichia coli were said to contain a true pathogen. Results based on hospital location of blood drawing also were determined. RESULTS: A total of 20,978 blood cultures were included, 13,292 before and 7,686 after the new system was introduced. With the new system, positive BCx rates increased from 7.5% to 15.7% (p < 0.001). Contaminants increased from 2.3% to 5.4% (p < 0.001), and pathogens increased from 2.5% to 5.8% (p < 0.001). Contaminated BCx increased significantly in the surgical/trauma intensive care unit (STICU), emergency department (ED), and medical ICU (MICU), while pathogen BCx increased on the surgical floor, ED, and MICU. CONCLUSIONS: A new blood culture system resulted in significant increases in the rates of positive, contaminated, and pathogen BCx. After the new system, multiple hospital units had contamination rates >3%. These data suggest that a "better" BCx system may not be superior regarding overall infection rates. More research is needed to determine the impact of identifying more contaminants and pathogens with the new system.


Assuntos
Bacteriemia/diagnóstico , Hemocultura , Bacteriemia/microbiologia , Hemocultura/métodos , Reações Falso-Positivas , Humanos , Melhoria de Qualidade , Sensibilidade e Especificidade , Centros de Atenção Terciária/estatística & dados numéricos
8.
Am Surg ; 84(4): 557-564, 2018 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-29712606

RESUMO

The optimal number of level I trauma centers (L1TCs) in a region has not been elucidated. To begin addressing this, we compared mortalities for patients treated in counties or regions with 1 L1TC to those with >1 L1TC across Ohio. Ohio Trauma Registry data from 2010 to 2012 were analyzed. Patients with age ≥15 from counties/regions with L1TC were included. Region was defined as a L1TC containing county and its neighboring counties. Two analyses were performed. In the county analysis, counties containing 1 L1TC were compared with counties with multiple L1TCs. This comparison is repeated on a regional level for the regional analysis. Subgroup analyses were performed. 38,661 and 55,064 patients were in the county and regional analysis, respectively. Patients treated in counties or regions with multiple L1TCs were significantly younger (P < 0.001). Despite this, the mortality was similar for the two groups in the county analysis and significantly higher for regions with multiple L1TCs (P < 0.001). Multivariate logistic regression demonstrated that having multiple L1TC coverage in a region was an independent predictor for death (odds ratios: 1.17; 1.07-1.28; P = 0.001). Subgroup analyses showed that mortality in counties and regions with multiple L1TCs was not lower in any subgroups but was higher in patients with age ≥65 and patients with blunt injuries (P < 0.05). Having multiple L1TCs in a county was associated with increased mortality in certain patient subgroups. Having multiple L1TCs in a region was an independent predictor for death. These results should be considered carefully when designing future regionalized trauma networks. More L1TCs is not necessarily better.


Assuntos
Acessibilidade aos Serviços de Saúde/estatística & dados numéricos , Indicadores de Qualidade em Assistência à Saúde/estatística & dados numéricos , Centros de Traumatologia/provisão & distribuição , Ferimentos e Lesões/mortalidade , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Ohio/epidemiologia , Sistema de Registros , Estudos Retrospectivos , Centros de Traumatologia/normas , Ferimentos e Lesões/terapia , Adulto Jovem
9.
Surg Infect (Larchmt) ; 18(5): 558-562, 2017 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-28561600

RESUMO

BACKGROUND: In 2013, the Centers for Disease Control and Prevention (CDC) developed new surveillance definitions for ventilator-associated events (VAE), leading to concerns that hospitals may be underreporting the true incidence of ventilator-associated pneumonias (VAPs). We sought to compare rates of clinically diagnosed VAP with CDC defined possible VAPs (PVAPs) in patients with a VAE in the surgical/trauma intensive care unit (STICU). HYPOTHESIS: Significant difference exists between rates of clinical VAP and PVAP in patients with at least one VAE. PATIENTS AND METHODS: All STICU patients with ≥1 VAE, between 1/1/2013 and 10/31/2015 were identified. Age, length of stay (LOS), ICU and ventilator days were collected. RESULTS: There were 134 patients who had ≥1 VAE. Mean age was 54.3 (±17.1) years. Mean LOS, median ICU, and median ventilator days were 26.3 (±14.1), 21.0 (17.0-33.0), and 17.0 (12.8-24.0) days, respectively. There were 68 cases of clinically diagnosed VAP, but only 37% met PVAP criteria. We compared 43 cases of clinical VAP, not meeting PVAP criteria, with the 25 PVAPs. Both groups had similar outcomes. The PVAPs were more likely to have an abnormal temperature (48.0% vs. 14.0%, p = 0.004), abnormal white blood cell count (84.0% vs. 18.6%, p < 0.001), or new antibiotic agent initiated (100% vs. 18.6%, p < 0.001) as VAE triggers. Comparison of the 93 trauma and 41 surgical patients demonstrated trauma patients were younger (51.2 vs. 61.5 y, p = 0.001), but had similar outcomes and rates of clinical VAP (48.4% and 43.9%, p = NS). Only 20.4% of trauma and 14.6% of surgical patients, however, had a PVAP reported. For patients with at least one VAE, the sensitivity and specificity for PVAP detecting VAP was 36.8% and 96.0%, respectively. CONCLUSION: The new CDC definition for PVAP grossly underestimates the clinical diagnosis of VAP and reports less than a third of the patients treated for VAP. Reporting differences were similar for trauma and surgical patients.


Assuntos
Unidades de Terapia Intensiva/estatística & dados numéricos , Pneumonia Associada à Ventilação Mecânica/diagnóstico , Pneumonia Associada à Ventilação Mecânica/epidemiologia , Respiração Artificial/efeitos adversos , Respiração Artificial/estatística & dados numéricos , Centros de Traumatologia/estatística & dados numéricos , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Pneumonia Associada à Ventilação Mecânica/prevenção & controle , Estudos Retrospectivos , Procedimentos Cirúrgicos Operatórios
10.
J Trauma Acute Care Surg ; 81(1): 190-5, 2016 07.
Artigo em Inglês | MEDLINE | ID: mdl-27032008

RESUMO

BACKGROUND: The Northern Ohio Trauma System (NOTS), established in 2010, is a collaborative regional trauma system composed of one level I and several lower-level trauma centers (TCs) across multiple hospital systems. Mortalities between counties in NOTS and other Ohio counties were compared to assess NOTS performance. METHODS: State trauma registry was analyzed for patients 15 years or older from 2006 to 2012. Mortality change over time was assessed by comparing all counties before and after NOTS establishment. Two analyses were done in the post-NOTS period: (1) a county analysis, comparing Cuyahoga County, the county containing NOTS level I TC (L1TC), with other counties containing L1TCs and (2) a regional analysis, comparing Cuyahoga and its adjacent counties (i.e., the NOTS region) with other L1TC containing regions. The following subgroups were included a priori: Injury Severity Score 15 or greater, age 65 years or older, and trauma mechanism. RESULTS: A total of 178,143 patients were analyzed. Cuyahoga was the only county that had a decrease in mortality for both the overall group and all subgroups over time (all p < 0.05). Both the county and regional analyses showed that the overall NOTS patients were 1 to 4 years older (p < 0.05), had similar or higher Injury Severity Score (p < 0.05), and were treated more often at lower-level TCs (p < 0.001). County analysis demonstrated that Cuyahoga County had approximately 1% lower mortality in geriatrics patients compared with non-NOTS counties. Regional analysis showed lower mortality in the NOTS region for the overall patient group, as well as geriatric and blunt injuries subgroups. CONCLUSIONS: Cuyahoga was the only county in Ohio that had significant mortality reduction for all patient groups over time. Trauma system regionalization was associated with greater utilization of lower-level TCs and lower patient mortality. These findings suggest that a collaborative regional trauma system may be more important than the number of L1TC in an area. LEVEL OF EVIDENCE: Therapeutic/care management study, level IV.


Assuntos
Avaliação de Processos e Resultados em Cuidados de Saúde , Programas Médicos Regionais/organização & administração , Centros de Traumatologia/organização & administração , Ferimentos e Lesões/mortalidade , Ferimentos e Lesões/terapia , Adolescente , Adulto , Idoso , Feminino , Humanos , Escala de Gravidade do Ferimento , Masculino , Pessoa de Meia-Idade , Ohio/epidemiologia , Sistema de Registros
12.
Surg Infect (Larchmt) ; 13(2): 121-4, 2012 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-22439782

RESUMO

BACKGROUND: Cytomegalovirus (CMV) enteritis presenting with perforation in the setting of acquired immunodeficiency syndrome (AIDS) represents a particularly deadly combination. METHODS: Case report and review of the pertinent literature. CASE REPORT: The authors report a patient with AIDS and CMV enteritis presenting as recurrent small-bowel obstruction and leading to perforation of the jejunum with subsequent survival. CONCLUSION: This is believed to represent the second case in the English-language literature of survival after CMV-induced small intestinal perforation in a patient with AIDS.


Assuntos
Infecções Oportunistas Relacionadas com a AIDS/complicações , Infecções por Citomegalovirus/tratamento farmacológico , Enterite/microbiologia , Obstrução Intestinal/virologia , Perfuração Intestinal/virologia , Doenças do Jejuno/virologia , Antivirais/uso terapêutico , Enfisema/cirurgia , Enfisema/virologia , Enterite/cirurgia , Humanos , Obstrução Intestinal/cirurgia , Perfuração Intestinal/cirurgia , Doenças do Jejuno/cirurgia , Masculino , Pessoa de Meia-Idade , Recidiva
13.
J Trauma ; 66(6): 1539-46; discussion 1546-7, 2009 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-19509612

RESUMO

BACKGROUND: Sepsis is the leading cause of mortality in noncoronary intensive care units. Recent evidence based guidelines outline strategies for the management of sepsis and studies have shown that early implementation of these guidelines improves survival. We developed an extensive logic-based sepsis management protocol; however, we found that early recognition of sepsis was a major obstacle to protocol implementation. To improve this, we developed a three-step sepsis screening tool with escalating levels of decision making. We hypothesized that aggressive screening for sepsis would improve early recognition of sepsis and decrease sepsis-related mortality by insuring early appropriate interventions. METHODS: Patients admitted to the surgical intensive care unit were screened twice daily by our nursing staff. The initial screen assesses the systemic inflammatory response syndrome parameters (heart rate, temperature, white blood cell count, and respiratory rate) and assigns a numeric score (0-4) for each. Patients with a score of > or = 4 screened positive proceed to the second step of the tool in which a midlevel provider attempts to identify the source of infection. If the patients screens positive for both systemic inflammatory response syndrome and an infection, the intensivist was notified to determine whether to implement our sepsis protocol. RESULTS: Over 5 months, 4,991 screens were completed on 920 patients. The prevalence of sepsis was 12.2%. The screening tool yielded a sensitivity of 96.5%, specificity of 96.7%, a positive predictive value of 80.2%, and a negative predictive value of 99.5%. In addition, sepsis-related mortality decreased from 35.1% to 23.3%. CONCLUSIONS: The three step sepsis screening tool is a valid tool for the early identification of sepsis. Implementation of this tool and our logic-based sepsis protocol has decreased sepsis-related mortality in our SICU by one third.


Assuntos
Sepse/diagnóstico , Adolescente , Adulto , Criança , Pré-Escolar , Cuidados Críticos , Feminino , Humanos , Lactente , Recém-Nascido , Masculino , Programas de Rastreamento , Pessoa de Meia-Idade , Estudos Retrospectivos , Sepse/mortalidade , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...