Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 58
Filtrar
1.
J Environ Qual ; 2024 May 11.
Artigo em Inglês | MEDLINE | ID: mdl-38733256

RESUMO

As global climate change poses a challenge to crop production, it is imperative to prioritize effective adaptation of agricultural systems based on a scientific understanding of likely impacts. In this study, we applied an integrated watershed modeling framework to examine the impacts of projected climate on runoff, soil moisture, and soil erosion under different management systems in Central Oklahoma. The proposed model uses measured climate data and three downscaled ensembles from the Coupled Model Intercomparison Project Phase 6 (CMIP6) at the water resources and erosion watershed to understand the impact of climate change and various climate conditions under three management systems: (1) continuous winter wheat (Triticum aestivum) under conventional tillage (WW-CT; baseline system), (2) continuous winter wheat under no-till (WW-NT), and (3) cool and warm season forage cover crop mixes under no-till (CC-NT). The study indicates that the occurrence of agricultural drought is projected to increase while erosion rates will remain unchanged under the WW-CT. In contrast, climate simulations imposed on the WW-NT and CC-NT systems significantly reduce runoff and sediment while preserving soil moisture levels. Especially, implementing the CC-NT system can bolster food security and foster sustainable farming practices in Central Oklahoma in the face of a changing climate.

2.
J Environ Manage ; 321: 115933, 2022 Nov 01.
Artigo em Inglês | MEDLINE | ID: mdl-35973288

RESUMO

One of the greatest threats to maintaining sustainable agro-ecosystems is mitigating the episodic soil loss from farm operations, further exacerbated by meteorological extremes. The Revised Universal Soil Loss Equation (RUSLE) is a model that combines the effects of rain, soil erodibility, topography, land cover, and conservation practices for estimating the annual average soil losses. This study aims to quantify soil water erosion to continental South America (S.A.) through RUSLE using available datasets and characterizing the average sediment delivery rate (SDR) to the major S.A. basins. Soil erodibility was estimated from the Global Gridded Soil Information soil database. LS-factor's topographical parameter was derived from Digital Elevation Models using the "Shuttle Radar Topography Mission" dataset. The R-factor was estimated from a previous study developed for S.A. and the C-factor from the Global Land Cover (Copernicus Global Land Services) database. We used a modeling study for SDR that simulated the annual average sediment transport in 27 basins in S.A. RUSLE set up presented a satisfactory performance compared to other applications on a continental scale with an estimated averaged soil loss for S.A. of 3.8 t ha-1 year-1. Chile (>20.0 t ha-1 year-1) and Colombia (8.1 t ha-1 year-1) showed the highest soil loss. Regarding SDR, Suriname, French Guyana, and Guyana presented the lowest values (<1.0 t ha-1 year-1). The highest soil losses were found in the Andes Cordillera of Colombia and the Center-South Region of Chile. In the former, the combination of "high" K-factor, "very high" C-factor, and "very high" LS-factor were the leading causes. In the latter, agriculture, livestock, deforestation, and aggressive R-factor explained the high soil loss. Basins with the highest SDR were located in the North Argentina - South Atlantic basin (27.73%), Mar Chiquitita (2.66%), Amazon River basin (2.32%), Magdalena (2.14%) (in Andes Cordillera), and Orinoco (1.83%).


Assuntos
Monitoramento Ambiental , Desenvolvimento Sustentável , Chile , Conservação dos Recursos Naturais , Ecossistema , Sistemas de Informação Geográfica , Solo
3.
Rev. estomatol. Hered ; 32(1): 42-51, ene.-mar 2022. tab, graf
Artigo em Espanhol | LILACS-Express | LILACS | ID: biblio-1389061

RESUMO

RESUMEN Objetivo: Evaluar la frecuencia y distribución de eventos adversos que se presentan en el desarrollo de los tratamientos odontológicos ejecutados por operadores clínicos de una Clínica Dental Docente durante el 2015. Material y Métodos: Se realizó un estudio observacional y descriptivo de corte longitudinal y prospectivo para evaluar la frecuencia de eventos adversos en la práctica odontológica de estudiantes de Odontología, para lo cual se implementó un Sistema de Registro y Notificación de Eventos Adversos en Odontología dirigido a 110 operadores clínicos en las áreas de Odontología Restauradora, Cirugía Bucal, Endodoncia y Rehabilitación Oral durante el 2015. Resultados: Se reportaron 167 eventos adversos que representan un 10.18% del total de tratamientos realizados durante el periodo de seguimiento y evaluación Los eventos más frecuentes, según campo clínico, fueron la "Hipersensibilidad post tratamiento restaurador" (operatoria dental), "Sobreobturación o subobturación con sintomatología" (endodoncia), "Alveolitis dental post-extracción" (cirugía bucal) y "Sensibilidad post-tallado para prótesis fija (pilares)" (rehabilitación oral). Conclusiones: La implementación de un sistema de registro y notificación es el punto de partida para la identificación de eventos adversos más frecuentes en la práctica odontológico, el cual es útil para definir procedimientos a seguir, elaborar protocolos y formular lineamientos para la atención segura y de calidad.


ABSTRACT Objective: To evaluate the frequency and distribution of adverse events that occur in the development of dental treatments performed by clinical operators of a Teaching Dental Clinic during 2015. Material and Methods: An observational and descriptive longitudinal and prospective study was carried out to evaluate the frequency of adverse events in the dental practice of Dentistry students, for which a System for the Registration and Notification of Adverse Events in Dentistry was implemented aimed at 110 clinical operators in the areas of Restorative Dentistry, Oral Surgery, Endodontics and Oral Rehabilitation during 2015. Results: 167 adverse events were reported, representing 10.18% of the total treatments performed during the monitoring and evaluation period. Common, depending on the clinical field, were "Post-restorative treatment hypersensitivity" (restorative dentistry), "Over-filling or under-filling with symptoms" (endodontics), "Post-extraction dental alveolitis" (oral surgery) and "Post-carving sensitivity for fixed prostheses (pillars)" (oral rehabilitation). Conclusions: The implementation of a registration and notification system is the starting point for the identification of the most frequent adverse events in dental practice, which is useful for defining procedures to be followed, developing protocols, and formulating guidelines for healthcare quality and safety.

4.
J Environ Manage ; 279: 111631, 2021 Feb 01.
Artigo em Inglês | MEDLINE | ID: mdl-33213990

RESUMO

Soil erosion is significantly increased and accelerated by unsustainable agricultural activities, resulting in one of the major threats to soil health and water quality worldwide. Quantifying soil erosion under different conservation practices is important for watershed management and a framework that can capture the spatio-temporal dynamics of soil erosion by water is required. In this paper, a modeling framework that coupled physically based models, Water Erosion Prediction Project (WEPP) and MIKE SHE/MIKE 11, was presented. Daily soil loss at a grid-scale resolution was determined using WEPP and the transport processes were simulated using a generic advection dispersion equation in MIKE SHE/MIKE 11 models. The framework facilitated the physical simulation of sediment production at the field scale and transport processes across the watershed. The coupled model was tested using an intensively managed agricultural watershed in Illinois. The impacts of no-till practice on both sediment production and sediment yield were evaluated using scenario-based simulations with different fractions of no-till and conventional tillage combinations. The results showed that if no-till were implemented for all fields throughout the watershed, 76% and 72% reductions in total soil loss and sediment yield, respectively, can be achieved. In addition, if no-till practice were implemented in the most vulnerable areas to sediment production across the watershed, a 40% no-till implementation can achieve almost the same reduction as 100% no-till implementation. Based on the simulation results, the impacts of no-till practice are more prominent if implemented where it is most needed.


Assuntos
Monitoramento Ambiental , Água , Illinois , Solo , Erosão do Solo
5.
Sci Total Environ ; 759: 143502, 2021 Mar 10.
Artigo em Inglês | MEDLINE | ID: mdl-33221001

RESUMO

The use of nitrogen (N) fertilizer marked the start of modern agriculture that boosted food production to help alleviate food shortages across the globe but at the cost of severe environmental issues and critical stress to the agroecosystem. This paper was aimed at determining the fate and transport of nitrite and ammonia under future climate projections by adapting the recommended land management practices that are supposed to reduce nitrate N in surface water to state government target. To accomplish these objectives, a fully-distributed physical-based hydrologic model, MIKE SHE, and a hydrodynamic river model, MIKE 11, were coupled with MIKE ECO-Lab to simulate the fate and transport of different forms of N in the agro-ecosystem in the Upper Sangamon River Basin (USRB). Twelve (12) combinations of land management and climate projections were simulated to evaluate the N fate and transport in the USRB from 2020 to 2050. Under the current land management, the nitrate concentration in surface water was expected to exceed the EPA limit of 10 ppm up to 2.5% of the days in the simulation period. Regulating the fertilizer application rates to approximately 50% of the current rate will ensure this limit will not be exceeded in the future. Implementing cover cropping alone can potentially decrease nitrate N concentrations by 33% in surface water under dry climate and in the saturated zone under future projections. By combining the cover cropping and regulated application rate management, the nitrate N concentration in the saturated zone was expected to decrease by 67% compared with historic baseline. The modeling framework developed and used in this study can help evaluate the effectiveness of different management schemes aimed at reducing future nutrient load in our surface water and groundwater.

6.
J Environ Manage ; 249: 109327, 2019 Nov 01.
Artigo em Inglês | MEDLINE | ID: mdl-31400587

RESUMO

The use of Nitrogen (N) fertilizer boosted crop production to accommodate 7 billion people on Earth in the 20th century but with the consequence of exacerbating N losses from agricultural landscapes. Land management practices that can prevent high N load are constantly being sought for mitigation and conservation purposes. This study was aimed at evaluating the impacts of different land management practices under projected climate scenarios on surface runoff linked N load at the field scale level. A framework to analyze changes in N load at a high spatiotemporal resolution under high greenhouse emission climate projections was developed using the Pesticide Root Zone Model (PRZM) for the Willow Creek Watershed in the Fort Cobb Experimental Watershed in Oklahoma. Specifically, 12 combinations of land management and climate scenarios were evaluated based on their N load via surface runoff from 2020 to 2070. Results showed that crop rotation practices lowered both the N load and the probability of high N load events. Spring application reduced the negative effects in summer and fall from other land management practices but at the risk of increased probability of generating high N load in April and May. The fertilizer application rate was found to be the most critical factor that affected the amount and the probability of high N load events. By adopting a target application management approach, the monthly maximum N can be decreased by 13% while the annual mean N load by 6%. The model framework and analysis method developed in this research can be used to analyze tradeoffs between environmental welfare and economic benefits of N fertilizer at the field scale level.


Assuntos
Agricultura , Nitrogênio , Clima , Mudança Climática , Fertilizantes
8.
Rev. estomatol. Hered ; 28(3): 185-194, jul. 2018. graf, tab
Artigo em Espanhol | LILACS-Express | LILACS | ID: biblio-1014024

RESUMO

Objetivos: Evaluar la calidad técnica de los tratamientos realizados por los operadores y la calidad percibida por los pacientes del Servicio de Operatoria Dental en una Clínica Dental Docente durante el año 2014. Material y métodos: Participaron 216 pacientes de manera voluntaria y aleatoria a quienes se les aplicó una encuesta para medir la calidad percibida del servicio. Posteriormente, se evaluaron los tratamientos de operatoria que les fueron realizados para determinar la calidad técnica a través de la comparación con los protocolos de atención. Resultados: En relación con la calidad percibida, el 74,8% de los participantes percibe la calidad de atención como "Regular", seguido de un 15,3% de pacientes que la perciben como "Malo" y un solo un 5% como "Muy malo". Sólo un 4,3% la percibe como "Bueno" y un 0,6% la percibe como "Muy bueno". En relación con la calidad técnica, se cumple en el 99% de los tratamientos evaluados, ya que se encuentran adheridos a los protocolos de atención clínica. Conclusiones: La calidad percibida por el paciente es en su mayoría negativa o neutral, sin embargo, los resultados de la calidad técnica de los tratamientos describen que éstos se encuentran altamente adheridos a los protocolos de atención.


Objectives: To evaluate the technical quality of the treatments performed by the operators and the quality perceived by the patients of the Dental Operative Service in a Teaching Dental Clinic during the year 2014. Material and methods: 216 patients participated voluntarily and randomly in a survey to measure the perceived quality of the service as well as evaluate the operative treatments that were performed to determine the technical quality.Results: Regarding perceived quality, 74.8% of the participants perceived the quality of care as "Ordinary", followed by 15.3% of patients perceiving it as "Poor" and only 5% as "Very bad". Only 4.3% perceive it as "Good" and 0.6% perceive it as "Very good". Concerning technical quality, 99% of the evaluated treatments are fulfilled since they adhere to the clinical care protocols. Conclusions: The quality perceived by the patient is mostly negative or neutral, however, the results of the technical quality of the treatments describe that they are highly adhered to the care protocols.

9.
J Environ Manage ; 203(Pt 1): 592-602, 2017 Dec 01.
Artigo em Inglês | MEDLINE | ID: mdl-28318825

RESUMO

Riparian erosion is one of the major causes of sediment and contaminant load to streams, degradation of riparian wildlife habitats, and land loss hazards. Land and soil management practices are implemented as conservation and restoration measures to mitigate the environmental problems brought about by riparian erosion. This, however, requires the identification of vulnerable areas to soil erosion. Because of the complex interactions between the different mechanisms that govern soil erosion and the inherent uncertainties involved in quantifying these processes, assessing erosion vulnerability at the watershed scale is challenging. The main objective of this study was to develop a methodology to identify areas along the riparian zone that are susceptible to erosion. The methodology was developed by integrating the physically-based watershed model MIKE-SHE, to simulate water movement, and a habitat suitability model, MaxEnt, to quantify the probability of presences of elevation changes (i.e., erosion) across the watershed. The presences of elevation changes were estimated based on two LiDAR-based elevation datasets taken in 2009 and 2012. The changes in elevation were grouped into four categories: low (0.5 - 0.7 m), medium (0.7 - 1.0 m), high (1.0 - 1.7 m) and very high (1.7 - 5.9 m), considering each category as a studied "species". The categories' locations were then used as "species location" map in MaxEnt. The environmental features used as constraints to the presence of erosion were land cover, soil, stream power index, overland flow, lateral inflow, and discharge. The modeling framework was evaluated in the Fort Cobb Reservoir Experimental watershed in southcentral Oklahoma. Results showed that the most vulnerable areas for erosion were located at the upper riparian zones of the Cobb and Lake sub-watersheds. The main waterways of these sub-watersheds were also found to be prone to streambank erosion. Approximatively 80% of the riparian zone (streambank included) has up to 30% probability to experience erosion greater than 1.0 m. By being able to identify the most vulnerable areas for stream and riparian sediment mobilization, conservation and management practices can be focused on areas needing the most attention and resources.


Assuntos
Conservação dos Recursos Naturais , Solo , Monitoramento Ambiental , Rios , Movimentos da Água
10.
J Crit Care ; 39: 1-5, 2017 06.
Artigo em Inglês | MEDLINE | ID: mdl-28082138

RESUMO

OBJECTIVE: To determine if the length of stay at a referring institution intensive care unit (ICU) before transfer to a tertiary/quaternary care facility is a risk factor for mortality. DESIGN: We performed a retrospective chart review of patients transferred to our ICU from referring institution ICUs over a 3-year period. Logistical regression analysis was performed to determine which factors were independently associated with increased mortality. The primary outcomes were ICU and hospital mortality. MAIN RESULTS: A total of 1248 patients were included in our study. Length of stay at the referring institution was an independent risk factor for both ICU and hospital mortality (P<.0001), with increasing lengths of stay correlating with increased mortality. Each additional day at the referring institution was associated with a 1.04 increase in likelihood of ICU mortality (95% confidence interval, 1.02-1.06; P =0.001) and a 1.029 (95% confidence interval, 1.01-1.05; P .005) increase in likelihood of hospital mortality. CONCLUSIONS: Length of stay at the referring institution before transfer is a risk factor for worse outcomes, with longer stays associated with increased likelihood of mortality. Further studies delineating which factors most affect length of stay at referring institutions, though a difficult task, should be pursued.


Assuntos
Estado Terminal/mortalidade , Tempo de Internação , Avaliação de Resultados em Cuidados de Saúde , Transferência de Pacientes , Encaminhamento e Consulta , APACHE , Feminino , Mortalidade Hospitalar , Humanos , Unidades de Terapia Intensiva , Masculino , Pessoa de Meia-Idade , Ohio , Análise de Regressão , Estudos Retrospectivos , Fatores de Risco
11.
Pharmacotherapy ; 37(2): 177-186, 2017 02.
Artigo em Inglês | MEDLINE | ID: mdl-27997675

RESUMO

STUDY OBJECTIVES: To describe compliance with antibiotic recommendations based on a previously published procalcitonin (PCT)-guided algorithm in clinical practice, to compare PCT algorithm compliance rates between PCT assays ordered in the antibiotic initiation setting (PCT concentration measured less than 24 hours after antibiotic initiation or before antibiotic initiation) with those in the antibiotic continuation setting (PCT concentration measured 24 hours or more after antibiotic initiation), and to evaluate patient- and PCT-related factors independently associated with algorithm compliance in patients in the medical intensive care unit (MICU). DESIGN: Single-center retrospective cohort study. SETTING: Large MICU in a tertiary care academic medical center. PATIENTS: A total of 527 adults admitted to the MICU unit over a 2-year period (November 1, 2011-October 31, 2013) who had a total of 957 PCT assays performed. PCT assays whose results were determined in the MICU were allocated retrospectively to either the initiation setting cohort or the continuation setting cohort based on timing of the PCT assay. MEASUREMENTS AND MAIN RESULTS: Each PCT assay was treated as a separate episode. Antibiotic regimens were compared between the 24-hour periods before and after the results of each PCT assay and evaluated against an algorithm to determine compliance. Clinical, laboratory, PCT-related, and microbiologic variables were assessed during the 24-hour period after the PCT assay results to determine their influence on PCT algorithm compliance. A larger proportion of PCT episodes occurred in the initiation setting (540 [56.4%]) than in the continuation setting (417 [43.5%]). Overall, compliance with PCT algorithm recommendations was low (48.5%) and not significantly different between the initiation setting and the continuation setting (49.1% vs 47.7%, p=0.678). No patient-related or PCT-related factors were independently associated with PCT algorithm compliance on multivariable logistic regression. CONCLUSION: Compliance with PCT algorithm antibiotic recommendations in both the initiation and continuation settings was lower than that reported in published randomized studies. No factors were independently associated with PCT algorithm compliance. Institutions using PCT assays to guide antibiotic use should assess compliance with algorithm antibiotic recommendations. Inclusion of a formalized antimicrobial stewardship program along with a PCT-guided algorithm is highly recommended.


Assuntos
Algoritmos , Antibacterianos/uso terapêutico , Calcitonina/sangue , Fidelidade a Diretrizes , Centros Médicos Acadêmicos , Idoso , Estudos de Coortes , Feminino , Humanos , Unidades de Terapia Intensiva , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Estudos Retrospectivos
12.
Infect Control Hosp Epidemiol ; 38(2): 186-188, 2017 02.
Artigo em Inglês | MEDLINE | ID: mdl-27852357

RESUMO

BACKGROUND Catheter-associated urinary tract infections (CAUTIs) are among the most common hospital-acquired infections (HAIs). Reducing CAUTI rates has become a major focus of attention due to increasing public health concerns and reimbursement implications. OBJECTIVE To implement and describe a multifaceted intervention to decrease CAUTIs in our ICUs with an emphasis on indications for obtaining a urine culture. METHODS A project team composed of all critical care disciplines was assembled to address an institutional goal of decreasing CAUTIs. Interventions implemented between year 1 and year 2 included protocols recommended by the Centers for Disease Control and Prevention for placement, maintenance, and removal of catheters. Leaders from all critical care disciplines agreed to align routine culturing practice with American College of Critical Care Medicine (ACCCM) and Infectious Disease Society of America (IDSA) guidelines for evaluating a fever in a critically ill patient. Surveillance data for CAUTI and hospital-acquired bloodstream infection (HABSI) were recorded prospectively according to National Healthcare Safety Network (NHSN) protocols. Device utilization ratios (DURs), rates of CAUTI, HABSI, and urine cultures were calculated and compared. RESULTS The CAUTI rate decreased from 3.0 per 1,000 catheter days in 2013 to 1.9 in 2014. The DUR was 0.7 in 2013 and 0.68 in 2014. The HABSI rates per 1,000 patient days decreased from 2.8 in 2013 to 2.4 in 2014. CONCLUSIONS Effectively reducing ICU CAUTI rates requires a multifaceted and collaborative approach; stewardship of culturing was a key and safe component of our successful reduction efforts. Infect Control Hosp Epidemiol 2017;38:186-188.


Assuntos
Infecções Relacionadas a Cateter/epidemiologia , Infecção Hospitalar/prevenção & controle , Controle de Infecções/métodos , Unidades de Terapia Intensiva , Infecções Urinárias/epidemiologia , Gestão de Antimicrobianos/estatística & dados numéricos , Humanos , Ohio/epidemiologia , Urina/microbiologia
13.
Hum Vaccin Immunother ; 12(11): 2953-2958, 2016 11.
Artigo em Inglês | MEDLINE | ID: mdl-27454779

RESUMO

There is scarce data about pneumococcal vaccination coverages among adults in recent years. We investigated current pneumococcal vaccination coverages in Catalonia, Spain, with a cross-sectional population-based study including 2,033,465 individuals aged 50 y or older assigned to the Catalonian Health Institute at 01/01/2015 (date of survey). A previously validated institutional research clinical Database was used to classify study subjects by their vaccination status for both 23-valent pneumococcal polysaccharide vaccine (PPSV23) and 13-valent pneumococcal conjugate vaccine (PCV13), to identify comorbidities and underlying conditions, and establish the risk stratum of each individual: High risk stratum: functional or anatomic asplenia, cochlear implants, CSF leaks, or immunocompromising conditions; medium risk stratum: immunocompetent persons with history of chronic cardiac or respiratory disease, liver disease, diabetes mellitus, alcoholism and/or smoking; low risk stratum: persons without high or medium risk conditions. Of the total 2,033,465 study population, an amount of 789,098 (38.8%) had received PPVS23, whereas 5031 (0.2%) had received PCV13. PPSV23 coverages increased largely with increasing age: 4.8% in 50-59 y vs 35.5% in 60-69 y vs 71.9% in 70-79 y vs 79.5% in 80 y or older; p < 0.001). PCV13 coverages also increased with age, although they were very low in all age groups. PPSV23 coverages were 59.2% in high risk stratum, 48.3% in medium risk stratum and 28.1% in low risk stratum (p < 0.001). For the PCV13, uptakes were 1.2%, 0.3% and 0.1% in high, medium and low stratum, respectively (p < 0.001). In conclusion, pneumococcal vaccination coverages in Catalonian adults are not optimal, being especially small for the PCV13 (even in high-risk subjects).


Assuntos
Infecções Pneumocócicas/prevenção & controle , Vacinas Pneumocócicas/administração & dosagem , Vacinação/estatística & dados numéricos , Idoso , Idoso de 80 Anos ou mais , Estudos Transversais , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Infecções Pneumocócicas/epidemiologia , Medição de Risco , Espanha/epidemiologia
14.
Am J Health Syst Pharm ; 73(11 Suppl 3): S88-93, 2016 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-27208145

RESUMO

PURPOSE: The performance of an updated insulin infusion protocol was evaluated at a large, academic medical center. METHODS: A retrospective medical record review was performed after a one-month run-in period for all patients at a large, academic, tertiary care medical center in whom the insulin infusion per protocol was administered from January 1 through February 28, 2014. Data were evaluated to determine the median blood glucose (BG) level, time to achieve BG in the target range, number of BG checks per patient per day, time elapsed between each BG check, and the frequency of hypoglycemia (BG concentration of ≤70 mg/dL). RESULTS: A total of 170 patients were included. The median preinfusion BG was 244 mg/dL (interquartile range [IQR], 204-304 mg/dL), which decreased to a median of 168 mg/dL (IQR, 147.5-199.5 mg/dL) when the protocol was utilized. However, 70 patients (41%) had a median BG concentration of ≥180 mg/dL, and 25 patients' (15%) BG value remained above 180 mg/dL. The median time to achieve the goal BG value was 4.2 hours (95% confidence interval, 3.2-5.1 hours). BG checks were performed a median of every 2.1 hours (IQR, 1.4-2.3 hours). Hypoglycemia was rare, occurring in only 2 (1.2%) patients. CONCLUSION: The median BG with an updated insulin infusion protocol approached the upper limit of the target BG range, and 41% of patients had a median BG above the goal range. Protocol specifications for the frequency of BG monitoring were not commonly followed, but the frequency of hypoglycemia was extremely low.


Assuntos
Centros Médicos Acadêmicos/normas , Quimioterapia Assistida por Computador/normas , Hipoglicemiantes/administração & dosagem , Sistemas de Infusão de Insulina/normas , Insulina/administração & dosagem , Centros Médicos Acadêmicos/métodos , Idoso , Glicemia/efeitos dos fármacos , Glicemia/metabolismo , Quimioterapia Assistida por Computador/métodos , Feminino , Humanos , Masculino , Sistemas Computadorizados de Registros Médicos/normas , Pessoa de Meia-Idade , Estudos Retrospectivos
15.
Am J Med Qual ; 31(6): 552-558, 2016 11.
Artigo em Inglês | MEDLINE | ID: mdl-26133376

RESUMO

Quality improvement in the health care setting is a complex process, and even more so in the critical care environment. The development of intensive care unit process measures and quality improvement strategies are associated with improved outcomes, but should be individualized to each medical center as structure and culture can differ from institution to institution. The purpose of this report is to describe the structure of quality improvement processes within a large medical intensive care unit while using examples of the study institution's successes and challenges in the areas of stat antibiotic administration, reduction in blood product waste, central line-associated bloodstream infections, and medication errors.


Assuntos
Unidades de Terapia Intensiva/organização & administração , Melhoria de Qualidade/organização & administração , Antibacterianos/administração & dosagem , Antibacterianos/uso terapêutico , Infecções Relacionadas a Cateter/prevenção & controle , Cateterismo Venoso Central/efeitos adversos , Cateterismo Venoso Central/normas , Humanos , Unidades de Terapia Intensiva/normas , Resíduos de Serviços de Saúde/prevenção & controle , Erros de Medicação/prevenção & controle , Avaliação de Processos e Resultados em Cuidados de Saúde
16.
Cleve Clin J Med ; 82(9): 615-24, 2015 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-26366959

RESUMO

In hospitalized patients, elevated serum lactate levels are both a marker of risk and a target of therapy. The authors describe the mechanisms underlying lactate elevations, note the risks associated with lactic acidosis, and outline a strategy for its treatment.


Assuntos
Acidose Láctica , Cardiotônicos/uso terapêutico , Hidratação/métodos , Oxigenoterapia/métodos , Choque Séptico/complicações , Vasoconstritores/uso terapêutico , Acidose Láctica/sangue , Acidose Láctica/diagnóstico , Acidose Láctica/etiologia , Acidose Láctica/fisiopatologia , Acidose Láctica/terapia , Gerenciamento Clínico , Humanos
17.
Rev Esp Quimioter ; 28(1): 29-35, 2015 Feb.
Artigo em Espanhol | MEDLINE | ID: mdl-25690142

RESUMO

BACKGROUND: Pneumococcal infections remain a major health problem worldwide. This study analysed the distribution of distinct Streptococcus pneumoniae serotypes causing invasive pneumococcal disease (IPD) among all-age population in the region of Tarragona (Spain) throughout 2006-2009. METHODS: An amount of 237 strains were evaluated, of which 203 (85.7%) were isolated from blood cultures, 14 (5.9%) from pleural fluids, 13 (5.5%) from CSF samples and 7 (3%) from other sterile sites. Forty-seven cases (19.8%) were children ≤ 14 years, 94 (39.7%) were patients 15-64 years and 96 (40.5%) were patients ≥ 65 years. RESULTS: Seven serotypes (1, 3, 6A, 7F, 12F, 14 and 19A) caused almost two thirds (63.3%) of cases among all-age patients. Serotype 1 was the most common serotype among children (44.7%) and among people 15-64 years (21.3%), whereas serotype 19A was the most common among people ≥ 65 years (12.5%).Among all-age population, serotype-vaccine coverage for the distinct pneumococcal polysaccharide vaccine (PPV) and conjugate vaccines (PCVs) were 17.3% for the PCV7, 49.8% for the PCV10, 73% for the PCV13 and 80.2% for the PPV23 (p < 0.001). Among children, vaccine-serotype coverage was 23.4% for the PCV7, 72.3% for the PCV10 and 83% for the PCV13. Among people ≥ 65 years, vaccine-serotype coverage was 62.5% for the PCV13 and 68.8% for the PPV23. CONCLUSION: A considerable proportion of IPD cases among our population would not be covered by the current pneumococcal vaccines.


Assuntos
Infecções Pneumocócicas/microbiologia , Vacinas Pneumocócicas/uso terapêutico , Streptococcus pneumoniae/imunologia , Adolescente , Adulto , Fatores Etários , Química Farmacêutica , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Infecções Pneumocócicas/epidemiologia , Prevalência , Vigilância em Saúde Pública , Sorogrupo , Espanha/epidemiologia , Vacinas Conjugadas/imunologia , Adulto Jovem
18.
Am J Med Qual ; 30(4): 317-22, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-24755479

RESUMO

Intensive care unit (ICU) resources are scarce, yet demand is increasing at a rapid rate. Optimizing throughput efficiency while balancing patient safety and quality of care is of utmost importance during times of high ICU utilization. Continuous improvement methodology was used to develop a multidisciplinary workflow to improve throughput in the ICU while maintaining a high quality of care and minimizing adverse outcomes. The research team was able to decrease ICU occupancy and therefore ICU length of stay by implementing this process without increasing mortality or readmissions to the ICU. By improving throughput efficiency, more patients were able to be provided with care in the ICU.


Assuntos
Ocupação de Leitos , Unidades de Terapia Intensiva , Fluxo de Trabalho , Idoso , Ocupação de Leitos/estatística & dados numéricos , Eficiência Organizacional , Feminino , Humanos , Comunicação Interdisciplinar , Tempo de Internação , Masculino , Pessoa de Meia-Idade , Centros de Atenção Terciária
19.
Indian J Crit Care Med ; 19(11): 636-41, 2015 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-26730113

RESUMO

INTRODUCTION: The United States experienced a postpandemic outbreak of H1N1 influenza in 2013-2014. Unlike the pandemic in 2009 clinical course and outcomes associated with critical illness in this postpandemic outbreak has been only sparsely described. METHODS: We conducted a retrospective analysis of all patients admitted to the Medical Intensive Care Unit with H1N1 influenza infection in 2009-2010 (pandemic) and 2013-2014 (postpandemic). RESULTS: Patients admitted in the postpandemic period were older (55 ± 13 vs. 45 ± 12, P = 0.002), and had a higher incidence of underlying pulmonary (17 vs. 7, P = 0.0007) and cardiac (16 vs. 8, P = 0.005) disease. Mechanical ventilation was initiated in most patients in both groups (27 vs. 21, P = 1.00). The PaO2/FiO2 ratio was significantly higher in the pandemic group on days 1 (216 vs. 81, P = 0.0009), 3 (202 ± 99 vs. 100 ± 46, P = 0.002) and 7 (199 ± 103 vs. 113 ± 44, P = 0.019) but by day 14 no difference was seen between the groups. Rescue therapies were used in more patients in the postpandemic period (48% vs. 20%, P = 0.028), including more frequent use of prone ventilation (10 vs. 3, P = 0.015), inhaled vasodilator therapy (11 vs. 4, P = 0.015) and extracorporeal membrane oxygenation (ECMO) (4 vs. 2, P = NS). No significant differences in mortality were seen between the two cohorts. CONCLUSIONS: Compared to the 2009-2010 pandemic, the 2013-2014 H1N1 strain affected older patients with more underlying co-morbid cardio-pulmonary diseases. The patients had worse oxygenation indices and rescue modalities such as prone ventilation, inhaled epoprostenol and ECMO, were used more consistently as compared to the 2009 pandemic.

20.
Am J Crit Care ; 23(4): e46-53, 2014 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-24986179

RESUMO

BACKGROUND: Long-term acute care hospitals are an option for patients in intensive care units who require prolonged care after an acute illness. Predicting use of these facilities may help hospitals improve resource management, expenditures, and quality of care delivered in intensive care. OBJECTIVE: To develop a predictive tool for early identification of intensive care patients with increased probability of transfer to such a hospital. METHODS: Data on 1967 adults admitted to intensive care at a tertiary care hospital between January 2009 and June 2009 were retrospectively reviewed. The prediction model was developed by using multiple ordinal logistic regression. The model was internally validated via the bootstrapping technique and externally validated with a control cohort of 950 intensive care patients. RESULTS: Among the study group, 146 patients (7.4%) were discharged to long-term acute care hospitals and 1582 (80.4%) to home or other care facilities; 239 (12.2%) died in the intensive care unit. The final prediction algorithm showed good accuracy (bias-corrected concordance index, 0.825; 95% CI, 0.803-0.845), excellent calibration, and external validation (concordance index, 0.789; 95% CI, 0.754-0.824). Hypoalbuminemia was the greatest potential driver of increased likelihood of discharge to a long-term acute care hospital. Other important predictors were intensive care unit category, older age, extended hospital stay before admission to intensive care, severe pressure ulcers, admission source, and dependency on mechanical ventilation. CONCLUSIONS: This new predictive tool can help estimate on the first day of admission to intensive care the likelihood of a patient's discharge to a long-term acute care hospital.


Assuntos
Hospitalização , Unidades de Terapia Intensiva , Assistência de Longa Duração , Nomogramas , Transferência de Pacientes , Fatores Etários , Idoso , Feminino , Humanos , Hipoalbuminemia/diagnóstico , Tempo de Internação , Masculino , Pessoa de Meia-Idade , Alta do Paciente , Valor Preditivo dos Testes , Úlcera por Pressão/diagnóstico , Respiração Artificial , Estudos Retrospectivos , Fatores de Risco
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...