RESUMO
AIMS: This study investigated Salmonella concentrations following combinations of horticultural practices including anaerobic soil disinfestation (ASD), soil amendment type and irrigation regimen. METHODS AND RESULTS: Sandy-loam soil was inoculated with a five-serovar Salmonella cocktail (5.5 ± 0.2 log CFU per gram) and subjected to one of six treatments: (i) no soil amendment, ASD (ASD control), (ii) no soil amendment, no-ASD (non-ASD control) and (iii-vi) soil amended with pelletized poultry litter, rye, rapeseed or hairy vetch with ASD. The effect of irrigation regimen was determined by collecting samples 3 and 7 days after irrigation. Twenty-five-gram soil samples were collected pre-ASD, post-soil saturation (i.e. ASD-process), and at 14 time-points post-ASD, and Salmonella levels enumerated. Log-linear models examined the effect of amendment type and irrigation regimen on Salmonella die-off during and post-ASD. During ASD, Salmonella concentrations significantly decreased in all treatments (range: -0.2 to -2.7 log CFU per gram), albeit the smallest decrease (-0.2 log CFU per gram observed in the pelletized poultry litter) was of negligible magnitude. Salmonella die-off rates varied by amendment with an average post-ASD rate of -0.05 log CFU per gram day (CI = -0.05, -0.04). Salmonella concentrations remained highest over the 42 days post-ASD in pelletized poultry litter, followed by rapeseed, and hairy vetch treatments. Findings suggested ASD was not able to eliminate Salmonella in soil, and certain soil amendments facilitated enhanced Salmonella survival. Salmonella serovar distribution differed by treatment with pelletized poultry litter supporting S. Newport survival, compared with other serovars. Irrigation appeared to assist Salmonella survival with concentrations being 0.14 log CFU per gram (CI = 0.05, 0.23) greater 3 days, compared with 7 days post-irrigation. CONCLUSIONS: ASD does not eliminate Salmonella in soil, and may in fact, depending on the soil amendment used, facilitate Salmonella survival. SIGNIFICANCE AND IMPACT OF THE STUDY: Synergistic and antagonistic effects on food safety hazards of implementing horticultural practices should be considered.
Assuntos
Microbiologia do Solo , Solo , Irrigação Agrícola , Agricultura/métodos , Anaerobiose , SalmonellaRESUMO
INTRODUCTION: The emergency medicine (EM) workforce has been growing at a rapid rate, fueled by a large increase in the number of EM residency programs and growth in the number of Advanced Practice Providers (APPs). OBJECTIVES: To review current available data on patient volumes and characteristics, the overall physician workforce, the current emergency physician (EP) workforce, and to project emergency physician staffing needs into the future. METHODS: Data was obtained through review of the current medical literature, reports from certifying organizations and professional societies, Web searches for alternative sources, and published governmental data. RESULTS: We conservatively estimate the demand for emergency clinicians to grow by â¼1.8% per year. The actual demand for EPs will likely be lower, considering the higher growth rates seen by APPs, likely offsetting the need for increasing numbers of EPs. We estimate the overall supply of board-certified or board-eligible EPs to increase by at least 4% in the near-term, which includes losses due to attrition. In light of this, we conservatively estimate the supply of board-certified or eligible EPs should exceed demand by at least 2.2% per year. In the intermediate term, it is possible that the supply of board-certified or eligible EPs could exceed demand by 3% or more per year. Using 2.2% growth, we estimate that the number of board-certified or board-eligible EPs should meet the anticipated demand for EPs as early as the start of 2021. Furthermore, extrapolating current trends, we anticipate the EP workforce could be 20-30% oversupplied by 2030. CONCLUSIONS: Historically, there has been a significant shortage of EPs. We project that this shortage may resolve quickly, and there is the potential for a significant oversupply in the future.
Assuntos
Medicina de Emergência , Mão de Obra em Saúde/estatística & dados numéricos , Médicos/provisão & distribuição , Escolha da Profissão , Medicina de Emergência/educação , Serviço Hospitalar de Emergência , Previsões , Humanos , Internato e Residência , Admissão e Escalonamento de Pessoal , Estados UnidosRESUMO
OBJECTIVES: A prior single-center study demonstrated historical and exam features predicting intracranial injury (ICI) in geriatric patients with low-risk falls. We sought to prospectively validate these findings in a multicenter population. METHODS: This is a prospective observational study of patients ≥65â¯years presenting after a fall to three EDs. Patients were eligible if they were at baseline mental status and were not triaged to the trauma bay. Fall mechanism, head strike history, headache, loss of consciousness (LOC), anticoagulants/antiplatelet use, dementia, and signs of head trauma were recorded. Radiographic imaging was obtained at the discretion of treating physicians. Patients were called at 30â¯days to determine outcome in non-imaged patients. RESULTS: 723 patients (median age 83, interquartile range 74-88) were enrolled. Although all patients were at baseline mental status, 76 had GCS <15, and 154 had dementia. 406 patients were on anticoagulation/antiplatelet agents. Fifty-two (7.31%) patients had traumatic ICI. Two study variables were helpful in predicting ICI: LOC (odds ratio (OR) 2.02) and signs of head trauma (OR 2.6). The sensitivity of these items was 86.5% (CI 73.6-94) with a specificity of 38.8% (CI 35.1-42.7). The positive predictive value in this population was 10% (CI 7.5-13.3) with a negative predictive value of 97.3% (CI 94.4-98.8). Had these items been applied as a decision rule, 273 patients would not have undergone CT scanning, but 7 injuries would have been missed. CONCLUSION: In low-risk geriatric fall patients, the best predictors of ICI were physical findings of head trauma and history of LOC.
Assuntos
Acidentes por Quedas/estatística & dados numéricos , Lesões Encefálicas Traumáticas/diagnóstico , Anamnese , Exame Físico , Inconsciência/etiologia , Idoso , Idoso de 80 Anos ou mais , Lesões Encefálicas Traumáticas/complicações , Serviço Hospitalar de Emergência/estatística & dados numéricos , Feminino , Escala de Coma de Glasgow , Humanos , Modelos Logísticos , Masculino , Valor Preditivo dos Testes , Estudos Prospectivos , Fatores de Risco , Tomografia Computadorizada por Raios X , Estados UnidosRESUMO
BACKGROUND: Pain is one of the most common reasons patients present to the emergency department (ED). Emergency physicians should be aware of the numerous opioid and nonopioid alternatives available for the treatment of pain. OBJECTIVES: To provide expert consensus guidelines for the safe and effective treatment of acute pain in the ED. METHODS: Multiple independent literature searches using PubMed were performed regarding treatment of acute pain. A multidisciplinary panel of experts in Pharmacology and Emergency Medicine reviewed and discussed the literature to develop consensus guidelines. RECOMMENDATIONS: The guidelines provide resources for the safe use of opioids in the ED as well as pharmacological and nonpharmacological alternatives to opioid analgesia. Care should be tailored to the patient based on their specific acute painful condition and underlying risk factors and comorbidities. CONCLUSIONS: Analgesia in the ED should be provided in the most safe and judicious manner, with the goals of relieving acute pain while decreasing the risk of complications and opioid dependence.
Assuntos
Dor Aguda/tratamento farmacológico , Medicina de Emergência/métodos , Manejo da Dor/métodos , Analgésicos/uso terapêutico , Analgésicos Opioides/efeitos adversos , Analgésicos Opioides/uso terapêutico , Tomada de Decisões , Medicina de Emergência/normas , Medicina de Emergência/tendências , Serviço Hospitalar de Emergência/organização & administração , Serviço Hospitalar de Emergência/tendências , Epidemias , Guias como Assunto/normas , Humanos , Manejo da Dor/tendências , Medição da Dor/métodos , Fatores de RiscoRESUMO
BACKGROUND: The landscape of the emergency medicine workforce has changed dramatically over the last few decades. The growth in emergency medicine residency programs has significantly increased the number of emergency medicine specialists now staffing emergency departments (EDs) throughout the country. Despite this increase in available providers, rising patient volumes, an aging population, ED overcrowding and inefficiency, increased regulation, and other factors have resulted in the continued need for additional emergency physicians. OBJECTIVES: To review current available data on patient volumes and characteristics, the overall physician workforce, the current emergency physician workforce, the impact of physician extenders and scribes on the practice of emergency medicine, and project emergency physician staffing needs into the future. DISCUSSION AND PROJECTIONS: We project that within the next 5 to 10 years, there will be enough board-certified or -eligible emergency physicians to provide care to all patients in the U.S. EDs. However, low-volume rural EDs will continue to have difficulty attracting emergency medicine specialists without significant incentives. CONCLUSIONS: There remains a shortage of board-certified emergency physicians, but it is decreasing every year. The use of physicians from other specialties to staff EDs has long been based on the theory that there is a long-standing shortage of available American Board of Emergency Medicine/American Osteopathic Board of Emergency Medicine physicians, both now and in the future. Our investigation shows that this is not supported by current data. Although there will always be regional and rural physician shortages, these are mirrored by all other specialties and are even more pressing in primary care.
Assuntos
Medicina de Emergência/educação , Serviço Hospitalar de Emergência , Admissão e Escalonamento de Pessoal , Certificação , Educação de Pós-Graduação em Medicina , Previsões , Humanos , Internato e Residência , Estados Unidos , Recursos HumanosRESUMO
Denitrifying bioreactors (DNBRs) are an emerging technology used to remove nitrate-nitrogen (NO) from enriched waters by supporting denitrifying microorganisms with organic carbon in an anaerobic environment. Field-scale investigations have established successful removal of NO from agricultural drainage, but the potential for DNBRs to remediate excess phosphorus (P) exported from agricultural systems has not been addressed. We hypothesized that biochar addition to traditional woodchip DNBRs would enhance NO and P removal and reduce nitrous oxide (NO) emissions based on previous research demonstrating reduced leaching of NO and P and lower greenhouse gas production associated with biochar amendment of agricultural soils. Nine laboratory-scale DNBRs, a woodchip control, and eight different woodchip-biochar treatments were used to test the effect of biochar on nutrient removal. The biochar treatments constituted a full factorial design of three factors (biochar source material [feedstock], particle size, and application rate), each with two levels. Statistical analysis by repeated measures ANOVA showed a significant effect of biochar, time, and their interaction on NO and dissolved P removal. Average P removal of 65% was observed in the biochar treatments by 18 h, after which the concentrations remained stable, compared with an 8% increase in the control after 72 h. Biochar addition resulted in average NO removal of 86% after 18 h and 97% after 72 h, compared with only 13% at 18 h and 75% at 72 h in the control. Biochar addition also resulted in significantly lower NO production. These results suggest that biochar can reduce the design residence time by enhancing nutrient removal rates.
RESUMO
Ammonia (NH) emissions from animal manures can cause air and water quality problems. Poultry litter treatment (PLT, sodium bisulfate; Jones-Hamilton Co.) is an acidic amendment that is applied to litter in poultry houses to decrease NH emissions, but currently it can only be applied once before birds are placed in the houses. This project analyzed the effect of multiple PLT applications on litter properties and NH release. Volatility chambers were used to compare multiple, single, and no application of PLT to poultry litter, all with and without fresh manure applications. A field component consisted of two commercial broiler houses: one had a single, preflock PLT application, while the other received PLT reapplications during the flock using an overhead application system. In the volatility chambers, single and reapplied PLT caused greater litter moisture and lower litter pH and , relative to no PLT. After 14 d, NH released from litter treated with reapplied PLT was significantly less than litter with both single and no applications. Furthermore, total N in litter was greatest in litter treated with reapplied PLT, increasing its fertilizer value. In the commercial poultry houses, PLT reapplication led to a temporary decrease in litter pH and , but these effects did not last because of continued bird excretion. Although one preflock PLT application is currently used as a successful strategy to control NH during early flock growth, repeat PLT application using the overhead reapplication system was not successful because of problems with the reapplication system and litter moisture concerns.
RESUMO
Leaching of phosphorus (P) mobilizes edaphic and applied sources of P and is a primary pathway of concern in agricultural soils of the Delmarva Peninsula, which defines the eastern boundary of the eutrophic Chesapeake Bay. We evaluated P leaching before and after poultry litter application from intact soil columns (30 cm diameter × 50 cm depth) obtained from low- and high-P members of four dominant Delmarva Peninsula soils. Surface soil textures ranged from fine sand to silt loam, and Mehlich-3 soil P ranged from 64 to 628 mg kg. Irrigation of soil columns before litter application pointed to surface soil P controls on dissolved P in leachate (with soil P sorption saturation providing a stronger relationship than Mehlich-3 P); however, strong relationships between P in the subsoil (45-50 cm) and leachate P concentrations were also observed ( = 0.61-0.73). After poultry litter application (4.5 Mg ha), leachate P concentrations and loads increased significantly for the finest-textured soils, consistent with observations that well-structured soils have the greatest propensity to transmit applied P. Phosphorus derived from poultry litter appeared to contribute 41 and 76% of total P loss in leachate from the two soils with the finest textures. Results point to soil P, including P sorption saturation, as a sound metric of P loss potential in leachate when manure is not an acute source of P but highlight the need to factor in macropore transport potential to predict leaching losses from applied P sources.
RESUMO
Leaching of nutrients through agricultural soils is a priority water quality concern on the Atlantic Coastal Plain. This study evaluated the effect of tillage and urea application on leaching of phosphorus (P) and nitrogen (N) from soils of the Delmarva Peninsula that had previously been under no-till management. Intact soil columns (30 cm wide × 50 cm deep) were irrigated for 6 wk to establish a baseline of leaching response. After 2 wk of drying, a subset of soil columns was subjected to simulated tillage (0-20 cm) in an attempt to curtail leaching of surface nutrients, especially P. Urea (145 kg N ha) was then broadcast on all soils (tilled and untilled), and the columns were irrigated for another 8 wk. Comparison of leachate recoveries representing rapid and slow flows confirmed the potential to manipulate flow fractions with tillage, albeit with mixed results across soils. Leachate trends in the finer-textured soil suggest that tillage impeded macropore flow and forced greater matrix flow. Despite significant vertical stratification of soil P that suggested tillage could prevent leaching of P via macropores from the surface to the subsoil, tillage had no significant impact on P leaching losses. Relatively high levels of soil P below 20 cm may have served as the source of P enrichment in leachate waters. However, tillage did lower losses of applied urea in leachate from two of the three soils, partially confirming the study's premise that tillage would destroy macropore pathways transmitting surface constituents to the subsoil.
RESUMO
BACKGROUND: Falls are a major cause of morbidity in the elderly. OBJECTIVES: We describe the low-acuity elderly fall population and study which historical and clinical features predict traumatic intracranial injuries (ICIs). METHODS: This is a prospective observational study of patients at least 65 years old presenting with fall to a tertiary care facility. Patients were eligible if they were at baseline mental status and were not triaged to the trauma bay. At presentation, a data form was completed by treating physicians regarding mechanism and position of fall, history of head strike, headache, loss of consciousness (LOC), and signs of head trauma. Radiographic imaging was obtained at the discretion of treating physicians. Medical records were subsequently reviewed to determine imaging results. All patients were called in follow-up at 30 days to determine outcome in those not imaged. The study was institutional review board approved. RESULTS: A total of 799 patients were enrolled; 79.5% of patients underwent imaging. Twenty-seven had ICIs (3.4%). Fourteen had subdural hematoma, 7 had subarachnoid hemorrhage, 3 had cerebral contusion, and 3 had a combination of injuries. Logistic regression demonstrated 2 study variables that were associated with ICIs: LOC (odds ratio, 2.8; confidence interval, 1.2-6.3) and signs of head trauma (odds ratio, 13.2; confidence interval, 2.7-64.1). History of head strike, mechanism and position, headache, and anticoagulant and antiplatelet use were not associated with ICIs. CONCLUSION: Elderly fall patients who are at their baseline mental status have a low incidence of ICIs. The best predictors of ICIs are physical findings of trauma to the head and history of LOC.
Assuntos
Acidentes por Quedas/estatística & dados numéricos , Lesões Encefálicas/etiologia , Idoso , Idoso de 80 Anos ou mais , Lesões Encefálicas/diagnóstico por imagem , Lesões Encefálicas/epidemiologia , Hematoma Subdural/diagnóstico por imagem , Hematoma Subdural/epidemiologia , Hematoma Subdural/etiologia , Humanos , Masculino , Neuroimagem , Estudos Prospectivos , Fatores de Risco , Hemorragia Subaracnóidea/diagnóstico por imagem , Hemorragia Subaracnóidea/epidemiologia , Hemorragia Subaracnóidea/etiologia , Tomografia Computadorizada por Raios X , Centros de Traumatologia/estatística & dados numéricos , Inconsciência/diagnóstico por imagem , Inconsciência/epidemiologia , Inconsciência/etiologiaRESUMO
Historical applications of manures and fertilizers at rates exceeding crop P removal in the Mid-Atlantic region (United States) have resulted in decades of increased water quality degradation from P losses in agricultural runoff. As such, many growers in this region face restrictions on future P applications. An improved understanding of the fate, transformations, and availability of P is needed to manage P-enriched soils. We paired chemical extractions (i.e., Mehlich-3, water extractable P, and chemical fractionation) with nondestructive methods (i.e., x-ray absorption near edge structure [XANES] spectroscopy and x-ray fluorescence [XRF]) to investigate P dynamics in eight P-enriched Mid-Atlantic soils with various management histories. Chemical fractionation and XRF data were used to support XANES linear combination fits, allowing for identification of various Al, Ca, and Fe phosphates and P sorbed phases in soils amended with fertilizer, poultry litter, or dairy manure. Management history and P speciation were used to make qualitative comparisons between the eight legacy P soils; we also speculate about how P speciation may affect future management of these soils with and without additional P applications. With continued P applications, we expect an increase in semicrystalline Al and Fe-P, P sorbed to Al (hydro)oxides, and insoluble Ca-P species in these soils for all P sources. Under drawdown scenarios, we expect plant P uptake first from semicrystalline Al and Fe phosphates followed by P sorbed phases. Our results can help guide management decisions on coastal plain soils with a history of P application.
Assuntos
Fertilizantes , Esterco , Fósforo , Solo , Fertilizantes/análise , Esterco/análise , Fósforo/análise , Solo/química , Monitoramento Ambiental , Poluentes do Solo/análise , Agricultura/métodos , Mid-Atlantic RegionRESUMO
Objectives: Falls are common in adults aged 65 years and older and are the leading cause of traumatic brain injuries in this age group. Alcohol use may increase the risk of falls as well as the severity of resultant injuries. The aim of this study was to examine the association between self-reported alcohol use and the prevalence of intracranial hemorrhage (ICH) in this patient group. Methods: This was a secondary analysis of the Geriatric Head Trauma Short Term Outcomes Project (GREAT STOP), a study of older adults with blunt head trauma from a fall. We determined the characteristics of every fall event, including patient demographics and medical history, and clinical signs and symptoms related to head trauma. Self-reported alcohol use was categorized as none, occasionally, weekly, or daily. We defined ICH as any acute ICH detected by computed tomography scan. We evaluated the association between alcohol use frequency and ICH, adjusted for patient factors and head injury risk factors. Results: Of 3128 study participants, 18.2% (n = 567) reported alcohol use: 10.3% with occasional use, 1.9% with weekly use, and 6.0% with daily use. ICH was more common in patients who used alcohol (20.5%, 22.0%, and 25.1% for occasional, weekly, and daily alcohol users, respectively, vs. 12.0% for non-users, p < 0.001). The frequency of alcohol use was independently associated with ICH, adjusted for patient and head injury risk factors. The adjusted odds ratios (with 95% confidence intervals) for occasional, weekly, and daily alcohol users increased from 2.0 (1.5â2.8) to 2.1 (1.1â4.1) and 2.5 (1.7â3.6), respectively, and showed the characteristics of doseâresponse effect. Conclusions: Alcohol use in older adult emergency department patients with head trauma is relatively common. Self-reported alcohol use appears to be associated with a higher risk of ICH in a dose-dependent fashion. Fall prevention strategies may need to consider alcohol mitigation as a modifiable risk factor.
RESUMO
Winter cover crop performance metrics (i.e., vegetative biomass quantity and quality) affect ecosystem services provisions, but they vary widely due to differences in agronomic practices, soil properties, and climate. Cereal rye (Secale cereale) is the most common winter cover crop in the United States due to its winter hardiness, low seed cost, and high biomass production. We compiled data on cereal rye winter cover crop performance metrics, agronomic practices, and soil properties across the eastern half of the United States. The dataset includes a total of 5,695 cereal rye biomass observations across 208 site-years between 2001-2022 and encompasses a wide range of agronomic, soils, and climate conditions. Cereal rye biomass values had a mean of 3,428 kg ha-1, a median of 2,458 kg ha-1, and a standard deviation of 3,163 kg ha-1. The data can be used for empirical analyses, to calibrate, validate, and evaluate process-based models, and to develop decision support tools for management and policy decisions.
Assuntos
Grão Comestível , Secale , Agricultura , Ecossistema , Grão Comestível/crescimento & desenvolvimento , Estações do Ano , Secale/crescimento & desenvolvimento , Solo , Estados UnidosRESUMO
Continuous application of poultry litter (PL) significantly changes many soil properties, including soil test P (STP); Al, Fe, and Ca concentrations; and pH, which can affect the potential for P transport in surface runoff water. We conducted rainfall simulations on three historically acidic silt loam soils in Arkansas, Missouri, and Virginia to establish if long-term PL applications would affect soil inorganic P fractions and the resulting dissolved reactive P (DRP) in runoff water. Soil samples (0-5 cm depth) were taken to find sites ranging in Mehlich-3 STP from 20 to 1154 mg P kg. Simulated rainfall events were conducted on 3-m plots at 6.7 cm h, and runoff was collected for 30 min. Correlation between Mehlich-3 and runoff DRP indicated a linear relationship to 833 mg Mehlich-3 P kg. As Mehlich-3 STP increased, a concomitant increase in soil pH and Ca occurred on all soils. Soil P fractionation demonstrated that, as Mehlich-3 STP generally increased above 450 mg P kg (from high to very high), the easily soluble and loosely bound P fractions decreased by 3 to 10%. Water-insoluble complexes of P bound to Al and Ca were the main drivers in the reduction of DRP in runoff, accounting for up to 43 and 38% of total P, respectively. Basing runoff DRP concentration projections solely on Mehlich-3 STP may overestimate runoff P losses from soils receiving long-term PL applications due to dissolution of water-insoluble Ca-P compounds.
RESUMO
OBJECTIVE: To evaluate productivity of mid-level providers (MLPs) compared with emergency medicine (EM) resident physicians in an emergency department (ED) low acuity area, and to compare patient satisfaction when cared for by MLPs versus EM residents. METHODS: This was a retrospective review of EM resident physicians and MLPs in an ED low acuity area. The number of patients seen and relative value units (RVUs) generated per clinical hour worked were evaluated. A t test was used to compare resident and MLP productivity. Additionally, patients were prospectively surveyed to assess satisfaction, using survey items based on the Press-Ganey survey. Non-parametric statistics were used to analyse patient satisfaction scores. RESULTS: MLPs treated 2.21 patients per hour (CI ±0.09), while resident physicians treated 1.53 patients per hour (CI ±0.08). MLPs generated 4.01 RVUs per hour (CI ±0.18) while resident physicians generated 3.14 RVUs per hour (CI ±0.18). Resident physicians generated 2.07 RVUs per patient (CI ±0.08) while MLPs generated 1.82 RVUs per patient (CI ±0.03; p<0.001). Of the 201 completed satisfaction surveys, 126 patients were seen by MLPs and 75 were seen by residents. Overall patients were highly satisfied with their ED visit. There were no differences in any survey responses based on provider type or resident level of training. CONCLUSION: In a low acuity area of the ED, MLPs treated more patients per hour and generated more RVUs per hour than EM resident physicians. However, resident physicians generated more RVUs per patient. Patient satisfaction did not differ.
Assuntos
Eficiência Organizacional/normas , Serviço Hospitalar de Emergência/organização & administração , Internato e Residência , Profissionais de Enfermagem , Assistentes Médicos , Humanos , Satisfação do Paciente , Estudos RetrospectivosRESUMO
Efficient termination of cover crops is an important component of cover crop management. Information on termination efficiency can help in devising management plans but estimating herbicide efficacy is a tedious task and potential remote sensing technologies and vegetative indices (VIs) have not been explored for this purpose. This study was designed to evaluate potential herbicide options for the termination of wheat (Triticum aestivum L.), cereal rye (Secale cereale L.), hairy vetch (Vicia villosa Roth.), and rapeseed (Brassica napus L.), and to correlate different VIs with visible termination efficiency. Nine herbicides and one roller-crimping treatment were applied to each cover crop. Among different herbicides used, glyphosate, glyphosate + glufosinate, paraquat, and paraquat + metribuzin provided more than 95% termination for both wheat and cereal rye 28 days after treatment (DAT). For hairy vetch, 2,4-D + glufosinate and glyphosate + glufosinate, resulted in 99 and 98% termination efficiency, respectively, followed by 2,4-D + glyphosate and paraquat with 92% termination efficiency 28 DAT. No herbicide provided more than 90% termination of rapeseed and highest control was provided by paraquat (86%), 2,4-D + glufosinate (85%), and 2,4-D + glyphosate (85%). Roller-crimping (without herbicide application) did not provide effective termination of any cover crop with 41, 61, 49, and 43% termination for wheat, cereal rye, hairy vetch, and rapeseed, respectively. Among the VIs, Green Leaf Index had the highest Pearson correlation coefficient for wheat (r = -0.786, p = <0.0001) and cereal rye (r = -0.804, p = <0.0001) with visible termination efficiency rating. Whereas for rapeseed, the Normalized Difference Vegetation Index (NDVI) had the highest correlation coefficient (r = -0.655, p = <0.0001). The study highlighted the need for tankmixing 2,4-D or glufosinate with glyphosate for termination instead of blanket application of glyphosate alone for all crops including rapeseed and other broadleaf cover crops.
Assuntos
Herbicidas , Vicia , Agricultura/métodos , Tecnologia de Sensoriamento Remoto , Paraquat , Herbicidas/análise , Produtos Agrícolas , Grão Comestível/química , Ácido 2,4-DiclorofenoxiacéticoRESUMO
BACKGROUND: In 2006, nearly a quarter of a million patients either arrived dead or died in the Emergency Department (ED). The role of palliative care (PC) in the ED is not well defined, and education of medical students and residents in the area is sparse. OBJECTIVES: We use an illustrative case to discuss important concepts in PC for the emergency physician (EP). The reader should be able to define hospice and PC, recognize its importance in the practice of Emergency Medicine, and understand the benefits PC has for the patient, the patient's family and caregivers, and the health care system as a whole. DISCUSSION: PC excels at treating pain and addressing end-of-life issues. Families and caregivers of patients benefit from PC in terms of improved personal quality of life after the patient's death. PC is more cost-effective than traditional medical care. CONCLUSION: Research on PC in the ED is sparse but it is a growing need, and the EP will need to become proficient in the delivery of PC in the ED.
Assuntos
Medicina de Emergência/métodos , Cuidados Paliativos , Planejamento de Assistência ao Paciente , Diretivas Antecipadas , Cuidadores/psicologia , Efeitos Psicossociais da Doença , Medicina de Emergência/normas , Serviço Hospitalar de Emergência/normas , Custos de Cuidados de Saúde , Humanos , Avaliação das Necessidades , Cuidados Paliativos/economiaRESUMO
OBJECTIVES: The objective was to prospectively validate and refine previously published criteria to determine the potential utility of chest x-ray (CXR) in the evaluation and management of patients presenting to the emergency department (ED) with nontraumatic chest pain (CP). METHODS: A prospective observational study was performed of patients presenting to three EDs in the United States with a chief complaint of nontraumatic CP. Previously defined high-risk history and examination elements were combined into a refined decision rule and these elements were recorded for each patient by the ED physician. CXR results were reviewed and analyzed to determine the presence of clinically significant findings including pneumonia, pleural effusion, pneumothorax, congestive heart failure, or the presence of a new mass. Odds ratios for each history and examination element were analyzed as well as sensitivity, specificity, and negative predictive value (NPV) of the rule overall. RESULTS: A total of 1,111 patients were enrolled and 1,089 CXRs were analyzed. There were 70 (6.4%) patients with clinically relevant findings on CXR. The refined decision rule had a sensitivity of 92.9% (confidence interval [CI] = 83.4%-97.3%) and specificity of 30.4% (CI = 27.6%-33.4%) to predict clinically relevant findings on CXR, with a NPV of 98.4% (CI = 96.1%-99.4%). Five CXRs with clinically significant findings would have been missed by application of the refined rule (three pneumonias and two pleural effusions). Applying these criteria as a CXR decision rule to this population would have reduced CXR utilization by 28.9%. CONCLUSIONS: This study validates previous research suggesting a low clinical yield for CXR in the setting of nontraumatic CP in the ED. This refined clinical decision rule has a favorable sensitivity and NPV in a patient population with low incidence of disease. Further validation is needed prior to use in practice.
Assuntos
Dor no Peito/diagnóstico por imagem , Técnicas de Apoio para a Decisão , Radiografia/estatística & dados numéricos , Adulto , Idoso , Austrália , Dor no Peito/etiologia , Serviço Hospitalar de Emergência/organização & administração , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Sensibilidade e EspecificidadeRESUMO
Between 2000 and 2010 the Eastern Shore of Virginia was implicated in four Salmonella outbreaks associated with tomato. Therefore, a multi-year study (2012-2015) was performed to investigate presumptive factors associated with the contamination of Salmonella within tomato fields at Virginia Tech's Eastern Shore Agricultural Research and Extension Center. Factors including irrigation water sources (pond and well), type of soil amendment: fresh poultry litter (PL), PL ash, and a conventional fertilizer (triple superphosphate - TSP), and production practices: staked with plastic mulch (SP), staked without plastic mulch (SW), and non-staked without plastic mulch (NW), were evaluated by split-plot or complete-block design. All field experiments relied on naturally occurring Salmonella contamination, except one follow up experiment (worst-case scenario) which examined the potential for contamination in tomato fruits when Salmonella was applied through drip irrigation. Samples were collected from pond and well water; PL, PL ash, and TSP; and the rhizosphere, leaves, and fruits of tomato plants. Salmonella was quantified using a most probable number method and contamination ratios were calculated for each treatment. Salmonella serovar was determined by molecular serotyping. Salmonella populations varied significantly by year; however, similar trends were evident each year. Findings showed use of untreated pond water and raw PL amendment increased the likelihood of Salmonella detection in tomato plots. Salmonella Newport and Typhimurium were the most frequently detected serovars in pond water and PL amendment samples, respectively. Interestingly, while these factors increased the likelihood of Salmonella detection in tomato plots (rhizosphere and leaves), all tomato fruits sampled (n = 4800) from these plots were Salmonella negative. Contamination of tomato fruits was extremely low (< 1%) even when tomato plots were artificially inoculated with an attenuated Salmonella Newport strain (104 CFU/mL). Furthermore, Salmonella was not detected in tomato plots irrigated using well water and amended with PL ash or TSP. Production practices also influenced the likelihood of Salmonella detection in tomato plots. Salmonella detection was higher in tomato leaf samples for NW plots, compared to SP and SW plots. This study provides evidence that attention to agricultural inputs and production practices may help reduce the likelihood of Salmonella contamination in tomato fields.
RESUMO
Over the past decade, the Eastern Shore of Virginia (ESV) has been implicated in at least four outbreaks of salmonellosis associated with tomato, all originating from the same serovar, Salmonella enterica serovar Newport. In addition to Salmonella Newport contamination, the devastating plant disease bacterial wilt, caused by the phytopathogen Ralstonia solanacearum, threatens the sustainability of ESV tomato production. Bacterial wilt is present in most ESV tomato fields and causes devastating yield losses each year. Although the connection between bacterial wilt and tomato-related salmonellosis outbreaks in ESV is of interest, the relationship between the two pathogens has never been investigated. In this study, tomato plants were root dip inoculated with one of four treatments: (i) 8 log CFU of Salmonella Newport per ml, (ii) 5 log CFU of R. solanacearum per ml, (iii) a coinoculation of 8 log CFU of Salmonella Newport per ml plus 5 log CFU of R. solanacearum per ml, and (iv) sterile water as control. Leaf, stem, and fruit samples were collected at the early-green-fruit stage, and S. enterica contamination in the internal tissues was detected. S. enterica was recovered in 1.4 and 2.9% of leaf samples from plants inoculated with Salmonella Newport only and from plants coinoculated with Salmonella Newport plus R. solanacearum, respectively. S. enterica was recovered from 1.7 and 3.5% of fruit samples from plants inoculated with Salmonella Newport only and from plants coinoculated with Salmonella Newport plus R. solanacearum, respectively. There were significantly more stem samples from plants coinoculated with Salmonella Newport plus R. solanacearum that were positive for S. enterica (18.6%) than stem samples collected from plants inoculated with Salmonella Newport only (5.7%). Results suggested that R. solanacearum could influence S. enterica survival and transportation throughout the internal tissues of tomato plants.