Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
1.
J Appl Microbiol ; 132(3): 2342-2354, 2022 Mar.
Article in English | MEDLINE | ID: mdl-34637586

ABSTRACT

AIMS: This study investigated Salmonella concentrations following combinations of horticultural practices including anaerobic soil disinfestation (ASD), soil amendment type and irrigation regimen. METHODS AND RESULTS: Sandy-loam soil was inoculated with a five-serovar Salmonella cocktail (5.5 ± 0.2 log CFU per gram) and subjected to one of six treatments: (i) no soil amendment, ASD (ASD control), (ii) no soil amendment, no-ASD (non-ASD control) and (iii-vi) soil amended with pelletized poultry litter, rye, rapeseed or hairy vetch with ASD. The effect of irrigation regimen was determined by collecting samples 3 and 7 days after irrigation. Twenty-five-gram soil samples were collected pre-ASD, post-soil saturation (i.e. ASD-process), and at 14 time-points post-ASD, and Salmonella levels enumerated. Log-linear models examined the effect of amendment type and irrigation regimen on Salmonella die-off during and post-ASD. During ASD, Salmonella concentrations significantly decreased in all treatments (range: -0.2 to -2.7 log CFU per gram), albeit the smallest decrease (-0.2 log CFU per gram observed in the pelletized poultry litter) was of negligible magnitude. Salmonella die-off rates varied by amendment with an average post-ASD rate of -0.05 log CFU per gram day (CI = -0.05, -0.04). Salmonella concentrations remained highest over the 42 days post-ASD in pelletized poultry litter, followed by rapeseed, and hairy vetch treatments. Findings suggested ASD was not able to eliminate Salmonella in soil, and certain soil amendments facilitated enhanced Salmonella survival. Salmonella serovar distribution differed by treatment with pelletized poultry litter supporting S. Newport survival, compared with other serovars. Irrigation appeared to assist Salmonella survival with concentrations being 0.14 log CFU per gram (CI = 0.05, 0.23) greater 3 days, compared with 7 days post-irrigation. CONCLUSIONS: ASD does not eliminate Salmonella in soil, and may in fact, depending on the soil amendment used, facilitate Salmonella survival. SIGNIFICANCE AND IMPACT OF THE STUDY: Synergistic and antagonistic effects on food safety hazards of implementing horticultural practices should be considered.


Subject(s)
Soil Microbiology , Soil , Agricultural Irrigation , Agriculture/methods , Anaerobiosis , Salmonella
2.
J Emerg Med ; 58(2): 198-202, 2020 Feb.
Article in English | MEDLINE | ID: mdl-32253112

ABSTRACT

INTRODUCTION: The emergency medicine (EM) workforce has been growing at a rapid rate, fueled by a large increase in the number of EM residency programs and growth in the number of Advanced Practice Providers (APPs). OBJECTIVES: To review current available data on patient volumes and characteristics, the overall physician workforce, the current emergency physician (EP) workforce, and to project emergency physician staffing needs into the future. METHODS: Data was obtained through review of the current medical literature, reports from certifying organizations and professional societies, Web searches for alternative sources, and published governmental data. RESULTS: We conservatively estimate the demand for emergency clinicians to grow by ∼1.8% per year. The actual demand for EPs will likely be lower, considering the higher growth rates seen by APPs, likely offsetting the need for increasing numbers of EPs. We estimate the overall supply of board-certified or board-eligible EPs to increase by at least 4% in the near-term, which includes losses due to attrition. In light of this, we conservatively estimate the supply of board-certified or eligible EPs should exceed demand by at least 2.2% per year. In the intermediate term, it is possible that the supply of board-certified or eligible EPs could exceed demand by 3% or more per year. Using 2.2% growth, we estimate that the number of board-certified or board-eligible EPs should meet the anticipated demand for EPs as early as the start of 2021. Furthermore, extrapolating current trends, we anticipate the EP workforce could be 20-30% oversupplied by 2030. CONCLUSIONS: Historically, there has been a significant shortage of EPs. We project that this shortage may resolve quickly, and there is the potential for a significant oversupply in the future.


Subject(s)
Emergency Medicine , Health Workforce/statistics & numerical data , Physicians/supply & distribution , Career Choice , Emergency Medicine/education , Emergency Service, Hospital , Forecasting , Humans , Internship and Residency , Personnel Staffing and Scheduling , United States
3.
Am J Emerg Med ; 37(8): 1470-1475, 2019 08.
Article in English | MEDLINE | ID: mdl-30415981

ABSTRACT

OBJECTIVES: A prior single-center study demonstrated historical and exam features predicting intracranial injury (ICI) in geriatric patients with low-risk falls. We sought to prospectively validate these findings in a multicenter population. METHODS: This is a prospective observational study of patients ≥65 years presenting after a fall to three EDs. Patients were eligible if they were at baseline mental status and were not triaged to the trauma bay. Fall mechanism, head strike history, headache, loss of consciousness (LOC), anticoagulants/antiplatelet use, dementia, and signs of head trauma were recorded. Radiographic imaging was obtained at the discretion of treating physicians. Patients were called at 30 days to determine outcome in non-imaged patients. RESULTS: 723 patients (median age 83, interquartile range 74-88) were enrolled. Although all patients were at baseline mental status, 76 had GCS <15, and 154 had dementia. 406 patients were on anticoagulation/antiplatelet agents. Fifty-two (7.31%) patients had traumatic ICI. Two study variables were helpful in predicting ICI: LOC (odds ratio (OR) 2.02) and signs of head trauma (OR 2.6). The sensitivity of these items was 86.5% (CI 73.6-94) with a specificity of 38.8% (CI 35.1-42.7). The positive predictive value in this population was 10% (CI 7.5-13.3) with a negative predictive value of 97.3% (CI 94.4-98.8). Had these items been applied as a decision rule, 273 patients would not have undergone CT scanning, but 7 injuries would have been missed. CONCLUSION: In low-risk geriatric fall patients, the best predictors of ICI were physical findings of head trauma and history of LOC.


Subject(s)
Accidental Falls/statistics & numerical data , Brain Injuries, Traumatic/diagnosis , Medical History Taking , Physical Examination , Unconsciousness/etiology , Aged , Aged, 80 and over , Brain Injuries, Traumatic/complications , Emergency Service, Hospital/statistics & numerical data , Female , Glasgow Coma Scale , Humans , Logistic Models , Male , Predictive Value of Tests , Prospective Studies , Risk Factors , Tomography, X-Ray Computed , United States
4.
J Emerg Med ; 54(5): 731-736, 2018 05.
Article in English | MEDLINE | ID: mdl-29523420

ABSTRACT

BACKGROUND: Pain is one of the most common reasons patients present to the emergency department (ED). Emergency physicians should be aware of the numerous opioid and nonopioid alternatives available for the treatment of pain. OBJECTIVES: To provide expert consensus guidelines for the safe and effective treatment of acute pain in the ED. METHODS: Multiple independent literature searches using PubMed were performed regarding treatment of acute pain. A multidisciplinary panel of experts in Pharmacology and Emergency Medicine reviewed and discussed the literature to develop consensus guidelines. RECOMMENDATIONS: The guidelines provide resources for the safe use of opioids in the ED as well as pharmacological and nonpharmacological alternatives to opioid analgesia. Care should be tailored to the patient based on their specific acute painful condition and underlying risk factors and comorbidities. CONCLUSIONS: Analgesia in the ED should be provided in the most safe and judicious manner, with the goals of relieving acute pain while decreasing the risk of complications and opioid dependence.


Subject(s)
Acute Pain/drug therapy , Emergency Medicine/methods , Pain Management/methods , Analgesics/therapeutic use , Analgesics, Opioid/adverse effects , Analgesics, Opioid/therapeutic use , Decision Making , Emergency Medicine/standards , Emergency Medicine/trends , Emergency Service, Hospital/organization & administration , Emergency Service, Hospital/trends , Epidemics , Guidelines as Topic/standards , Humans , Pain Management/trends , Pain Measurement/methods , Risk Factors
5.
J Emerg Med ; 50(4): 690-3, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26823136

ABSTRACT

BACKGROUND: The landscape of the emergency medicine workforce has changed dramatically over the last few decades. The growth in emergency medicine residency programs has significantly increased the number of emergency medicine specialists now staffing emergency departments (EDs) throughout the country. Despite this increase in available providers, rising patient volumes, an aging population, ED overcrowding and inefficiency, increased regulation, and other factors have resulted in the continued need for additional emergency physicians. OBJECTIVES: To review current available data on patient volumes and characteristics, the overall physician workforce, the current emergency physician workforce, the impact of physician extenders and scribes on the practice of emergency medicine, and project emergency physician staffing needs into the future. DISCUSSION AND PROJECTIONS: We project that within the next 5 to 10 years, there will be enough board-certified or -eligible emergency physicians to provide care to all patients in the U.S. EDs. However, low-volume rural EDs will continue to have difficulty attracting emergency medicine specialists without significant incentives. CONCLUSIONS: There remains a shortage of board-certified emergency physicians, but it is decreasing every year. The use of physicians from other specialties to staff EDs has long been based on the theory that there is a long-standing shortage of available American Board of Emergency Medicine/American Osteopathic Board of Emergency Medicine physicians, both now and in the future. Our investigation shows that this is not supported by current data. Although there will always be regional and rural physician shortages, these are mirrored by all other specialties and are even more pressing in primary care.


Subject(s)
Emergency Medicine/education , Emergency Service, Hospital , Personnel Staffing and Scheduling , Certification , Education, Medical, Graduate , Forecasting , Humans , Internship and Residency , United States , Workforce
6.
J Environ Qual ; 44(6): 1903-10, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26641342

ABSTRACT

Ammonia (NH) emissions from animal manures can cause air and water quality problems. Poultry litter treatment (PLT, sodium bisulfate; Jones-Hamilton Co.) is an acidic amendment that is applied to litter in poultry houses to decrease NH emissions, but currently it can only be applied once before birds are placed in the houses. This project analyzed the effect of multiple PLT applications on litter properties and NH release. Volatility chambers were used to compare multiple, single, and no application of PLT to poultry litter, all with and without fresh manure applications. A field component consisted of two commercial broiler houses: one had a single, preflock PLT application, while the other received PLT reapplications during the flock using an overhead application system. In the volatility chambers, single and reapplied PLT caused greater litter moisture and lower litter pH and , relative to no PLT. After 14 d, NH released from litter treated with reapplied PLT was significantly less than litter with both single and no applications. Furthermore, total N in litter was greatest in litter treated with reapplied PLT, increasing its fertilizer value. In the commercial poultry houses, PLT reapplication led to a temporary decrease in litter pH and , but these effects did not last because of continued bird excretion. Although one preflock PLT application is currently used as a successful strategy to control NH during early flock growth, repeat PLT application using the overhead reapplication system was not successful because of problems with the reapplication system and litter moisture concerns.

7.
J Environ Qual ; 44(2): 605-13, 2015 Mar.
Article in English | MEDLINE | ID: mdl-26023979

ABSTRACT

Denitrifying bioreactors (DNBRs) are an emerging technology used to remove nitrate-nitrogen (NO) from enriched waters by supporting denitrifying microorganisms with organic carbon in an anaerobic environment. Field-scale investigations have established successful removal of NO from agricultural drainage, but the potential for DNBRs to remediate excess phosphorus (P) exported from agricultural systems has not been addressed. We hypothesized that biochar addition to traditional woodchip DNBRs would enhance NO and P removal and reduce nitrous oxide (NO) emissions based on previous research demonstrating reduced leaching of NO and P and lower greenhouse gas production associated with biochar amendment of agricultural soils. Nine laboratory-scale DNBRs, a woodchip control, and eight different woodchip-biochar treatments were used to test the effect of biochar on nutrient removal. The biochar treatments constituted a full factorial design of three factors (biochar source material [feedstock], particle size, and application rate), each with two levels. Statistical analysis by repeated measures ANOVA showed a significant effect of biochar, time, and their interaction on NO and dissolved P removal. Average P removal of 65% was observed in the biochar treatments by 18 h, after which the concentrations remained stable, compared with an 8% increase in the control after 72 h. Biochar addition resulted in average NO removal of 86% after 18 h and 97% after 72 h, compared with only 13% at 18 h and 75% at 72 h in the control. Biochar addition also resulted in significantly lower NO production. These results suggest that biochar can reduce the design residence time by enhancing nutrient removal rates.

8.
J Environ Qual ; 44(2): 524-34, 2015 Mar.
Article in English | MEDLINE | ID: mdl-26023971

ABSTRACT

Leaching of phosphorus (P) mobilizes edaphic and applied sources of P and is a primary pathway of concern in agricultural soils of the Delmarva Peninsula, which defines the eastern boundary of the eutrophic Chesapeake Bay. We evaluated P leaching before and after poultry litter application from intact soil columns (30 cm diameter × 50 cm depth) obtained from low- and high-P members of four dominant Delmarva Peninsula soils. Surface soil textures ranged from fine sand to silt loam, and Mehlich-3 soil P ranged from 64 to 628 mg kg. Irrigation of soil columns before litter application pointed to surface soil P controls on dissolved P in leachate (with soil P sorption saturation providing a stronger relationship than Mehlich-3 P); however, strong relationships between P in the subsoil (45-50 cm) and leachate P concentrations were also observed ( = 0.61-0.73). After poultry litter application (4.5 Mg ha), leachate P concentrations and loads increased significantly for the finest-textured soils, consistent with observations that well-structured soils have the greatest propensity to transmit applied P. Phosphorus derived from poultry litter appeared to contribute 41 and 76% of total P loss in leachate from the two soils with the finest textures. Results point to soil P, including P sorption saturation, as a sound metric of P loss potential in leachate when manure is not an acute source of P but highlight the need to factor in macropore transport potential to predict leaching losses from applied P sources.

9.
J Environ Qual ; 44(2): 560-71, 2015 Mar.
Article in English | MEDLINE | ID: mdl-26023975

ABSTRACT

Leaching of nutrients through agricultural soils is a priority water quality concern on the Atlantic Coastal Plain. This study evaluated the effect of tillage and urea application on leaching of phosphorus (P) and nitrogen (N) from soils of the Delmarva Peninsula that had previously been under no-till management. Intact soil columns (30 cm wide × 50 cm deep) were irrigated for 6 wk to establish a baseline of leaching response. After 2 wk of drying, a subset of soil columns was subjected to simulated tillage (0-20 cm) in an attempt to curtail leaching of surface nutrients, especially P. Urea (145 kg N ha) was then broadcast on all soils (tilled and untilled), and the columns were irrigated for another 8 wk. Comparison of leachate recoveries representing rapid and slow flows confirmed the potential to manipulate flow fractions with tillage, albeit with mixed results across soils. Leachate trends in the finer-textured soil suggest that tillage impeded macropore flow and forced greater matrix flow. Despite significant vertical stratification of soil P that suggested tillage could prevent leaching of P via macropores from the surface to the subsoil, tillage had no significant impact on P leaching losses. Relatively high levels of soil P below 20 cm may have served as the source of P enrichment in leachate waters. However, tillage did lower losses of applied urea in leachate from two of the three soils, partially confirming the study's premise that tillage would destroy macropore pathways transmitting surface constituents to the subsoil.

10.
Am J Emerg Med ; 32(8): 890-4, 2014 Aug.
Article in English | MEDLINE | ID: mdl-24929771

ABSTRACT

BACKGROUND: Falls are a major cause of morbidity in the elderly. OBJECTIVES: We describe the low-acuity elderly fall population and study which historical and clinical features predict traumatic intracranial injuries (ICIs). METHODS: This is a prospective observational study of patients at least 65 years old presenting with fall to a tertiary care facility. Patients were eligible if they were at baseline mental status and were not triaged to the trauma bay. At presentation, a data form was completed by treating physicians regarding mechanism and position of fall, history of head strike, headache, loss of consciousness (LOC), and signs of head trauma. Radiographic imaging was obtained at the discretion of treating physicians. Medical records were subsequently reviewed to determine imaging results. All patients were called in follow-up at 30 days to determine outcome in those not imaged. The study was institutional review board approved. RESULTS: A total of 799 patients were enrolled; 79.5% of patients underwent imaging. Twenty-seven had ICIs (3.4%). Fourteen had subdural hematoma, 7 had subarachnoid hemorrhage, 3 had cerebral contusion, and 3 had a combination of injuries. Logistic regression demonstrated 2 study variables that were associated with ICIs: LOC (odds ratio, 2.8; confidence interval, 1.2-6.3) and signs of head trauma (odds ratio, 13.2; confidence interval, 2.7-64.1). History of head strike, mechanism and position, headache, and anticoagulant and antiplatelet use were not associated with ICIs. CONCLUSION: Elderly fall patients who are at their baseline mental status have a low incidence of ICIs. The best predictors of ICIs are physical findings of trauma to the head and history of LOC.


Subject(s)
Accidental Falls/statistics & numerical data , Brain Injuries/etiology , Aged , Aged, 80 and over , Brain Injuries/diagnostic imaging , Brain Injuries/epidemiology , Hematoma, Subdural/diagnostic imaging , Hematoma, Subdural/epidemiology , Hematoma, Subdural/etiology , Humans , Male , Neuroimaging , Prospective Studies , Risk Factors , Subarachnoid Hemorrhage/diagnostic imaging , Subarachnoid Hemorrhage/epidemiology , Subarachnoid Hemorrhage/etiology , Tomography, X-Ray Computed , Trauma Centers/statistics & numerical data , Unconsciousness/diagnostic imaging , Unconsciousness/epidemiology , Unconsciousness/etiology
11.
J Environ Qual ; 53(3): 352-364, 2024.
Article in English | MEDLINE | ID: mdl-38469617

ABSTRACT

Historical applications of manures and fertilizers at rates exceeding crop P removal in the Mid-Atlantic region (United States) have resulted in decades of increased water quality degradation from P losses in agricultural runoff. As such, many growers in this region face restrictions on future P applications. An improved understanding of the fate, transformations, and availability of P is needed to manage P-enriched soils. We paired chemical extractions (i.e., Mehlich-3, water extractable P, and chemical fractionation) with nondestructive methods (i.e., x-ray absorption near edge structure [XANES] spectroscopy and x-ray fluorescence [XRF]) to investigate P dynamics in eight P-enriched Mid-Atlantic soils with various management histories. Chemical fractionation and XRF data were used to support XANES linear combination fits, allowing for identification of various Al, Ca, and Fe phosphates and P sorbed phases in soils amended with fertilizer, poultry litter, or dairy manure. Management history and P speciation were used to make qualitative comparisons between the eight legacy P soils; we also speculate about how P speciation may affect future management of these soils with and without additional P applications. With continued P applications, we expect an increase in semicrystalline Al and Fe-P, P sorbed to Al (hydro)oxides, and insoluble Ca-P species in these soils for all P sources. Under drawdown scenarios, we expect plant P uptake first from semicrystalline Al and Fe phosphates followed by P sorbed phases. Our results can help guide management decisions on coastal plain soils with a history of P application.


Subject(s)
Fertilizers , Manure , Phosphorus , Soil , Fertilizers/analysis , Manure/analysis , Phosphorus/analysis , Soil/chemistry , Environmental Monitoring , Soil Pollutants/analysis , Agriculture/methods , Mid-Atlantic Region
12.
Sci Data ; 11(1): 200, 2024 Feb 13.
Article in English | MEDLINE | ID: mdl-38351049

ABSTRACT

Winter cover crop performance metrics (i.e., vegetative biomass quantity and quality) affect ecosystem services provisions, but they vary widely due to differences in agronomic practices, soil properties, and climate. Cereal rye (Secale cereale) is the most common winter cover crop in the United States due to its winter hardiness, low seed cost, and high biomass production. We compiled data on cereal rye winter cover crop performance metrics, agronomic practices, and soil properties across the eastern half of the United States. The dataset includes a total of 5,695 cereal rye biomass observations across 208 site-years between 2001-2022 and encompasses a wide range of agronomic, soils, and climate conditions. Cereal rye biomass values had a mean of 3,428 kg ha-1, a median of 2,458 kg ha-1, and a standard deviation of 3,163 kg ha-1. The data can be used for empirical analyses, to calibrate, validate, and evaluate process-based models, and to develop decision support tools for management and policy decisions.


Subject(s)
Edible Grain , Secale , Agriculture , Ecosystem , Edible Grain/growth & development , Seasons , Secale/growth & development , Soil , United States
13.
J Environ Qual ; 42(6): 1829-37, 2013 Nov.
Article in English | MEDLINE | ID: mdl-25602423

ABSTRACT

Continuous application of poultry litter (PL) significantly changes many soil properties, including soil test P (STP); Al, Fe, and Ca concentrations; and pH, which can affect the potential for P transport in surface runoff water. We conducted rainfall simulations on three historically acidic silt loam soils in Arkansas, Missouri, and Virginia to establish if long-term PL applications would affect soil inorganic P fractions and the resulting dissolved reactive P (DRP) in runoff water. Soil samples (0-5 cm depth) were taken to find sites ranging in Mehlich-3 STP from 20 to 1154 mg P kg. Simulated rainfall events were conducted on 3-m plots at 6.7 cm h, and runoff was collected for 30 min. Correlation between Mehlich-3 and runoff DRP indicated a linear relationship to 833 mg Mehlich-3 P kg. As Mehlich-3 STP increased, a concomitant increase in soil pH and Ca occurred on all soils. Soil P fractionation demonstrated that, as Mehlich-3 STP generally increased above 450 mg P kg (from high to very high), the easily soluble and loosely bound P fractions decreased by 3 to 10%. Water-insoluble complexes of P bound to Al and Ca were the main drivers in the reduction of DRP in runoff, accounting for up to 43 and 38% of total P, respectively. Basing runoff DRP concentration projections solely on Mehlich-3 STP may overestimate runoff P losses from soils receiving long-term PL applications due to dissolution of water-insoluble Ca-P compounds.

14.
Emerg Med J ; 30(1): e12, 2013 Jan.
Article in English | MEDLINE | ID: mdl-22411596

ABSTRACT

OBJECTIVE: To evaluate productivity of mid-level providers (MLPs) compared with emergency medicine (EM) resident physicians in an emergency department (ED) low acuity area, and to compare patient satisfaction when cared for by MLPs versus EM residents. METHODS: This was a retrospective review of EM resident physicians and MLPs in an ED low acuity area. The number of patients seen and relative value units (RVUs) generated per clinical hour worked were evaluated. A t test was used to compare resident and MLP productivity. Additionally, patients were prospectively surveyed to assess satisfaction, using survey items based on the Press-Ganey survey. Non-parametric statistics were used to analyse patient satisfaction scores. RESULTS: MLPs treated 2.21 patients per hour (CI ±0.09), while resident physicians treated 1.53 patients per hour (CI ±0.08). MLPs generated 4.01 RVUs per hour (CI ±0.18) while resident physicians generated 3.14 RVUs per hour (CI ±0.18). Resident physicians generated 2.07 RVUs per patient (CI ±0.08) while MLPs generated 1.82 RVUs per patient (CI ±0.03; p<0.001). Of the 201 completed satisfaction surveys, 126 patients were seen by MLPs and 75 were seen by residents. Overall patients were highly satisfied with their ED visit. There were no differences in any survey responses based on provider type or resident level of training. CONCLUSION: In a low acuity area of the ED, MLPs treated more patients per hour and generated more RVUs per hour than EM resident physicians. However, resident physicians generated more RVUs per patient. Patient satisfaction did not differ.


Subject(s)
Efficiency, Organizational/standards , Emergency Service, Hospital/organization & administration , Internship and Residency , Nurse Practitioners , Physician Assistants , Humans , Patient Satisfaction , Retrospective Studies
15.
PLoS One ; 18(4): e0284529, 2023.
Article in English | MEDLINE | ID: mdl-37079528

ABSTRACT

Efficient termination of cover crops is an important component of cover crop management. Information on termination efficiency can help in devising management plans but estimating herbicide efficacy is a tedious task and potential remote sensing technologies and vegetative indices (VIs) have not been explored for this purpose. This study was designed to evaluate potential herbicide options for the termination of wheat (Triticum aestivum L.), cereal rye (Secale cereale L.), hairy vetch (Vicia villosa Roth.), and rapeseed (Brassica napus L.), and to correlate different VIs with visible termination efficiency. Nine herbicides and one roller-crimping treatment were applied to each cover crop. Among different herbicides used, glyphosate, glyphosate + glufosinate, paraquat, and paraquat + metribuzin provided more than 95% termination for both wheat and cereal rye 28 days after treatment (DAT). For hairy vetch, 2,4-D + glufosinate and glyphosate + glufosinate, resulted in 99 and 98% termination efficiency, respectively, followed by 2,4-D + glyphosate and paraquat with 92% termination efficiency 28 DAT. No herbicide provided more than 90% termination of rapeseed and highest control was provided by paraquat (86%), 2,4-D + glufosinate (85%), and 2,4-D + glyphosate (85%). Roller-crimping (without herbicide application) did not provide effective termination of any cover crop with 41, 61, 49, and 43% termination for wheat, cereal rye, hairy vetch, and rapeseed, respectively. Among the VIs, Green Leaf Index had the highest Pearson correlation coefficient for wheat (r = -0.786, p = <0.0001) and cereal rye (r = -0.804, p = <0.0001) with visible termination efficiency rating. Whereas for rapeseed, the Normalized Difference Vegetation Index (NDVI) had the highest correlation coefficient (r = -0.655, p = <0.0001). The study highlighted the need for tankmixing 2,4-D or glufosinate with glyphosate for termination instead of blanket application of glyphosate alone for all crops including rapeseed and other broadleaf cover crops.


Subject(s)
Herbicides , Vicia , Agriculture/methods , Remote Sensing Technology , Paraquat , Herbicides/analysis , Crops, Agricultural , Edible Grain/chemistry , 2,4-Dichlorophenoxyacetic Acid
16.
J Emerg Med ; 43(5): 803-10, 2012 Nov.
Article in English | MEDLINE | ID: mdl-21269792

ABSTRACT

BACKGROUND: In 2006, nearly a quarter of a million patients either arrived dead or died in the Emergency Department (ED). The role of palliative care (PC) in the ED is not well defined, and education of medical students and residents in the area is sparse. OBJECTIVES: We use an illustrative case to discuss important concepts in PC for the emergency physician (EP). The reader should be able to define hospice and PC, recognize its importance in the practice of Emergency Medicine, and understand the benefits PC has for the patient, the patient's family and caregivers, and the health care system as a whole. DISCUSSION: PC excels at treating pain and addressing end-of-life issues. Families and caregivers of patients benefit from PC in terms of improved personal quality of life after the patient's death. PC is more cost-effective than traditional medical care. CONCLUSION: Research on PC in the ED is sparse but it is a growing need, and the EP will need to become proficient in the delivery of PC in the ED.


Subject(s)
Emergency Medicine/methods , Palliative Care , Patient Care Planning , Advance Directives , Caregivers/psychology , Cost of Illness , Emergency Medicine/standards , Emergency Service, Hospital/standards , Health Care Costs , Humans , Needs Assessment , Palliative Care/economics
17.
Acad Emerg Med ; 25(6): 650-656, 2018 06.
Article in English | MEDLINE | ID: mdl-29427301

ABSTRACT

OBJECTIVES: The objective was to prospectively validate and refine previously published criteria to determine the potential utility of chest x-ray (CXR) in the evaluation and management of patients presenting to the emergency department (ED) with nontraumatic chest pain (CP). METHODS: A prospective observational study was performed of patients presenting to three EDs in the United States with a chief complaint of nontraumatic CP. Previously defined high-risk history and examination elements were combined into a refined decision rule and these elements were recorded for each patient by the ED physician. CXR results were reviewed and analyzed to determine the presence of clinically significant findings including pneumonia, pleural effusion, pneumothorax, congestive heart failure, or the presence of a new mass. Odds ratios for each history and examination element were analyzed as well as sensitivity, specificity, and negative predictive value (NPV) of the rule overall. RESULTS: A total of 1,111 patients were enrolled and 1,089 CXRs were analyzed. There were 70 (6.4%) patients with clinically relevant findings on CXR. The refined decision rule had a sensitivity of 92.9% (confidence interval [CI] = 83.4%-97.3%) and specificity of 30.4% (CI = 27.6%-33.4%) to predict clinically relevant findings on CXR, with a NPV of 98.4% (CI = 96.1%-99.4%). Five CXRs with clinically significant findings would have been missed by application of the refined rule (three pneumonias and two pleural effusions). Applying these criteria as a CXR decision rule to this population would have reduced CXR utilization by 28.9%. CONCLUSIONS: This study validates previous research suggesting a low clinical yield for CXR in the setting of nontraumatic CP in the ED. This refined clinical decision rule has a favorable sensitivity and NPV in a patient population with low incidence of disease. Further validation is needed prior to use in practice.


Subject(s)
Chest Pain/diagnostic imaging , Decision Support Techniques , Radiography/statistics & numerical data , Adult , Aged , Australia , Chest Pain/etiology , Emergency Service, Hospital/organization & administration , Female , Humans , Male , Middle Aged , Prospective Studies , Sensitivity and Specificity
18.
Front Microbiol ; 9: 2451, 2018.
Article in English | MEDLINE | ID: mdl-30386314

ABSTRACT

Between 2000 and 2010 the Eastern Shore of Virginia was implicated in four Salmonella outbreaks associated with tomato. Therefore, a multi-year study (2012-2015) was performed to investigate presumptive factors associated with the contamination of Salmonella within tomato fields at Virginia Tech's Eastern Shore Agricultural Research and Extension Center. Factors including irrigation water sources (pond and well), type of soil amendment: fresh poultry litter (PL), PL ash, and a conventional fertilizer (triple superphosphate - TSP), and production practices: staked with plastic mulch (SP), staked without plastic mulch (SW), and non-staked without plastic mulch (NW), were evaluated by split-plot or complete-block design. All field experiments relied on naturally occurring Salmonella contamination, except one follow up experiment (worst-case scenario) which examined the potential for contamination in tomato fruits when Salmonella was applied through drip irrigation. Samples were collected from pond and well water; PL, PL ash, and TSP; and the rhizosphere, leaves, and fruits of tomato plants. Salmonella was quantified using a most probable number method and contamination ratios were calculated for each treatment. Salmonella serovar was determined by molecular serotyping. Salmonella populations varied significantly by year; however, similar trends were evident each year. Findings showed use of untreated pond water and raw PL amendment increased the likelihood of Salmonella detection in tomato plots. Salmonella Newport and Typhimurium were the most frequently detected serovars in pond water and PL amendment samples, respectively. Interestingly, while these factors increased the likelihood of Salmonella detection in tomato plots (rhizosphere and leaves), all tomato fruits sampled (n = 4800) from these plots were Salmonella negative. Contamination of tomato fruits was extremely low (< 1%) even when tomato plots were artificially inoculated with an attenuated Salmonella Newport strain (104 CFU/mL). Furthermore, Salmonella was not detected in tomato plots irrigated using well water and amended with PL ash or TSP. Production practices also influenced the likelihood of Salmonella detection in tomato plots. Salmonella detection was higher in tomato leaf samples for NW plots, compared to SP and SW plots. This study provides evidence that attention to agricultural inputs and production practices may help reduce the likelihood of Salmonella contamination in tomato fields.

19.
J Food Prot ; 77(2): 320-4, 2014 Feb.
Article in English | MEDLINE | ID: mdl-24490928

ABSTRACT

Over the past decade, the Eastern Shore of Virginia (ESV) has been implicated in at least four outbreaks of salmonellosis associated with tomato, all originating from the same serovar, Salmonella enterica serovar Newport. In addition to Salmonella Newport contamination, the devastating plant disease bacterial wilt, caused by the phytopathogen Ralstonia solanacearum, threatens the sustainability of ESV tomato production. Bacterial wilt is present in most ESV tomato fields and causes devastating yield losses each year. Although the connection between bacterial wilt and tomato-related salmonellosis outbreaks in ESV is of interest, the relationship between the two pathogens has never been investigated. In this study, tomato plants were root dip inoculated with one of four treatments: (i) 8 log CFU of Salmonella Newport per ml, (ii) 5 log CFU of R. solanacearum per ml, (iii) a coinoculation of 8 log CFU of Salmonella Newport per ml plus 5 log CFU of R. solanacearum per ml, and (iv) sterile water as control. Leaf, stem, and fruit samples were collected at the early-green-fruit stage, and S. enterica contamination in the internal tissues was detected. S. enterica was recovered in 1.4 and 2.9% of leaf samples from plants inoculated with Salmonella Newport only and from plants coinoculated with Salmonella Newport plus R. solanacearum, respectively. S. enterica was recovered from 1.7 and 3.5% of fruit samples from plants inoculated with Salmonella Newport only and from plants coinoculated with Salmonella Newport plus R. solanacearum, respectively. There were significantly more stem samples from plants coinoculated with Salmonella Newport plus R. solanacearum that were positive for S. enterica (18.6%) than stem samples collected from plants inoculated with Salmonella Newport only (5.7%). Results suggested that R. solanacearum could influence S. enterica survival and transportation throughout the internal tissues of tomato plants.


Subject(s)
Antibiosis , Food Contamination/analysis , Ralstonia solanacearum/physiology , Salmonella enterica/growth & development , Solanum lycopersicum/microbiology , Fruit/microbiology , Plant Leaves/microbiology , Plant Roots/microbiology , Salmonella enterica/physiology
20.
West J Emerg Med ; 14(6): 598-601, 2013 Nov.
Article in English | MEDLINE | ID: mdl-24381679

ABSTRACT

INTRODUCTION: Mid-level providers (MLP) are extensively used in staffing emergency departments (ED). We sought to compare the productivity of MLPs staffing a low-acuity and high-acuity area of a community ED. METHODS: This is a retrospective review of MLP productivity at a single center 42,000-volume community ED from July 2009 to September 2010. MLPs staffed day shifts (8AM-6PM or 10AM-10PM) in high- and low-acuity sections of the ED. We used two-tailed T-test to compare patients/hour, relative value units (RVUs)/hour, and RVUs/patient between the 2 MLP groups. RESULTS: We included 49 low-acuity and 55 high-acuity shifts in this study. During the study period, MLPs staffing low-acuity shifts treated a mean of 2.7 patients/hour (confidence interval [CI] +/- 0.23), while those staffing high-acuity shifts treated a mean of 1.56 patients/hour (CI +/- 0.14, p<0.0001). MLPs staffing low-acuity shifts generated a mean of 4.45 RVUs/hour (CI +/- 0.34) compared to 3.19 RVUs/hour (CI +/- 0.29) for those staffing high-acuity shifts (p<0.0001). MLPs staffing low-acuity shifts generated a mean of 1.68 RVUs/patient (CI +/- 0.06) while those staffing high-acuity shifts generated a mean RVUs/patient of 2.05 (CI +/- 0.09, p<0.0001). CONCLUSION: MLPs staffing a low-acuity area treated more patients/hour and generated more RVUs/hour than when staffing a high-acuity area.

SELECTION OF CITATIONS
SEARCH DETAIL