Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
1.
Circulation ; 146(12): 892-906, 2022 09 20.
Article in English | MEDLINE | ID: mdl-36121907

ABSTRACT

BACKGROUND: Infection with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) induces a prothrombotic state, but long-term effects of COVID-19 on incidence of vascular diseases are unclear. METHODS: We studied vascular diseases after COVID-19 diagnosis in population-wide anonymized linked English and Welsh electronic health records from January 1 to December 7, 2020. We estimated adjusted hazard ratios comparing the incidence of arterial thromboses and venous thromboembolic events (VTEs) after diagnosis of COVID-19 with the incidence in people without a COVID-19 diagnosis. We conducted subgroup analyses by COVID-19 severity, demographic characteristics, and previous history. RESULTS: Among 48 million adults, 125 985 were hospitalized and 1 319 789 were not hospitalized within 28 days of COVID-19 diagnosis. In England, there were 260 279 first arterial thromboses and 59 421 first VTEs during 41.6 million person-years of follow-up. Adjusted hazard ratios for first arterial thrombosis after COVID-19 diagnosis compared with no COVID-19 diagnosis declined from 21.7 (95% CI, 21.0-22.4) in week 1 after COVID-19 diagnosis to 1.34 (95% CI, 1.21-1.48) during weeks 27 to 49. Adjusted hazard ratios for first VTE after COVID-19 diagnosis declined from 33.2 (95% CI, 31.3-35.2) in week 1 to 1.80 (95% CI, 1.50-2.17) during weeks 27 to 49. Adjusted hazard ratios were higher, for longer after diagnosis, after hospitalized versus nonhospitalized COVID-19, among Black or Asian versus White people, and among people without versus with a previous event. The estimated whole-population increases in risk of arterial thromboses and VTEs 49 weeks after COVID-19 diagnosis were 0.5% and 0.25%, respectively, corresponding to 7200 and 3500 additional events, respectively, after 1.4 million COVID-19 diagnoses. CONCLUSIONS: High relative incidence of vascular events soon after COVID-19 diagnosis declines more rapidly for arterial thromboses than VTEs. However, incidence remains elevated up to 49 weeks after COVID-19 diagnosis. These results support policies to prevent severe COVID-19 by means of COVID-19 vaccines, early review after discharge, risk factor control, and use of secondary preventive agents in high-risk patients.


Subject(s)
COVID-19 , Thrombosis , Vascular Diseases , Venous Thromboembolism , Venous Thrombosis , Adult , COVID-19/complications , COVID-19/epidemiology , COVID-19 Vaccines , Cohort Studies , Humans , SARS-CoV-2 , Thrombosis/complications , Thrombosis/epidemiology , Vascular Diseases/complications , Venous Thromboembolism/etiology , Venous Thrombosis/epidemiology , Wales/epidemiology
2.
PLoS Med ; 19(2): e1003926, 2022 02.
Article in English | MEDLINE | ID: mdl-35192597

ABSTRACT

BACKGROUND: Thromboses in unusual locations after the Coronavirus Disease 2019 (COVID-19) vaccine ChAdOx1-S have been reported, although their frequency with vaccines of different types is uncertain at a population level. The aim of this study was to estimate the population-level risks of hospitalised thrombocytopenia and major arterial and venous thromboses after COVID-19 vaccination. METHODS AND FINDINGS: In this whole-population cohort study, we analysed linked electronic health records from adults living in England, from 8 December 2020 to 18 March 2021. We estimated incidence rates and hazard ratios (HRs) for major arterial, venous, and thrombocytopenic outcomes 1 to 28 and >28 days after first vaccination dose for ChAdOx1-S and BNT162b2 vaccines. Analyses were performed separately for ages <70 and ≥70 years and adjusted for age, age2, sex, ethnicity, and deprivation. We also prespecified adjustment for anticoagulant medication, combined oral contraceptive medication, hormone replacement therapy medication, history of pulmonary embolism or deep vein thrombosis, and history of coronavirus infection in analyses of venous thrombosis; and diabetes, hypertension, smoking, antiplatelet medication, blood pressure lowering medication, lipid lowering medication, anticoagulant medication, history of stroke, and history of myocardial infarction in analyses of arterial thromboses. We selected further covariates with backward selection. Of 46 million adults, 23 million (51%) were women; 39 million (84%) were <70; and 3.7 million (8.1%) Asian or Asian British, 1.6 million (3.5%) Black or Black British, 36 million (79%) White, 0.7 million (1.5%) mixed ethnicity, and 1.5 million (3.2%) were of another ethnicity. Approximately 21 million (46%) adults had their first vaccination between 8 December 2020 and 18 March 2021. The crude incidence rates (per 100,000 person-years) of all venous events were as follows: prevaccination, 140 [95% confidence interval (CI): 138 to 142]; ≤28 days post-ChAdOx1-S, 294 (281 to 307); >28 days post-ChAdOx1-S, 359 (338 to 382), ≤28 days post-BNT162b2-S, 241 (229 to 253); >28 days post-BNT162b2-S 277 (263 to 291). The crude incidence rates (per 100,000 person-years) of all arterial events were as follows: prevaccination, 546 (95% CI: 541 to 555); ≤28 days post-ChAdOx1-S, 1,211 (1,185 to 1,237); >28 days post-ChAdOx1-S, 1678 (1,630 to 1,726), ≤28 days post-BNT162b2-S, 1,242 (1,214 to 1,269); >28 days post-BNT162b2-S, 1,539 (1,507 to 1,572). Adjusted HRs (aHRs) 1 to 28 days after ChAdOx1-S, compared with unvaccinated rates, at ages <70 and ≥70 years, respectively, were 0.97 (95% CI: 0.90 to 1.05) and 0.58 (0.53 to 0.63) for venous thromboses, and 0.90 (0.86 to 0.95) and 0.76 (0.73 to 0.79) for arterial thromboses. Corresponding aHRs for BNT162b2 were 0.81 (0.74 to 0.88) and 0.57 (0.53 to 0.62) for venous thromboses, and 0.94 (0.90 to 0.99) and 0.72 (0.70 to 0.75) for arterial thromboses. aHRs for thrombotic events were higher at younger ages for venous thromboses after ChAdOx1-S, and for arterial thromboses after both vaccines. Rates of intracranial venous thrombosis (ICVT) and of thrombocytopenia in adults aged <70 years were higher 1 to 28 days after ChAdOx1-S (aHRs 2.27, 95% CI: 1.33 to 3.88 and 1.71, 1.35 to 2.16, respectively), but not after BNT162b2 (0.59, 0.24 to 1.45 and 1.00, 0.75 to 1.34) compared with unvaccinated. The corresponding absolute excess risks of ICVT 1 to 28 days after ChAdOx1-S were 0.9 to 3 per million, varying by age and sex. The main limitations of the study are as follows: (i) it relies on the accuracy of coded healthcare data to identify exposures, covariates, and outcomes; (ii) the use of primary reason for hospital admission to measure outcome, which improves the positive predictive value but may lead to an underestimation of incidence; and (iii) potential unmeasured confounding. CONCLUSIONS: In this study, we observed increases in rates of ICVT and thrombocytopenia after ChAdOx1-S vaccination in adults aged <70 years that were small compared with its effect in reducing COVID-19 morbidity and mortality, although more precise estimates for adults aged <40 years are needed. For people aged ≥70 years, rates of arterial or venous thrombotic events were generally lower after either vaccine compared with unvaccinated, suggesting that either vaccine is suitable in this age group.


Subject(s)
BNT162 Vaccine , COVID-19 Vaccines , ChAdOx1 nCoV-19/adverse effects , Thrombocytopenia/etiology , Vaccination , Adult , Aged , Cohort Studies , England/epidemiology , Female , Humans , Incidence , Male , Middle Aged , SARS-CoV-2/pathogenicity , Thrombocytopenia/epidemiology , Vaccination/adverse effects
3.
Eur Radiol ; 32(1): 602-612, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34117912

ABSTRACT

OBJECTIVES: In breast cancer screening, two readers separately examine each woman's mammograms for signs of cancer. We examined whether preventing the two readers from seeing each other's decisions (blinding) affects behaviour and outcomes. METHODS: This cohort study used data from the CO-OPS breast-screening trial (1,119,191 women from 43 screening centres in England) where all discrepant readings were arbitrated. Multilevel models were fitted using Markov chain Monte Carlo to measure whether reader 2 conformed to the decisions of reader 1 when they were not blinded, and the effect of blinding on overall rates of recall for further tests and cancer detection. Differences in positive predictive value (PPV) were assessed using Pearson's chi-squared test. RESULTS: When reader 1 recalls, the probability of reader 2 also recalling was higher when not blinded than when blinded, suggesting readers may be influenced by the other's decision. Overall, women were less likely to be recalled when reader 2 was blinded (OR 0.923; 95% credible interval 0.864, 0.986), with no clear pattern in cancer detection rate (OR 1.029; 95% credible interval 0.970, 1.089; Bayesian p value 0.832). PPV was 22.1% for blinded versus 20.6% for not blinded (p < 0.001). CONCLUSIONS: Our results suggest that when not blinded, reader 2 is influenced by reader 1's decisions to recall (alliterative bias) which would result in bypassing arbitration and negate some of the benefits of double-reading. We found a relationship between blinding the second reader and slightly higher PPV of breast cancer screening, although this analysis may be confounded by other centre characteristics. KEY POINTS: • In Europe, it is recommended that breast screening mammograms are analysed by two readers but there is little evidence on the effect of 'blinding' the readers so they cannot see each other's decisions. • We found evidence that when the second reader is not blinded, they are more likely to agree with a recall decision from the first reader and less likely to make an independent judgement (alliterative error). This may reduce overall accuracy through bypassing arbitration. • This observational study suggests an association between blinding the second reader and higher positive predictive value of screening, but this may be confounded by centre characteristics.


Subject(s)
Breast Neoplasms , Early Detection of Cancer , Bayes Theorem , Breast Neoplasms/diagnostic imaging , Cohort Studies , Female , Humans , Mammography , Mass Screening , Observer Variation
4.
J Environ Qual ; 44(3): 953-62, 2015 May.
Article in English | MEDLINE | ID: mdl-26024275

ABSTRACT

Shallow narrow drainfields are assumed to provide better wastewater renovation than conventional drainfields and are used for protection of surface and ground water. To test this assumption, we evaluated the water quality functions of two advanced onsite wastewater treatment system (OWTS) drainfields-shallow narrow (SND) and Geomat (GEO)-and a conventional pipe and stone (P&S) drainfield over 12 mo using replicated ( = 3) intact soil mesocosms. The SND and GEO mesocosms received effluent from a single-pass sand filter, whereas the P&S received septic tank effluent. Between 97.1 and 100% of 5-d biochemical oxygen demand (BOD), fecal coliform bacteria, and total phosphorus (P) were removed in all drainfield types. Total nitrogen (N) removal averaged 12.0% for P&S, 4.8% for SND, and 5.4% for GEO. A mass balance analysis accounted for 95.1% (SND), 94.1% (GEO), and 87.6% (P&S) of N inputs. When the whole treatment train (excluding the septic tank) is considered, advanced systems, including sand filter pretreatment and SND or GEO soil-based treatment, removed 99.8 to 99.9% of BOD, 100% of fecal coliform bacteria and P, and 26.0 to 27.0% of N. In contrast, the conventional system removed 99.4% of BOD and 100% of fecal coliform bacteria and P but only 12.0% of N. All drainfield types performed similarly for most water quality functions despite differences in placement within the soil profile. However, inclusion of the pretreatment step in advanced system treatment trains results in better N removal than in conventional treatment systems despite higher drainfield N removal rates in the latter.

5.
Water Res ; 259: 121750, 2024 Aug 01.
Article in English | MEDLINE | ID: mdl-38851115

ABSTRACT

Phosphorus (P) discharge from agricultural and urban drainage is known for causing downstream eutrophication worldwide. Agricultural best management practices that are designed to reduce P load out of farms target different P species from various sources such as fertilizers leaching and farm soil and canal sediment erosion, however, few studies have assessed the impact of floating aquatic vegetation (FAV) on canal sediment and farm drainage water quality. This study evaluated the impact of FAVs on canal sediment properties and P water quality in drainage canals in the Everglades Agricultural Area in south Florida, USA. Non-parametric statistical methods, correlation analysis, trend analysis and principal component analysis (PCA) were used to determine the relationship between FAV coverage with sediment properties and P water quality parameters. Results showed that FAV coverage was correlated with the highly recalcitrant and most stable form of P in the sediment layer (Residual P Pool). FAV coverage also correlated with the dissolved organic P (DOP) which was the smallest P pool (7 %) of total P concentration in drainage water, therefore FAV coverage had no correlation with farm P load. The trend analysis showed no trend in farm P loads, despite a decline in FAV coverage at farm canals over an 8-year period. Phosphorus content in the sediment surface layer was strongly associated with farm P load and had a significant correlation with particulate P (PP) and soluble reactive P (SRP) which constituted 47 % and 46 % of the total P concentration in the drainage water, respectively. Equilibrium P concentration assays also showed the potential to release SRP from the sediment layer. The P budget established for this study reveals that sediment stores the largest P mass (333 kg P), while FAVs store the smallest P mass (8 kg P) in a farm canal, highlighting the significant contribution of canal sediment to farm P discharges. Further research is required to evaluate the impact of sediment removal and canal maintenance practices that help reduce farm P discharges.


Subject(s)
Agriculture , Geologic Sediments , Phosphorus , Phosphorus/analysis , Florida , Geologic Sediments/chemistry , Water Pollutants, Chemical/analysis , Environmental Monitoring
6.
Sci Total Environ ; 861: 160644, 2023 Feb 25.
Article in English | MEDLINE | ID: mdl-36464046

ABSTRACT

Nutrient retention in biochar amended soil has yielded variable results, with poorly understood mechanisms. Identification of changes on biochar surfaces during in situ soil aging can provide mechanistic information on the role of biochar on nutrient retention. In the current greenhouse study, we analyzed changes of biochar surface characteristics from aging in two soils with different iron levels and amended with two types of manure under corn. On pristine biochar surfaces, we detected no iron species. In contrast, after soil aging (70 days), a self-functionalization of biochar surfaces with iron oxides was observed, which can be explained by soil redox cycles allowing reduced iron(II) to migrate on biochar surfaces followed by its re-oxidation. This self-functionalization is proposed as an underlying mechanism explaining the significantly (p < 0.01) increased nitrate retention by 29-180 % in biochar amended soil. Significant (p < 0.05) reductions in leachate phosphate (18-41 %) and dissolved organic carbon (8.8-55 %) were also observed after biochar surface functionalization. Our results indicate that redox-driven iron oxide formation on surfaces of biochar in the soil can be a critical process explaining the dynamic nature of nutrient retention observed in biochar amended soils. Identifying soil environmental conditions most beneficial for such surface functionalization, which has the potential to increase nutrient retention, is critical for implementing efficient biochar amendment strategies and for increased resource efficiency in agroecosystems.


Subject(s)
Soil Pollutants , Soil , Nitrates , Charcoal , Manure
7.
Sci Total Environ ; 869: 161712, 2023 Apr 15.
Article in English | MEDLINE | ID: mdl-36682547

ABSTRACT

Rice is planted as a rotation crop in the sugarcane-dominant Everglades Agricultural Area (EAA) in southern Florida. The Histosols in this area are unlike other mineral soils used to grow rice due to the high organic content and land subsidence caused by rapid oxidation of organic matter upon drainage. It remains unknown if such soils pose a risk of arsenic (As) or cadmium (Cd) mobilization and uptake into rice grain. Both As and Cd are carcinogenic trace elements of concern in rice, and it is important to understand their soil-plant transfer into rice, a staple food of global importance. Here, a mesocosm pot study was conducted using two thicknesses of local soil, deep (D, 50 cm) and shallow (S, 25 cm), under three water managements, conventional flooding (FL), low water table (LWT), and alternating wetting and drying (AWD). Rice was grown to maturity and plant levels of As and Cd were determined. Regardless of treatments, rice grown in these Florida Histolsols has very low Cd concentrations in polished grain (1.5-5.6 µg kg-1) and relatively low total As (35-150 µg kg-1) and inorganic As (35-87 µg kg-1) concentrations in polished grain, which are below regulatory limits. This may be due to the low soil As and Cd levels, high soil cation exchange capacity due to high soil organic matter content, and slightly alkaline soil pH. Grain As was significantly affected by water management (AWD < FL = LWT) and its interaction effect with soil thickness (AWD-D ≤ AWD-S ≤ FL-D = LWT-S = LWT-D ≤ FL-S), resulting in as much as 62 % difference among treatments. Grain Cd was significantly affected by water management (AWD > FL > LWT) without any soil thickness impact. In conclusion, even though water management has more of an impact on rice As and Cd than soil thickness, the low concentrations of As and Cd in rice pose little health risk for consumers.


Subject(s)
Arsenic , Oryza , Soil Pollutants , Cadmium/analysis , Arsenic/analysis , Water/analysis , Soil/chemistry , Oryza/chemistry , Florida , Water Supply , Soil Pollutants/analysis
8.
J Environ Qual ; 51(2): 272-287, 2022 Mar.
Article in English | MEDLINE | ID: mdl-35045194

ABSTRACT

Animal manure has been increasingly adopted as a more sustainable substitute for synthetic fertilizers but might result in increased dissolved organic C (DOC) and phosphate (PO4 3- ) leaching and elevated greenhouse gas emissions from soil. Biochar may reduce nutrient loss from manure-amended soils, but large-scale application has been hindered, in part, by its high cost. Minimum cost alternatives, such as incomplete coal combustion residue (char), may provide a more viable option to farmers, but char needs to be analyzed in comparison to high-temperature pine biochar before recommendations can be made. We valuated losses of soil C, N, and P, as well as plant yields and changes in microbial biomass, in two contrasting soils amended with dairy slurry or swine lagoon wastewater and with biochar or coal char over 105 d. Dissolved organic C leaching decreased with addition of biochar or char (0.6-27% or 1.6-36%), independent of soil texture and manure type. Leaching of PO4 3- was reduced by biochar (15-24%) and char (38-50%) in the silt loam. Soil N leaching increased after char application (likely due to our high application rate) but was unaffected by biochar. Char reduced CO2 emissions from the sandy loam by 9.7-54%, whereas both biochar and char increased CO2 emissions in the silt loam by 38-48% during plant root senescence. Depending on soil characteristics, char may outcompete biochar with respect to reduction of PO4 3- and DOC leaching. Unlike biochar, some char-N is available, and this should be accounted for when considering application rates.


Subject(s)
Manure , Soil , Animals , Carbon , Charcoal/chemistry , Coal , Nutrients , Soil/chemistry , Swine
9.
BMJ Open ; 10(9): e041370, 2020 09 28.
Article in English | MEDLINE | ID: mdl-32988953

ABSTRACT

OBJECTIVES: To use Population Health Management (PHM) methods to identify and characterise individuals at high-risk of severe COVID-19 for which shielding is required, for the purposes of managing ongoing health needs and mitigating potential shielding-induced harm. DESIGN: Individuals at 'high risk' of COVID-19 were identified using the published national 'Shielded Patient List' criteria. Individual-level information, including current chronic conditions, historical healthcare utilisation and demographic and socioeconomic status, was used for descriptive analyses of this group using PHM methods. Segmentation used k-prototypes cluster analysis. SETTING: A major healthcare system in the South West of England, for which linked primary, secondary, community and mental health data are available in a system-wide dataset. The study was performed at a time considered to be relatively early in the COVID-19 pandemic in the UK. PARTICIPANTS: 1 013 940 individuals from 78 contributing general practices. RESULTS: Compared with the groups considered at 'low' and 'moderate' risk (ie, eligible for the annual influenza vaccination), individuals at high risk were older (median age: 68 years (IQR: 55-77 years), cf 30 years (18-44 years) and 63 years (38-73 years), respectively), with more primary care/community contacts in the previous year (median contacts: 5 (2-10), cf 0 (0-2) and 2 (0-5)) and had a higher burden of comorbidity (median Charlson Score: 4 (3-6), cf 0 (0-0) and 2 (1-4)). Geospatial analyses revealed that 3.3% of rural and semi-rural residents were in the high-risk group compared with 2.91% of urban and inner-city residents (p<0.001). Segmentation uncovered six distinct clusters comprising the high-risk population, with key differentiation based on age and the presence of cancer, respiratory, and mental health conditions. CONCLUSIONS: PHM methods are useful in characterising the needs of individuals requiring shielding. Segmentation of the high-risk population identified groups with distinct characteristics that may benefit from a more tailored response from health and care providers and policy-makers.


Subject(s)
Coronavirus Infections , Health Information Systems/statistics & numerical data , Pandemics , Pneumonia, Viral , Population Health Management , Risk Assessment/methods , Risk Management , Aged , Betacoronavirus , COVID-19 , Coronavirus Infections/epidemiology , Coronavirus Infections/prevention & control , Cross-Sectional Studies , Demography , England/epidemiology , Female , General Practice/statistics & numerical data , Humans , Male , Middle Aged , Needs Assessment , Pandemics/prevention & control , Pneumonia, Viral/epidemiology , Pneumonia, Viral/prevention & control , Risk Factors , Risk Management/methods , Risk Management/organization & administration , SARS-CoV-2 , Severity of Illness Index
10.
J Vet Diagn Invest ; 21(4): 523-6, 2009 Jul.
Article in English | MEDLINE | ID: mdl-19564503

ABSTRACT

Bovine herpesvirus 1 (BoHV-1) is an infectious agent of concern in the international export of bovine products; it is endemic in the United States, but it has been eradicated in many countries of the European Union (EU). For export of semen to the EU, accurate assessment of BoHV-1 status of the bull is required and is usually accomplished by measuring the level of antibody to the virus. The gold standard is virus neutralization (VN) using overnight incubation with the virus, a test approved by the World Organization for Animal Health (OIE). Enzyme-linked immunosorbent assay (ELISA) is also approved for international trade. The lone U.S. Department of Agriculture-approved commercial ELISA was compromised with specificity problems, which necessitated the development of a different ELISA. Of 4 monoclonal antibodies evaluated, 1 directed against glycoprotein C of BoHV-1 was found to be the most reliable. One hundred twenty-eight characterized positive samples and 334 negative serum samples were tested. The blocking ELISA showed 97.7% sensitivity and 99.4% specificity as compared with OIE VN. The Wisconsin Veterinary Diagnostic Laboratory ELISA fulfills the OIE requirement for a blocking or competitive ELISA to qualify animals for export to BoHV-1-free countries.


Subject(s)
Cattle Diseases/blood , Enzyme-Linked Immunosorbent Assay/veterinary , Herpesviridae Infections/veterinary , Herpesvirus 1, Bovine/isolation & purification , Animals , Cattle , Cattle Diseases/diagnosis , Cattle Diseases/virology , Enzyme-Linked Immunosorbent Assay/methods , Herpesviridae Infections/virology
11.
Water Res ; 41(1): 3-10, 2007 Jan.
Article in English | MEDLINE | ID: mdl-17113123

ABSTRACT

Enterococci, a common fecal indicator, and Staphylococcus aureus, a common skin pathogen, can be shed by bathers affecting the quality of recreational waters and resulting in possible human health impacts. Due to limited information available concerning human shedding of these microbes, this study focused on estimating the amounts of enterococci and S. aureus shed by bathers directly off their skin and indirectly via sand adhered to skin. Two sets of experiments were conducted at a marine beach located in Miami-Dade County, Florida. The first study, referred to as the "large pool" study, involved 10 volunteers who immersed their bodies in 4700L during four 15min cycles with exposure to beach sand in cycles 3 and 4. The "small pool" study involved 10 volunteers who were exposed to beach sand for 30min before they individually entered a small tub. After each individual was rinsed with off-shore marine water, sand and rinse water were collected and analyzed for enterococci. Results from the "large pool" study showed that bathers shed concentrations of enterococci and S. aureus on the order of 6x10(5) and 6x10(6) colony forming units (CFU) per person in the first 15min exposure period, respectively. Significant reductions in the bacteria shed per bather (50% reductions for S. aureus and 40% for enterococci) were observed in the subsequent bathing cycles. The "small pool" study results indicated that the enterococci contribution from sand adhered to skin was small (about 2% of the total) in comparison with the amount shed directly from the bodies of the volunteers. Results indicated that bathers transport significant amounts of enterococci and S. aureus to the water column, and thus human microbial bathing load should be considered as a non-point source when designing recreational water quality models.


Subject(s)
Bathing Beaches , Enterococcus/isolation & purification , Seawater/microbiology , Staphylococcus aureus/isolation & purification , Water Microbiology , Bacteria , Environmental Monitoring/methods , Humans , Silicon Dioxide , Water
12.
J Vet Diagn Invest ; 29(2): 208-211, 2017 Mar.
Article in English | MEDLINE | ID: mdl-28176615

ABSTRACT

An 11-d-old Holstein bull calf was presented to the Veterinary Medical Teaching Hospital at the University of Wisconsin-Madison because of a 4-d history of diarrhea and persistent low-grade fever. Initial diagnosis was enteritis caused by Cryptosporidium and rotavirus. During hospitalization, the calf became stuporous and was only responsive to noxious stimuli, with hypotonia of all 4 limbs, tail, head, and neck. A cerebrospinal fluid analysis revealed xanthochromia, with marked lymphocytic pleocytosis, which was suggestive of viral meningitis and/or encephalitis. Aichivirus B, which belongs to the Kobuvirus genus, was tentatively identified in spinal fluid by next-generation DNA sequencing. This virus can affect a multitude of species, including humans and cattle, and has been isolated from both healthy and diarrheic individuals. However, to date, a possible connection with neurologic disease has not been described, to our knowledge.


Subject(s)
Cattle Diseases/diagnosis , Kobuvirus/isolation & purification , Picornaviridae Infections/veterinary , Animals , Animals, Newborn , Cattle , Cattle Diseases/virology , Diagnosis, Differential , Diarrhea/veterinary , Kobuvirus/genetics , Male , Picornaviridae Infections/diagnosis , Wisconsin
13.
PLoS One ; 11(9): e0162104, 2016.
Article in English | MEDLINE | ID: mdl-27583363

ABSTRACT

Climate change may affect the ability of soil-based onsite wastewater treatment systems (OWTS) to treat wastewater in coastal regions of the Northeastern United States. Higher temperatures and water tables can affect treatment by reducing the volume of unsaturated soil and oxygen available for treatment, which may result in greater transport of pathogens, nutrients, and biochemical oxygen demand (BOD5) to groundwater, jeopardizing public and aquatic ecosystem health. The soil treatment area (STA) of an OWTS removes contaminants as wastewater percolates through the soil. Conventional STAs receive wastewater from the septic tank, with infiltration occurring deeper in the soil profile. In contrast, shallow narrow STAs receive pre-treated wastewater that infiltrates higher in the soil profile, which may make them more resilient to climate change. We used intact soil mesocosms to quantify the water quality functions of a conventional and two types of shallow narrow STAs under present climate (PC; 20°C) and climate change (CC; 25°C, 30 cm elevation in water table). Significantly greater removal of BOD5 was observed under CC for all STA types. Phosphorus removal decreased significantly from 75% (PC) to 66% (CC) in the conventional STA, and from 100% to 71-72% in shallow narrow STAs. No fecal coliform bacteria (FCB) were released under PC, whereas up to 17 and 20 CFU 100 mL-1 were released in conventional and shallow narrow STAs, respectively, under CC. Total N removal increased from 14% (PC) to 19% (CC) in the conventional STA, but decreased in shallow narrow STAs, from 6-7% to less than 3.0%. Differences in removal of FCB and total N were not significant. Leaching of N in excess of inputs was also observed in shallow narrow STAs under CC. Our results indicate that climate change can affect contaminant removal from wastewater, with effects dependent on the contaminant and STA type.


Subject(s)
Biological Oxygen Demand Analysis , Climate Change , Phosphorus/isolation & purification , Waste Disposal, Fluid/methods , New England , Nitrogen/isolation & purification
15.
West J Emerg Med ; 14(2): 158-60, 2013 Mar.
Article in English | MEDLINE | ID: mdl-23599858
16.
J Lipid Res ; 47(5): 1097-111, 2006 May.
Article in English | MEDLINE | ID: mdl-16479018

ABSTRACT

The LIPID MAPS Consortium (www.lipidmaps.org) is developing comprehensive procedures for identifying all lipids of the macrophage, following activation by endotoxin. The goal is to quantify temporal and spatial changes in lipids that occur with cellular metabolism and to develop bioinformatic approaches that establish dynamic lipid networks. To achieve these aims, an endotoxin of the highest possible analytical specification is crucial. We now report a large-scale preparation of 3-deoxy-D-manno-octulosonic acid (Kdo)(2)-Lipid A, a nearly homogeneous Re lipopolysaccharide (LPS) sub-structure with endotoxin activity equal to LPS. Kdo(2)-Lipid A was extracted from 2 kg cell paste of a heptose-deficient Escherichia coli mutant. It was purified by chromatography on silica, DEAE-cellulose, and C18 reverse-phase resin. Structure and purity were evaluated by electrospray ionization/mass spectrometry, liquid chromatography/mass spectrometry and (1)H-NMR. Its bioactivity was compared with LPS in RAW 264.7 cells and bone marrow macrophages from wild-type and toll-like receptor 4 (TLR-4)-deficient mice. Cytokine and eicosanoid production, in conjunction with gene expression profiling, were employed as readouts. Kdo(2)-Lipid A is comparable to LPS by these criteria. Its activity is reduced by >10(3) in cells from TLR-4-deficient mice. The purity of Kdo(2)-Lipid A should facilitate structural analysis of complexes with receptors like TLR-4/MD2.


Subject(s)
Lipopolysaccharides/pharmacology , Macrophage Activation/drug effects , Toll-Like Receptor 4/physiology , Animals , Chromatography, High Pressure Liquid/methods , Escherichia coli/metabolism , Lipopolysaccharides/isolation & purification , Mice , Nuclear Magnetic Resonance, Biomolecular , Prostaglandin D2/metabolism , Spectrometry, Mass, Electrospray Ionization
17.
Br J Clin Pharmacol ; 58(5): 521-7, 2004 Nov.
Article in English | MEDLINE | ID: mdl-15521900

ABSTRACT

AIM: To compare gentamicin dose estimates from four predictive methods. METHODS: A retrospective study was conducted, comprising patients at Fremantle Hospital who received gentamicin therapy and had at least one gentamicin serum concentration reported. A manual calculation method, the Australian 'Therapeutic Guidelines: Antibiotic' (TGA) nomogram and the SeBA-GEN and DoseCalc software packages were compared. SeBA-GEN dose estimates were regarded as the reference standard. RESULTS: There were 64 males and 30 females with mean age of 58 +/- 16 years. In patients with moderate renal impairment (CL(Cr) = 30-60 ml min(-1); n = 21), mean dose estimates using DoseCalc and the manual calculation method were comparable to SeBA-GEN but the mean TGA nomogram dose (230 mg; 95% confidence interval 179, 281) was significantly lower than SeBA-GEN (286 mg; 261, 311; P = 0.002; one-way RM anova). In patients with mild renal impairment (CL(Cr) = 60-90 ml min(-1); n = 48), DoseCalc (392 mg; 367, 427) was comparable to SeBA-GEN (377 mg; 362, 392). Although the manual method (341 mg; 306, 376; P = 0.007) and the TGA nomogram (335 mg; 302, 368; P < 0.001) estimates were significantly lower than SeBA-GEN, the practical difference was modest. CONCLUSIONS: SeBA-GEN and DoseCalc are generally comparable for estimation of gentamicin doses in patients with renal impairment. The 'Therapeutic Guidelines: Antibiotic' nomogram is a valid approach to dosage estimation, but only when used in patients with normal renal function. Simple manual calculations are a suitable alternative in patients with renal impairment.


Subject(s)
Anti-Bacterial Agents/administration & dosage , Drug Therapy, Computer-Assisted/standards , Gentamicins/administration & dosage , Software/standards , Drug Administration Schedule , Female , Humans , Male , Middle Aged , Practice Guidelines as Topic , Predictive Value of Tests , Retrospective Studies , Western Australia
SELECTION OF CITATIONS
SEARCH DETAIL