RESUMO
Enteric bacterial pathogen levels can influence the suitability of irrigation water sources for fruits and vegetables. We hypothesize that stable spatial patterns of Salmonella enterica and Listeria monocytogenes levels may exist across surface water sources in the Mid-Atlantic U.S. Water samples were collected at four streams and two pond sites in the mid-Atlantic U.S. over 2 years, biweekly during the fruit and vegetable growing seasons, and once a month during nongrowing seasons. Two stream sites and one pond site had significantly different mean concentrations in growing and nongrowing seasons. Stable spatial patterns were determined for relative differences between the site concentrations and average concentration of both pathogens across the study area. Mean relative differences were significantly different from zero at four of the six sites for S. enterica and three of six sites for L. monocytogenes. There was a similarity between the mean relative difference distribution between sites over growing season, nongrowing season, and the entire observation period. Mean relative differences were determined for temperature, oxidation-reduction potential, specific electrical conductance, pH, dissolved oxygen, turbidity, and cumulative rainfall. A moderate-to-strong Spearman correlation (rs > 0.657) was found between spatial patterns of S. enterica and 7-day rainfall, and between relative difference patterns of L. monocytogenes and temperature (rs = 0.885) and dissolved oxygen (rs = -0.885). Persistence in ranking sampling sites by the concentrations of the two pathogens was also observed. Finding spatially stable patterns in pathogen concentrations highlights spatiotemporal dynamics of these microorganisms across the study area can facilitate the design of an effective microbial water quality monitoring program for surface irrigation water.
Assuntos
Listeria monocytogenes , Salmonella enterica , Mid-Atlantic Region , Qualidade da Água , Estações do AnoRESUMO
Heartland virus (HRTV) disease is an emerging tickborne illness in the midwestern and southern United States. We describe a reported fatal case of HRTV infection in the Maryland and Virginia region, states not widely recognized to have human HRTV disease cases. The range of HRTV could be expanding in the United States.
Assuntos
Infecções por Bunyaviridae , Phlebovirus , Viroses , Estados Unidos/epidemiologia , Humanos , Infecções por Bunyaviridae/diagnóstico , Phlebovirus/genética , Mid-Atlantic RegionRESUMO
Tidal wetlands in the Mid-Atlantic, USA, are experiencing high rates of relative sea level rise, and it is unclear whether they will be resilient in the face of future flooding increases. In a previous study, we found 80% of our study areas in tidal freshwater and salt marshes in the Delaware Estuary and Barnegat Bay had elevation change rates lower than the 19-year increase in mean sea level. Here, we examine relationships between marsh elevation dynamics and abiotic and biotic parameters in order to assess their utility as indicators of vulnerability to relative sea level rise. We further apply a range of marsh vulnerability indicators including elevation change rates to evaluate their ability to corroborate marsh habitat change over the last 30 years. Of the field measurements, soil bulk density and belowground plant biomass were among the strongest predictors of elevation change and accretion dynamics across all marsh types and settings. Both tidal freshwater and salt marshes tended to have higher rates of elevation increase and surface accretion in areas where soil bulk density and live belowground biomass were higher. Nine of the ten marshes experienced a net loss of area from the 1970s to 2015 ranging from 0.05 to 14%. Although tidal freshwater marshes were low in elevation and experienced variable elevation change rates, marsh area loss was low. Conversely, salt marshes closest to the coast and perched high in the tidal frame with a higher degree of human modification tended to experience the greatest marsh loss, which incorporated anthropogenic impacts and edge erosion. Thus, our regional assessment points to the need for a comprehensive understanding of factors that influence marsh resilience including human modifications and geomorphic settings.
Assuntos
Elevação do Nível do Mar , Áreas Alagadas , Ecossistema , Monitoramento Ambiental , Estuários , Mid-Atlantic RegionRESUMO
Apple orchards with minimal or reduced fungicide inputs in the Mid-Atlantic region of the United States have experienced outbreaks of severe premature defoliation with symptoms that matched those of apple blotch disease (ABD) caused by Diplocarpon coronariae. Fungal isolates obtained from symptomatic apple leaves and fruit produced uniform slow-growing, dark-gray colonies on peptone potato dextrose agar and had conidia. Internal transcribed spacer DNA sequences matched with D. coronariae and Koch's postulates were fulfilled when typical ABD symptoms occurred when reinoculated onto apple leaves and fruit. Spore dispersal in nonfungicide-treated orchards detected with quantitative PCR was low in early spring and dropped to undetectable levels in late May and early June before rising exponentially to highs in July and August, which coincided with symptom development. Only low spore numbers were detected in fungicide-treated orchards and nearby forests. In preliminary fungicide tests, fluxapyroxad, thiophanate methyl, and difenoconazole effectively inhibited mycelial growth of isolates in vitro. When apple cultivars Fuji and Honeycrisp were inoculated with D. coronariae, Honeycrisp showed delayed onset of symptoms and lower disease severity, and the transcription profile of seven host defense-related genes showed that PR-2, PR-8, LYK4, and CERK1 were highly induced in Honeycrisp at 2 and 5 days postinoculation. This is the first report of ABD in the Mid-Atlantic United States, which includes studies of seasonal D. coronariae spore dispersal patterns, preliminary fungicide efficacy, and host defense-related gene expression to assist development of best ABD management practices.
Assuntos
Ascomicetos , Fungicidas Industriais , Malus , Frutas/microbiologia , Fungicidas Industriais/farmacologia , Malus/microbiologia , Mid-Atlantic Region , Estados UnidosRESUMO
In the current review, we examine the regional history, ecology, and epidemiology of eastern equine encephalitis virus (EEEV) to investigate the major drivers of disease outbreaks in the northeastern United States. EEEV was first recognized as a public health threat during an outbreak in eastern Massachusetts in 1938, but historical evidence for equine epizootics date back to the 1800s. Since then, sporadic disease outbreaks have reoccurred in the Northeast with increasing frequency and northward expansion of human cases during the last 20 yr. Culiseta melanura (Coquillett) (Diptera: Culicidae) serves as the main enzootic vector that drives EEEV transmission among wild birds, but this mosquito species will occasionally feed on mammals. Several species have been implicated as bridge vectors to horses and humans, with Coquilletstidia perturbans (Walker) as a leading suspect based on its opportunistic feeding behavior, vector competence, and high infection rates during recent disease outbreaks. A diversity of bird species are reservoir competent, exposed to EEEV, and serve as hosts for Cs. melanura, with a few species, including the wood thrush (Hlocichia mustelina) and the American robin (Turdus migratorius), contributing disproportionately to virus transmission based on available evidence. The major factors responsible for the sustained resurgence of EEEV are considered and may be linked to regional landscape and climate changes that support higher mosquito densities and more intense virus transmission.
Assuntos
Aves/virologia , Reservatórios de Doenças/virologia , Vírus da Encefalite Equina do Leste/fisiologia , Encefalomielite Equina , Doenças dos Cavalos , Mosquitos Vetores , Animais , Encefalomielite Equina/epidemiologia , Encefalomielite Equina/transmissão , Encefalomielite Equina/veterinária , Encefalomielite Equina/virologia , Doenças dos Cavalos/epidemiologia , Doenças dos Cavalos/transmissão , Doenças dos Cavalos/virologia , Cavalos , Humanos , Mid-Atlantic Region/epidemiologia , New England/epidemiologiaRESUMO
INTRODUCTION: While the role of palliative care in the emergency department is recognized, barriers against the effective integration of palliative interventions and emergency care remain. We examined the association between goals-of-care and palliative care consultations and healthcare utilization outcomes in older adult patients who presented to the emergency department (ED) with sepsis. METHODS: We performed a retrospective review of 197 patients aged 65 years and older who presented to the ED with sepsis or septic shock. Healthcare utilization outcomes were compared between patients divided into 3 groups: no palliative care consultation, palliative care consultation within 4 days of admission (i.e., early consultation), and palliative care consultation after 4 days of admission (i.e., late consultation). RESULTS: 51% of patients did not receive any palliative consultation, 39% of patients underwent an early palliative care consultation (within 4 days), and 10% of patients underwent a late palliative care consultation (after 4 days). Patients who received late palliative care consultation had a significantly increased number of procedures, total length of stay, ICU length of stay, and cost (p < .01, p < .001, p < .05, p < .001; respectively). Regarding early palliative care consultation, there were no statistically significant associations between this intervention and our outcomes of interest; however, we noted a trend towards decreased total length of stay and decreased healthcare cost. CONCLUSION: In patients aged 65 years and older who presented to the ED with sepsis, early palliative consultations were associated with reduced healthcare utilization as compared to late palliative consultations.
Assuntos
Serviço Hospitalar de Emergência/organização & administração , Utilização de Instalações e Serviços/estatística & dados numéricos , Cuidados Paliativos/organização & administração , Encaminhamento e Consulta/organização & administração , Sepse/terapia , Idoso , Idoso de 80 Anos ou mais , Serviço Hospitalar de Emergência/economia , Utilização de Instalações e Serviços/economia , Feminino , Custos de Cuidados de Saúde/estatística & dados numéricos , Humanos , Tempo de Internação/economia , Tempo de Internação/estatística & dados numéricos , Modelos Lineares , Masculino , Mid-Atlantic Region , Cuidados Paliativos/economia , Cuidados Paliativos/métodos , Planejamento de Assistência ao Paciente , Encaminhamento e Consulta/economia , Estudos Retrospectivos , Fatores de TempoRESUMO
INTRODUCTION: Emergency nurses work under sometimes uncertain conditions to provide care to patients with all kinds of illnesses and afflictions from all segments of the population. Despite implications that they must work together to provide efficient and effective patient care, few studies explore reciprocal workplace relationships of emergency nurses. AIM: This research sought to illuminate the lived experience of workplace reciprocity of emergency nurses. METHODS: Using a phenomenological approach with snowball sampling technique, unstructured, open-ended interviews were conducted with emergency nurses in the mid-Atlantic region of the United States. The original study was conducted in 2013 (nâ¯=â¯9) and a replication study in 2018 (nâ¯=â¯7). Data were collected and analyzed using Giorgi's Phenomenological Method. Results from each study were evaluated for thematic congruence. RESULTS: Six themes of workplace reciprocity of emergency nurses were identified for both studies: emergency department (ED) culture, balancing, technology, caring, bridging, and connection. An additional theme, bonding, was identified with the replication study. CONCLUSIONS: Exploring workplace reciprocity of emergency nurses provided insight the influences on workplace relationships. Establishing and nurturing workplace reciprocity may create a culture of safety, connection, enhance work engagement, and influence nurse recruitment and retention.
Assuntos
Enfermeiras e Enfermeiros , Local de Trabalho , Serviço Hospitalar de Emergência , Humanos , Mid-Atlantic Region , Pesquisa Qualitativa , Estados UnidosRESUMO
BACKGROUND: Perforated peptic ulcer is a morbid emergency general surgery condition. Best practices for postoperative care remain undefined. Surgical dogma preaches practices such as peritoneal drain placement, prolonged nil per os, and routine postoperative enteral contrast imaging despite a lack of evidence. We aimed to evaluate the role of postoperative enteral contrast imaging in postoperative perforated peptic ulcer care. Our primary objective was to assess effects of routine postoperative enteral contrast imaging on early detection of clinically significant leaks. METHODS: We conducted a multicenter retrospective cohort study of patients who underwent repair of perforated peptic ulcer between July 2016 and June 2018. We compared outcomes between those who underwent routine postoperative enteral contrast imaging and those who did not. RESULTS: Our analysis included 95 patients who underwent primary/omental patch repair. The mean age was 60 years, and 54% were male. Thirteen (14%) had a leak. Eighty percent of patients had a drain placed. Nine patients had leaks diagnosed based on bilious drain output without routine postoperative enteral contrast imaging. Use of routine postoperative enteral contrast imaging varied significantly between institutions (30%-87%). Two late leaks after initial normal postoperative enteral contrast imaging were confirmed by imaging after a clinical change triggered the second study. Two patients had contained leaks identified by routine postoperative enteral contrast imaging but remained clinically well. Duration of hospital stay was longer in those who received routine postoperative enteral contrast imaging (12 vs 6 days, median; P = .000). CONCLUSION: Routine postoperative enteral contrast imaging after perforated peptic ulcer repair likely does not improve the detection of clinically significant leaks and is associated with increased duration of hospital stay.
Assuntos
Procedimentos Cirúrgicos do Sistema Digestório/estatística & dados numéricos , Úlcera Péptica Perfurada/cirurgia , Complicações Pós-Operatórias/diagnóstico por imagem , Idoso , Colorado/epidemiologia , Meios de Contraste , Feminino , Humanos , Masculino , Mid-Atlantic Region/epidemiologia , Pessoa de Meia-Idade , Úlcera Péptica Perfurada/diagnóstico por imagem , Complicações Pós-Operatórias/epidemiologia , Radiografia , Estudos RetrospectivosRESUMO
Tick-borne illnesses have been on the rise in the United States, with reported cases up sharply in the past two decades. In this literature review, we synthesize the available research on the relationship between vegetation and tick abundance for four tick species in the northeastern United States that are of potential medical importance to humans. The blacklegged tick (Ixodes scapularis) (Say; Acari: Ixodidae) is found to be positively associated with closed canopy forests and dense vegetation thickets, and negatively associated with open canopy environments, such as grasslands or old agricultural fields. The American dog tick (Dermacentor variabilis) (Say; Acari: Ixodidae) has little habitat overlap with I. scapularis, with abundance highest in grasses and open-canopy fields. The lone star tick (Amblyomma americanum) (Linnaeus; Acari: Ixodidae) is a habitat generalist without consistent associations with particular types of vegetation. The habitat associations of the recently introduced Asian longhorned tick (Haemaphysalis longicornis) (Neumann; Acari: Ixodidae) in the northeastern United States, and in other regions where it has invaded, are still unknown, although based on studies in its native range, it is likely to be found in grasslands and open-canopy habitats.
Assuntos
Distribuição Animal , Vetores Aracnídeos/fisiologia , Ixodidae/fisiologia , Plantas , Animais , Biota , Mid-Atlantic Region , New England , Ontário , Dinâmica Populacional , WisconsinRESUMO
Enteric viruses (EVs) are the largest contributors to foodborne illnesses and outbreaks globally. Their ability to persist in the environment, coupled with the challenges experienced in environmental monitoring, creates a critical aperture through which agricultural crops may become contaminated. This study involved a 17-month investigation of select human EVs and viral indicators in nontraditional irrigation water sources (surface and reclaimed waters) in the Mid-Atlantic region of the United States. Real-time quantitative PCR was used for detection of Aichi virus, hepatitis A virus, and norovirus genotypes I and II (GI and GII, respectively). Pepper mild mottle virus (PMMoV), a common viral indicator of human fecal contamination, was also evaluated, along with atmospheric (air and water temperature, cloud cover, and precipitation 24 h, 7 days, and 14 days prior to sample collection) and physicochemical (dissolved oxygen, pH, salinity, and turbidity) data, to determine whether there were any associations between EVs and measured parameters. EVs were detected more frequently in reclaimed waters (32% [n = 22]) than in surface waters (4% [n = 49]), similar to PMMoV detection frequency in surface (33% [n = 42]) and reclaimed (67% [n = 21]) waters. Our data show a significant correlation between EV and PMMoV (R2 = 0.628, P < 0.05) detection levels in reclaimed water samples but not in surface water samples (R2 = 0.476, P = 0.78). Water salinity significantly affected the detection of both EVs and PMMoV (P < 0.05), as demonstrated by logistic regression analyses. These results provide relevant insights into the extent and degree of association between human (pathogenic) EVs and water quality data in Mid-Atlantic surface and reclaimed waters, as potential sources for agricultural irrigation. IMPORTANCE Microbiological analysis of agricultural waters is fundamental to ensure microbial food safety. The highly variable nature of nontraditional sources of irrigation water makes them particularly difficult to test for the presence of viruses. Multiple characteristics influence viral persistence in a water source, as well as affecting the recovery and detection methods that are employed. Testing for a suite of viruses in water samples is often too costly and labor-intensive, making identification of suitable indicators for viral pathogen contamination necessary. The results from this study address two critical data gaps, namely, EV prevalence in surface and reclaimed waters of the Mid-Atlantic region of the United States and subsequent evaluation of physicochemical and atmospheric parameters used to inform the potential for the use of indicators of viral contamination.
Assuntos
Irrigação Agrícola , Enterovirus/isolamento & purificação , Tobamovirus/isolamento & purificação , Poluentes da Água/análise , Monitoramento Ambiental , Concentração de Íons de Hidrogênio , Mid-Atlantic Region , Oxigênio/análise , Salinidade , Microbiologia da Água , Poluição da Água/análiseRESUMO
Certified Registered Nurse Anesthetists (CRNAs) are uniquely skilled anesthesia providers with substantial experience managing critically ill patients. During the coronavirus disease 2019 (COVID) pandemic, CRNAs at a large academic medical center in the Mid-Atlantic United States experienced a shift in their daily responsibilities. As the hospital transitioned to the management of patients who tested positive for the virus that causes COVID, the severe acute respiratory syndrome-coronavirus type 2 (SARS-CoV-2), CRNAs were redeployed into the roles of respiratory therapists and intensive care unit registered nurses. Although facing the stress of the global pandemic, this facility's CRNAs proved to be flexible, capable, and necessary members of the care team for patients with COVID-19.
Assuntos
COVID-19/enfermagem , COVID-19/psicologia , Enfermeiras Anestesistas/psicologia , Papel do Profissional de Enfermagem/psicologia , Admissão e Escalonamento de Pessoal/estatística & dados numéricos , Papel Profissional , Carga de Trabalho/estatística & dados numéricos , Adulto , Feminino , Humanos , Masculino , Mid-Atlantic Region , Pessoa de Meia-Idade , Enfermeiras Anestesistas/estatística & dados numéricos , Pandemias , SARS-CoV-2RESUMO
The present study aimed to expand weight stigma theoretical models by accounting for central tenets of prominent eating disorder (ED) theories and increasing the generalizability of existing models for individuals across the weight spectrum. College students (Sample 1: N = 1228; Sample 2: N = 1368) completed online surveys assessing stigma and ED symptoms. In each sample, separately, multi-group path analyses tested whether body mass index (BMI) classification (underweight/average weight, overweight, obese) moderated a model wherein weight stigma experiences were sequentially associated with weight bias internalization, body dissatisfaction, and five ED symptoms: binge eating, purging, restricting, excessive exercise, muscle building behaviors. Results supported the assessed model overall and for individuals in each BMI class, separately. Although patterns of associations differed for individuals with different BMIs, these variations were limited. The present findings suggest that the adverse impact of weight stigma on distinct ED symptoms is not limited to individuals with elevated BMIs and that these associations are generally explained by the same mechanisms. Weight stigma interventions that focus on decreasing weight bias internalization and body dissatisfaction are recommended for individuals across the weight spectrum. Further examination of associations between weight stigma and multiple ED symptoms, beyond disinhibited eating, is supported.
Assuntos
Insatisfação Corporal/psicologia , Imagem Corporal/psicologia , Transtornos da Alimentação e da Ingestão de Alimentos/psicologia , Preconceito de Peso/psicologia , Adolescente , Adulto , Índice de Massa Corporal , Feminino , Humanos , Masculino , Mid-Atlantic Region , Estudantes/psicologia , Estudantes/estatística & dados numéricos , Inquéritos e Questionários , Universidades , Adulto JovemRESUMO
While urban greenspace is increasingly recognized as important to mental health, its role in substance use is understudied. This exploratory study investigates the interaction of greenspace with peer network health, sex, and executive function (EF) in models of substance use among a sample of disadvantaged, urban youth. Adolescents and their parents were recruited from a hospital in the mid-Atlantic region of the U.S. Residential greenspace at the streetscape level was derived from analysis of Google Street View imagery. Logistic regression models were used to test the moderating effect of greenspace on the association between peer network health and substance use, as well as additional moderating effects of sex and EF. The significant negative association of peer network health with substance use occurred only among youth residing in high greenspace environments, a moderating effect which was stronger among youth with high EF deficit. The moderating effect of greenspace did not differ between girls and boys. Greenspace may play an important role in moderating peer influences on substance use among disadvantaged, urban adolescents, and such moderation may differ according to an individual's level of EF. This research provides evidence of differences in environmental susceptibility regarding contextual mechanisms of substance use among youth, and it informs the development of targeted substance use interventions that leverage social and environmental influences on adolescent substance use.
Assuntos
Parques Recreativos , Transtornos Relacionados ao Uso de Substâncias , Adolescente , Função Executiva , Feminino , Humanos , Masculino , Mid-Atlantic Region , Grupo Associado , Transtornos Relacionados ao Uso de Substâncias/epidemiologiaRESUMO
OBJECTIVES: To describe real-time changes in medical visits (MVs), visit mode, and patient-reported visit experience associated with rapidly deployed care reorganization during the coronavirus disease 2019 (COVID-19) pandemic. STUDY DESIGN: Cross-sectional time series from September 29, 2019, through June 20, 2020. METHODS: Responding to official public health and clinical guidance, team-based systematic structural changes were implemented in a large, integrated health system to reorganize and transition delivery of care from office-based to virtual care platforms. Overall and discipline-specific weekly MVs, visit mode (office-based, telephone, or video), and associated aggregate measures of patient-reported visit experience were reported. A 38-week time-series analysis with March 8, 2020, and May 3, 2020, as the interruption dates was performed. RESULTS: After the first interruption, there was a decreased weekly visit trend for all visits (ß3 = -388.94; P < .05), an immediate decrease in office-based visits (ß2 = -25,175.16; P < .01), increase in telephone-based visits (ß2 = 17,179.60; P < .01), and increased video-based visit trend (ß3 = 282.02; P < .01). After the second interruption, there was an increased visit trend for all visits (ß5 = 565.76; P < .01), immediate increase in video-based visits (ß4 = 3523.79; P < .05), increased office-based visit trend (ß5 = 998.13; P < .01), and decreased trend in video-based visits (ß5 = -360.22; P < .01). After the second interruption, there were increased weekly long-term visit trends for the proportion of patients reporting "excellent" as to how well their visit needs were met for all visits (ß5 = 0.17; P < .01), telephone-based visits (ß5 = 0.34; P < .01), and video-based visits (ß5 = 0.32; P < .01). Video-based visits had the highest proportion of respondents rating "excellent" as to how well their scheduling and visit needs were met. CONCLUSIONS: COVID-19 required prompt organizational transformation to optimize the patient experience.
Assuntos
Agendamento de Consultas , Atenção à Saúde/organização & administração , Programas de Assistência Gerenciada/organização & administração , Visita a Consultório Médico/tendências , Telemedicina/tendências , COVID-19/epidemiologia , Estudos Transversais , Atenção à Saúde/economia , Humanos , Análise de Séries Temporais Interrompida , Programas de Assistência Gerenciada/economia , Mid-Atlantic RegionRESUMO
Apple growers in the Mid-Atlantic region of the United States have been reporting an increase in losses to bitter rot of apple and are requesting up-to-date management recommendations. Management is complicated by variations in apple cultivar susceptibility, temperature, rainfall, and biology of the Colletotrichum spp. that cause bitter rot. Over 500 apple fruit with bitter rot were obtained from 38 orchards across the Mid-Atlantic and the causal species were identified as Colletotrichum fioriniae and C. nymphaeae of the C. acutatum species complex and C. chrysophilum, C. noveboracense, C. siamense, C. fructicola, C. henanense, and C. gloeosporioides sensu stricto of the C. gloeosporioides species complex, the latter two being first reports. Species with faster in vitro growth rates at higher temperatures were more abundant in warmer regions of the Mid-Atlantic, while those with slower growth rates at higher temperatures were more abundant in cooler regions. Regional bloom dates are earlier and weather data show a gradual warming trend that likely influenced but was not necessarily the main cause of the recent increase in bitter rot in the region. A grower survey of apple cultivar susceptibility showed high variation, with the increase in acres planted to the highly susceptible cultivar Honeycrisp broadly corresponding to the increase in reports of bitter rot. These results form a basis for future studies on the biology and ecology of the Colletotrichum spp. responsible, and suggest that integrated bitter rot management must begin with selection of less-susceptible apple cultivars.
Assuntos
Colletotrichum , Malus , Mid-Atlantic Region , Doenças das Plantas , Estados Unidos , Tempo (Meteorologia)RESUMO
ABSTRACT: A total of 482 veal cutlet, 555 ground veal, and 540 ground beef samples were purchased from retail establishments in the mid-Atlantic region of the United States over a noncontiguous 2-year period between 2014 and 2017. Samples (325 g each) were individually enriched and screened via real-time PCR for all seven regulated serogroups of Shiga toxin-producing Escherichia coli (STEC). Presumptive STEC-positive samples were subjected to serogroup-specific immunomagnetic separation and plated onto selective media. Up to five isolates typical for STEC from each sample were analyzed via multiplex PCR for both the virulence genes (i.e., eae, stx1 and/or stx2, and ehxA) and serogroup-specific gene(s) for the seven regulated STEC serogroups. The recovery rates of non-O157 STEC from veal cutlets (3.94%, 19 of 482 samples) and ground veal (7.03%, 39 of 555 samples) were significantly higher (P < 0.05) than that from ground beef (0.93%, 5 of 540 samples). In contrast, only a single isolate of STEC O157:H7 was recovered; this isolate originated from 1 (0.18%) of 555 samples of ground veal. Recovery rates for STEC were not associated with state, season, packaging type, or store type (P > 0.05) but were associated with brand and fat content (P < 0.05). Pulsed-field subtyping of the 270 viable and confirmed STEC isolates from the 64 total samples testing positive revealed 78 pulsotypes (50 to 80% similarity) belonging to 39 pulsogroups, with ≥90% similarity among pulsotypes within pulsogroups. Multiple isolates from 43 (67.7%) of 64 samples testing positive had an indistinguishable pulsotype. STEC serotypes O26 and O103 were the most prevalent serogroups in beef and veal, respectively. These findings support related findings from regulatory sampling studies over the past decade and confirm that recovery rates for the regulated STEC serogroups are higher for raw veal than for raw beef samples, as was observed in the present study of meat purchased at food retailers in the mid-Atlantic region of the United States.
Assuntos
Proteínas de Escherichia coli , Carne Vermelha , Escherichia coli Shiga Toxigênica , Animais , Bovinos , Proteínas de Escherichia coli/genética , Carne , Mid-Atlantic Region , Sorogrupo , Estados UnidosRESUMO
Three tick species that can transmit pathogen causing disease are commonly found parasitizing people and animals in the mid-Atlantic United States: the blacklegged tick (Ixodes scapularis Say), the American dog tick (Dermacentor variabilis [Say]), and the lone star tick (Amblyomma americanum [L.]) (Acari: Ixodidae). The potential risk of pathogen transmission from tick bites acquired at schools in tick-endemic areas is a concern, as school-aged children are a high-risk group for tick-borne disease. Integrated pest management (IPM) is often required in school districts, and continued tick range expansion and population growth will likely necessitate IPM strategies to manage ticks on school grounds. However, an often-overlooked step of tick management is monitoring and assessment of local tick species assemblages to inform the selection of control methodologies. The purpose of this study was to evaluate tick species presence, abundance, and distribution and the prevalence of tick-borne pathogens in both questing ticks and those removed from rodent hosts on six school properties in Maryland. Overall, there was extensive heterogeneity in tick species dominance, abundance, and evenness across the field sites. A. americanum and I. scapularis were found on all sites in all years. Overall, A. americanum was the dominant tick species. D. variabilis was collected in limited numbers. Several pathogens were found in both questing ticks and those removed from rodent hosts, although prevalence of infection was not consistent between years. Borrelia burgdorferi, Ehrlichia chaffeensis, Ehrlichia ewingii, and Ehrlichia "Panola Mountain" were identified in questing ticks, and B. burgdorferi and Borrelia miyamotoi were detected in trapped Peromyscus spp. mice. B. burgdorferi was the dominant pathogen detected. The impact of tick diversity on IPM of ticks is discussed.
Assuntos
Amblyomma , Dermacentor , Ixodes , Doenças Transmitidas por Carrapatos/epidemiologia , Animais , Camundongos , Mid-Atlantic Region/epidemiologia , Ninfa , Controle de Ácaros e CarrapatosRESUMO
Anthracnose fruit rot (AFR) and Botrytis fruit rot (BFR) are primary diseases affecting strawberry (Fragaria × ananassa), which typically drive fungicide applications throughout the growing season. The Strawberry Advisory System (StAS), a disease forecasting tool, was originally developed in Florida to better time the fungicide sprays by monitoring AFR and BFR infection risk based on leaf wetness and temperature input in real-time. Thirteen field trials were conducted in Maryland and Virginia between 2017 and 2019 to evaluate the StAS performance in the Mid-Atlantic region. As a result, 55, 18, and 31% fewer sprays were recorded on average in the model-based StAS treatment compared with the grower standard treatment in 2017, 2018, and 2019, respectively. Marketable yield, as well as AFR and BFR incidence, were largely comparable between the two treatments. However, poor disease control occurred during the StAS treatment in four trials in 2017, presumably because of a missed fungicide spray during a high-risk infection event and attributable to heavy rainfall that led to impassable fields. The implementation of the StAS may be further challenged by the employment of floating row covers that are essential for growing strawberries in plasticulture systems in open fields in the Mid-Atlantic region. Preliminary results indicated that row covers can alter canopy-level microclimatic conditions, possibly increasing the risk for disease occurrence. Overall, the StAS can be a valuable tool for Mid-Atlantic growers to control AFR and BFR, but sprays may need to be promptly applied when consecutive or heavy rainfalls are predicted, especially for highly susceptible cultivars. Complications in disease forecasting and management arising from the use of row covers need to be further addressed in this region because of its highly diverse climate.
Assuntos
Fragaria , Fungicidas Industriais , Botrytis , Mid-Atlantic Region , Doenças das PlantasRESUMO
BACKGROUND: COVID-19 is a new pandemic, and its impact by HIV status is unknown. National reporting does not include gender identity; therefore, data are absent on the impact of COVID-19 on transgender people, including those with HIV. Baseline data from the American Cohort to Study HIV Acquisition Among Transgender Women in High Risk Areas (LITE) Study provide an opportunity to examine pre-COVID factors that may increase vulnerability to COVID-19-related harms among transgender women. SETTING: Atlanta, Baltimore, Boston, Miami, New York City, Washington, DC. METHODS: Baseline data from LITE were analyzed for demographic, psychosocial, and material factors that may affect vulnerability to COVID-related harms. RESULTS: The 1020 participants had high rates of poverty, unemployment, food insecurity, homelessness, and sex work. Transgender women with HIV (n = 273) were older, more likely to be Black, had lower educational attainment, and were more likely to experience material hardship. Mental and behavioral health symptoms were common and did not differ by HIV status. Barriers to health care included being mistreated, provider discomfort serving transgender women, and past negative experiences; as well as material hardships, such as cost and transportation. However, most reported access to material and social support-demonstrating resilience. CONCLUSIONS: Transgender women with HIV may be particularly vulnerable to pandemic harms. Mitigating this harm would benefit everyone, given the highly infectious nature of this coronavirus. Collecting gender identity in COVID-19 data is crucial to inform an effective public health response. Transgender-led organizations' response to this crisis serve as an important model for effective community-led interventions.