Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 59
Filter
1.
Ann Surg Oncol ; 28(13): 8109-8115, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34115250

ABSTRACT

INTRODUCTION: Improving patient safety and quality are priorities in health care. The study of malpractice cases provides an opportunity to identify areas for quality improvement. While the issues surrounding malpractice cases in breast cancer are often multifactorial, there are few studies providing insight into malpractice cases specifically related to common breast cancer surgical procedures. We sought to characterize the factors in liability cases involving breast cancer surgery. METHODS: Closed cases from 2008 to 2019 involving a breast cancer diagnosis, a primary responsible service of general surgery, surgical oncology, or plastic surgery, and a breast cancer procedure were reviewed using data from the Controlled Risk Insurance Company (CRICO) Strategies Comparative Benchmarking System database, a national repository of professional liability data. RESULTS: A total of 174 malpractice cases were reviewed, of which 41 cases were closed with payment. Plastic surgeons were most commonly named (64%, 111/174), followed by general surgeons (30%, 53/174), and surgical oncologists (6%, 10/174). The most common allegation was error in surgical treatment (87%, 152/174), and infection, cosmetic injury, emotional trauma, foreign body, and nosocomial infections represented the top five injury descriptions. On average, indemnity payments were larger for high clinical severity cases. Technical skills, followed by clinical judgment, were the most commonly named contributing factors. The average payment per case was $130,422. CONCLUSION: Malpractice cases predominantly involve technical complications related to plastic surgery procedures. Better understanding of the malpractice environment involving surgical procedures performed for breast cancer may provide practical insight to guide initiatives aimed at improving patient outcomes.


Subject(s)
Breast Neoplasms , Malpractice , Oncologists , Surgeons , Breast Neoplasms/surgery , Female , Humans , Patient Safety , Retrospective Studies
3.
J Patient Saf ; 17(8): 576-582, 2021 12 01.
Article in English | MEDLINE | ID: mdl-32209947

ABSTRACT

OBJECTIVE: Clinicians may hesitate to advocate for autopsies out of concern for increased malpractice risk if the pathological findings at time of death differ from the clinical findings. We aimed to understand the impact of autopsy findings on malpractice claim outcomes. METHODS: Closed malpractice claims with loss dates between 1995 and 2015 involving death related to inpatient care at 3 Harvard Medical School hospitals were extracted from a captive malpractice insurer's database. These claims were linked to patients' electronic health records and their autopsy reports. Using the Goldman classification system, 2 physician reviewers blinded to claim outcome determined whether there was major, minor, or no discordance between the final clinical diagnoses and pathologic diagnoses. Claims were compared depending on whether an autopsy was performed and whether there was major versus minor/no clinical-pathologic discordance. Primary outcomes included percentage of claims paid through settlement or plaintiff verdict and the amount of indemnity paid, inflation adjusted. RESULTS: Of 293 malpractice claims related to an inpatient death that could be linked to patients' electronic health records, 89 claims (30%) had an autopsy performed by either the hospital or medical examiner. The most common claim allegation was an issue with clinician diagnosis, which was statistically less common in the autopsy group (18% versus 38%, P = 0.001). There was no difference in percentage of claims paid whether an autopsy was performed or not (42% versus 41%, P = 0.90) and no difference in median indemnity of paid claims after adjusting for number of defendants ($1,180,537 versus $906,518, P = 0.15). Thirty-one percent of claims with hospital autopsies performed demonstrated major discordance between autopsy and clinical findings. Claims with major clinical-pathologic discordance also did not have a statistically significant difference in percentage paid (44% versus 41%, P > 0.99) or amount paid ($895,954 versus $1,494,120, P = 0.10) compared with claims with minor or no discordance. CONCLUSIONS: Although multiple factors determine malpractice claim outcome, in this cohort, claims in which an autopsy was performed did not result in more paid outcomes, even when there was major discordance between clinical and pathologic diagnoses.


Subject(s)
Malpractice , Physicians , Autopsy , Databases, Factual , Hospitalization , Humans
4.
mBio ; 11(6)2020 11 17.
Article in English | MEDLINE | ID: mdl-33203758

ABSTRACT

Norovirus infections take a heavy toll on worldwide public health. While progress has been made toward understanding host responses to infection, the role of the gut microbiome in determining infection outcome is unknown. Moreover, data are lacking on the nature and duration of the microbiome response to norovirus infection, which has important implications for diagnostics and host recovery. Here, we characterized the gut microbiomes of subjects enrolled in a norovirus challenge study. We analyzed microbiome features of asymptomatic and symptomatic individuals at the genome (population) and gene levels and assessed their response over time in symptomatic individuals. We show that the preinfection microbiomes of subjects with asymptomatic infections were enriched in Bacteroidetes and depleted in Clostridia relative to the microbiomes of symptomatic subjects. These compositional differences were accompanied by differences in genes involved in the metabolism of glycans and sphingolipids that may aid in host resilience to infection. We further show that microbiomes shifted in composition following infection and that recovery times were variable among human hosts. In particular, Firmicutes increased immediately following the challenge, while Bacteroidetes and Proteobacteria decreased over the same time. Genes enriched in the microbiomes of symptomatic subjects, including the adenylyltransferase glgC, were linked to glycan metabolism and cell-cell signaling, suggesting as-yet unknown roles for these processes in determining infection outcome. These results provide important context for understanding the gut microbiome role in host susceptibility to symptomatic norovirus infection and long-term health outcomes.IMPORTANCE The role of the human gut microbiome in determining whether an individual infected with norovirus will be symptomatic is poorly understood. This study provides important data on microbes that distinguish asymptomatic from symptomatic microbiomes and links these features to infection responses in a human challenge study. The results have implications for understanding resistance to and treatment of norovirus infections.


Subject(s)
Bacteroidetes/growth & development , Caliciviridae Infections/prevention & control , Firmicutes/growth & development , Gastrointestinal Microbiome , Norovirus/immunology , Proteobacteria/growth & development , Asymptomatic Diseases , Bacteroidetes/genetics , Caliciviridae Infections/immunology , Caliciviridae Infections/virology , Disease Susceptibility , Firmicutes/genetics , Humans , Metagenomics , Proteobacteria/genetics
6.
Acta Psychiatr Scand ; 141(3): 206-220, 2020 03.
Article in English | MEDLINE | ID: mdl-31733146

ABSTRACT

OBJECTIVE: Individual placement and support (IPS) has shown consistently better outcomes on competitive employment for patients with severe mental illness than traditional vocational rehabilitation. The evidence for efficacy originates from few countries, and generalization to different countries has been questioned. This has delayed implementation of IPS and led to requests for country-specific RCTs. This meta-analysis examines if evidence for IPS efficacy can be generalized between rather different countries. METHODS: A systematic search was conducted according to PRISMA guidelines to identify RCTs. Overall efficacy was established by meta-analysis. The generalizability of IPS efficacy between countries was analysed by random-effects meta-regression, employing country- and date-specific contextual data obtained from the OECD and the World Bank. RESULTS: The systematic review identified 27 RCTs. Employment rates are more than doubled in IPS compared with standard vocational rehabilitation (RR 2.07 95% CI 1.82-2.35). The efficacy of IPS was marginally moderated by strong legal protection against dismissals. It was not moderated by regulation of temporary employment, generosity of disability benefits, type of integration policies, GDP, unemployment rate or employment rate for those with low education. CONCLUSIONS: The evidence for efficacy of IPS is very strong. The efficacy of IPS can be generalized between countries.


Subject(s)
Employment, Supported/statistics & numerical data , Employment/methods , Mental Disorders/rehabilitation , Asia , Australia , Europe , Humans , North America , Policy , Randomized Controlled Trials as Topic
7.
Diagnosis (Berl) ; 6(3): 227-240, 2019 08 27.
Article in English | MEDLINE | ID: mdl-31535832

ABSTRACT

Background Diagnostic errors cause substantial preventable harm, but national estimates vary widely from 40,000 to 4 million annually. This cross-sectional analysis of a large medical malpractice claims database was the first phase of a three-phase project to estimate the US burden of serious misdiagnosis-related harms. Methods We sought to identify diseases accounting for the majority of serious misdiagnosis-related harms (morbidity/mortality). Diagnostic error cases were identified from Controlled Risk Insurance Company (CRICO)'s Comparative Benchmarking System (CBS) database (2006-2015), representing 28.7% of all US malpractice claims. Diseases were grouped according to the Agency for Healthcare Research and Quality (AHRQ) Clinical Classifications Software (CCS) that aggregates the International Classification of Diseases diagnostic codes into clinically sensible groupings. We analyzed vascular events, infections, and cancers (the "Big Three"), including frequency, severity, and settings. High-severity (serious) harms were defined by scores of 6-9 (serious, permanent disability, or death) on the National Association of Insurance Commissioners (NAIC) Severity of Injury Scale. Results From 55,377 closed claims, we analyzed 11,592 diagnostic error cases [median age 49, interquartile range (IQR) 36-60; 51.7% female]. These included 7379 with high-severity harms (53.0% death). The Big Three diseases accounted for 74.1% of high-severity cases (vascular events 22.8%, infections 13.5%, and cancers 37.8%). In aggregate, the top five from each category (n = 15 diseases) accounted for 47.1% of high-severity cases. The most frequent disease in each category, respectively, was stroke, sepsis, and lung cancer. Causes were disproportionately clinical judgment factors (85.7%) across categories (range 82.0-88.8%). Conclusions The Big Three diseases account for about three-fourths of serious misdiagnosis-related harms. Initial efforts to improve diagnosis should focus on vascular events, infections, and cancers.


Subject(s)
Diagnostic Errors/adverse effects , Infections/diagnosis , Malpractice/legislation & jurisprudence , Neoplasms/diagnosis , Vascular Diseases/diagnosis , Cross-Sectional Studies , Databases, Factual , Female , Humans , Male , Middle Aged , United States
8.
Int J Radiat Oncol Biol Phys ; 103(4): 801-808, 2019 03 15.
Article in English | MEDLINE | ID: mdl-30439486

ABSTRACT

PURPOSE: Medical errors in radiation oncology (RO) practice have received significant national attention over the last decade. Medical errors can lead to malpractice cases. Better characterizing these events can educate providers with the goal of improving patient care. METHODS AND MATERIALS: The Controlled Risk Insurance Company Strategies' Comparative Benchmarking System (CBS) represents approximately 30% of all closed US malpractice cases and includes the experience of more than 30 academic hospitals. Registered nurses trained as clinical taxonomy specialists code each case, and individual case-level details are available. Practicing radiation oncologists extracted all closed RO cases from years 2005 to 2014 and subgrouped them by patient allegation category, clinical injury severity, care setting and academic affiliation, disease site and natural history, treatment modality, and contributing factor. Within categories, χ2 tests were used to test for the variables' association with an indemnity payment. RESULTS: RO was the primary service in 102 closed cases (0.2% of all cases in the CBS), accounting for $13,323,578 in indemnity payments (0.1% of all payments in the CBS). The median indemnity payment was $100,000. Head-and-neck and central nervous system tumors accounted for 23.9% and 10.9% of all RO cases, respectively, and 41.3% and 31.4% of all indemnity payments, respectively. Benign diseases and brachytherapy were involved in 12.0% and 15.2% of cases, respectively. Cases involving benign disease (P = .009), treatment of the wrong site (P = .001), or treatment using the wrong dose (P < .001) were all associated with indemnity payments. The top 5 most expensive cases accounted for nearly 80% of all indemnity payments, and all involved head-and-neck, central nervous system, benign, or brachytherapy cases. CONCLUSIONS: We found that although closed malpractice cases involving RO are rare events, certain populations may be overrepresented in closed claims. These data can help inform providers and systems with the goal of ultimately improving patient safety.


Subject(s)
Malpractice/statistics & numerical data , Radiation Oncology , Benchmarking , Female , Humans , Male , Middle Aged
9.
Acad Emerg Med ; 25(9): 980-986, 2018 09.
Article in English | MEDLINE | ID: mdl-29665190

ABSTRACT

BACKGROUND: Data are lacking on how emergency medicine (EM) malpractice cases with resident involvement differs from cases that do not name a resident. OBJECTIVES: The objective was to compare malpractice case characteristics in cases where a resident is involved (resident case) to cases that do not involve a resident (nonresident case) and to determine factors that contribute to malpractice cases utilizing EM as a model for malpractice claims across other medical specialties. METHODS: We used data from the Controlled Risk Insurance Company (CRICO) Strategies' division Comparative Benchmarking System (CBS) to analyze open and closed EM cases asserted from 2009 to 2013. The CBS database is a national repository that contains professional liability data on > 400 hospitals and > 165,000 physicians, representing over 30% of all malpractice cases in the United States (>350,000 claims). We compared cases naming residents (either alone or in combination with an attending) to those that did not involve a resident (nonresident cohort). We reported the case statistics, allegation categories, severity scores, procedural data, final diagnoses, and contributing factors. Fisher's exact test or t-test was used for comparisons (alpha set at 0.05). RESULTS: A total of 845 EM cases were identified of which 732 (87%) did not name a resident (nonresident cases), while 113 (13%) included a resident (resident cases). There were higher total incurred losses for nonresident cases. The most frequent allegation categories in both cohorts were "failure or delay in diagnosis/misdiagnosis" and "medical treatment" (nonsurgical procedures or treatment regimens, i.e., central line placement). Allegation categories of safety and security, patient monitoring, hospital policy and procedure, and breach of confidentiality were found in the nonresident cases. Resident cases incurred lower payments on average ($51,163 vs. $156,212 per case). Sixty-six percent (75) of resident versus 57% (415) of nonresident cases were high-severity claims (permanent, grave disability or death; p = 0.05). Procedures involved were identified in 32% (36) of resident and 26% (188) of nonresident cases (p = 0.17). The final diagnoses in resident cases were more often cardiac related (19% [21] vs. 10% [71], p < 0.005) whereas nonresident cases had more orthopedic-related final diagnoses (10% [72] vs. 3% [3], p < 0.01). The most common contributing factors in resident and nonresident cases were clinical judgment (71% vs. 76% [p = 0.24]), communication (27% vs. 30% [p = 0.46]), and documentation (20% vs. 21% [p = 0.95]). Technical skills contributed to 20% (22) of resident cases versus 13% (96) of nonresident cases (p = 0.07) but those procedures involving vascular access (2.7% [3] vs 0.1% [1]) and spinal procedures (3.5% [4] vs. 1.1% [8]) were more prevalent in resident cases (p < 0.05 for each). CONCLUSIONS: There are higher total incurred losses in nonresident cases. There are higher severity scores in resident cases. The overall case profiles, including allegation categories, final diagnoses, and contributing factors between resident and nonresident cases are similar. Cases involving residents are more likely to involve certain technical skills, specifically vascular access and spinal procedures, which may have important implications regarding supervision. Clinical judgment, communication, and documentation are the most prevalent contributing factors in all cases and should be targets for risk reduction strategies.


Subject(s)
Emergency Medicine/statistics & numerical data , Internship and Residency/statistics & numerical data , Malpractice/statistics & numerical data , Medical Staff, Hospital/statistics & numerical data , Case-Control Studies , Databases, Factual , Delayed Diagnosis , Diagnostic Errors , Humans , Retrospective Studies , United States
10.
J Bone Joint Surg Am ; 99(17): e94, 2017 Sep 06.
Article in English | MEDLINE | ID: mdl-28872536

ABSTRACT

BACKGROUND: The purpose of this investigation was to characterize the clinical efficacy and cost-effectiveness of simulation training aimed at reducing cast-saw injuries. METHODS: Third-year orthopaedic residents underwent simulation-based instruction on distal radial fracture reduction, casting, and cast removal using an oscillating saw. The analysis compared incidences of cast-saw injuries and associated costs before and after the implementation of the simulation curriculum. Actual and potential costs associated with cast-saw injuries included wound care, extra clinical visits, and potential total payment (indemnity and expense payments). Curriculum costs were calculated through time-derived, activity-based accounting methods. The researchers compared the costs of cast-saw injuries and the simulation curriculum to determine overall savings and return on investment. RESULTS: In the 2.5 years prior to simulation, cast-saw injuries occurred in approximately 4.3 per 100 casts cut by orthopaedic residents. For the 2.5-year period post-simulation, the injury rate decreased significantly to approximately 0.7 per 100 casts cut (p = 0.002). The total cost to implement the casting simulation was $2,465.31 per 6-month resident rotation. On the basis of historical data related to cast-saw burns (n = 6), total payments ranged from $2,995 to $25,000 per claim. The anticipated savings from averted cast-saw injuries and associated medicolegal payments in the 2.5 years post-simulation was $27,131, representing an 11-to-1 return on investment. CONCLUSIONS: Simulation-based training for orthopaedic surgical residents was effective in reducing cast-saw injuries and had a high theoretical return on investment. These results support further investment in simulation-based training as cost-effective means of improving patient safety and clinical outcomes. LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.


Subject(s)
Burns/prevention & control , Casts, Surgical , Device Removal/education , Internship and Residency , Orthopedics/education , Simulation Training/economics , Burns/economics , Burns/epidemiology , Controlled Before-After Studies , Cost Savings , Device Removal/adverse effects , Humans , Patient Safety/economics , Radius Fractures/therapy , Retrospective Studies
11.
Clin Exp Immunol ; 184(3): 347-57, 2016 06.
Article in English | MEDLINE | ID: mdl-26822517

ABSTRACT

Noroviruses (NoV) are the most common cause of epidemic gastroenteritis world-wide. NoV infections are often asymptomatic, although individuals still shed large amounts of NoV in their stool. Understanding the differences between asymptomatic and symptomatic individuals would help in elucidating mechanisms of NoV pathogenesis. Our goal was to compare the serum cytokine responses and faecal viral RNA titres of asymptomatic and symptomatic NoV-infected individuals. We tested serum samples from infected subjects (n = 26; 19 symptomatic, seven asymptomatic) from two human challenge studies of GI.1 NoV for 16 cytokines. Samples from prechallenge and days 1-4 post-challenge were tested for these cytokines. Cytokine levels were compared to stool NoV RNA titres quantified previously by reverse transcription-polymerase chain reaction (RT-qPCR). While both symptomatic and asymptomatic groups had similar patterns of cytokine responses, the symptomatic group generally exhibited a greater elevation of T helper type 1 (Th1) and Th2 cytokines and IL-8 post-challenge compared to the asymptomatic group (all P < 0·01). Daily viral RNA titre was associated positively with daily IL-6 concentration and negatively with daily IL-12p40 concentration (all P < 0·05). Symptoms were not associated significantly with daily viral RNA titre, duration of viral shedding or cumulative shedding. Symptomatic individuals, compared to asymptomatic, have greater immune system activation, as measured by serum cytokines, but they do not have greater viral burden, as measured by titre and shedding, suggesting that symptoms may be immune-mediated in NoV infection.


Subject(s)
Gastroenteritis/diagnosis , Interleukin-12 Subunit p40/blood , Interleukin-6/blood , Interleukin-8/blood , Norovirus/immunology , Virus Shedding/immunology , Adolescent , Adult , Asymptomatic Diseases , Feces/chemistry , Feces/virology , Female , Gastroenteritis/immunology , Gastroenteritis/pathology , Gastroenteritis/virology , Host-Pathogen Interactions , Humans , Immunity, Innate , Male , Norovirus/genetics , Norovirus/growth & development , RNA, Viral/genetics , RNA, Viral/immunology , Severity of Illness Index , Th1 Cells/immunology , Th1 Cells/pathology , Th1 Cells/virology , Th1-Th2 Balance , Th2 Cells/immunology , Th2 Cells/pathology , Th2 Cells/virology , Viral Load/immunology
12.
J Appl Microbiol ; 120(2): 509-21, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26535924

ABSTRACT

AIMS: This study investigated waterborne opportunistic pathogens (OPs) including potential hosts, and evaluated the use of Legionella spp. for indicating microbial water quality for OPs within a full-scale operating drinking water distribution system (DWDS). METHODS AND RESULTS: To investigate the occurrence of specific microbial pathogens within a major city DWDS we examined large volume (90 l drinking water) ultrafiltration (UF) concentrates collected from six sites between February, 2012 and June, 2013. The detection frequency and concentration estimates by qPCR were: Legionella spp. (57%/85 cell equivalent, CE l(-1) ), Mycobacterium spp. (88%/324 CE l(-1) ), Pseudomonas aeruginosa (24%/2 CE l(-1) ), Vermamoeba vermiformis (24%/2 CE l(-1) ) and Acanthamoeba spp. (42%/5 cyst equivalent, CE l(-1) ). There was no detection of the following microorganisms: human faecal indicator Bacteroides (HF183), Salmonella enterica, Campylobacter spp., Escherichia coli O157:H7, Giardia intestinalis, Cryptosporidium spp. or Naegleria fowleri. There were significant correlations between the qPCR signals of Legionella spp. and Mycobacterium spp., and their potential hosts V. vermiformis and Acanthamoeba spp. Sequencing of Legionella spp. demonstrated limited diversity, with most sequences coming from two dominant groups, of which the larger dominant group was an unidentified species. Other known species including Legionella pneumophila were detected, but at low frequency. The densities of Legionella spp. and Mycobacterium spp. were generally higher (17 and 324 folds, respectively) for distal sites relative to the entry point to the DWDS. CONCLUSIONS: Legionella spp. occurred, had significant growth and were strongly associated with free-living amoebae (FLA) and Mycobacterium spp., suggesting that Legionella spp. could provide a useful DWDS monitoring role to indicate potential conditions for non-faecal OPs. SIGNIFICANCE AND IMPACT OF THE STUDY: The results provide insight into microbial pathogen detection that may aid in the monitoring of microbial water quality within DWDS prior to customer exposures.


Subject(s)
Amoeba/isolation & purification , Drinking Water/microbiology , Drinking Water/parasitology , Legionella/isolation & purification , Mycobacterium/isolation & purification , Pseudomonas aeruginosa/isolation & purification , Amoeba/classification , Amoeba/genetics , Drinking Water/chemistry , Humans , Legionella/classification , Legionella/genetics , Mycobacterium/classification , Mycobacterium/genetics , Pseudomonas aeruginosa/classification , Pseudomonas aeruginosa/genetics , Water Pollution/analysis , Water Quality
13.
Clin Exp Immunol ; 182(2): 195-203, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26178578

ABSTRACT

Noroviruses (NoV) are the most common cause of epidemic gastroenteritis worldwide. The acute immune response to NoV in humans is poorly understood, hindering research on prevention and treatment. To elucidate the acute immune response and test for cytokine predictors of susceptibility to infection, serum samples from two human NoV challenge studies were tested for 16 cytokines. Subjects who became infected (n = 26) were age-matched with subjects who remained uninfected following NoV challenge (n = 26). Samples were tested from prechallenge and days 1-4 post-challenge. Cytokine responses were compared between infected and uninfected groups. Overall, infected individuals exhibited an elevation in T helper type 1 (Th1) and Th2 cytokines, as well as chemokines interleukin (IL)-8 and monocyte chemoattractant protein (MCP-1), compared to uninfected individuals (all P < 0.05). Most cytokines peaked on day 2 post-challenge in infected subjects, and tumour necrosis factor (TNF)-α, IL-8, and IL-10 remained elevated to day 3. The only cytokine elevated significantly among infected subjects to day 4 post-challenge was IL-10 (P = 0.021). Prechallenge cytokine concentrations were not predictive of infection status post-challenge. There were no significant changes in serum cytokines among NoV-challenged subjects who remained uninfected. These results suggest that NoV infection elicits a Th1-type response, with some Th2 activation. Persistent elevation of IL-10 among infected subjects is consistent with activation of adaptive immune responses, such as B cell expansion, as well as down-regulation of Th1 cytokines. This study presents the first comprehensive description of the acute cytokine response to GI.1 NoV in humans.


Subject(s)
Caliciviridae Infections/immunology , Cytokines/immunology , Gastroenteritis/immunology , Norovirus/immunology , Adult , Caliciviridae Infections/blood , Caliciviridae Infections/virology , Chemokine CCL2/blood , Chemokine CCL2/immunology , Cytokines/blood , Feces/virology , Female , Gastroenteritis/blood , Gastroenteritis/virology , Host-Pathogen Interactions/immunology , Humans , Interleukin-10/blood , Interleukin-10/immunology , Interleukin-8/blood , Interleukin-8/immunology , Male , Norovirus/genetics , Norovirus/physiology , RNA, Viral/genetics , Reverse Transcriptase Polymerase Chain Reaction , Th1 Cells/immunology , Th1 Cells/metabolism , Th2 Cells/immunology , Th2 Cells/metabolism , Time Factors , Tumor Necrosis Factor-alpha/blood , Tumor Necrosis Factor-alpha/immunology , Young Adult
14.
J Med Virol ; 86(12): 2055-64, 2014 Dec.
Article in English | MEDLINE | ID: mdl-24531909

ABSTRACT

Norovirus is the most common cause of acute infectious gastroenteritis, causing approximately 21 million cases annually in the USA. The virus is highly contagious and resistant to decontamination, making outbreaks difficult to control. To facilitate the development of better control methods, this study characterized the viral shedding patterns in stools from subjects experimentally infected with genogroup I or II norovirus. Viral stool titers were determined by quantitative real-time RT-PCR for all stools produced in the first 7 days post-challenge and representative stools through day 35 post-challenge. The shedding titers and disease course were analyzed with respect to virus type, illness, and subject demographics. Infection with GII.2 Snow Mountain (SMV) resulted in more symptoms and a higher frequency of painful symptoms compared to GI.1 Norwalk (NV) infection. However, NV infection produced stool viral titers approximately 2 logs higher than those seen in SMV infections. Both NV and SMV were shed in stools for up to 3 weeks after the resolution of symptoms, but long shedding durations were more common in NV infections. For each challenge virus, shedding titers and patterns were not correlated with subject demographics or clinical course. This is the first study to report shedding dynamics in experimental GII norovirus infection.


Subject(s)
Caliciviridae Infections/pathology , Caliciviridae Infections/virology , Gastroenteritis/pathology , Gastroenteritis/virology , Norwalk virus/isolation & purification , Virus Shedding , Adult , Animals , Feces/virology , Female , Human Experimentation , Humans , Male , Middle Aged , Real-Time Polymerase Chain Reaction , Time Factors , United States , Viral Load , Young Adult
15.
J Appl Microbiol ; 115(1): 310-8, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23617931

ABSTRACT

AIMS: To assess human adenoviruses (HAdVs) removal in an advanced wastewater treatment facility and compare two parallel tertiary treatment methods for the removal of HAdVs. METHODS AND RESULTS: Tangential flow ultrafiltration was used to concentrate the water samples, and HAdVs were precipitated by polyethylene glycol. HAdVs were detected only by TaqMan real-time PCR, and HAdV genotype was determined by DNA sequence. HAdVs were detected in 100% of primary clarification influent, secondary clarification effluent and granular media (GM) filtration effluent samples but only in 31·2% of membrane filtration (MF) effluent and 41·7% of final effluent (FE) samples, respectively. The average HAdVs loads were significantly reduced along the treatments but HAdVs were still present in FE. Comparison of two parallel treatments (GM vs MF) showed that MF was technically superior to GM for the removal of HAdVs. CONCLUSIONS: These findings indicate that adenoviruses are not completely removed by treatment processes. MF is a better treatment for removal of adenoviruses than GM filtration. Because only qPCR was used, the results only indicate the removal of adenovirus DNA and not the infectivity of viruses. SIGNIFICANCE AND IMPACT OF THE STUDY: Presence of HAdVs in FE by qPCR suggests a potential public health risk from exposure to the treated wastewater and using the FE for recreational or water reuse purposes should be cautious.


Subject(s)
Adenoviruses, Human/isolation & purification , Real-Time Polymerase Chain Reaction , Waste Disposal, Fluid/methods , Wastewater/virology , Adenoviruses, Human/genetics , Filtration , Georgia
16.
Epidemiol Infect ; 141(8): 1572-84, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23507473

ABSTRACT

Norovirus is a common cause of gastroenteritis in all ages. Typical infections cause viral shedding periods of days to weeks, but some individuals can shed for months or years. Most norovirus risk models do not include these long-shedding individuals, and may therefore underestimate risk. We reviewed the literature for norovirus-shedding duration data and stratified these data into two distributions: regular shedding (mean 14-16 days) and long shedding (mean 105-136 days). These distributions were used to inform a norovirus transmission model that predicts the impact of long shedders. Our transmission model predicts that this subpopulation increases the outbreak potential (measured by the reproductive number) by 50-80%, the probability of an outbreak by 33%, the severity of transmission (measured by the attack rate) by 20%, and transmission duration by 100%. Characterizing and understanding shedding duration heterogeneity can provide insights into community transmission that can be useful in mitigating norovirus risk.


Subject(s)
Caliciviridae Infections/epidemiology , Caliciviridae Infections/transmission , Disease Outbreaks , Gastroenteritis/epidemiology , Gastroenteritis/virology , Norovirus/physiology , Caliciviridae Infections/virology , Humans , Models, Biological , Risk Factors , Virus Shedding
17.
Epidemiol Infect ; 140(7): 1161-72, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22444943

ABSTRACT

The purpose of this study was to examine global epidemiological trends in human norovirus (NoV) outbreaks by transmission route and setting, and describe relationships between these characteristics, viral attack rates, and the occurrence of genogroup I (GI) or genogroup II (GII) strains in outbreaks. We analysed data from 902 reverse transcriptase-polymerase chain reaction-confirmed, human NoV outbreaks abstracted from a systematic review of articles published from 1993 to 2011 and indexed under the terms 'norovirus' and 'outbreak'. Multivariate regression analyses demonstrated that foodservice and winter outbreaks were significantly associated with higher attack rates. Foodborne and waterborne outbreaks were associated with multiple strains (GI+GII). Waterborne outbreaks were significantly associated with GI strains, while healthcare-related and winter outbreaks were associated with GII strains. These results identify important trends for epidemic NoV detection, prevention, and control.


Subject(s)
Caliciviridae Infections/epidemiology , Cross Infection/epidemiology , Disease Outbreaks , Norovirus/classification , Basic Reproduction Number , Caliciviridae Infections/virology , Cross Infection/virology , Food/virology , Gastroenteritis/epidemiology , Gastroenteritis/virology , Genotype , Global Health , Humans , Norovirus/genetics , Norovirus/isolation & purification , Risk Factors , Seasons , Water Microbiology
18.
Environ Sci Technol ; 44(22): 8561-6, 2010 Nov 15.
Article in English | MEDLINE | ID: mdl-20968297

ABSTRACT

Contaminants from the soil surrounding drinking water distribution systems are thought to not enter the drinking water when sufficient internal pressure is maintained. Pressure transients may cause short intervals of negative pressure, and the soil near drinking water pipes often contains fecal material due to the proximity of sewage lines, so that a pressure event may cause intrusion of pathogens. This paper presents a risk model for predicting intrusion and dilution of viruses and their transport to consumers. Random entry and dilution of virus was simulated by embedding the hydraulic model into a Monte Carlo simulation. Special attention was given to adjusting for the coincidence of virus presence and use of tap water, as independently occurring short-term events within the longer interval that the virus is predicted to travel in any branch of the distribution system. The probability that a consumer drinks water contaminated with virus is small, but when this happens the virus concentration tends to be high and the risk of infection may be considerable. The spatial distribution of infection risk is highly heterogeneous. The presence of a chlorine residual reduces the infection risk.


Subject(s)
Fresh Water/virology , Sewage/virology , Virus Diseases/epidemiology , Water Pollutants/analysis , Water Supply/analysis , Drainage, Sanitary , Drinking , Environmental Exposure/statistics & numerical data , Humans , Monte Carlo Method , Pressure , Risk Assessment , Risk Factors , Soil Microbiology
19.
J Med Virol ; 79(1): 84-91, 2007 Jan.
Article in English | MEDLINE | ID: mdl-17133557

ABSTRACT

Noroviruses (NoVs) are the most common cause of acute non-bacterial gastroenteritis outbreaks in the US. We investigated 16 gastroenteritis outbreaks in North Carolina (NC), from 1995 to 2000, to further characterize the epidemiology of NoV using RT-PCR on stool and ELISA on sera. NoV were identified in 14 outbreaks by RT-PCR. Sequence analyses of the amplicons indicated the outbreak strains belonged to the following clusters: five GII/4, three GI/3, one GI/4, one GII/2, one GII/5, one GII/7, and one GII/13 (prototype strain). We detected NoV in stool samples from one outbreak but could not determine its specific cluster within the GII genogroup based on polymerase sequence analysis. The five GII/4 strains were classified as the "95/96 US common strain" and occurred throughout the 5-year period. In contrast to national trends, the majority (86%) of NoV outbreaks identified in North Carolina were foodborne. Of the 12 food-related NoV outbreaks, we were able to document transmission by food handlers in two outbreaks. Person-to-person transmission from primary cases was suggested in three outbreaks. Our results indicate that NoVs are important agents of viral gastroenteritis outbreaks in NC.


Subject(s)
Caliciviridae Infections/epidemiology , Disease Outbreaks , Gastroenteritis/epidemiology , Norovirus/classification , Norovirus/genetics , Caliciviridae Infections/virology , Feces/virology , Gastroenteritis/virology , Humans , Sequence Analysis, DNA , United States/epidemiology
20.
J Am Coll Health ; 50(2): 57-66, 2001 Sep.
Article in English | MEDLINE | ID: mdl-11590984

ABSTRACT

Norwalk-like viruses (NLVs) are transmitted by fecally contaminated food, water, fomites, and person-to-person contact. They are a leading cause of acute gastroenteritis epidemics in industrialized countries. NLV outbreaks are characterized by a 12- to 48-hour incubation period; nausea, vomiting, and diarrhea for 24 to 72 hours; and high secondary attack rates. NLV infections spread rapidly on college and university campuses because of close living quarters, shared bathrooms and common rooms, many food handlers, popular self-service salad bars in dining halls, and person-to-person contact through sports and recreational activities. The illness is generally mild and self-limited but an outbreak can strain the resources of campus health services and cause high absenteeism among both students and staff. Treatment is primarily through antiemetic medication and oral rehydration. Prevention and control of NLV outbreaks rests on promoting hand washing; enforcement of strict hygiene in all food preparation areas; and prompt, rigorous cleaning of potentially contaminated areas where someone has been ill.


Subject(s)
Caliciviridae Infections/epidemiology , Disease Outbreaks , Gastroenteritis/epidemiology , Norwalk virus/isolation & purification , Acute Disease , Adolescent , Adult , Antibodies, Viral/blood , Caliciviridae Infections/etiology , Enzyme-Linked Immunosorbent Assay , Feces/virology , Female , Gastroenteritis/etiology , Humans , Male , Norwalk virus/immunology , Reverse Transcriptase Polymerase Chain Reaction , United States/epidemiology , Universities/statistics & numerical data
SELECTION OF CITATIONS
SEARCH DETAIL