Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
1.
Int J Epidemiol ; 53(1)2024 Feb 01.
Article in English | MEDLINE | ID: mdl-37820050

ABSTRACT

BACKGROUND: Culture-independent diagnostic testing (CIDT) provides rapid results to clinicians and is quickly displacing traditional detection methods. Increased CIDT use and sensitivity likely result in higher case detection but might also obscure infection trends. Severe illness outcomes, such as hospitalization and death, are likely less affected by changes in testing practices and can be used as indicators of the expected case incidence trend had testing methods not changed. METHODS: Using US Foodborne Diseases Active Surveillance Network data during 1996-2019 and mixed effects quasi-Poisson regression, we estimated the expected yearly incidence for nine enteric pathogens. RESULTS: Removing the effect of CIDT use, CIDT panel testing and culture-confirmation of CIDT testing, the modelled incidence in all but three pathogens (Salmonella, Shigella, STEC O157) was significantly lower than the observed and the upward trend in Campylobacter was reversed from an observed 2.8% yearly increase to a modelled -2.8% yearly decrease (95% credible interval: -4.0, -1.4). CONCLUSIONS: Severe outcomes may be useful indicators in evaluating trends in surveillance systems that have undergone a marked change.


Subject(s)
Campylobacter , Foodborne Diseases , Humans , Incidence , Foodborne Diseases/epidemiology , Diagnostic Techniques and Procedures , Hospitalization
2.
MMWR Morb Mortal Wkly Rep ; 72(26): 701-706, 2023 Jun 30.
Article in English | MEDLINE | ID: mdl-37384552

ABSTRACT

Each year, infections from major foodborne pathogens are responsible for an estimated 9.4 million illnesses, 56,000 hospitalizations, and 1,350 deaths in the United States (1). To evaluate progress toward prevention of enteric infections in the United States, the Foodborne Diseases Active Surveillance Network (FoodNet) conducts surveillance for laboratory-diagnosed infections caused by eight pathogens transmitted commonly through food at 10 U.S. sites. During 2020-2021, FoodNet detected decreases in many infections that were due to behavioral modifications, public health interventions, and changes in health care-seeking and testing practices during the COVID-19 pandemic. This report presents preliminary estimates of pathogen-specific annual incidences during 2022, compared with average annual incidences during 2016-2018, the reference period for the U.S. Department of Health and Human Services' Healthy People 2030 targets (2). Many pandemic interventions ended by 2022, resulting in a resumption of outbreaks, international travel, and other factors leading to enteric infections. During 2022, annual incidences of illnesses caused by the pathogens Campylobacter, Salmonella, Shigella, and Listeria were similar to average annual incidences during 2016-2018; however, incidences of Shiga toxin-producing Escherichia coli (STEC), Yersinia, Vibrio, and Cyclospora illnesses were higher. Increasing culture-independent diagnostic test (CIDT) usage likely contributed to increased detection by identifying infections that would have remained undetected before widespread CIDT usage. Reducing pathogen contamination during poultry slaughter and processing of leafy greens requires collaboration among food growers and processors, retail stores, restaurants, and regulators.


Subject(s)
COVID-19 , Foodborne Diseases , Humans , Animals , Incidence , Pandemics , Watchful Waiting , COVID-19/epidemiology , Foodborne Diseases/epidemiology
3.
Emerg Infect Dis ; 29(6): 1183-1190, 2023 06.
Article in English | MEDLINE | ID: mdl-37209671

ABSTRACT

Shiga toxin-producing Escherichia coli (STEC) causes acute diarrheal illness. To determine risk factors for non-O157 STEC infection, we enrolled 939 patients and 2,464 healthy controls in a case-control study conducted in 10 US sites. The highest population-attributable fractions for domestically acquired infections were for eating lettuce (39%), tomatoes (21%), or at a fast-food restaurant (23%). Exposures with 10%-19% population attributable fractions included eating at a table service restaurant, eating watermelon, eating chicken, pork, beef, or iceberg lettuce prepared in a restaurant, eating exotic fruit, taking acid-reducing medication, and living or working on or visiting a farm. Significant exposures with high individual-level risk (odds ratio >10) among those >1 year of age who did not travel internationally were all from farm animal environments. To markedly decrease the number of STEC-related illnesses, prevention measures should focus on decreasing contamination of produce and improving the safety of foods prepared in restaurants.


Subject(s)
Escherichia coli Infections , Shiga-Toxigenic Escherichia coli , Animals , Cattle , United States/epidemiology , Escherichia coli Infections/epidemiology , Case-Control Studies , Risk Factors , Diarrhea/epidemiology
4.
MMWR Morb Mortal Wkly Rep ; 71(40): 1260-1264, 2022 Oct 07.
Article in English | MEDLINE | ID: mdl-36201372

ABSTRACT

To evaluate progress toward prevention of enteric infections in the United States, the Foodborne Diseases Active Surveillance Network (FoodNet) conducts active population-based surveillance for laboratory-diagnosed infections caused by Campylobacter, Cyclospora, Listeria, Salmonella, Shiga toxin-producing Escherichia coli (STEC), Shigella, Vibrio, and Yersinia at 10 U.S. sites. This report summarizes preliminary 2021 data and describes changes in annual incidence compared with the average annual incidence for 2016-2018, the reference period for the U.S. Department of Health and Human Services' (HHS) Healthy People 2030 goals for some pathogens (1). During 2021, the incidence of infections caused by Salmonella decreased, incidence of infections caused by Cyclospora, Yersinia, and Vibrio increased, and incidence of infections caused by other pathogens did not change. As in 2020, behavioral modifications and public health interventions implemented to control the COVID-19 pandemic might have decreased transmission of enteric infections (2). Other factors (e.g., increased use of telemedicine and continued increase in use of culture-independent diagnostic tests [CIDTs]) might have altered their detection or reporting (2). Much work remains to achieve HHS Healthy People 2030 goals, particularly for Salmonella infections, which are frequently attributed to poultry products and produce, and Campylobacter infections, which are frequently attributed to chicken products (3).


Subject(s)
COVID-19 , Foodborne Diseases , Vibrio , Foodborne Diseases/epidemiology , Humans , Incidence , Pandemics , Population Surveillance , Salmonella , United States/epidemiology , Watchful Waiting
5.
MMWR Morb Mortal Wkly Rep ; 70(38): 1332-1336, 2021 Sep 24.
Article in English | MEDLINE | ID: mdl-34555002

ABSTRACT

Foodborne illnesses are a substantial and largely preventable public health problem; before 2020 the incidence of most infections transmitted commonly through food had not declined for many years. To evaluate progress toward prevention of foodborne illnesses in the United States, the Foodborne Diseases Active Surveillance Network (FoodNet) of CDC's Emerging Infections Program monitors the incidence of laboratory-diagnosed infections caused by eight pathogens transmitted commonly through food reported by 10 U.S. sites.* FoodNet is a collaboration among CDC, 10 state health departments, the U.S. Department of Agriculture's Food Safety and Inspection Service (USDA-FSIS), and the Food and Drug Administration. This report summarizes preliminary 2020 data and describes changes in incidence with those during 2017-2019. During 2020, observed incidences of infections caused by enteric pathogens decreased 26% compared with 2017-2019; infections associated with international travel decreased markedly. The extent to which these reductions reflect actual decreases in illness or decreases in case detection is unknown. On March 13, 2020, the United States declared a national emergency in response to the COVID-19 pandemic. After the declaration, state and local officials implemented stay-at-home orders, restaurant closures, school and child care center closures, and other public health interventions to slow the spread of SARS-CoV-2, the virus that causes COVID-19 (1). Federal travel restrictions were declared (1). These widespread interventions as well as other changes to daily life and hygiene behaviors, including increased handwashing, have likely changed exposures to foodborne pathogens. Other factors, such as changes in health care delivery, health care-seeking behaviors, and laboratory testing practices, might have decreased the detection of enteric infections. As the pandemic continues, surveillance of illness combined with data from other sources might help to elucidate the factors that led to the large changes in 2020; this understanding could lead to improved strategies to prevent illness. To reduce the incidence of these infections concerted efforts are needed, from farm to processing plant to restaurants and homes. Consumers can reduce their risk of foodborne illness by following safe food-handling and preparation recommendations.


Subject(s)
COVID-19/epidemiology , Food Microbiology/statistics & numerical data , Food Parasitology/statistics & numerical data , Foodborne Diseases/epidemiology , Pandemics , Watchful Waiting , Adolescent , Child , Child, Preschool , Foodborne Diseases/microbiology , Foodborne Diseases/parasitology , Humans , Incidence , Infant , United States/epidemiology
6.
Pediatrics ; 145(3)2020 03.
Article in English | MEDLINE | ID: mdl-32054822

ABSTRACT

BACKGROUND: Most countries use 3-dose pneumococcal conjugate vaccine (PCV) schedules; a 4-dose (3 primary and 1 booster) schedule is licensed for US infants. We evaluated the invasive pneumococcal disease (IPD) breakthrough infection incidence in children receiving 2 vs 3 primary PCV doses with and without booster doses (2 + 1 vs 3 + 1; 2 + 0 vs 3 + 0). METHODS: We used 2001-2016 Active Bacterial Core surveillance data to identify breakthrough infections (vaccine-type IPD in children receiving ≥1 7-valent pneumococcal conjugate vaccine [PCV7] or 13-valent pneumococcal conjugate vaccine [PCV13] dose) among children aged <5 years. We estimated schedule-specific IPD incidence rates (IRs) per 100 000 person-years and compared incidence by schedule (2 + 1 vs 3 + 1; 2 + 0 vs 3 + 0) using rate differences (RDs) and incidence rate ratios. RESULTS: We identified 71 PCV7 and 49 PCV13 breakthrough infections among children receiving a schedule of interest. PCV13 breakthrough infection rates were higher in children aged <1 year receiving the 2 + 0 (IR: 7.8) vs 3 + 0 (IR: 0.6) schedule (incidence rate ratio: 12.9; 95% confidence interval: 4.1-40.4); PCV7 results were similar. Differences in PCV13 breakthrough infection rates by schedule in children aged <1 year were larger in 2010-2011 (2 + 0 IR: 18.6; 3 + 0 IR: 1.4; RD: 16.6) vs 2012-2016 (2 + 0 IR: 3.6; 3 + 0 IR: 0.2; RD: 3.4). No differences between schedules were detected in children aged ≥1 year for PCV13 breakthrough infections. CONCLUSIONS: Fewer PCV breakthrough infections occurred in the first year of life with 3 primary doses. Differences in breakthrough infection rates by schedule decreased as vaccine serotypes decreased in circulation.


Subject(s)
Heptavalent Pneumococcal Conjugate Vaccine , Pneumococcal Infections/epidemiology , Pneumococcal Infections/prevention & control , Pneumococcal Vaccines , Child, Preschool , Female , Humans , Incidence , Infant , Male , Treatment Failure , United States/epidemiology
7.
Open Forum Infect Dis ; 7(2): ofaa030, 2020 Feb.
Article in English | MEDLINE | ID: mdl-32099844

ABSTRACT

BACKGROUND: Shigella causes an estimated 500 000 enteric illnesses in the United States annually, but the association with socioeconomic factors is unclear. METHODS: We examined possible epidemiologic associations between shigellosis and poverty using 2004-2014 Foodborne Diseases Active Surveillance Network (FoodNet) data. Shigella cases (n = 21 246) were geocoded, linked to Census tract data from the American Community Survey, and categorized into 4 poverty and 4 crowding strata. For each stratum, we calculated incidence by sex, age, race/ethnicity, and FoodNet site. Using negative binomial regression, we estimated incidence rate ratios (IRRs) comparing the highest to lowest stratum. RESULTS: Annual FoodNet Shigella incidence per 100 000 population was higher among children <5 years old (19.0), blacks (7.2), and Hispanics (5.6) and was associated with Census tract poverty (incidence rate ratio [IRR], 3.6; 95% confidence interval [CI], 3.5-3.8) and household crowding (IRR, 1.8; 95% CI, 1.7-1.9). The association with poverty was strongest among children and persisted regardless of sex, race/ethnicity, or geographic location. After controlling for demographic variables, the association between shigellosis and poverty remained significant (IRR, 2.3; 95% CI, 2.0-2.6). CONCLUSIONS: In the United States, Shigella infections are epidemiologically associated with poverty, and increased incidence rates are observed among young children, blacks, and Hispanics.

8.
J Infect Dis ; 222(8): 1405-1412, 2020 09 14.
Article in English | MEDLINE | ID: mdl-31758182

ABSTRACT

BACKGROUND: The relationships between socioeconomic status and domestically acquired salmonellosis and leading Salmonella serotypes are poorly understood. METHODS: We analyzed surveillance data from laboratory-confirmed cases of salmonellosis from 2010-2016 for all 10 Foodborne Disease Active Surveillance Network (FoodNet) sites, having a catchment population of 47.9 million. Case residential data were geocoded, linked to census tract poverty level, and then categorized into 4 groups according to census tract poverty level. After excluding those reporting international travel before illness onset, age-specific and age-adjusted salmonellosis incidence rates were calculated for each census tract poverty level, overall and for each of the 10 leading serotypes. RESULTS: Of 52 821geocodable Salmonella infections (>96%), 48 111 (91.1%) were domestically acquired. Higher age-adjusted incidence occurred with higher census tract poverty level (P < .001; relative risk for highest [≥20%] vs lowest [<5%] census tract poverty level, 1.37). Children <5 years old had the highest relative risk (2.07). Although this relationship was consistent by race/ethnicity and by serotype, it was not present in 5 FoodNet sites or among those aged 18-49 years. CONCLUSION: Children and older adults living in higher-poverty census tracts have had a higher incidence of domestically acquired salmonellosis. There is a need to understand socioeconomic status differences for risk factors for domestically acquired salmonellosis by age group and FoodNet site to help focus prevention efforts.


Subject(s)
Community Networks/statistics & numerical data , Foodborne Diseases/epidemiology , Poverty/statistics & numerical data , Salmonella Infections/epidemiology , Censuses , Community Networks/organization & administration , Foodborne Diseases/microbiology , Humans , Incidence , Population Surveillance , Risk Factors , Salmonella/classification , Salmonella/isolation & purification , Salmonella Infections/microbiology , Serogroup , United States/epidemiology
9.
Open Forum Infect Dis ; 5(7): ofy148, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30568988

ABSTRACT

BACKGROUND: The relationship between socioeconomic status and Shiga toxin-producing Escherichia coli (STEC) is not well understood. However, recent studies in Connecticut and New York City found that as census tract poverty (CTP) decreased, rates of STEC increased. To explore this nationally, we analyzed surveillance data from laboratory-confirmed cases of STEC from 2010-2014 for all Foodborne Disease Active Surveillance Network (FoodNet) sites, population 47.9 million. METHODS: Case residential data were geocoded and linked to CTP level (2010-2014 American Community Survey). Relative rates were calculated comparing incidence in census tracts with <20% of residents below poverty with those with ≥20%. Relative rates of age-adjusted 5-year incidence per 100 000 population were determined for all STEC, hospitalized only and hemolytic-uremic syndrome (HUS) cases overall, by demographic features, FoodNet site, and surveillance year. RESULTS: There were 5234 cases of STEC; 26.3% were hospitalized, and 5.9% had HUS. Five-year incidence was 10.9/100 000 population. Relative STEC rates for the <20% compared with the ≥20% CTP group were >1.0 for each age group, FoodNet site, surveillance year, and race/ethnic group except Asian. Relative hospitalization and HUS rates tended to be higher than their respective STEC relative rates. CONCLUSIONS: Persons living in lower CTP were at higher risk of STEC than those in the highest poverty census tracts. This is unlikely to be due to health care-seeking or diagnostic bias as it applies to analysis limited to hospitalized and HUS cases. Research is needed to better understand exposure differences between people living in the lower vs highest poverty-level census tracts to help direct prevention efforts.

10.
J Innov Health Inform ; 24(2): 940, 2017 Jun 23.
Article in English | MEDLINE | ID: mdl-28749317

ABSTRACT

BACKGROUND: The purpose of this literature review is to understand geographical information systems (GIS) and how they can be applied to public health informatics, medical informatics, and epidemiology. METHOD: Relevant papers that reflected the use of geographical information systems (GIS) in health research were identified from four academic databases: Academic Search Complete, BioMed Central, PubMed Central, and Scholars Portal, as well as Google Scholar. The search strategy used was to identify articles with "geographic information systems", "GIS", "public health", "medical informatics", "epidemiology", and "health geography" as main subject headings or text words in titles and abstracts. Papers published between 1997 and 2014 were considered and a total of 39 articles were included to inform the authors on the use of GIS technologies in health informatics research. RESULTS: The main applications of GIS in health informatics and epidemiology include disease surveillance, health risk analysis, health access and planning, and community health profiling. GIS technologies can significantly improve quality and efficiency in health research as substantial connections can be made between a population's health and their geographical location. CONCLUSIONS: Gains in health informatics can be made when GIS are applied through research, however, improvements need to occur in the quantity and quality of data input for these systems to ensure better geographical health maps are used so that proper conclusions between public health and environmental factors may be made.


Subject(s)
Geographic Information Systems/statistics & numerical data , Medical Informatics , Public Health Informatics , Research , Humans , Population Health
11.
Transplantation ; 101(9): 2115-2119, 2017 09.
Article in English | MEDLINE | ID: mdl-28333861

ABSTRACT

BACKGROUND: The waiting list for kidney transplantation is long. The creation of "vouchers" for future kidney transplants enables living donation to occur when optimal for the donor and transplantation to occur later, when and if needed by the recipient. METHODS: The donation of a kidney at a time that is optimal for the donor generates a "voucher" that only a specified recipient may redeem later when needed. The voucher provides the recipient with priority in being matched with a living donor from the end of a future transplantation chain. Besides its use in persons of advancing age with a limited window for donation, vouchers remove a disincentive to kidney donation, namely, a reluctance to donate now lest one's family member should need a transplant in the future. RESULTS: We describe the first three voucher cases, in which advancing age might otherwise have deprived the donors the opportunity to provide a kidney to a family member. These 3 voucher donations functioned in a nondirected fashion and triggered 25 transplants through kidney paired donation across the United States. CONCLUSIONS: The provision of a voucher to potential recipients whose need for a transplant makes them "chronologically incompatible" with their donors may increase the number of living donor transplants.


Subject(s)
Delivery of Health Care, Integrated , Directed Tissue Donation , Donor Selection , Kidney Diseases/surgery , Kidney Transplantation/methods , Living Donors/supply & distribution , Time-to-Treatment , Transplant Recipients , Waiting Lists , Age Factors , Child , Child, Preschool , Delivery of Health Care, Integrated/organization & administration , Disease Progression , Donor Selection/organization & administration , Female , Humans , Kidney Diseases/diagnosis , Male , Middle Aged , Time Factors , Treatment Outcome
12.
Clin Infect Dis ; 63(suppl 4): S221-S226, 2016 Dec 01.
Article in English | MEDLINE | ID: mdl-27838676

ABSTRACT

BACKGROUND: Infants are at greatest risk for severe pertussis. In 2006, the Advisory Committee on Immunization Practices recommended that adolescents and adults, especially those with infant contact, receive a single dose of Tdap (tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis vaccine). To assess the effectiveness of cocooning, we conducted a case-control evaluation of infant close contacts. METHODS: Pertussis cases aged <2 months with onset between 1 January 2011 and 31 December 2011 were identified in Emerging Infections Program Network sites. For each case, we recruited 3 controls from birth certificates and interviewed identified adult close contacts (CCs) or parents of CCs aged <18 years. Pertussis vaccination was verified through medical providers and/or immunization registries. RESULTS: Forty-two cases were enrolled, with 154 matched controls. Around enrolled infants, 859 CCs were identified (600 adult and 259 nonadult). An average of 5.4 CCs was identified per case and 4.1 CCs per control. Five hundred fifty-four (64.5%) CCs were enrolled (371 adult and 183 non-adult CCs); 119 (32.1% of enrolled) adult CCs had received Tdap. The proportion of Tdap-vaccinated adult CCs was similar between cases and controls (P = .89). The 600 identified adult CCs comprised 172 potential cocoons; 71 (41.3%) potential cocoons had all identified adult CCs enrolled. Of these, 9 were fully vaccinated and 43.7% contained no Tdap-vaccinated adults. The proportion of fully vaccinated case (4.8%) and control (10.0%) cocoons was similar (P = .43). CONCLUSIONS: Low Tdap coverage among adult CCs reinforces the difficulty of implementing the cocooning strategy and the importance of vaccination during pregnancy to prevent infant pertussis.


Subject(s)
Diphtheria-Tetanus-Pertussis Vaccine/immunology , Vaccination , Whooping Cough/prevention & control , Adult , Case-Control Studies , Child , Child, Preschool , Diphtheria-Tetanus-Pertussis Vaccine/administration & dosage , Female , Humans , Infant , Infant, Newborn , Male , Outcome Assessment, Health Care , Population Surveillance , United States/epidemiology
13.
BMC Infect Dis ; 16: 354, 2016 07 22.
Article in English | MEDLINE | ID: mdl-27450432

ABSTRACT

BACKGROUND: Campylobacter is a leading cause of foodborne illness in the United States. Campylobacter infections have been associated with individual risk factors, such as the consumption of poultry and raw milk. Recently, a Maryland-based study identified community socioeconomic and environmental factors that are also associated with campylobacteriosis rates. However, no previous studies have evaluated the association between community risk factors and campylobacteriosis rates across multiple U.S. states. METHODS: We obtained Campylobacter case data (2004-2010; n = 40,768) from the Foodborne Diseases Active Surveillance Network (FoodNet) and socioeconomic and environmental data from the 2010 Census of Population and Housing, the 2011 American Community Survey, and the 2007 U.S. Census of Agriculture. We linked data by zip code and derived incidence rate ratios using negative binomial regression models. RESULTS: Community socioeconomic and environmental factors were associated with both lower and higher campylobacteriosis rates. Zip codes with higher percentages of African Americans had lower rates of campylobacteriosis (incidence rate ratio [IRR]) = 0.972; 95 % confidence interval (CI) = 0.970,0.974). In Georgia, Maryland, and Tennessee, three leading broiler chicken producing states, zip codes with broiler operations had incidence rates that were 22 % (IRR = 1.22; 95 % CI = 1.03,1.43), 16 % (IRR = 1.16; 95 % CI = 0.99,1.37), and 35 % (IRR = 1.35; 95 % CI = 1.18,1.53) higher, respectively, than those of zip codes without broiler operations. In Minnesota and New York FoodNet counties, two top dairy producing areas, zip codes with dairy operations had significantly higher campylobacteriosis incidence rates (IRR = 1.37; 95 % CI = 1.22, 1.55; IRR = 1.19; 95 % CI = 1.04,1.36). CONCLUSIONS: Community socioeconomic and environmental factors are important to consider when evaluating the relationship between possible risk factors and Campylobacter infection.


Subject(s)
Campylobacter Infections/epidemiology , Foodborne Diseases/epidemiology , Poultry Products/poisoning , Adolescent , Adult , Aged , Aged, 80 and over , Animal Husbandry , Animals , Campylobacter Infections/etiology , Chickens , Child , Child, Preschool , Environment , Female , Foodborne Diseases/etiology , Health Surveys , Humans , Incidence , Infant , Infant, Newborn , Male , Middle Aged , Models, Statistical , Public Health Surveillance , Residence Characteristics , Risk Factors , Socioeconomic Factors , United States/epidemiology , Young Adult
14.
Pediatr Nephrol ; 30(5): 855-8, 2015 May.
Article in English | MEDLINE | ID: mdl-25750074

ABSTRACT

BACKGROUND: Kidney transplantation is the treatment of choice for end-stage renal disease. However, since pediatric patients have long projected life-years, it is also optimal for them to get well-matched transplants to minimize long-term sensitization. In North America, pediatric kidney transplantation is largely dependent upon the use of deceased donor organs, making it challenging to identify timely, well-matched transplants. Pediatric recipients may have willing living donors who are either HLA- or ABO-incompatible (ABOi); therefore, one solution is to utilize ABOi transplants and paired exchange programs to enhance HLA matching and living donation. CASE-DIAGNOSIS/TREATMENT: We adopted this approach for a highly sensitized patient with cPRA 90%, who received a successful ABOi paired exchange transplant. The recipient received pre-transplant immunomodulation until an acceptable isohemagglutinin titer <1:8 was reached before transplantation. The patient was induced with anti-thymocyte globulin and maintained on steroid-based triple immunosuppression. Eighteen-month allograft function is excellent with an estimated glomerular filtration rate (eGFR) of 83.53 ml/min/1.73 m(2). The patient did not develop de novo donor-specific HLA antibodies or have any episodes of acute rejection CONCLUSIONS: This case highlights the safety and efficacy of using paired exchange in combination with ABOi transplants in pediatric kidney transplantation to optimize HLA matching, minimize wait times, and enhance allograft survival.


Subject(s)
Blood Group Incompatibility/immunology , Histocompatibility Testing/methods , Kidney Transplantation/methods , ABO Blood-Group System , Child , Humans , Living Donors , Male
15.
Transpl Int ; 27(11): 1175-82, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25052215

ABSTRACT

The disparity between kidney transplant candidates and donors necessitates innovations to increase organ availability. Transporting kidneys allows for living donors and recipients to undergo surgery with a familiar transplant team, city, friends, and family. The effect of shipping kidneys and prolonged cold ischemia time (CIT) with living donor transplantation outcomes is not clearly known. This retrospective matched (age, gender, race, and year of procedure) cohort study compared allograft outcomes for shipped live donor kidney transplants and nonshipped living donor kidney transplants. Fifty-seven shipped live donor kidneys were transplanted from 31 institutions in 26 cities. The mean shipping distance was 1634 miles (range 123-2811) with mean CIT of 12.1 ± 2.8 h. The incidence of delayed graft function in the shipped cohort was 1.8% (1/57) compared to 0% (0/57) in the nonshipped cohort. The 1-year allograft survival was 98% in both cohorts. There were no significant differences between the mean serum creatinine values or the rates of serum creatinine decline in the immediate postoperative period even after adjusted for gender and differences in recipient and donor BMI. Despite prolonged CITs, outcomes for shipped live donor kidney transplants were similar when compared to matched nonshipped living donor kidney transplants.


Subject(s)
Kidney Transplantation , Living Donors , Tissue and Organ Procurement , Adult , Cohort Studies , Cold Ischemia , Creatinine/blood , Delayed Graft Function , Female , Graft Survival , Humans , Male , Middle Aged , Retrospective Studies , Transportation , Unrelated Donors
16.
Kidney Int ; 84(5): 1009-16, 2013 Nov.
Article in English | MEDLINE | ID: mdl-23715120

ABSTRACT

Incompatible donor/recipient pairs with broadly sensitized recipients have difficulty finding a crossmatch-compatible match, despite a large kidney paired donation pool. One approach to this problem is to combine kidney paired donation with lower-risk crossmatch-incompatible transplantation with intravenous immunoglobulin. Whether this strategy is non-inferior compared with transplantation of sensitized patients without donor-specific antibody (DSA) is unknown. Here we used a protocol including a virtual crossmatch to identify acceptable crossmatch-incompatible donors and the administration of intravenous immunoglobulin to transplant 12 HLA-sensitized patients (median calculated panel reactive antibody 98%) with allografts from our kidney paired donation program. This group constituted the DSA(+) kidney paired donation group. We compared rates of rejection and survival between the DSA(+) kidney paired donation group with a similar group of 10 highly sensitized patients (median calculated panel reactive antibody 85%) that underwent DSA(-) kidney paired donation transplantation without intravenous immunoglobulin. At median follow-up of 22 months, the DSA(+) kidney paired donation group had patient and graft survival of 100%. Three patients in the DSA(+) kidney paired donation group experienced antibody-mediated rejection. Patient and graft survival in the DSA(-) kidney paired donation recipients was 100% at median follow-up of 18 months. No rejection occurred in the DSA(-) kidney paired donation group. Thus, our study provides a clinical framework through which kidney paired donation can be performed with acceptable outcomes across a crossmatch-incompatible transplant.


Subject(s)
Graft Rejection/immunology , Graft Rejection/prevention & control , Graft Survival/drug effects , HLA Antigens/immunology , Histocompatibility , Immunoglobulins, Intravenous/therapeutic use , Isoantibodies/blood , Kidney Transplantation/adverse effects , Living Donors , Adult , Aged , Female , Graft Rejection/mortality , Histocompatibility Testing , Humans , Kidney Transplantation/mortality , Male , Middle Aged , Retrospective Studies , Risk Factors , Time Factors , Treatment Outcome , Waiting Lists
17.
Emerg Infect Dis ; 18(12): 1929-36, 2012 Dec.
Article in English | MEDLINE | ID: mdl-23171627

ABSTRACT

Salmonellosis is usually associated with foodborne transmission. To identify risk from animal contact, we compared animal exposures of case-patients infected with bovine-associated Salmonella subtypes with those of control-patients infected with non-bovine-associated subtypes. We used data collected in New York and Washington, USA, from March 1, 2008, through March 1, 2010. Contact with farm animals during the 5 days before illness onset was significantly associated with being a case-patient (odds ratio 3.2, p = 0.0008), after consumption of undercooked ground beef and unpasteurized milk were controlled for. Contact with cattle specifically was also significantly associated with being a case-patient (odds ratio 7.4, p = 0.0002), after food exposures were controlled for. More cases of bovine-associated salmonellosis in humans might result from direct contact with cattle, as opposed to ingestion of foods of bovine origin, than previously recognized. Efforts to control salmonellosis should include a focus on transmission routes other than foodborne.


Subject(s)
Salmonella Infections/transmission , Salmonella/isolation & purification , Adult , Animals , Animals, Domestic , Case-Control Studies , Cattle , Female , Humans , Male , Meat/microbiology , Middle Aged , Milk/microbiology , New York/epidemiology , Odds Ratio , Risk Factors , Salmonella/classification , Salmonella Infections/epidemiology , Serotyping , Washington/epidemiology , Young Adult
18.
Foodborne Pathog Dis ; 9(9): 796-802, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22870888

ABSTRACT

The objective of this study was to identify patient symptoms and case outcomes that were more likely to occur as a result of Salmonella infections caused by bovine-associated subtypes (isolates that matched contemporary bovine isolates from New York by serovar and pulsed-field gel electrophoresis pattern), as compared to salmonellosis caused by non-bovine-associated subtypes. Data were collected in 34 counties of New York that comprise the Foodborne Diseases Active Surveillance Network (FoodNet) catchment area of the Centers for Disease Control and Prevention Emerging Infections Program. Patients with specimen collection dates between March 1, 2008 and March 1, 2010 were included. Symptoms and outcomes of 40 cases infected with bovine-associated Salmonella subtypes were compared to those of 379 control-cases infected with Salmonella isolates that were not bovine-associated. Cases were significantly more likely to have invasive salmonellosis (odds ratio, 3.8; p-value=0.02), after adjusting for age group, gender, and race. In addition, there was a marginal association between case status and the presence of blood in the stool (p-value=0.1) while ill. These findings might have implications for patient management, as a history of consuming undercooked foods of bovine origin or having direct contact with cattle in the few days prior to illness could be useful for suggesting a more proactive diagnostic approach as well as close monitoring for the need to implement more aggressive therapy.


Subject(s)
Cattle/microbiology , Salmonella Infections/microbiology , Salmonella Infections/physiopathology , Salmonella/classification , Animals , Bacteremia/microbiology , Bacteremia/physiopathology , Bacteremia/therapy , Case-Control Studies , Electrophoresis, Gel, Pulsed-Field , Female , Humans , Male , Melena/etiology , New York , Public Health Surveillance , Salmonella/isolation & purification , Salmonella Food Poisoning/microbiology , Salmonella Food Poisoning/physiopathology , Salmonella Food Poisoning/therapy , Salmonella Infections/therapy , Salmonella Infections, Animal/microbiology , Serotyping , Severity of Illness Index , Surveys and Questionnaires , Treatment Outcome , Zoonoses/microbiology
19.
Arch Surg ; 146(4): 453-8, 2011 Apr.
Article in English | MEDLINE | ID: mdl-21502455

ABSTRACT

OBJECTIVE: To determine whether ABO-incompatible (ABOi) kidney transplantation can be performed safely and result in acceptable posttransplantation outcomes. DESIGN: Prospective study. SETTING: Transplantation center. PATIENTS: In the 1½ years of a new program, 18 patients with renal failure and an ABOi living kidney donor were included in the study. All donors and recipients were of incompatible blood types and underwent transplantation beginning in June 2008. INTERVENTIONS: Patients received immunomodulation (anti-CD20 antibody, intravenous immunoglobulin, and plasmapheresis) until an acceptable isoagglutinin titer was obtained on the date of transplantation. All the kidneys were transplanted heterotopically, and all the patients received induction immunosuppression followed by a combination of prednisone, mycophenolate mofetil, and tacrolimus. Isoagglutinin titers were monitored, and postoperative plasmapheresis was initiated if titers increased. MAIN OUTCOME MEASURES: Patient and allograft survival; length of stay; 1-, 3-, and 6-month and 1-year renal function; and incidence of rejection. RESULTS: Patient survival was 100%, with allograft survival of 94.4%. Mean (SD) length of stay was 6.9 (1.9) days. Donor to recipient transplantation was A to O in 11 cases, A2 to B in 1, B to A in 3, B to O in 1, and AB to B in 2. Mean (SD) creatinine levels, a measure of graft function, were 1.2 (0.5) mg/dL at discharge, 1.4 (0.4) mg/dL at 1 month, 1.3 (0.45) mg/dL at 3 months, 1.1 (0.3) mg/dL at 6 months, and 1.2 (0.2) mg/dL at 1 year. One episode of cellular rejection occurred. CONCLUSIONS: These short-term results suggest that with a straightforward regimen, ABOi kidney transplantation is possible, acceptable results and graft function are obtainable, and access to kidney transplantation for those with a blood type-incompatible donor can be expanded.


Subject(s)
ABO Blood-Group System/immunology , Immunosuppressive Agents/administration & dosage , Kidney Transplantation/immunology , Transplantation Conditioning/methods , Adult , Aged , Antigens, CD20/immunology , Female , Graft Rejection/diagnosis , Graft Rejection/therapy , Graft Survival , Humans , Immunoglobulins, Intravenous/administration & dosage , Incidence , Kidney Function Tests , Living Donors , Male , Middle Aged , Plasmapheresis , Prospective Studies , Time Factors , Transplantation, Homologous , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...