Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
1.
Int J Vitam Nutr Res ; 2023 Jul 20.
Article in English | MEDLINE | ID: mdl-37469107

ABSTRACT

Vitamin B12 can lead to neurological deficits. We assessed whether the mean corpuscular volume (MCV) could be a sufficiently sensitive measurement for abnormal serum methylmalonic Acid (MMA) and total plasma homocysteine (tHCY) (biomarkers of vitamin B12 or folate deficiency) and if so, at what cutoff value. A total of 26,397 participants (12,730 males and 13,667 females) were included in the analysis. Weighted analysis was performed using NHANES data to calculate crude/adjusted associations between MCV-MMA/tHCY, using linear regression. Unadjusted odds ratios (OR) 95% CIs were estimated from logistic regression models. Receiver Operating Curve and the Youden Index were used to identify the MCV level that most accurately distinguished those with abnormal MMA and tHCY (dependent variables) from those without. A positive and significant correlation between MCV-MMA/tHCY was found in the general population between ages 18-85, 0.95 (95% C.I. 0.75-1.17) and 2.61 (95% C.I. 2.15-3.08). In pregnant women, for every unit increase in MCV there was a 19% increase in odds of abnormal MMA, OR 1.19 (95% C.I. 1.08-1.31), p=0.001 and the Area Under the Curve for MCV as a test for abnormal MMA was 78%. An MCV cutoff of 93.1 correctly identified abnormal MMA in pregnant women with 81% sensitivity and 77% specificity. In the general population the MCV test performed poorly in identifying abnormal MMA/tHCY. MCV is an inexpensive measurement that may be useful to screen asymptomatic pregnant women for vitamin B12 abnormalities. This may have a significant impact on reducing adverse neurological outcomes in their children.

2.
Congenit Anom (Kyoto) ; 63(4): 100-108, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37073427

ABSTRACT

Folate and vitamin B12 deficiencies have been strongly associated with neural tube defects, preliminary research suggests folate and B12 deficiency may also be associated with autism spectrum disorder (ASD). We examined the association between neural tube defects and ASD as a further avenue to examine the hypothesis that ASD is related to maternal folate and B12 deficiency during pregnancy. A retrospective case-control study was performed using the Military Health System Data Repository. Cases and matched controls were followed from birth until at least 6 months after their first autism diagnosis. International Classification of Diseases, 9th Revision, codes were used to identify neural tube defects in the health records. A total of 8760 cases between the ages of 2 and 18 years were identified. The prevalence of any neural tube defect was 0.11% in children without ASD and 0.64% in children with ASD. Children with autism were over 6 times as likely to have a neural tube defect. The increased odds of neural tube defect in children diagnosed with ASD, found through our methodology, supports prior studies. Although additional studies are needed to elucidate the relationship between ASD and maternal folate and vitamin B12 deficiency during pregnancy this study supports their use during pregnancy.


Subject(s)
Autism Spectrum Disorder , Neural Tube Defects , Pregnancy , Female , Child , Humans , Child, Preschool , Adolescent , Case-Control Studies , Retrospective Studies , Autism Spectrum Disorder/diagnosis , Autism Spectrum Disorder/epidemiology , Autism Spectrum Disorder/etiology , Neural Tube Defects/diagnosis , Neural Tube Defects/epidemiology , Neural Tube Defects/etiology , Folic Acid , Vitamin B 12 , Vitamins
4.
Circulation ; 143(12): 1184-1197, 2021 03 23.
Article in English | MEDLINE | ID: mdl-33435695

ABSTRACT

BACKGROUND: After heart transplantation, endomyocardial biopsy (EMBx) is used to monitor for acute rejection (AR). Unfortunately, EMBx is invasive, and its conventional histological interpretation has limitations. This is a validation study to assess the performance of a sensitive blood biomarker-percent donor-derived cell-free DNA (%ddcfDNA)-for detection of AR in cardiac transplant recipients. METHODS: This multicenter, prospective cohort study recruited heart transplant subjects and collected plasma samples contemporaneously with EMBx for %ddcfDNA measurement by shotgun sequencing. Histopathology data were collected to define AR, its 2 phenotypes (acute cellular rejection [ACR] and antibody-mediated rejection [AMR]), and controls without rejection. The primary analysis was to compare %ddcfDNA levels (median and interquartile range [IQR]) for AR, AMR, and ACR with controls and to determine %ddcfDNA test characteristics using receiver-operator characteristics analysis. RESULTS: The study included 171 subjects with median posttransplant follow-up of 17.7 months (IQR, 12.1-23.6), with 1392 EMBx, and 1834 %ddcfDNA measures available for analysis. Median %ddcfDNA levels decayed after surgery to 0.13% (IQR, 0.03%-0.21%) by 28 days. Also, %ddcfDNA increased again with AR compared with control values (0.38% [IQR, 0.31-0.83%], versus 0.03% [IQR, 0.01-0.14%]; P<0.001). The rise was detected 0.5 and 3.2 months before histopathologic diagnosis of ACR and AMR. The area under the receiver operator characteristic curve for AR was 0.92. A 0.25%ddcfDNA threshold had a negative predictive value for AR of 99% and would have safely eliminated 81% of EMBx. In addition, %ddcfDNA showed distinctive characteristics comparing AMR with ACR, including 5-fold higher levels (AMR ≥2, 1.68% [IQR, 0.49-2.79%] versus ACR grade ≥2R, 0.34% [IQR, 0.28-0.72%]), higher area under the receiver operator characteristic curve (0.95 versus 0.85), higher guanosine-cytosine content, and higher percentage of short ddcfDNA fragments. CONCLUSIONS: We found that %ddcfDNA detected AR with a high area under the receiver operator characteristic curve and negative predictive value. Monitoring with ddcfDNA demonstrated excellent performance characteristics for both ACR and AMR and led to earlier detection than the EMBx-based monitoring. This study supports the use of %ddcfDNA to monitor for AR in patients with heart transplant and paves the way for a clinical utility study. Registration: URL: https://www.clinicaltrials.gov; Unique identifier: NCT02423070.


Subject(s)
Allografts/transplantation , Cell-Free Nucleic Acids/genetics , Graft Rejection/physiopathology , Adult , Aged , Cohort Studies , Female , Humans , Male , Middle Aged , Prospective Studies , Young Adult
5.
EBioMedicine ; 40: 541-553, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30692045

ABSTRACT

BACKGROUND: Allograft failure is common in lung-transplant recipients and leads to poor outcomes including early death. No reliable clinical tools exist to identify patients at high risk for allograft failure. This study tested the use of donor-derived cell-free DNA (%ddcfDNA) as a sensitive marker of early graft injury to predict impending allograft failure. METHODS: This multicenter, prospective cohort study enrolled 106 subjects who underwent lung transplantation and monitored them after transplantation for the development of allograft failure (defined as severe chronic lung allograft dysfunction [CLAD], retransplantation, and/or death from respiratory failure). Plasma samples were collected serially in the first three months following transplantation and assayed for %ddcfDNA by shotgun sequencing. We computed the average levels of ddcfDNA over three months for each patient (avddDNA) and determined its relationship to allograft failure using Cox-regression analysis. FINDINGS: avddDNA was highly variable among subjects: median values were 3·6%, 1·6% and 0·7% for the upper, middle, and low tertiles, respectively (range 0·1%-9·9%). Compared to subjects in the low and middle tertiles, those with avddDNA in the upper tertile had a 6·6-fold higher risk of developing allograft failure (95% confidence interval 1·6-19·9, p = 0·007), lower peak FEV1 values, and more frequent %ddcfDNA elevations that were not clinically detectable. INTERPRETATION: Lung transplant patients with early unresolving allograft injury measured via %ddcfDNA are at risk of subsequent allograft injury, which is often clinically silent, and progresses to allograft failure. FUND: National Institutes of Health.


Subject(s)
Biomarkers , Cell-Free Nucleic Acids , Graft Rejection , Lung Transplantation/adverse effects , Lung Transplantation/mortality , Tissue Donors , Aged , Allografts , Comorbidity , Female , Graft Rejection/immunology , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Prognosis , Proportional Hazards Models , Prospective Studies , Risk Factors , Sequence Analysis, DNA , Time Factors , Transplantation, Homologous
6.
J Heart Lung Transplant ; 37(7): 925-932, 2018 07.
Article in English | MEDLINE | ID: mdl-29500138

ABSTRACT

BACKGROUND: Antibody-mediated rejection (AMR) often progresses to poor health outcomes in lung transplant recipients (LTRs). This, combined with the relatively insensitive clinical tools used for its diagnosis (spirometry, histopathology) led us to determine whether clinical AMR is diagnosed significantly later than its pathologic onset. In this study, we leveraged the high sensitivity of donor-derived cell-free DNA (ddcfDNA), a novel genomic tool, to detect early graft injury after lung transplantation. METHODS: We adjudicated AMR and acute cellular rejection (ACR) in 157 LTRs using the consensus criteria of the International Society for Heart and Lung Transplantation (ISHLT). We assessed the kinetics of allograft injury in relation to ACR or AMR using both clinical criteria (decline in spirometry from baseline) and molecular criteria (ddcfDNA); percent ddcfDNA was quantitated via shotgun sequencing. We used a mixed-linear model to assess the relationship between and ddcfDNA levels and donor-specific antibodies (DSA) in AMR+ LTRs. RESULTS: Compared with ACR, AMR episodes (n = 42) were associated with significantly greater allograft injury when assessed by both spirometric (0.1 liter vs -0.6 liter, p < 0.01) and molecular (ddcfDNA) analysis (1.1% vs 5.4%, p < 0.001). Allograft injury detected by ddcfDNA preceded clinical AMR diagnosis by a median of 2.8 months. Within the same interval, spirometry or histopathology did not reveal findings of allograft injury or dysfunction. Elevated levels of ddcfDNA before clinical diagnosis of AMR were associated with a concurrent rise in DSA levels. CONCLUSION: Diagnosis of clinical AMR in LTRs lags behind DSA-associated molecular allograft injury as assessed by ddcfDNA.


Subject(s)
Cell-Free Nucleic Acids/analysis , Delayed Diagnosis , Graft Rejection/diagnosis , Graft Rejection/immunology , Isoantibodies/physiology , Lung Transplantation , Graft Rejection/genetics , Humans , Prospective Studies
7.
J Heart Lung Transplant ; 36(9): 1004-1012, 2017 Sep.
Article in English | MEDLINE | ID: mdl-28624139

ABSTRACT

BACKGROUND: Use of new genomic techniques in clinical settings requires that such methods are rigorous and reproducible. Previous studies have shown that quantitation of donor-derived cell-free DNA (%ddcfDNA) by unbiased shotgun sequencing is a sensitive, non-invasive marker of acute rejection after heart transplantation. The primary goal of this study was to assess the reproducibility of %ddcfDNA measurements across technical replicates, manual vs automated platforms, and rejection phenotypes in distinct patient cohorts. METHODS: After developing and validating the %ddcfDNA assay, we subjected the method to a rigorous test of its reproducibility. We measured %ddcfDNA in technical replicates performed by 2 independent laboratories and verified the reproducibility of %ddcfDNA patterns of 2 rejection phenotypes: acute cellular rejection and antibody-mediated rejection in distinct patient cohorts. RESULTS: We observed strong concordance of technical-replicate %ddcfDNA measurements across 2 independent laboratories (slope = 1.02, R2 > 0.99, p < 10-6), as well as across manual and automated platforms (slope = 0.80, R2 = 0.92, p < 0.001). The %ddcfDNA measurements in distinct heart transplant cohorts had similar baselines and error rates. The %ddcfDNA temporal patterns associated with rejection phenotypes were similar in both patient cohorts; however, the quantity of ddcfDNA was significantly higher in samples with severe vs mild histologic rejection grade (2.73% vs 0.14%, respectively; p < 0.001). CONCLUSIONS: The %ddcfDNA assay is precise and reproducible across laboratories and in samples from 2 distinct types of heart transplant rejection. These findings pave the way for larger studies to assess the clinical utility of %ddcfDNA as a marker of acute rejection after heart transplantation.


Subject(s)
Cell-Free Nucleic Acids/analysis , Graft Rejection/blood , Heart Transplantation/adverse effects , Primary Graft Dysfunction/blood , Tissue Donors , Acute Disease , Adult , Biomarkers/blood , Female , Heart Transplantation/methods , Humans , Linear Models , Male , Primary Graft Dysfunction/physiopathology , Reproducibility of Results , Statistics, Nonparametric
8.
AIDS ; 26(2): 175-84, 2012 Jan 14.
Article in English | MEDLINE | ID: mdl-22089380

ABSTRACT

OBJECTIVE: To describe symptoms, physical examination findings, and set-point viral load associated with acute HIV seroconversion in a heterosexual cohort of HIV-discordant couples in Zambia. DESIGN: We followed HIV serodiscordant couples in Lusaka, Zambia from 1995 to 2009 with HIV testing of negative partners and symptom inventories 3 monthly, and physical examinations annually. METHODS: We compared prevalence of self-reported or treated symptoms (malaria syndrome, chronic diarrhea, asthenia, night sweats, and oral candidiasis) and annual physical examination findings (unilateral or bilateral neck, axillary, or inguinal adenopathy; and dermatosis) in seroconverting vs. HIV-negative or HIV-positive intervals, controlling for repeated observations, age, and sex. A composite score comprised of significant symptoms and physical examination findings predictive of seroconversion vs. HIV-negative intervals was constructed. We modeled the relationship between number of symptoms and physical examination findings at seroconversion and log set-point viral load using linear regression. RESULTS: Two thousand, three hundred and eighty-eight HIV-negative partners were followed for a median of 18 months; 429 seroconversions occurred. Neither symptoms nor physical examination findings were reported for most seroconverters. Seroconversion was significantly associated with malaria syndrome among nondiarrheic patients [adjusted odds ratio (aOR) = 4.0], night sweats (aOR = 1.4), and bilateral axillary (aOR = 1.6), inguinal (aOR = 2.2), and neck (aOR = 2.2) adenopathy relative to HIV-negative intervals. Median number of symptoms and findings was positively associated with set-point viral load (P < 0.001). CONCLUSION: Although most acute and early infections were asymptomatic, malaria syndrome was more common and more severe during seroconversion. When present, symptoms and physical examination findings were nonspecific and associated with higher set-point viremia.


Subject(s)
Asthenia/epidemiology , Candidiasis, Oral/epidemiology , Diarrhea/epidemiology , HIV Seropositivity/epidemiology , HIV-1/isolation & purification , Malaria/epidemiology , Sexual Partners , Adolescent , Adult , Asthenia/virology , Candidiasis, Oral/virology , Cohort Studies , Diarrhea/virology , Epidemics , Female , Follow-Up Studies , Genotype , HIV Seropositivity/virology , HIV-1/immunology , Heterosexuality , Humans , Linear Models , Malaria/virology , Male , Middle Aged , Prevalence , Viral Load , Viremia , Young Adult , Zambia/epidemiology
9.
J Infect Dis ; 203(2): 258-62, 2011 Jan 15.
Article in English | MEDLINE | ID: mdl-21288826

ABSTRACT

Human immunodeficiency virus type 1 (HIV-1)-specific T cell responses were characterized in a blinded study involving infected individuals and their seronegative exposed uninfected (EU) partners from Lusaka, Zambia. HIV-1-specific T cell responses were detected ex vivo in all infected individuals and amplified, on average, 27-fold following in vitro expansion. In contrast, no HIV-1-specific T cell responses were detected in any of the EU partners ex vivo or following in vitro expansion. These data demonstrate that the detection of HIV-1-specific T cell immunity in EU individuals is not universal and that alternative mechanisms may account for protection in these individuals.


Subject(s)
CD8-Positive T-Lymphocytes/immunology , HIV Infections/immunology , HIV-1/immunology , Sexual Partners , Female , Humans , Immunity, Innate , Male , Zambia
10.
Headache ; 50(5): 790-4, 2010 May.
Article in English | MEDLINE | ID: mdl-19925623

ABSTRACT

BACKGROUND: Headaches can be triggered by a variety of factors. Military service members have a high prevalence of headache but the factors triggering headaches in military troops have not been identified. OBJECTIVE: The objective of this study is to determine headache triggers in soldiers and military beneficiaries seeking specialty care for headaches. METHODS: A total of 172 consecutive US Army soldiers and military dependents (civilians) evaluated at the headache clinics of 2 US Army Medical Centers completed a standardized questionnaire about their headache triggers. RESULTS: A total of 150 (87%) patients were active-duty military members and 22 (13%) patients were civilians. In total, 77% of subjects had migraine; 89% of patients reported at least one headache trigger with a mean of 8.3 triggers per patient. A wide variety of headache triggers was seen with the most common categories being environmental factors (74%), stress (67%), consumption-related factors (60%), and fatigue-related factors (57%). The types of headache triggers identified in active-duty service members were similar to those seen in civilians. Stress-related triggers were significantly more common in soldiers. There were no significant differences in trigger types between soldiers with and without a history of head trauma. CONCLUSION: Headaches in military service members are triggered mostly by the same factors as in civilians with stress being the most common trigger. Knowledge of headache triggers may be useful for developing strategies that reduce headache occurrence in the military.


Subject(s)
Environmental Exposure/adverse effects , Headache/epidemiology , Headache/etiology , Military Personnel , Occupational Exposure/adverse effects , Stress, Psychological/epidemiology , Adult , Cohort Studies , Comorbidity , Female , Headache/classification , Humans , Male , Military Personnel/psychology , United States/epidemiology , Young Adult
11.
AIDS ; 17(5): 733-40, 2003 Mar 28.
Article in English | MEDLINE | ID: mdl-12646797

ABSTRACT

BACKGROUND AND OBJECTIVES: Sexual behavior following voluntary HIV counseling and testing (VCT) is described in 963 cohabiting heterosexual couples with one HIV positive and one HIV negative partner ('discordant couples'). Biological markers were used to assess the validity of self-report. METHODS: Couples were recruited from a same-day VCT center in Lusaka, Zambia. Sexual exposures with and without condoms were recorded at 3-monthly intervals. Sperm detected on vaginal smears, pregnancy, and sexually transmitted diseases (STD) including HIV, gonorrhea, syphilis, and Trichomonas vaginalis were assessed. RESULTS: Less than 3% of couples reported current condom use prior to VCT. In the year after VCT, > 80% of reported acts of intercourse in discordant couples included condom use. Reporting 100% condom use was associated with 39-70% reductions in biological markers; however most intervals with reported unprotected sex were negative for all biological markers. Under-reporting was common: 50% of sperm and 32% of pregnancies and HIV transmissions were detected when couples had reported always using condoms. Positive laboratory tests for STD and reported extramarital sex were relatively infrequent. DNA sequencing confirmed that 87% of new HIV infections were acquired from the spouse. CONCLUSIONS: Joint VCT prompted sustained but imperfect condom use in HIV discordant couples. Biological markers were insensitive but provided evidence for a significant under-reporting of unprotected sex. Strategies that encourage truthful reporting of sexual behavior and sensitive biological markers of exposure are urgently needed. The impact of prevention programs should be assessed with both behavioral and biological measures.


Subject(s)
Condoms/statistics & numerical data , Counseling , HIV Infections/prevention & control , Sexual Behavior/statistics & numerical data , Sexual Partners/psychology , Adult , Developing Countries , Female , Follow-Up Studies , HIV Infections/diagnosis , HIV Infections/transmission , Humans , Male , Middle Aged , Patient Compliance/statistics & numerical data , Prospective Studies , Risk Factors , Sexually Transmitted Diseases/prevention & control , Truth Disclosure , Zambia
12.
J Virol ; 76(16): 8276-84, 2002 Aug.
Article in English | MEDLINE | ID: mdl-12134033

ABSTRACT

The setpoint of viral RNA concentration (viral load [VL]) during chronic human immunodeficiency virus type 1 (HIV-1) infection reflects a virus-host equilibration closely related to CD8(+) cytotoxic T-lymphocyte (CTL) responses, which rely heavily on antigen presentation by the human major histocompatibility complex (MHC) (i.e., HLA) class I molecules. Differences in HIV-1 VL among 259 mostly clade C virus-infected individuals (137 females and 122 males) in the Zambia-UAB HIV Research Project (ZUHRP) were associated with several HLA class I alleles and haplotypes. In particular, general linear model analyses revealed lower log(10) VL among those with HLA allele B*57 (P = 0.002 [without correction]) previously implicated in favorable response and in those with HLA B*39 and A*30-Cw*03 (P = 0.002 to 0.016); the same analyses also demonstrated higher log(10) VL among individuals with A*02-Cw*16, A*23-B*14, and A*23-Cw*07 (P = 0.010 to 0.033). These HLA effects remained strong (P = 0.0002 to 0.075) after adjustment for age, gender, and duration of infection and persisted across three orders of VL categories (P = 0.001 to 0.084). In contrast, neither B*35 (n = 15) nor B*53 (n = 53) showed a clear disadvantage such as that reported elsewhere for these closely related alleles. Other HLA associations with unusually high (A*68, B*41, B*45, and Cw*16) or low (B*13, Cw*12, and Cw*18) VL were either unstable or reflected their tight linkage respecting disequilibria with other class I variants. The three consistently favorable HLA class I variants retained in multivariable models and in alternative analyses were present in 30.9% of subjects with the lowest (<10,000 copies per ml) and 3.1% of those with the highest (>100,000) VL. Clear differential distribution of HLA profiles according to level of viremia suggests important host genetic contribution to the pattern of immune control and escape during HIV-1 infection.


Subject(s)
Genes, MHC Class I , HIV Infections/genetics , HIV Infections/immunology , HIV-1 , HLA Antigens/genetics , Adult , Alleles , Female , Genetic Variation , HIV Infections/virology , HIV-1/classification , Haplotypes , Humans , Male , Multivariate Analysis , RNA, Viral/blood , Viremia/genetics , Viremia/immunology , Viremia/virology , Zambia
13.
J Virol ; 76(1): 397-405, 2002 Jan.
Article in English | MEDLINE | ID: mdl-11739704

ABSTRACT

Most human immunodeficiency virus type 1 (HIV-1) transmissions in sub-Saharan Africa are believed to occur between married adults who are discordant for their HIV-1 infection status; however, no studies to date have investigated the molecular epidemiology of such transmission events. Here we report the genetic characterization of HIV-1 strains from 149 transmission pairs that were identified prospectively in a cohort of discordant couples in Lusaka, Zambia. Subgenomic gag, gp120, gp41, and/or long terminal repeat regions were amplified by PCR analysis of uncultured blood samples from both partners and sequenced without interim cloning. Pairwise genetic distances were calculated for the regions analyzed and compared to those of subtype-specific reference sequences as well as local controls. Sequence relationships were also examined by phylogenetic tree analysis. By these approaches, epidemiological linkage was established for the majority of transmission pairs. Viruses from 129 of the 149 couples (87%) were very closely related and clustered together in phylogenetic trees in a statistically highly significant manner. In contrast, viruses from 20 of the 149 couples (13%) were only distantly related in two independent genomic regions, thus ruling out transmission between the two partners. The great majority (95%) of transmitted viruses were of subtype C origin, although representatives of subtypes A, D, G, and J were also identified. There was no evidence for extensive transmission networks within the cohort, although two phylogenetic subclusters of viruses infecting two couples each were identified. Taken together, these data indicate that molecular epidemiological analyses of presumed transmission pairs are both feasible and required to determine behavioral, virological, and immunological correlates of heterosexual transmission in sub-Saharan Africa with a high level of accuracy.


Subject(s)
HIV Infections/epidemiology , HIV-1/genetics , Cohort Studies , Family Relations , HIV Infections/transmission , HIV Infections/virology , HIV-1/classification , Heterosexuality , Humans , Phylogeny , RNA, Viral/genetics , Zambia/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL