Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
Sci Rep ; 13(1): 21927, 2023 12 11.
Article in English | MEDLINE | ID: mdl-38081834

ABSTRACT

The continued emergence of vaccine-resistant SARS-CoV-2 variants of concern (VOC) requires specific identification of each VOC as it arises. Here, we report an expanded version of our previously described sloppy molecular beacon (SMB) melting temperature (Tm) signature-based assay for VOCs, now modified to include detection of Delta (B.1.617.2) and Omicron (B.1.1.529) sub-variants. The SMB-VOC assay targets the signature codons 501, 484 and 452 in the SARS-CoV-2 spike protein which we show can specifically detect and differentiate all known VOCs including the Omicron subvariants (BA.1, BA.2, BA.2.12.1, BA.4/BA.5). The limit of detection (LOD) of the assay was 20, 22 and 36 genomic equivalents (GE) per reaction with the Delta, Omicron BA.1 and BA.2 respectively. Clinical validation of the 3-codon assay in the LC480 instrument showed the assay detected 94% (81/86) of the specimens as WT or VOCs and 6% (5/86) of the tests producing indeterminate results compared to sequencing. Sanger sequencing also failed for four samples. None of the specimens were incorrectly identified as WT or as a different VOC by our assay. Thus, excluding specimens with indeterminant results, the assay was 100% sensitive and 100% specific compared to Sanger sequencing for variant identification. This new assay concept can be easily expanded to add newer variants and can serve as a robust diagnostic tool for selecting appropriate monoclonal antibody therapy and rapid VOC surveillance.


Subject(s)
COVID-19 , Magnoliopsida , Humans , COVID-19/diagnosis , Reverse Transcriptase Polymerase Chain Reaction , SARS-CoV-2/genetics , Temperature , COVID-19 Testing
2.
Antimicrob Agents Chemother ; 67(11): e0093223, 2023 11 15.
Article in English | MEDLINE | ID: mdl-37877727

ABSTRACT

Variable pharmacokinetics of rifampin in tuberculosis (TB) treatment can lead to poor outcomes. Urine spectrophotometry is simpler and more accessible than recommended serum-based drug monitoring, but its optimal efficacy in predicting serum rifampin underexposure in adults with TB remains uncertain. Adult TB patients in New Jersey and Virginia receiving rifampin-containing regimens were enrolled. Serum and urine samples were collected over 24 h. Rifampin serum concentrations were measured using validated liquid chromatography-tandem mass spectrometry, and total exposure (area under the concentration-time curve) over 24 h (AUC0-24) was determined through noncompartmental analysis. The Sunahara method was used to extract total rifamycins, and rifampin urine excretion was measured by spectrophotometry. An analysis of 58 eligible participants, including 15 (26%) with type 2 diabetes mellitus, demonstrated that urine spectrophotometry accurately identified subtarget rifampin AUC0-24 at 0-4, 0-8, and 0-24 h. The area under the receiver operator characteristic curve (AUC ROC) values were 0.80 (95% CI 0.67-0.90), 0.84 (95% CI 0.72-0.94), and 0.83 (95% CI 0.72-0.93), respectively. These values were comparable to the AUC ROC of 2 h serum concentrations commonly used for therapeutic monitoring (0.82 [95% CI 0.71-0.92], P = 0.6). Diabetes status did not significantly affect the AUC ROCs for urine in predicting subtarget rifampin serum exposure (P = 0.67-0.92). Spectrophotometric measurement of urine rifampin excretion within the first 4 or 8 h after dosing is a simple and cost-effective test that accurately predicts rifampin underexposure. This test provides critical information for optimizing tuberculosis treatment outcomes by facilitating appropriate dose adjustments.


Subject(s)
Diabetes Mellitus, Type 2 , Tuberculosis , Adult , Humans , Rifampin/pharmacokinetics , Antitubercular Agents/pharmacokinetics , Prospective Studies , Diabetes Mellitus, Type 2/drug therapy , Tuberculosis/diagnosis , Tuberculosis/drug therapy
3.
Arch Dis Child ; 108(8): 616-621, 2023 08.
Article in English | MEDLINE | ID: mdl-37171408

ABSTRACT

OBJECTIVE: Pharmacokinetic variability drives tuberculosis (TB) treatment outcomes but measurement of serum drug concentrations for personalised dosing is inaccessible for children in TB-endemic settings. We compared rifampin urine excretion for prediction of a serum target associated with treatment outcome. DESIGN: Prospective diagnostic accuracy study. SETTING: Inpatient wards and outpatient clinics, northern Tanzania. PATIENTS: Children aged 4-17 years were consecutively recruited on initiation of WHO-approved treatment regimens. INTERVENTIONS: Samples were collected after directly observed therapy at least 2 weeks after initiation in the intensive phase: serum at pre-dose and 1, 2 and 6 hours post-dose, later analysed by liquid chromatography-tandem mass spectrometry for calculation of rifampin total exposure or area under the concentration time curve (AUC0-24); urine at post-dose intervals of 0-4, 4-8 and 8-24 hours, with rifampin excretion amount measured onsite by spectrophotometry. MAIN OUTCOME MEASURES: Receiver operating characteristic (ROC) curve for percentage of rifampin dose excreted in urine measured by spectrophotometry to predict serum rifampin AUC0-24 target of 31.7 mg*hour/L. RESULTS: 89 children, 52 (58%) female, with median age of 9.1 years, had both serum and urine collection. Only 59 (66%) reached the serum AUC0-24 target, reflected by a range of urine excretion patterns. Area under the ROC curve for percentage of rifampin dose excreted in urine over 24 hours predicting serum AUC0-24 target was 69.3% (95% CI 56.7% to 81.8%), p=0.007. CONCLUSIONS: Urine spectrophotometry correlated with a clinically relevant serum target for rifampin, representing a step toward personalised dosing for children in TB-endemic settings.


Subject(s)
Rifampin , Tuberculosis , Humans , Child , Female , Male , Rifampin/therapeutic use , Rifampin/pharmacokinetics , Antitubercular Agents/therapeutic use , Antitubercular Agents/pharmacokinetics , Prospective Studies , Tuberculosis/diagnosis , Tuberculosis/drug therapy , Treatment Outcome
4.
PLoS One ; 18(3): e0282708, 2023.
Article in English | MEDLINE | ID: mdl-36928472

ABSTRACT

Saliva has been a COVID-19 diagnostic specimen of interest due to its simple collection, scalability, and yield. Yet COVID-19 testing and estimates of the infectious period remain largely based on nasopharyngeal and nasal swabs. We sought to evaluate whether saliva testing captured prolonged presence of SARS-CoV-2 and potential infectiousness later in the disease course. We conducted an observational study of symptomatic COVID-19 patients at University Hospital in Newark, NJ. Paired saliva and nasal specimens from 96 patients were analyzed, including longitudinal analysis of paired observations from 28 of these patients who had multiple time-points. Saliva detected significantly more cases of COVID-19 beyond 5 days (86.1% [99/115] saliva vs 48.7% [56/115] nasal, p-value < 0.001), 9 days (79.4% [50/63] saliva vs 36.5% [23/63] nasal, p-value < 0.001) and 14 days (71.4% [20/28] saliva vs 32.1% [9/28] nasal, p-value = 0.010) of symptoms. Additionally, saliva yielded lower cycle thresholds across all time periods, indicative of higher viral loads in saliva. In the longitudinal analysis, a log-rank analysis indicated that the survival curve for saliva was significantly different from the curve for nasal swabs (p<0.001) with a median survival time for saliva of 18 days compared to 13 days for nasal swabs. We additionally performed saliva viral cultures among a similar COVID-19 patient cohort and noted patients with positive saliva viral cultures between 7 to 28 days of symptoms. Findings from this study suggest that SARS-CoV-2 RNA persists longer and in higher abundance in saliva compared to nasal swabs, with potential of prolonged propagating virus. Testing saliva may thus increase yield for detecting potentially infectious virus even beyond the first five days of symptomatic COVID-19.


Subject(s)
COVID-19 , Communicable Diseases , Humans , SARS-CoV-2/genetics , COVID-19/diagnosis , COVID-19 Testing , Saliva , RNA, Viral/genetics , Specimen Handling , Nasopharynx
5.
Environ Health ; 17(1): 43, 2018 05 02.
Article in English | MEDLINE | ID: mdl-29720177

ABSTRACT

BACKGROUND: Chronic lymphocytic leukemia (CLL) was the predominant leukemia in a recent study of Chornobyl cleanup workers from Ukraine exposed to radiation (UR-CLL). Radiation risks of CLL significantly increased with increasing bone marrow radiation doses. Current analysis aimed to clarify whether the increased risks were due to radiation or to genetic mutations in the Ukrainian population. METHODS: A detailed characterization of the genomic landscape was performed in a unique sample of 16 UR-CLL patients and age- and sex-matched unexposed general population Ukrainian-CLL (UN-CLL) and Western-CLL (W-CLL) patients (n = 28 and 100, respectively). RESULTS: Mutations in telomere-maintenance pathway genes POT1 and ATM were more frequent in UR-CLL compared to UN-CLL and W-CLL (both p < 0.05). No significant enrichment in copy-number abnormalities at del13q14, del11q, del17p or trisomy12 was identified in UR-CLL compared to other groups. Type of work performed in the Chornobyl zone, age at exposure and at diagnosis, calendar time, and Rai stage were significant predictors of total genetic lesions (all p < 0.05). Tumor telomere length was significantly longer in UR-CLL than in UN-CLL (p = 0.009) and was associated with the POT1 mutation and survival. CONCLUSIONS: No significant enrichment in copy-number abnormalities at CLL-associated genes was identified in UR-CLL compared to other groups. The novel associations between radiation exposure, telomere maintenance and CLL prognosis identified in this unique case series provide suggestive, though limited data and merit further investigation.


Subject(s)
Chernobyl Nuclear Accident , Genome, Human/radiation effects , Leukemia, Lymphocytic, Chronic, B-Cell/epidemiology , Neoplasms, Radiation-Induced/epidemiology , Occupational Exposure , Radiation Exposure , Adult , Case-Control Studies , Female , Follow-Up Studies , Genomics , Humans , Incidence , Leukemia, Lymphocytic, Chronic, B-Cell/etiology , Male , Middle Aged , Neoplasms, Radiation-Induced/etiology , Prevalence , Radiation Dosage , Ukraine/epidemiology , Young Adult
6.
Hematol Oncol ; 35(2): 215-224, 2017 Jun.
Article in English | MEDLINE | ID: mdl-26806761

ABSTRACT

The recently demonstrated radiation-induction of chronic lymphocytic leukemia (CLL) raises the question as to whether the amount of radiation exposure influences any of the clinical characteristics of the disease. We evaluated the relationship between bone marrow radiation doses and clinical characteristics and survival of 79 CLL cases diagnosed during 1986-2006 in a cohort of 110 645 male workers who participated in the cleanup work of the Chornobyl nuclear accident in Ukraine in 1986. All diagnoses were confirmed by an independent International Hematology Panel. Patients were followed up to the date of death or end of follow-up on 31 October 2010. The median age at diagnosis was 57 years. Median bone marrow dose was 22.6 milligray (mGy) and was not associated with time between exposure and clinical diagnosis of CLL (latent period), age, peripheral blood lymphocyte count or clinical stage of disease in univariate and multivariate analyses. Latent period was significantly shorter among those older at first exposure, smokers and those with higher frequency of visits to the doctor prior to diagnosis. A significant increase in the risk of death with increasing radiation dose was observed (p = 0.03, hazard ratio = 2.38, 95% confidence interval: 1.11,5.08 comparing those with doses ≥22 mGy to doses <22 mGy). After adjustment for radiation dose, survival of CLL cases was significantly shorter among those with younger age at first exposure, higher peripheral blood lymphocyte count, more advanced clinical stage of disease and older age at diagnosis (all p < 0.05). This is the first study to examine association between bone marrow radiation doses from the Chornobyl accident and clinical manifestations of the CLL in Chornobyl cleanup workers. The current study provides new evidence on the association of radiation dose and younger age at first radiation exposure at Chornobyl with shorter survival after diagnosis. Future studies are necessary with more cases in order to improve the statistical power of these analyses and to determine their significance. Copyright © 2016 John Wiley & Sons, Ltd.


Subject(s)
Chernobyl Nuclear Accident , Leukemia, Lymphocytic, Chronic, B-Cell/etiology , Leukemia, Radiation-Induced/etiology , Occupational Exposure/adverse effects , Adult , Aged , Humans , Leukemia, Lymphocytic, Chronic, B-Cell/mortality , Leukemia, Radiation-Induced/mortality , Male , Middle Aged , Radiation Dosage
8.
Environ Health Perspect ; 121(1): 59-65, 2013 Jan.
Article in English | MEDLINE | ID: mdl-23149165

ABSTRACT

BACKGROUND: Risks of most types of leukemia from exposure to acute high doses of ionizing radiation are well known, but risks associated with protracted exposures, as well as associations between radiation and chronic lymphocytic leukemia (CLL), are not clear. OBJECTIVES: We estimated relative risks of CLL and non-CLL from protracted exposures to low-dose ionizing radiation. METHODS: A nested case-control study was conducted in a cohort of 110,645 Ukrainian cleanup workers of the 1986 Chornobyl nuclear power plant accident. Cases of incident leukemia diagnosed in 1986-2006 were confirmed by a panel of expert hematologists/hematopathologists. Controls were matched to cases on place of residence and year of birth. We estimated individual bone marrow radiation doses by the Realistic Analytical Dose Reconstruction with Uncertainty Estimation (RADRUE) method. We then used a conditional logistic regression model to estimate excess relative risk of leukemia per gray (ERR/Gy) of radiation dose. RESULTS: We found a significant linear dose response for all leukemia [137 cases, ERR/Gy = 1.26 (95% CI: 0.03, 3.58]. There were nonsignificant positive dose responses for both CLL and non-CLL (ERR/Gy = 0.76 and 1.87, respectively). In our primary analysis excluding 20 cases with direct in-person interviews < 2 years from start of chemotherapy with an anomalous finding of ERR/Gy = -0.47 (95% CI: < -0.47, 1.02), the ERR/Gy for the remaining 117 cases was 2.38 (95% CI: 0.49, 5.87). For CLL, the ERR/Gy was 2.58 (95% CI: 0.02, 8.43), and for non-CLL, ERR/Gy was 2.21 (95% CI: 0.05, 7.61). Altogether, 16% of leukemia cases (18% of CLL, 15% of non-CLL) were attributed to radiation exposure. CONCLUSIONS: Exposure to low doses and to low dose-rates of radiation from post-Chornobyl cleanup work was associated with a significant increase in risk of leukemia, which was statistically consistent with estimates for the Japanese atomic bomb survivors. Based on the primary analysis, we conclude that CLL and non-CLL are both radiosensitive.


Subject(s)
Chernobyl Nuclear Accident , Leukemia, Radiation-Induced/epidemiology , Occupational Exposure/adverse effects , Case-Control Studies , Humans , Lymphocytes/radiation effects , Radiation, Ionizing
9.
Ann Clin Lab Sci ; 41(1): 3-7, 2011.
Article in English | MEDLINE | ID: mdl-21325247

ABSTRACT

Current FDA regulations and AABB standards do not adequately protect the well-being of blood donors. Several practices have adverse consequences for donors, including: elevated incidence of donation related reactions and injuries, iron deficiency anemia in premenopausal women, and inadequate counseling of donors to obtain medical follow-up for health risks identified during pre-donation health screening. These practices can be improved without impacting negatively on the national blood supply. In addition to revising current blood collection operations, blood centers should explore the feasibility of establishing expanded donor health screening programs and determining their effectiveness in improving donor health, donor recruitment, and donor retention.


Subject(s)
Blood Donors , Blood Specimen Collection/standards , Health , Health Policy , Humans , Practice Guidelines as Topic
10.
South Med J ; 103(4): 343-6, 2010 Apr.
Article in English | MEDLINE | ID: mdl-20224480

ABSTRACT

In addition to assuring an adequate and safe blood supply, blood collection agencies are responsible for the well being of donors. Several aspects of the current blood donor experience may negatively impact donor health and require modification. Physicians need to be aware of health-related issues associated with blood donation in order to more effectively counsel patients participating in this critical community service and manage patients referred to them by blood collection agencies.


Subject(s)
Blood Donors , Donor Selection/methods , Health Status , Phlebotomy/adverse effects , Referral and Consultation , Adolescent , Adult , Anemia/complications , Anemia/diagnosis , Blood Banks , Blood Donors/supply & distribution , Donor Selection/standards , Female , Humans , Hypertension/complications , Hypertension/diagnosis , Male , Middle Aged , Risk Factors , Syncope/complications , Syncope/diagnosis , Young Adult
11.
Ann Clin Lab Sci ; 39(2): 138-43, 2009.
Article in English | MEDLINE | ID: mdl-19429799

ABSTRACT

Given the paucity of published data regarding reaction rates in younger teenaged donors, we evaluated the reaction rates in all of our first time teenaged donors after New York Blood Center lowered the minimum permissible age for blood donations from 17 to 16 yr in 2005. The overall rates of vaso-vagal reactions in donors aged 16 to 19, and those resulting in syncope, occurring in 72,769 consecutive first time whole blood, 3,822 double red cell, and 777 platelet apheresis donations were calculated. They were correlated with age and compared to those found in donors aged 20-29. Separate rates were calculated by gender, age in yr, and donation type, and then compared to each other. The overall reaction rate among first time teenaged whole blood donors was 8.2% and was significantly greater than among plateletpheresis donors (4.0%; p <0.0002). The rate in female whole blood donors (10.0%) was significantly higher than in males (6.4%; p <0.0002). In male double red cell donors the overall reaction rate of 3.5% was significantly lower than that found in male whole blood donors (p <0.002). Among both male and female whole blood donors a significant correlation with decreasing donor age between 19 and 16 yr was found (r(2) = 0.981; p = 0.01) and (r(2) = 0.988; p = 0.006), respectively. We conclude that teenaged donors have increased reaction rates when compared to adults and the reaction rates increase with decreasing age. In addition, females have higher reaction rates than males. Finally, reaction rates associated with apheresis donations are significantly lower than those associated with whole blood donations.


Subject(s)
Blood Donors/psychology , Plateletpheresis/psychology , Syncope, Vasovagal/epidemiology , Adolescent , Age Factors , Blood Component Removal/adverse effects , Blood Component Removal/psychology , Blood Donors/classification , Erythrocyte Transfusion/adverse effects , Female , Humans , Male , Sex Characteristics , Young Adult
12.
Radiat Res ; 170(6): 711-20, 2008 Dec.
Article in English | MEDLINE | ID: mdl-19138038

ABSTRACT

Leukemia is one of the cancers most susceptible to induction by ionizing radiation, but the effects of lower doses delivered over time have not been quantified adequately. After the Chornobyl (Chernobyl) accident in Ukraine in April 1986, several hundred thousand workers who were involved in cleaning up the site and its surroundings received fractionated exposure, primarily from external gamma radiation. To increase our understanding of the role of protracted low-dose radiation exposure in the etiology of leukemia, we conducted a nested case-control study of leukemia in a cohort of cleanup workers identified from the Chornobyl State Registry of Ukraine. The analysis is based on 71 cases of histologically confirmed leukemia diagnosed in 1986-2000 and 501 age- and residence-matched controls selected from the same cohort. Study subjects or their proxies were interviewed about their cleanup activities and other relevant factors. Individual bone marrow radiation doses were estimated by the RADRUE dose reconstruction method (mean dose = 76.4 mGy, SD = 213.4). We used conditional logistic regression to estimate leukemia risks. The excess relative risk (ERR) of total leukemia was 3.44 per Gy [95% confidence interval (CI) 0.47-9.78, P < 0.01]. The dose response was linear and did not differ significantly by calendar period of first work in the 30-km Chornobyl zone, duration or type of work. We found a similar dose-response relationship for chronic and non-chronic lymphocytic leukemia [ERR = 4.09 per Gy (95% CI < 0-14.41) and 2.73 per Gy (95% CI < 0-13.50), respectively]. To further clarify these issues, we are extending the case-control study to ascertain cases for another 6 years (2001-2006).


Subject(s)
Chernobyl Nuclear Accident , Environmental Restoration and Remediation , Leukemia/epidemiology , Leukemia/etiology , Neoplasms, Radiation-Induced/epidemiology , Occupational Exposure , Adult , Age Distribution , Aged , Case-Control Studies , Cohort Studies , Humans , Leukemia/pathology , Logistic Models , Male , Membrane Proteins , Middle Aged , Radiation Dosage , Risk Assessment , Time Factors , Tumor Suppressor Proteins , Ukraine/epidemiology , United States
13.
Int J Hematol ; 76(1): 55-60, 2002 Jul.
Article in English | MEDLINE | ID: mdl-12138896

ABSTRACT

In preparation for a possible large epidemiological study of radiation-related leukemia in Chernobyl clean-up workers of Ukraine, histologic evaluation of 62 cases of leukemia and related disorders was conducted by a panel of expert hematologists and hematopathologists from the United States, France, and Ukraine. All cases were randomly selected from a surrogate population of men in the general population of 6 regions of Ukraine who were between the ages of 20 and 60 years in 1986 and were reported to have developed leukemia, myelodysplasia, or multiple myeloma between the years 1987 and 1998. The hematologists and hematopathologists on the panel were in agreement with one another and with the previously reported diagnoses and classifications of about 90% of the cases of acute and chronic leukemia in the study. These results suggest that strong reliance can be placed on the clinical diagnoses of acute and chronic forms of leukemia and multiple myeloma that have occurred in Ukrainian Chernobyl clean-up workers providing that the diagnoses are supported by records of the patients having had adequate histologic bone marrow studies. The number of cases in this study with the diagnosis of myelodysplasia, however, was too small to draw firm conclusions.


Subject(s)
Leukemia/pathology , Multiple Myeloma/pathology , Myelodysplastic Syndromes/pathology , Adult , Epidemiologic Studies , Humans , Leukemia/epidemiology , Male , Middle Aged , Multiple Myeloma/epidemiology , Myelodysplastic Syndromes/epidemiology , Ukraine/epidemiology
14.
Pharmacotherapy ; 22(6): 677-85, 2002 Jun.
Article in English | MEDLINE | ID: mdl-12066958

ABSTRACT

STUDY OBJECTIVE: To compare point-of-care and standard hospital laboratory assays for monitoring patients receiving single or combination anticoagulant regimens. DESIGN: Prospective analysis. SETTING: Nursing units and clinics at a large, community hospital. PATIENTS: One hundred fifty patients receiving anticoagulants for cardiac, vascular, orthopedic, or cancer indications. Thirty patients were enrolled into each treatment group: warfarin, enoxaparin, heparin, warfarin plus enoxaparin, and warfarin plus heparin. INTERVENTION: Capillary and venous blood samples were collected once in each patient for simultaneous measurement of international normalized ratio (INR) and activated partial thromboplastin time (aPTT) by both assays. MEASUREMENTS AND MAIN RESULTS: Mean differences in paired INR and paired aPTT by point-of-care and standard assays were small, but 95% confidence intervals were wide. The INR differences were greater for the warfarin plus heparin group than for the warfarin group or warfarin plus enoxaparin group; clinical decision agreement was 47% for warfarin plus heparin, 73% for warfarin, and 93% for warfarin plus enoxaparin. The aPTT difference was greater for the warfarin plus heparin than for the heparin group; however, clinical decision agreement, 67% and 70%, respectively, was similar. CONCLUSIONS: Point-of-care methods showed limited agreement with standard hospital laboratory assays of coagulation for all treatment groups. For INR values, significantly greater disagreement was noted between the assay methods for the warfarin plus heparin group compared with the warfarin group, but the agreement was similar for the warfarin and warfarin plus enoxaparin groups. Our data indicate that the point-of-care assays should not be considered interchangeable with standard laboratory assays.


Subject(s)
Anticoagulants/therapeutic use , Drug Monitoring/methods , Laboratories, Hospital , Point-of-Care Systems , Aged , Hospitals, Community/organization & administration , Humans , International Normalized Ratio , Partial Thromboplastin Time , Prospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL
...