Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 36
Filter
Add more filters

Country/Region as subject
Publication year range
1.
Ann Behav Med ; 55(1): 82-88, 2021 02 12.
Article in English | MEDLINE | ID: mdl-33301024

ABSTRACT

BACKGROUND: Investigating antecedents of behaviors, such as wearing face coverings, is critical for developing strategies to prevent SARS-CoV-2 transmission. PURPOSE: The purpose of this study was to determine associations between theory-based behavioral predictors of intention to wear a face covering and actual wearing of a face covering in public. METHODS: Data from a cross-sectional panel survey of U.S. adults conducted in May and June 2020 (N = 1,004) were used to test a theory-based behavioral path model. We (a) examined predictors of intention to wear a face covering, (b) reported use of cloth face coverings, and (c) reported use of other face masks (e.g., a surgical mask or N95 respirator) in public. RESULTS: We found that being female, perceived importance of others wanting the respondent to wear a face covering, confidence to wear a face covering, and perceived importance of personal face covering use was positively associated with intention to wear a face covering in public. Intention to wear a face covering was positively associated with self-reported wearing of a cloth face covering if other people were observed wearing cloth face coverings in public at least "rarely" (aOR = 1.43), with stronger associations if they reported "sometimes" (aOR = 1.83), "often" (aOR = 2.32), or "always" (aOR = 2.96). For other types of face masks, a positive association between intention and behavior was only present when observing others wearing face masks "often" (aOR = 1.25) or "always" (aOR = 1.48). CONCLUSIONS: Intention to wear face coverings and observing other people wearing them are important behavioral predictors of adherence to the CDC recommendation to wear face coverings in public.


Subject(s)
COVID-19/prevention & control , Communicable Disease Control , Masks , Psychological Theory , Adult , Female , Humans , Male , Pandemics/prevention & control , Sex Factors , Social Norms , United States
2.
MMWR Morb Mortal Wkly Rep ; 69(28): 933-937, 2020 Jul 17.
Article in English | MEDLINE | ID: mdl-32673303

ABSTRACT

On April 3, 2020, the White House Coronavirus Task Force and CDC announced a new behavioral recommendation to help slow the spread of coronavirus disease 2019 (COVID-19) by encouraging the use of a cloth face covering when out in public (1). Widespread use of cloth face coverings has not been studied among the U.S. population, and therefore, little is known about encouraging the public to adopt this behavior. Immediately following the recommendation, an Internet survey sampled 503 adults during April 7-9 to assess their use of cloth face coverings and the behavioral and sociodemographic factors that might influence adherence to this recommendation. The same survey was administered 1 month later, during May 11-13, to another sample of 502 adults to assess changes in the prevalence estimates of use of cloth face coverings from April to May. Within days of the release of the first national recommendation for use of cloth face coverings, a majority of persons who reported leaving their home in the previous week reported using a cloth face covering (61.9%). Prevalence of use increased to 76.4% 1 month later, primarily associated with increases in use among non-Hispanic white persons (54.3% to 75.1%), persons aged ≥65 years (36.6% to 79.2%), and persons residing in the Midwest (43.7% to 73.8%). High rates were observed in April and by May, increased further among non-Hispanic black persons (74.4% to 82.3%), Hispanic or Latino persons (77.3% to 76.2%), non-Hispanic persons of other race (70.8% to 77.3%), persons aged 18-29 years (70.1% to 74.9%) and 30-39 years (73.9% to 84.4%), and persons residing in the Northeast (76.9% to 87.0%). The use of a cloth face covering was associated with theory-derived constructs that indicate a favorable attitude toward them, intention to use them, ability to use them, social support for using them, and beliefs that they offered protection for self, others, and the community. Research is needed to understand possible barriers to using cloth face coverings and ways to promote their consistent and correct use among those who have yet to adopt this behavior.


Subject(s)
Coronavirus Infections/epidemiology , Coronavirus Infections/prevention & control , Masks/statistics & numerical data , Pandemics/prevention & control , Pneumonia, Viral/epidemiology , Pneumonia, Viral/prevention & control , Adolescent , Adult , Aged , COVID-19 , Ethnicity/statistics & numerical data , Female , Health Knowledge, Attitudes, Practice , Humans , Male , Middle Aged , Racial Groups/statistics & numerical data , Residence Characteristics/statistics & numerical data , Socioeconomic Factors , Surveys and Questionnaires , United States/epidemiology , Young Adult
3.
Am J Transplant ; 19(9): 2583-2593, 2019 09.
Article in English | MEDLINE | ID: mdl-30980600

ABSTRACT

To reduce the risk of HIV, hepatitis B virus (HBV), and hepatitis C virus (HCV) transmission through organ transplantation, donors are universally screened for these infections by nucleic acid tests (NAT). Deceased organ donors are classified as "increased risk" if they engaged in specific behaviors during the 12 months before death. We developed a model to estimate the risk of undetected infection for HIV, HBV, and HCV among NAT-negative donors specific to the type and timing of donors' potential risk behavior to guide revisions to the 12-month timeline. Model parameters were estimated, including risk of disease acquisition for increased risk groups, number of virions that multiply to establish infection, virus doubling time, and limit of detection by NAT. Monte Carlo simulation was performed. The risk of undetected infection was <1/1 000 000 for HIV after 14 days, for HBV after 35 days, and for HCV after 7 days from the time of most recent potential exposure to the day of a negative NAT. The period during which reported donor risk behaviors result in an "increased risk" designation can be safely shortened.


Subject(s)
HIV Infections/transmission , Hepatitis B/transmission , Hepatitis C/transmission , Organ Transplantation/adverse effects , Organ Transplantation/standards , Risk Assessment/methods , Tissue Donors , DNA, Viral , Female , Humans , Male , Monte Carlo Method , Practice Guidelines as Topic , Probability , Reproducibility of Results , Risk-Taking , Substance Abuse, Intravenous , United States , United States Public Health Service
4.
Am J Public Health ; 114(2): 252-253, 2024 02.
Article in English | MEDLINE | ID: mdl-38335493
5.
Transpl Infect Dis ; 21(4): e13115, 2019 Aug.
Article in English | MEDLINE | ID: mdl-31102550

ABSTRACT

BACKGROUND: Between 2002 and 2013, the organs of 13 deceased donors with infectious encephalitis were transplanted, causing infections in 23 recipients. As a consequence, organs from donors showing symptoms of encephalitis (increased probability of infectious encephalitis (IPIE) organs) might be declined. We had previously characterized the risk of IPIE organs using data available to most transplant teams and not requiring special diagnostic tests. If the probability of infection is low, the benefits of a transplant from a donor with suspected infectious encephalitis might outweigh the risk and could be lifesaving for some transplant candidates. METHODS: Using organ transplant data and Cox Proportional Hazards models, we determined liver donor and recipient characteristics predictive of post-transplant or waitlist survival and generated 5-year survival probability curves. We also calculated expected waiting times for an organ offer based on transplant candidate characteristics. Using a limited set of actual cases of infectious encephalitis transmission via transplant, we estimated post-transplant survival curves given an organ from an IPIE donor. RESULTS: 54% (1256) of patients registered from 2002-2006 who died or were removed from the waiting list because of deteriorated condition within 1 year could have had an at least marginal estimated benefit by accepting an IPIE liver with some probability of infection, with the odds increasing to 86% of patients if the probability of infection was low (5% or less). Additionally, 54% (1252) were removed from the waiting list prior to their estimated waiting time for a non-IPIE liver and could have benefited from an IPIE liver. CONCLUSION: Improved allocation and utilization of IPIE livers could be achieved by evaluating the patient-specific trade-offs between (a) accepting an IPIE liver and (b) remaining on the waitlist and accepting a non-IPIE liver after the estimated waiting time.


Subject(s)
Infectious Encephalitis , Liver Transplantation/adverse effects , Models, Theoretical , Tissue Donors/statistics & numerical data , Tissue and Organ Procurement/standards , Humans , Liver Transplantation/mortality , Proportional Hazards Models , Risk Assessment , Risk Factors , Survival Rate
6.
Am J Public Health ; 113(10): 1074-1078, 2023 10.
Article in English | MEDLINE | ID: mdl-37672741

Subject(s)
COVID-19 , Masks , Humans
7.
Transpl Infect Dis ; 20(5): e12933, 2018 Oct.
Article in English | MEDLINE | ID: mdl-29809311

ABSTRACT

BACKGROUND: There were 13 documented clusters of infectious encephalitis transmission via organ transplant from deceased donors to recipients during 2002-2013. Hence, organs from donors diagnosed with encephalitis are often declined because of concerns about the possibility of infection, given that there is no quick and simple test to detect causes of infectious encephalitis. METHODS: We constructed a database containing cases of infectious and non-infectious encephalitis. Using statistical imputation, cross-validation, and regression techniques, we determined deceased organ donor characteristics, including demographics, signs, symptoms, physical exam, and laboratory findings, predictive of infectious vs non-infectious encephalitis, and developed a calculator which assesses the risk of infection. RESULTS: Using up to 12 predictive patient characteristics (with a minimum of 3, depending on what information is available), the calculator provides the probability that a donor may have infectious vs non-infectious encephalitis, improving the prediction accuracy over current practices. These characteristics include gender, fever, immunocompromised state (other than HIV), cerebrospinal fluid elevation, altered mental status, psychiatric features, cranial nerve abnormality, meningeal signs, focal motor weakness, Babinski's sign, movement disorder, and sensory abnormalities. CONCLUSION: In the absence of definitive diagnostic testing in a potential organ donor, infectious encephalitis can be predicted with a risk score. The risk calculator presented in this paper represents a prototype, establishing a framework that can be expanded to other infectious diseases transmissible through solid organ transplantation.


Subject(s)
Disease Transmission, Infectious/prevention & control , Donor Selection/standards , Infectious Encephalitis/epidemiology , Organ Transplantation/adverse effects , Tissue Donors/statistics & numerical data , Adult , Clinical Decision-Making/methods , Decision Support Techniques , Disease Transmission, Infectious/statistics & numerical data , Female , Humans , Infectious Encephalitis/etiology , Infectious Encephalitis/prevention & control , Male , Middle Aged , Models, Biological , Organ Transplantation/methods , Risk Assessment/methods , Young Adult
8.
Transpl Infect Dis ; 19(2)2017 Apr.
Article in English | MEDLINE | ID: mdl-28178393

ABSTRACT

BACKGROUND: In 2013, guidelines were released for reducing the risk of viral bloodborne pathogen transmission through organ transplantation. Eleven criteria were described that result in a donor being designated at increased infectious risk. Human immunodeficiency virus (HIV) and hepatitis C virus (HCV) transmission risk from an increased-risk donor (IRD), despite negative nucleic acid testing (NAT), likely varies based on behavior type and timing. METHODS: We developed a Monte Carlo risk model to quantify probability of HIV among IRDs. The model included NAT performance, viral load dynamics, and per-act risk of acquiring HIV by each behavior. The model also quantifies the probability of HCV among IRDs by non-medical intravenous drug use (IVDU). RESULTS: Highest risk is among donors with history of unprotected, receptive anal male-to-male intercourse with partner of unknown HIV status (MSM), followed by sex with an HIV-infected partner, IVDU, and sex with a commercial sex worker. CONCLUSION: With NAT screening, the estimated risk of undetected HIV remains small even at 1 day following a risk behavior. The estimated risk for HCV transmission through IVDU is likewise small and decreases quicker with time owing to the faster viral growth dynamics of HCV compared with HIV. These findings may allow for improved organ allocation, utilization, and recipient informed consent.


Subject(s)
Allografts/virology , Disease Transmission, Infectious/statistics & numerical data , HIV Infections/epidemiology , Hepatitis C/epidemiology , Models, Theoretical , Tissue Donors/statistics & numerical data , Blood-Borne Pathogens , HIV/isolation & purification , Hepacivirus/isolation & purification , Humans , Nucleic Acid Amplification Techniques , Practice Guidelines as Topic , RNA, Viral/isolation & purification , Risk , Risk-Taking , Serologic Tests , Sex Work , Time Factors , Tissue Donors/psychology , Tissue and Organ Harvesting/standards , Viral Load
9.
Am J Trop Med Hyg ; 111(3): 490-497, 2024 Sep 04.
Article in English | MEDLINE | ID: mdl-38981503

ABSTRACT

Malaria continues to be a major source of morbidity and mortality in sub-Saharan Africa. Timely, accurate, and effective case management is critical to malaria control. Proactive community case management (ProCCM) is a new strategy in which a community health worker "sweeps" a village, visiting households at defined intervals to proactively provide diagnostic testing and treatment if indicated. Pilot experiments have shown the potential of ProCCM for controlling malaria transmission; identifying the best strategy for administering ProCCM in terms of interval timings and number of sweeps could lead to further reductions in malaria infections. We developed an agent-based simulation to model malaria transmission and the impact of various ProCCM strategies. The model was validated using symptomatic prevalence data from a ProCCM pilot study in Senegal. Various ProCCM strategies were tested to evaluate the potential for reducing parasitologically confirmed symptomatic malaria cases in the Senegal setting. We found that weekly ProCCM sweeps during a 21-week transmission season could reduce cases by 36.3% per year compared with no sweeps. Alternatively, two initial fortnightly sweeps, seven weekly sweeps, and finally four fortnightly sweeps (13 sweeps total) could reduce confirmed malaria cases by 30.5% per year while reducing the number of diagnostic tests and corresponding costs by about 33%. Under a highly seasonal transmission setting, starting the sweeps early with longer duration and higher frequency would increase the impact of ProCCM, though with diminishing returns. The model is flexible and allows decision-makers to evaluate implementation strategies incorporating sweep frequency, time of year, and available budget.


Subject(s)
Case Management , Malaria , Humans , Malaria/prevention & control , Malaria/epidemiology , Malaria/transmission , Malaria/diagnosis , Malaria/drug therapy , Africa South of the Sahara/epidemiology , Senegal/epidemiology , Child , Child, Preschool , Prevalence , Community Health Workers , Adolescent , Adult , Infant , Pilot Projects , Community Health Services , Female , Male , Models, Theoretical
10.
Sci Rep ; 13(1): 6164, 2023 04 15.
Article in English | MEDLINE | ID: mdl-37061525

ABSTRACT

With over 100,000 patients on the kidney transplant waitlist in 2019, it is important to understand if and how the functional status of a patient may change while on the waitlist. Recorded both at registration and just prior to transplantation, the Karnofsky Performance Score measures a patient's functional status and takes on values ranging from 0 to 100 in increments of 10. Using machine learning techniques, we built a gradient boosting regression model to predict a patient's pre-transplant functional status based on information known at the time of waitlist registration. The model's predictions result in an average root mean squared error of 12.99 based on 5 rolling origin cross validations and 12.94 in a separate out-of-time test. In comparison, predicting that the pre-transplant functional status remains the same as the status at registration, results in average root mean squared errors of 14.50 and 14.11 respectively. The analysis is based on 118,401 transplant records from 2007 to 2019. To the best of our knowledge, there has been no previously published research on building a model to predict kidney pre-transplant functional status. We also find that functional status at registration and total serum albumin, have the most impact in predicting the pre-transplant functional status.


Subject(s)
Kidney Transplantation , Humans , Functional Status , Karnofsky Performance Status , Waiting Lists
11.
Public Health Rep ; 138(2): 241-247, 2023.
Article in English | MEDLINE | ID: mdl-36416100

ABSTRACT

OBJECTIVE: High-quality scientific evidence underpins public health decision making. The Centers for Disease Control and Prevention (CDC) agency provides scientific data, including during public health emergencies. To understand CDC's contributions to COVID-19 science, we conducted a bibliometric evaluation of publications authored by CDC scientists from January 20, 2020, through January 20, 2022, by using a quality improvement approach (SQUIRE 2.0). METHODS: We catalogued COVID-19 articles with ≥1 CDC-affiliated author published in a scientific journal and indexed in the World Health Organization's COVID-19 database. We identified priority topic areas from the agency's COVID-19 Public Health Science Agenda by using keyword scripts in EndNote and then assessed the impact of the published articles by using Scopus and Altmetric. RESULTS: During the first 2 years of the agency's pandemic response, CDC authors contributed to 1044 unique COVID-19 scientific publications in 208 journals. Publication topics included testing (n = 853, 82%); prevention strategies (n = 658, 63%); natural history, transmission, breakthrough infections, and reinfections (n = 587, 56%); vaccines (n = 567, 54%); health equity (n = 308, 30%); variants (n = 232, 22%); and post-COVID-19 conditions (n = 44, 4%). Publications were cited 40 427 times and received 81 921 news reports and 1 058 893 social media impressions. As the pandemic evolved, CDC adapted to address new scientific questions, including vaccine effectiveness, safety, and access; viral variants, including Delta and Omicron; and health equity. CONCLUSION: The agency's COVID-19 Public Health Science Agenda helped guide impactful scientific activities. CDC continues to evaluate COVID-19 priority topic areas and contribute to development of new scientific work. CDC is committed to monitoring emerging issues and addressing gaps in evidence needed to improve health.


Subject(s)
COVID-19 , United States/epidemiology , Humans , COVID-19/epidemiology , COVID-19/prevention & control , Public Health , Bibliometrics , Pandemics/prevention & control , Centers for Disease Control and Prevention, U.S.
12.
Sci Rep ; 12(1): 8630, 2022 05 23.
Article in English | MEDLINE | ID: mdl-35606393

ABSTRACT

We expanded a published mathematical model of SARS-CoV-2 transmission with complex, age-structured transmission and with laboratory-derived source and wearer protection efficacy estimates for a variety of face masks to estimate their impact on COVID-19 incidence and related mortality in the United States. The model was also improved to allow realistic age-structured transmission with a pre-specified R0 of transmission, and to include more compartments and parameters, e.g. for groups such as detected and undetected asymptomatic infectious cases who mask up at different rates. When masks are used at typically-observed population rates of 80% for those ≥ 65 years and 60% for those < 65 years, face masks are associated with 69% (cloth) to 78% (medical procedure mask) reductions in cumulative COVID-19 infections and 82% (cloth) to 87% (medical procedure mask) reductions in related deaths over a 6-month timeline in the model, assuming a basic reproductive number of 2.5. If cloth or medical procedure masks' source control and wearer protection efficacies are boosted about 30% each to 84% and 60% by cloth over medical procedure masking, fitters, or braces, the COVID-19 basic reproductive number of 2.5 could be reduced to an effective reproductive number ≤ 1.0, and from 6.0 to 2.3 for a variant of concern similar to delta (B.1.617.2). For variants of concern similar to omicron (B.1.1.529) or the sub-lineage BA.2, modeled reductions in effective reproduction number due to similar high quality, high prevalence mask wearing is more modest (to 3.9 and 5.0 from an R0 = 10.0 and 13.0, respectively). None-the-less, the ratio of incident risk for masked vs. non-masked populations still shows a benefit of wearing masks even with the higher R0 variants.


Subject(s)
COVID-19 , SARS-CoV-2 , COVID-19/epidemiology , COVID-19/prevention & control , Humans , Masks , Textiles , United States/epidemiology
13.
BMC Bioinformatics ; 11: 512, 2010 Oct 13.
Article in English | MEDLINE | ID: mdl-20942945

ABSTRACT

BACKGROUND: Surface enhanced laser desorption/ionization time-of-flight mass spectrometry (SELDI) is a proteomics tool for biomarker discovery and other high throughput applications. Previous studies have identified various areas for improvement in preprocessing algorithms used for protein peak detection. Bottom-up approaches to preprocessing that emphasize modeling SELDI data acquisition are promising avenues of research to find the needed improvements in reproducibility. RESULTS: We studied the properties of the SELDI detector intensity response to matrix only runs. The intensity fluctuations and noise observed can be characterized by a natural exponential family with quadratic variance function (NEF-QVF) class of distributions. These include as special cases many common distributions arising in practice (e.g.- normal, Poisson). Taking this model into account, we present a modified Antoniadis-Sapatinas wavelet denoising algorithm as the core of our preprocessing program, implemented in MATLAB. The proposed preprocessing approach shows superior peak detection sensitivity compared to MassSpecWavelet for false discovery rate (FDR) values less than 25%. CONCLUSIONS: The NEF-QVF detector model requires that certain parameters be measured from matrix only spectra, leaving implications for new experiment design at the trade-off of slightly increased cost. These additional measurements allow our preprocessing program to adapt to changing noise characteristics arising from intralaboratory and across-laboratory factors. With further development, this approach may lead to improved peak prediction reproducibility and nearly automated, high throughput preprocessing of SELDI data.


Subject(s)
Algorithms , Proteomics/methods , Spectrometry, Mass, Matrix-Assisted Laser Desorption-Ionization/methods , Data Interpretation, Statistical , Proteins/analysis , Proteins/chemistry , Proteome/analysis , Proteome/chemistry
14.
Proteomics ; 9(7): 1754-62, 2009 Apr.
Article in English | MEDLINE | ID: mdl-19294696

ABSTRACT

SELDI protein profiling experiments can be used as a first step in studying the pathogenesis of various diseases such as cancer. There are a plethora of software packages available for doing the preprocessing of SELDI data, each with many options and written from different signal processing perspectives, offering many researchers choices they may not have the background or desire to make. Moreover, several studies have shown that mistakes in the preprocessing of the data can bias the biological interpretation of the study. For this reason, we conduct a large scale evaluation of available signal processing techniques to establish which are most effective. We use data generated from a standard, published simulation engine so that "truth" is known. We select the top algorithms by considering two logical performance metrics, and give our recommendations for research directions that are likely to be most promising. There is considerable opportunity for future contributions improving the signal processing of SELDI spectra.


Subject(s)
Proteins/chemistry , Proteomics/methods , Signal Processing, Computer-Assisted , Software , Spectrometry, Mass, Matrix-Assisted Laser Desorption-Ionization , Algorithms , Databases, Protein , Models, Biological , Sensitivity and Specificity
15.
PLoS One ; 14(1): e0209068, 2019.
Article in English | MEDLINE | ID: mdl-30625130

ABSTRACT

We used an ensemble of statistical methods to build a model that predicts kidney transplant survival and identifies important predictive variables. The proposed model achieved better performance, measured by Harrell's concordance index, than the Estimated Post Transplant Survival model used in the kidney allocation system in the U.S., and other models published recently in the literature. The model has a five-year concordance index of 0.724 (in comparison, the concordance index is 0.697 for the Estimated Post Transplant Survival model, the state of the art currently in use). It combines predictions from random survival forests with a Cox proportional hazards model. The rankings of importance for the model's variables differ by transplant recipient age. Better survival predictions could eventually lead to more efficient allocation of kidneys and improve patient outcomes.


Subject(s)
Kidney Transplantation , Machine Learning , Graft Survival , Humans , Models, Statistical , Proportional Hazards Models , Transplant Recipients
16.
BMC Res Notes ; 12(1): 767, 2019 Nov 25.
Article in English | MEDLINE | ID: mdl-31767032

ABSTRACT

OBJECTIVE: To advance public health support for the U.S. Department of Housing and Urban Development's smoke-free rule, the Centers for Disease Control and Prevention collaborated with the Georgia Institute of Technology to develop a geospatial mapping tool. The objective was to create a tool state and local public health agencies could use to tailor smoke-free educational materials and cessation interventions for specific public housing development resident populations. RESULTS: The resulting "Extinguish Tool" includes an interactive map of U.S. public housing developments (PHDs) and healthcare facilities that provides detailed information on individual PHDs, their proximity to existing healthcare facilities, and the demographic characteristics of residents. The tool also estimates the number of PHD residents who smoke cigarettes and calculates crude estimates of the potential economic benefits of providing cessation interventions to these residents. The geospatial mapping tool project serves as an example of a collaborative and innovative public health approach to protecting the health and well-being of the nation's two million public housing residents, including 760,000 children, from the harms of tobacco smoking and secondhand smoke exposure in the places where they live, play, and gather.


Subject(s)
Public Health/education , Public Housing/standards , Smoke-Free Policy , Tobacco Smoke Pollution/prevention & control , Adolescent , Adult , Aged , Aged, 80 and over , Biobehavioral Sciences , Demography , Female , Geographic Information Systems , Humans , Male , Middle Aged , Surveys and Questionnaires , Tobacco Smoke Pollution/adverse effects , United States
17.
J Clin Endocrinol Metab ; 93(3): 703-9, 2008 Mar.
Article in English | MEDLINE | ID: mdl-18160468

ABSTRACT

CONTEXT: A substantial body of research on the pathophysiology of chronic fatigue syndrome (CFS) has focused on hypothalamic-pituitary-adrenal axis dysregulation. The cortisol awakening response has received particular attention as a marker of hypothalamic-pituitary-adrenal axis dysregulation. OBJECTIVE: The objective of the current study was to evaluate morning salivary cortisol profiles in persons with CFS and well controls identified from the general population. DESIGN AND SETTING: We conducted a case-control study at an outpatient research clinic. CASES AND OTHER PARTICIPANTS: We screened a sample of 19,381 residents of Georgia and identified those with CFS and a matched sample of well controls. Seventy-five medication-free CFS cases and 110 medication-free well controls provided complete sets of saliva samples. MAIN OUTCOME MEASURES: We assessed free cortisol concentrations in saliva collected on a regular workday immediately upon awakening and 30 and 60 min after awakening. RESULTS: There was a significant interaction effect, indicating different profiles of cortisol concentrations over time between groups, with the CFS group showing an attenuated morning cortisol profile. Notably, we observed a sex difference in this effect. Women with CFS exhibited significantly attenuated morning cortisol profiles compared with well women. In contrast, cortisol profiles were similar in men with CFS and male controls. CONCLUSIONS: CFS was associated with an attenuated morning cortisol response, but the effect was limited to women. Our results suggest that a sex difference in hypocortisolism may contribute to increased risk of CFS in women.


Subject(s)
Fatigue Syndrome, Chronic/metabolism , Hydrocortisone/analysis , Saliva/chemistry , Adult , Case-Control Studies , Female , Humans , Male , Middle Aged , Sex Characteristics
18.
Mol Immunol ; 44(13): 3445-52, 2007 Jul.
Article in English | MEDLINE | ID: mdl-17467056

ABSTRACT

Myeloma and Chinese hamster ovary (CHO) cells are frequently used for the production of recombinant antibodies. With increasing interest in producing recombinant IgA for protection against infectious agents, it is essential to characterize the IgA produced in these cells. Here we show that while myeloma cells secrete IgA2m(2) predominantly as H(2)L(2), CHO cells secrete H(2)L and H(2) in addition to fully assembled H(2)L(2). When the CHO cells also synthesize J chain and secretory component (SC), polymeric IgA and secretory IgA in which SC is disulfide bonded to the polymeric IgA are produced. Blocking cysteines on purified IgA2m(2) protein by alkylating with iodoacetamide stabilizes the disulfide bonds between the H and L chains suggesting that the disulfide bonds between H and L chains are unstable. Taken together our results suggest that the covalent assembly of IgA2m(2) is different in myeloma and CHO cells.


Subject(s)
Immunoglobulin A/classification , Immunoglobulin A/metabolism , Immunoglobulin Allotypes/metabolism , Protein Processing, Post-Translational , Animals , CHO Cells , Cell Line, Tumor , Cricetinae , Cricetulus , Immunoglobulin A/genetics , Immunoglobulin Allotypes/genetics , Immunoglobulin Heavy Chains/genetics , Immunoglobulin Heavy Chains/metabolism , Immunoglobulin Light Chains/genetics , Immunoglobulin Light Chains/metabolism , Multiple Myeloma/genetics , Multiple Myeloma/immunology , Multiple Myeloma/metabolism , Recombinant Proteins/classification , Recombinant Proteins/genetics , Recombinant Proteins/metabolism , Transfection
19.
Theor Biol Med Model ; 4: 8, 2007 Feb 14.
Article in English | MEDLINE | ID: mdl-17300722

ABSTRACT

BACKGROUND: The body's primary stress management system is the hypothalamic pituitary adrenal (HPA) axis. The HPA axis responds to physical and mental challenge to maintain homeostasis in part by controlling the body's cortisol level. Dysregulation of the HPA axis is implicated in numerous stress-related diseases. RESULTS: We developed a structured model of the HPA axis that includes the glucocorticoid receptor (GR). This model incorporates nonlinear kinetics of pituitary GR synthesis. The nonlinear effect arises from the fact that GR homodimerizes after cortisol activation and induces its own synthesis in the pituitary. This homodimerization makes possible two stable steady states (low and high) and one unstable state of cortisol production resulting in bistability of the HPA axis. In this model, low GR concentration represents the normal steady state, and high GR concentration represents a dysregulated steady state. A short stress in the normal steady state produces a small perturbation in the GR concentration that quickly returns to normal levels. Long, repeated stress produces persistent and high GR concentration that does not return to baseline forcing the HPA axis to an alternate steady state. One consequence of increased steady state GR is reduced steady state cortisol, which has been observed in some stress related disorders such as Chronic Fatigue Syndrome (CFS). CONCLUSION: Inclusion of pituitary GR expression resulted in a biologically plausible model of HPA axis bistability and hypocortisolism. High GR concentration enhanced cortisol negative feedback on the hypothalamus and forced the HPA axis into an alternative, low cortisol state. This model can be used to explore mechanisms underlying disorders of the HPA axis.


Subject(s)
Hydrocortisone/biosynthesis , Hypothalamo-Hypophyseal System/physiology , Models, Biological , Pituitary-Adrenal System/physiology , Receptors, Glucocorticoid/physiology , Adaptation, Physiological , Adrenocorticotropic Hormone/biosynthesis , Animals , Humans , Receptors, Glucocorticoid/chemistry , Stress, Psychological/physiopathology
20.
BMC Neurol ; 7: 40, 2007 Dec 05.
Article in English | MEDLINE | ID: mdl-18053240

ABSTRACT

BACKGROUND: Complaints of unrefreshing sleep are a prominent component of chronic fatigue syndrome (CFS); yet, polysomnographic studies have not consistently documented sleep abnormalities in CFS patients. We conducted this study to determine whether alterations in objective sleep characteristics are associated with subjective measures of poor sleep quality in persons with CFS. METHODS: We examined the relationship between perceived sleep quality and polysomnographic measures of nighttime and daytime sleep in 35 people with CFS and 40 non-fatigued control subjects, identified from the general population of Wichita, Kansas and defined by empiric criteria. Perceived sleep quality and daytime sleepiness were assessed using clinical sleep questionnaires. Objective sleep characteristics were assessed by nocturnal polysomnography and daytime multiple sleep latency testing. RESULTS: Participants with CFS reported unrefreshing sleep and problems sleeping during the preceding month significantly more often than did non-fatigued controls. Participants with CFS also rated their quality of sleep during the overnight sleep study as significantly worse than did control subjects. Control subjects reported significantly longer sleep onset latency than latency to fall asleep as measured by PSG and MSLT. There were no significant differences in sleep pathology or architecture between subjects with CFS and control subjects. CONCLUSION: People with CFS reported sleep problems significantly more often than control subjects. Yet, when measured these parameters and sleep architecture did not differ between the two subject groups. A unique finding requiring further study is that control, but not CFS subjects, significantly over reported sleep latency suggesting CFS subjects may have an increased appreciation of sleep behaviour that may contribute to their perception of sleep problems.


Subject(s)
Fatigue Syndrome, Chronic/physiopathology , Fatigue Syndrome, Chronic/psychology , Perception , Polysomnography , Sleep Wake Disorders/physiopathology , Sleep Wake Disorders/psychology , Adult , Aged , Female , Humans , Male , Middle Aged , Patients/psychology , Sleep , Sleep Stages , Sleep Wake Disorders/diagnosis , Surveys and Questionnaires
SELECTION OF CITATIONS
SEARCH DETAIL