Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 104
Filtrar
Más filtros

Tipo del documento
Intervalo de año de publicación
1.
Law Hum Behav ; 48(1): 67-82, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38252101

RESUMEN

OBJECTIVE: In 2007, New York enacted the Sex Offender Management and Treatment Act, empowering the state to civilly manage individuals who have committed sexual offenses (respondents) and are deemed to have a mental abnormality (MA) that predisposes them to sexually recidivate after serving their criminal sentences. We sought to replicate and extend a previous study (Lu et al., 2015) to identify factors predicting legal decisions. HYPOTHESES: We predicted, on the basis of previous research, that clinical information (e.g., diagnosis) as well as empirically supported risk factors (e.g., sexual deviance) would predict trial outcomes. METHOD: We analyzed multiple pieces of demographic, criminogenic, and clinical data on three nested subsamples of respondents on the basis of the legal process: MA consent (n = 713), MA trial (n = 316), and disposition hearing (n = 643). The binary outcomes of interest were as follows: For the MA consent subsample, it was whether the respondent waived their MA trial; for the MA trial subsample, it was whether the respondent was found at trial to have an MA; and for the disposition hearing, it was whether the respondent was ordered to inpatient or outpatient civil management. RESULTS: The strongest predictor of waiving the trial was geographic location; respondents outside New York City and Long Island were more likely to waive their trials (ORs = 2.38-3.37). The strongest predictors of MA trial and disposition hearing outcomes were Diagnostic and Statistical Manual of Mental Disorders diagnoses; pedophilia (ORs = 4.05-7.22) and sexual sadism (ORs = 2.68-7.03) diagnoses increased the likelihood of an MA finding and confinement order. CONCLUSIONS: Judges and juries give significant weight to clinical information, particularly pedophilia diagnoses, when making civil management legal decisions. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Asunto(s)
Criminales , Trastornos Parafílicos , Delitos Sexuales , Humanos , Conducta Sexual , Trastornos Parafílicos/diagnóstico , Ciudad de Nueva York
2.
BMC Med ; 20(1): 348, 2022 10 12.
Artículo en Inglés | MEDLINE | ID: mdl-36221132

RESUMEN

BACKGROUND: Cervical insufficiency is one of the underlying causes of late miscarriage and preterm birth. Although many risk factors have been identified, the relative magnitude of their association with risk in nulliparous versus parous women has not been well demonstrated, especially for incident cervical insufficiency (ICI). The aim of this study was to investigate and compare the magnitude of the association of ICI with predictive factors in nulliparous and parous women, and to further investigate various aspects of obstetric history for parous women. METHODS: Pregnant women with a first diagnosis of cervical insufficiency were compared to a random sample of control pregnancies from women with no diagnosis by using Swedish national health registers. Demographic, reproductive, and pregnancy-specific factors were compared in case and control pregnancies, and relative risks presented as odds ratios (OR), stratified by nulliparous/parous. Independent associations with ICI were estimated from multivariable logistic regression. Associations with obstetric history were further estimated for multiparous women. RESULTS: A total of 759 nulliparous ICI cases and 1498 parous cases were identified during the study period. Multifetal gestation had a strong positive association with ICI in both groups, but of much larger magnitude for nulliparous women. The number of previous miscarriages was also a much stronger predictor of risk in nulliparous women, especially for multifetal pregnancies. History of preterm delivery (<37 weeks' gestation) was an independent predictor for parous women, and for those whose most recent delivery was preterm, the association with ICI increased with each additional week of prematurity. A previous delivery with prolonged second stage of labor or delivery of a very large infant were both inversely associated with risk of ICI in the current pregnancy. CONCLUSIONS: The differences in importance of predictive risk factors for incident cervical insufficiency in nulliparous and parous women can help resolve some of the inconsistencies in the literature to date regarding factors that are useful for risk prediction. Stratifying on parity can inform more targeted surveillance of at-risk pregnancies, enable the two groups of women to be better informed of their risks, and eventually inform screening and intervention efforts.


Asunto(s)
Aborto Espontáneo , Enfermedades del Recién Nacido , Nacimiento Prematuro , Aborto Espontáneo/epidemiología , Estudios de Casos y Controles , Femenino , Edad Gestacional , Humanos , Recién Nacido , Paridad , Embarazo , Nacimiento Prematuro/epidemiología , Factores de Riesgo
3.
BMC Med Res Methodol ; 22(1): 157, 2022 05 30.
Artículo en Inglés | MEDLINE | ID: mdl-35637431

RESUMEN

BACKGROUND: Despite the ease of interpretation and communication of a risk ratio (RR), and several other advantages in specific settings, the odds ratio (OR) is more commonly reported in epidemiological and clinical research. This is due to the familiarity of the logistic regression model for estimating adjusted ORs from data gathered in a cross-sectional, cohort or case-control design. The preservation of the OR (but not RR) in case-control samples has contributed to the perception that it is the only valid measure of relative risk from case-control samples. For cohort or cross-sectional data, a method known as 'doubling-the-cases' provides valid estimates of RR and an expression for a robust standard error has been derived, but is not available in statistical software packages. METHODS: In this paper, we first describe the doubling-of-cases approach in the cohort setting and then extend its application to case-control studies by incorporating sampling weights and deriving an expression for a robust standard error. The performance of the estimator is evaluated using simulated data, and its application illustrated in a study of neonatal jaundice. We provide an R package that implements the method for any standard design. RESULTS: Our work illustrates that the doubling-of-cases approach for estimating an adjusted RR from cross-sectional or cohort data can also yield valid RR estimates from case-control data. The approach is straightforward to apply, involving simple modification of the data followed by logistic regression analysis. The method performed well for case-control data from simulated cohorts with a range of prevalence rates. In the application to neonatal jaundice, the RR estimates were similar to those from relative risk regression, whereas the OR from naive logistic regression overestimated the RR despite the low prevalence of the outcome. CONCLUSIONS: By providing an R package that estimates an adjusted RR from cohort, cross-sectional or case-control studies, we have enabled the method to be easily implemented with familiar software, so that investigators are not limited to reporting an OR and can examine the RR when it is of interest.


Asunto(s)
Ictericia Neonatal , Estudios de Cohortes , Estudios Transversales , Humanos , Recién Nacido , Modelos Logísticos , Oportunidad Relativa
4.
Bull World Health Organ ; 99(10): 708-714, 2021 Oct 01.
Artículo en Inglés | MEDLINE | ID: mdl-34621088

RESUMEN

Widescale testing for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection is recognized as a key element of surveillance and outbreak control in the coronavirus disease 2019 (COVID-19) pandemic. The practical challenges, however, have often led to testing only symptomatic individuals and their close contacts. As many countries plan for a cautious relaxation of social restrictions, more effective approaches for widescale testing are increasingly important. Early in the COVID-19 pandemic, laboratories in several countries demonstrated the feasibility of detecting SARS-CoV-2 infection by pooled testing, which combines the specimens from several individuals. Since no further testing is needed for individuals in a negative pool, there is potential for greater efficiency of testing. Despite validations of the accuracy of the results and the efficiency in testing specific groups, the benefits of pooling are less acknowledged as a population surveillance strategy that can detect new disease outbreaks without posing restrictions on entire societies. Pooling specimens from natural clusters, such as school classes, sports teams, workplace colleagues and other social networks, would enable timely and cost-effective widescale testing for SARS-CoV-2. The initial result would be readily translatable into action in terms of quarantine and isolation policies. Clusters of uninfected individuals would be quickly identified and immediate local lockdown of positive clusters would be the appropriate and sufficient action while retesting those individuals. By adapting to the social networks of a population, pooled testing offers a cost-efficient surveillance system that is synchronized with quarantine policies that are rational, risk-based and equitable.


Le dépistage à grande échelle de l'infection au coronavirus 2 du syndrome respiratoire aigu sévère (SARS-CoV-2) est considéré comme l'un des piliers de la surveillance et de la lutte contre la pandémie de maladie à coronavirus 2019 (COVID-19). Néanmoins, les défis pratiques inhérents à ce dépistage ont souvent poussé à ne tester que les individus symptomatiques et leurs contacts rapprochés. Alors que de nombreux pays prévoient de lever certaines restrictions sociales, il devient de plus en plus important d'adopter des approches plus efficaces pour dépister à grande échelle. Dès le début de la pandémie de COVID-19, des laboratoires répartis dans divers pays ont démontré qu'il était possible de détecter une infection au SARS-CoV-2 par le biais de tests groupés, qui rassemblent des échantillons provenant de plusieurs individus. Si le résultat de l'un des groupes est négatif, nul besoin de tester chaque individu, ce qui pourrait accroître considérablement l'efficacité du processus. Même si les résultats se sont révélés fiables et que le dépistage de groupes spécifiques a prouvé l'efficacité du système, le regroupement d'échantillons est rarement envisagé comme stratégie de surveillance de la population permettant d'identifier toute nouvelle flambée des contaminations sans imposer de restrictions à l'ensemble de la société. Pourtant, le regroupement d'échantillons provenant de foyers épidémiques naturels tels que les salles de classe, les équipes sportives, les collègues de travail et autres interactions sociales ferait gagner du temps et de l'argent lors du dépistage à grande échelle du SARS-CoV-2. Le résultat initial pourrait aussitôt se traduire par des mesures d'isolation et de quarantaine. Les groupes d'individus non infectés seraient rapidement repérés, et le confinement immédiat des groupes positifs constituerait une intervention appropriée et suffisante pendant que ces individus subiraient un nouveau test. En s'adaptant aux interactions sociales d'une population donnée, les tests groupés représentent une solution de surveillance rentable en phase avec une politique de quarantaine rationnelle, équitable et fondée sur une analyse des risques.


Las pruebas a gran escala para detectar la infección por el coronavirus del síndrome respiratorio agudo grave-2 (SARS-CoV-2) se reconocen como un elemento clave de la vigilancia y el control de los brotes de la pandemia por enfermedad del coronavirus (COVID-19). Sin embargo, los desafíos prácticos han llevado a menudo a realizar pruebas solo a las personas sintomáticas y a sus contactos cercanos. A medida que muchos países planifican una cautelosa relajación de las restricciones sociales, es cada vez más importante contar con enfoques más eficaces para la realización de pruebas a gran escala. Al principio de la pandemia de COVID-19, los laboratorios de varios países demostraron la viabilidad de la detección de la infección por el SARS-CoV-2 mediante pruebas conjuntas, que combinan las muestras de varias personas. Dado que no se necesitan más pruebas para las personas en un grupo negativo, existe la posibilidad de que las pruebas sean más eficaces. A pesar de las validaciones de la exactitud de los resultados y la eficiencia en las pruebas de grupos específicos, los beneficios de la agrupación son menos reconocidos como una estrategia de vigilancia de la población que puede detectar nuevos brotes de la enfermedad sin plantear restricciones en sociedades enteras. La puesta en común de muestras procedentes de grupos naturales, como clases escolares, equipos deportivos, compañeros de trabajo y otras redes sociales, permitiría realizar pruebas a gran escala, oportunas y rentables, para detectar el SARS-CoV-2. El resultado inicial permitiría ejecutar fácilmente políticas de cuarentena y aislamiento. Se identificarían rápidamente los grupos de personas no infectadas y bastaría con una cuarentena inmediata de los grupos positivos mientras se vuelven a realizar las pruebas a dichas personas. Al adaptarse a las redes sociales de una población, las pruebas conjuntas ofrecen un sistema de vigilancia rentable que se sincroniza con políticas de cuarentena que son racionales, basadas en el riesgo y equitativas.


Asunto(s)
COVID-19 , SARS-CoV-2 , Control de Enfermedades Transmisibles , Humanos , Pandemias , Cuarentena
5.
Acta Obstet Gynecol Scand ; 100(12): 2216-2225, 2021 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-34476807

RESUMEN

INTRODUCTION: Anti-D alloimmunization is the most common cause of severe hemolytic disease of the fetus and newborn (HDFN). The management of pregnancies affected by less frequent red blood cell (RBC) antibodies poses a challenge to clinicians, and perinatal outcomes are less well described. This study aimed to describe the frequency of clinically significant RBC antibodies in our pregnant population and analyze the risk of prenatal and postnatal treatment for HDFN in relation to our national risk classification system and management guidelines. MATERIAL AND METHODS: A retrospective cohort study in the population of all alloimmunized singleton pregnancies in the Stockholm region 1990-2016. Descriptive summaries of different RBC antibodies and pregnancy outcomes were presented, the risks of intrauterine blood transfusion (IUT) and neonatal treatment for HDFN were estimated by type of antibodies. RESULTS: Of the 1724 alloimmunized pregnancies, 1079 (63%) were at risk of HDFN and constituted our study cohort. Anti-D was detected in 492 (46%) pregnancies, followed by anti-E in 161 (15%), and anti-c in 128 (12%). Eighty-seven (8%) pregnancies had IUT, with the highest risk in pregnancies affected by anti-D combined with other antibodies. The maximum titer recorded before IUT was 64 or above, except for two pregnancies affected by anti-c, for which the maximum titers were 8 and 16. For the 942 (95%) live-born neonates from 992 alloimmunized pregnancies without IUT, the median gestational age at birth was 38+5  weeks compared with 35+5  weeks for those who had IUT. Neonatal treatment was most common in the anti-D alone and anti-D combined groups, with 136 (57%) and 21 (44%), respectively, treated with phototherapy and 35 (15%) and 9 (20%) receiving exchange transfusions, respectively. For pregnancies complicated by moderate- and low-risk antibodies, phototherapy was less frequent (32 [36%] and 21 [19%]) and exchange transfusion was rare (5 [6%] and 3 [3%]). CONCLUSIONS: Anti-D, especially in combination with other antibodies, presents the highest risk of severe HDFN. The classification of less frequent and less well-known RBC antibodies into risk groups can help clinicians in assessing the risk of HDFN and counseling alloimmunized pregnant women regarding the risk of prenatal and postnatal treatments.


Asunto(s)
Eritroblastosis Fetal/diagnóstico , Eritrocitos/inmunología , Atención Prenatal , Transfusión de Sangre Intrauterina , Estudios de Cohortes , Eritroblastosis Fetal/terapia , Femenino , Edad Gestacional , Humanos , Recién Nacido , Isoanticuerpos , Embarazo , Estudios Retrospectivos
6.
Matern Child Nutr ; 17(2): e13110, 2021 04.
Artículo en Inglés | MEDLINE | ID: mdl-33269548

RESUMEN

With expanded HIV treatment and prevention programmes, most infants born to HIV-positive women are uninfected, but the patterns and determinants of their growth are not well described. This study aimed to assess growth patterns in a cohort of HIV-exposed uninfected (HEU) infants who participated in an experimental HIV vaccine trial and to test for associations with maternal and infant factors, including in-utero exposure to antiretroviral therapy (ART), mode of delivery, exclusive breastfeeding, mother's education and receipt of the vaccine. Infants in the trial were seen at regular clinic visits from birth to 48 weeks of age. From the anthropometric measurements at these visits, weight-for-age z-scores (WAZ), weight-for-length z-scores (WLZ) and length-for-age z-scores (LAZ) were computed using World Health Organization (WHO) software and reference tables. Growth patterns were investigated with respect to maternal and infant factors, using linear mixed regression models. From 94 infants included at birth, growth data were available for 75.5% at 48 weeks. The determinants of infant growth in this population are multifactorial: infant LAZ during the first year was significantly lower among infants delivered by caesarean section (p = 0.043); both WAZ and LAZ were depressed among infants with longer exposure to maternal ART (WAZ: p = 0.015; LAZ: p < 0.0001) and among infants of mothers with lower educational level (WAZ: p = 0.038; LAZ: p < 0.0001); the effect of maternal education was modified by breastfeeding practice, with no differences seen in exclusively breastfed infants. These findings inform intervention strategies to preserve growth in this vulnerable infant population.


Asunto(s)
Infecciones por VIH , Complicaciones Infecciosas del Embarazo , Lactancia Materna , Cesárea , Estudios de Cohortes , Femenino , Infecciones por VIH/tratamiento farmacológico , Humanos , Lactante , Recién Nacido , Embarazo , Complicaciones Infecciosas del Embarazo/prevención & control
7.
BMC Med Res Methodol ; 20(1): 145, 2020 06 06.
Artículo en Inglés | MEDLINE | ID: mdl-32505178

RESUMEN

BACKGROUND: The change in two measurements of a continuous outcome can be modelled directly with a linear regression model, or indirectly with a random effects model (REM) of the individual measurements. These methods are susceptible to model misspecifications, which are commonly addressed by applying monotonic transformations (e.g., Box-Cox transformation) to the outcomes. However, transforming the outcomes complicates the data analysis, especially when variable selection is involved. We propose a robust alternative through a novel application of the conditional probit (cprobit) model. METHODS: The cprobit model analyzes the ordered outcomes within each subject, making the estimate invariant to monotonic transformation on the outcome. By scaling the estimate from the cprobit model, we obtain the exposure effect on the change in the observed or Box-Cox transformed outcome, pending the adequacy of the normality assumption on the raw or transformed scale. RESULTS: Using simulated data, we demonstrated a similar good performance of the cprobit model and REM with and without transformation, except for some bias from both methods when the Box-Cox transformation was applied to scenarios with small sample size and strong effects. Only the cprobit model was robust to skewed subject-specific intercept terms when a Box-Cox transformation was used. Using two real datasets from the breast cancer and inpatient glycemic variability studies which utilize electronic medical records, we illustrated the application of our proposed robust approach as a seamless three-step workflow that facilitates the use of Box-Cox transformation to address non-normality with a common underlying model. CONCLUSIONS: The cprobit model provides a seamless and robust inference on the change in continuous outcomes, and its three-step workflow is implemented in an R package for easy accessibility.


Asunto(s)
Modelos Lineales , Sesgo , Humanos , Tamaño de la Muestra
8.
Epidemiology ; 29(3): 453-457, 2018 05.
Artículo en Inglés | MEDLINE | ID: mdl-29337843

RESUMEN

BACKGROUND: Hemolytic disease of the fetus and newborn due to maternal red blood cell alloimmunization can have serious consequences. Because early detection enables careful monitoring of affected pregnancies, programs to routinely screen all pregnant women have been widely adopted. Due to the low prevalence of alloimmunization, these require large investments of resources to detect a small number of cases. METHODS: We conducted a validation study of a decision tree developed in the Netherlands for determining whether to screen for alloimmunization. In a Swedish cohort, we compared the performance of that decision tree to two alternative models that used maternal characteristics, obstetric history, and transfusion history to identify high-risk women for screening or low-risk women who might be exempt from screening. The models were compared for predictive ability and potential reduction in the volume of screening. RESULTS: The decision tree applied to our study population identified 89% of alloimmunized women with a negative predictive value (NPV) of 99.7% by screening 62% of the population. To achieve the same NPV, our model exempting low-risk women captured 90% of alloimmunizations by screening 63% of the population. In contrast, the model identifying high-risk women for screening while maintaining a similar NPV captured 63% of alloimmunized women by screening 20% of the population. CONCLUSIONS: We validated that an existing decision tree for selecting women for maternal screening performed well in our population, identifying a large proportion of women who became alloimmunized, with a predictive performance almost identical to that of a more elaborate model.


Asunto(s)
Anticuerpos/sangre , Árboles de Decisión , Eritrocitos/inmunología , Diagnóstico Prenatal/normas , Adulto , Estudios de Cohortes , Femenino , Humanos , Tamizaje Masivo , Países Bajos , Valor Predictivo de las Pruebas , Embarazo
9.
Popul Health Metr ; 16(1): 18, 2018 12 18.
Artículo en Inglés | MEDLINE | ID: mdl-30563536

RESUMEN

BACKGROUND: To quantify temporal trends in age-standardized rates of disease, the convention is to fit a linear regression model to log-transformed rates because the slope term provides the estimated annual percentage change. However, such log-transformation is not always appropriate. METHODS: We propose an alternative method using the rank-ordered logit (ROL) model that is indifferent to log-transformation. This method quantifies the temporal trend using odds, a quantity commonly used in epidemiology, and the log-odds corresponds to the scaled slope parameter estimate from linear regression. The ROL method can be implemented by using the commands for proportional hazards regression in any standard statistical package. We apply the ROL method to estimate temporal trends in age-standardized cancer rates worldwide using the cancer incidence data from the Cancer Incidence in Five Continents plus (CI5plus) database for the period 1953 to 2007 and compare the estimates to their scaled counterparts obtained from linear regression with and without log-transformation. RESULTS: We found a strong concordance in the direction and significance of the temporal trends in cancer incidence estimated by all three approaches, and illustrated how the estimate from the ROL model provides a measure that is comparable to a scaled slope parameter estimated from linear regression. CONCLUSIONS: Our method offers an alternative approach for quantifying temporal trends in incidence or mortality rates in a population that is invariant to transformation, and whose estimate of trend agrees with the scaled slope from a linear regression model.


Asunto(s)
Interpretación Estadística de Datos , Métodos Epidemiológicos , Modelos Estadísticos , Neoplasias/epidemiología , Salud Global , Humanos , Incidencia , Modelos Lineales , Modelos Logísticos , Oportunidad Relativa , Estándares de Referencia
10.
J Immunol ; 197(6): 2261-8, 2016 09 15.
Artículo en Inglés | MEDLINE | ID: mdl-27503210

RESUMEN

Conditional gene targeting using the bacteriophage-derived Cre recombinase is widely applied for functional gene studies in mice. Mice transgenic for Cre under the control of the lck gene promoter are used to study the role of loxP-targeted genes in T cell development and function. In this article, we show a striking 65% reduction in cellularity, preferential development of γδ versus αß T cells, and increased expression of IL-7R in the thymus of mice expressing Cre under the proximal lck promoter (lck-cre(+) mice). The transition from CD4/CD8 double-negative to double-positive cells was blocked, and lck-cre(+) double-positive cells were more prone to apoptosis and showed higher levels of Cre expression. Importantly, numbers of naive T cells were reduced in spleens and lymph nodes of lck-cre(+) mice. In contrast, frequencies of γδ T cells, CD44(+)CD62L(-) effector T cells, and Foxp3(+) regulatory T cells were elevated, as was the frequency of IFN-γ-secreting CD4(+) and CD8(+) T cells. A literature survey of 332 articles that used lck-cre(+) mice for deletion of floxed genes indicated that results are statistically influenced by the control used (lck-cre(+) or lck-cre(-)), more frequently resembling the lck-cre(+) phenotype described in this article if lck-cre(-) controls were used. Altogether, care should be taken when interpreting published results and to properly control targeted gene deletions using the lck-cre(+) strain.


Asunto(s)
Linfocitos T CD8-positivos/metabolismo , Integrasas/metabolismo , Regiones Promotoras Genéticas , Proteínas Serina-Treonina Quinasas/genética , Subgrupos de Linfocitos T/metabolismo , Timo/citología , Animales , Apoptosis , Linfocitos T CD8-positivos/inmunología , Diferenciación Celular/genética , Eliminación de Gen , Integrasas/genética , Interferón gamma/inmunología , Interferón gamma/metabolismo , Ratones , Ratones Transgénicos , Proteínas Serina-Treonina Quinasas/metabolismo , Receptores de Interleucina-7/genética , Subgrupos de Linfocitos T/inmunología , Linfocitos T Reguladores/inmunología , Linfocitos T Reguladores/metabolismo , Timo/inmunología
11.
Int J Cancer ; 140(3): 581-590, 2017 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-27759937

RESUMEN

Family history of cancer is a well-known risk factor but the role of family history in survival is less clear. The aim of this study was to investigate the association between family history and cancer survival for the common cancers in Sweden. Using the Swedish population-based registers, patients diagnosed with the most common cancers were followed for cancer-specific death during 1991-2010. We used multivariate proportional hazards (Cox) regression models to contrast the survival of patients with a family history of cancer (individuals whose parent or sibling had a concordant cancer) to the survival of patients without a family history. Family history of cancer had a modest protective effect on survival for breast cancer (hazard ratio (HR) = 0.88, 95% confidence interval (95% CI) = 0.81 to 0.96) and prostate cancer (HR = 0.82, 95% CI = 0.75 to 0.90). In contrast, family history of cancer was associated with worse survival for nervous system cancers (HR = 1.24, 95% CI = 1.05 to 1.47) and ovarian cancer (HR = 1.20, 95% CI = 1.01 to 1.43). Furthermore, the poorer survival for ovarian cancer was consistent with a higher FIGO stage and a greater proportion of more aggressive tumors of the serous type. The better survival for patients with a family history of breast and prostate cancer may be due to medical surveillance of family members. The poor survival for ovarian cancer patients with an affected mother or sister is multifactorial, suggesting that these cancers are more aggressive than their sporadic counterparts.


Asunto(s)
Neoplasias/mortalidad , Femenino , Humanos , Masculino , Persona de Mediana Edad , Madres , Modelos de Riesgos Proporcionales , Sistema de Registros , Factores de Riesgo , Hermanos , Suecia
12.
Development ; 141(12): 2429-40, 2014 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-24917499

RESUMEN

A common feature of development in most vertebrate models is the early segregation of the germ line from the soma. For example, in Xenopus and zebrafish embryos primordial germ cells (PGCs) are specified by germ plasm that is inherited from the egg; in mice, Blimp1 expression in the epiblast mediates the commitment of cells to the germ line. How these disparate mechanisms of PGC specification evolved is unknown. Here, in order to identify the ancestral mechanism of PGC specification in vertebrates, we studied PGC specification in embryos from the axolotl (Mexican salamander), a model for the tetrapod ancestor. In the axolotl, PGCs develop within mesoderm, and classic studies have reported their induction from primitive ectoderm (animal cap). We used an axolotl animal cap system to demonstrate that signalling through FGF and BMP4 induces PGCs. The role of FGF was then confirmed in vivo. We also showed PGC induction by Brachyury, in the presence of BMP4. These conditions induced pluripotent mesodermal precursors that give rise to a variety of somatic cell types, in addition to PGCs. Irreversible restriction of the germ line did not occur until the mid-tailbud stage, days after the somatic germ layers are established. Before this, germline potential was maintained by MAP kinase signalling. We propose that this stochastic mechanism of PGC specification, from mesodermal precursors, is conserved in vertebrates.


Asunto(s)
Ambystoma mexicanum/embriología , Regulación del Desarrollo de la Expresión Génica , Células Germinativas/citología , Mesodermo/citología , Animales , Proteína Morfogenética Ósea 4/metabolismo , Diferenciación Celular , Proteínas Fetales/metabolismo , Factores de Crecimiento de Fibroblastos/metabolismo , Hibridación in Situ , Sistema de Señalización de MAP Quinasas , Células Madre Pluripotentes/citología , Transducción de Señal , Procesos Estocásticos , Proteínas de Dominio T Box/metabolismo , Xenopus
13.
Stat Med ; 36(3): 455-465, 2017 02 10.
Artículo en Inglés | MEDLINE | ID: mdl-27734520

RESUMEN

Using both simulated and real datasets, we compared two approaches for estimating absolute risk from nested case-control (NCC) data and demonstrated the feasibility of using the NCC design for estimating absolute risk. In contrast to previously published results, we successfully demonstrated not only that data from a matched NCC study can be used to unbiasedly estimate absolute risk but also that matched studies give better statistical efficiency and classify subjects into more appropriate risk categories. Our result has implications for studies that aim to develop or validate risk prediction models. In addition to the traditional full cohort study and case-cohort study, researchers designing these studies now have the option of performing a NCC study with huge potential savings in cost and resources. Detailed explanations on how to obtain the absolute risk estimates under the proposed approach are given. Copyright © 2016 John Wiley & Sons, Ltd.


Asunto(s)
Estudios de Casos y Controles , Medición de Riesgo/métodos , Estadística como Asunto/métodos , Factores de Edad , Enfermedades Cardiovasculares/etiología , Factores de Confusión Epidemiológicos , Interpretación Estadística de Datos , Femenino , Humanos , Masculino , Modelos Estadísticos , Modelos de Riesgos Proporcionales , Factores de Riesgo , Factores Sexuales
14.
Acta Paediatr ; 105(12): 1444-1450, 2016 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-27173507

RESUMEN

AIM: This study examined maternal and pregnancy risk factors for haemolytic and nonhaemolytic neonatal jaundice in a large population-based cohort study. METHODS: We conducted a cohort study of 1 019 220 singleton live births from the Swedish medical birth register from 1987 to 2002, using information on neonatal jaundice and maternal and pregnancy characteristics. Diagnoses of gestational hypertensive disorders were obtained by linkage to the national inpatient register. Multivariate logistic regression analysis provided odds ratios for the risk factors of both forms of jaundice. RESULTS: A total of 6057 (0.6%) births were affected by haemolytic jaundice and 36 869 (3.6%) by nonhaemolytic jaundice. The strongest risk factors for haemolytic jaundice were maternal alloimmunisation, blood group O and neonatal jaundice in older siblings. For nonhaemolytic jaundice, the strongest risk factors were preterm birth, neonatal jaundice in older siblings, maternal origin from East or South-East Asia and maternal obesity. We estimated that 13% of haemolytic jaundice was attributable to alloimmunisation and 39% of nonhaemolytic jaundice was attributable to preterm birth. CONCLUSION: Haemolytic and nonhaemolytic neonatal jaundice had different risk factor profiles. Interventions to reduce maternal alloimmunisation, preterm birth and maternal obesity may lower the prevalence of neonatal jaundice and the risk of consequent neurological complications.


Asunto(s)
Hemólisis , Ictericia Neonatal/epidemiología , Adulto , Estudios de Cohortes , Femenino , Humanos , Ictericia Neonatal/etiología , Masculino , Embarazo , Factores de Riesgo , Suecia/epidemiología , Adulto Joven
15.
Int J Cancer ; 136(8): 1948-56, 2015 Apr 15.
Artículo en Inglés | MEDLINE | ID: mdl-25267314

RESUMEN

Family history is a well-known risk factor for many cancers. However, it is important to know if/how the familial risk of cancer changes over time. For each of four major cancers (colorectal, breast, prostate and melanoma), we identified siblings of cancer patients (case siblings) and siblings of matched cancer-free controls sampled from Swedish population-based registers. Effects of age and time since diagnosis on sibling risks were examined using Poisson regression and presented graphically as smoothed hazard ratios (HRs). Screening effects were investigated by comparing hazards before/after the introduction of mammography for breast cancer and prostate-specific antigen (PSA) testing for prostate cancer. Case siblings had higher cancer incidence than control siblings for all cancers at all ages, with overall incidence rate ratios (IRRs) of 2.41 (95% confidence interval 2.14-2.71) for colorectal cancer, 2.37 (2.24-2.52) for breast cancer, 3.69 (3.46-3.93) for prostate cancer and 3.20 (2.72-3.76) for melanoma. Risks were highest in siblings who were young when the first cancer was diagnosed in the family, with siblings aged 30-40 having IRR 9.05 (3.03-27.00) for colorectal cancer and 4.30 (2.87-6.45) for breast cancer. Smoothed HRs remained fairly constant for up to 20 years except for prostate cancer, where the HR decreased steeply during the first few years. After introduction of PSA testing, men had higher incidence of prostate cancer shortly after diagnosis in a brother, but no such screening effect was found for breast cancer. Our findings can help inform the screening and counseling of family members of cancer patients.


Asunto(s)
Neoplasias/etiología , Adulto , Factores de Edad , Anciano , Estudios de Casos y Controles , Intervalos de Confianza , Femenino , Humanos , Incidencia , Masculino , Tamizaje Masivo/métodos , Persona de Mediana Edad , Sistema de Registros , Riesgo , Factores de Riesgo , Hermanos
16.
Transfusion ; 55(10): 2479-85, 2015 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-26098293

RESUMEN

BACKGROUND: Studies have repeatedly demonstrated that blood donors experience lower mortality than the general population. While this may suggest a beneficial effect of blood donation, it may also reflect the selection of healthy persons into the donor population. To overcome this bias, we investigated the relation between blood donation frequency and mortality within a large cohort of blood donors. In addition, our analyses also took into consideration the effects of presumed health differences linked to donation behavior. STUDY DESIGN AND METHODS: Using the Scandinavian Donation and Transfusion database (SCANDAT), we assessed the association between annual number of donations in 5-year windows and donor mortality by means of Poisson regression analysis. The analyses included adjustment for demographic characteristics and for an internal healthy donor effect, estimated among elderly donors exempted from continued donation because of age criteria. RESULTS: Statistical analyses included 1,182,495 donors of whom 15,401 died during 9,526,627 person-years of follow-up. Analyses adjusted only for demographic characteristics showed a 18.6% reduction in mortality per additional annual donation (95% confidence interval [CI], 16.8%-20.4%). After additional adjustment for the internal healthy donor effect, each additional annual donation was associated with a 7.5% decreased mortality risk 7.5% (95% CI, 5.7%-9.4%). CONCLUSION: We observed an inverse relationship between donation frequency and mortality. The magnitude of the association was reduced after adjustment for an estimate of self-selection in the donor population. Our observations indicate that repeated blood donation is not associated with premature death, but cannot be interpreted as conclusive evidence of a beneficial health effect.


Asunto(s)
Donantes de Sangre , Bases de Datos Factuales , Mortalidad , Femenino , Estudios de Seguimiento , Humanos , Masculino , Estudios Retrospectivos , Factores de Riesgo
17.
Stat Med ; 33(11): 1842-52, 2014 May 20.
Artículo en Inglés | MEDLINE | ID: mdl-24753004

RESUMEN

Many epidemiological studies use a nested case-control (NCC) design to reduce cost while maintaining study power. Because NCC sampling is conditional on the primary outcome, routine application of logistic regression to analyze a secondary outcome will generally be biased. Recently, many studies have proposed several methods to obtain unbiased estimates of risk for a secondary outcome from NCC data. Two common features of all current methods requires that the times of onset of the secondary outcome are known for cohort members not selected into the NCC study and the hazards of the two outcomes are conditionally independent given the available covariates. This last assumption will not be plausible when the individual frailty of study subjects is not captured by the measured covariates. We provide a maximum-likelihood method that explicitly models the individual frailties and also avoids the need to have access to the full cohort data. We derive the likelihood contribution by respecting the original sampling procedure with respect to the primary outcome. We use proportional hazard models for the individual hazards, and Clayton's copula is used to model additional dependence between primary and secondary outcomes beyond that explained by the measured risk factors. We show that the proposed method is more efficient than weighted likelihood and is unbiased in the presence of shared frailty for the primary and secondary outcome. We illustrate the method with an application to a study of risk factors for diabetes in a Swedish cohort.


Asunto(s)
Estudios de Casos y Controles , Funciones de Verosimilitud , Modelos de Riesgos Proporcionales , Factores de Riesgo , Enfermedades Cardiovasculares/etiología , Estudios de Cohortes , Simulación por Computador , Diabetes Mellitus Tipo 2/complicaciones , Femenino , Humanos , Masculino
18.
Stat Med ; 33(30): 5388-98, 2014 Dec 30.
Artículo en Inglés | MEDLINE | ID: mdl-24980445

RESUMEN

The significant investment in measuring biomarkers has prompted investigators to improve cost-efficiency by sub-sampling in non-standard study designs. For example, investigators studying prognosis may assume that any differences in biomarkers are likely to be most apparent in an extreme sample of the earliest deaths and the longest-surviving controls. Simple logistic regression analysis of such data does not exploit the information available in the survival time, and statistical methods that model the sampling scheme may be more efficient. We derive likelihood equations that reflect the complex sampling scheme in unmatched and matched 'extreme' case-control designs. We investigated the performance and power of the method in simulation experiments, with a range of underlying hazard ratios and study sizes. Our proposed method resulted in hazard ratio estimates close to those obtained from the full cohort. The standard error estimates also performed well when compared with the empirical variance. In an application to a study investigating markers for lethal prostate cancer, an extreme case-control sample of lethal cases and the longest-surviving controls provided estimates of the effect of Gleason score in close agreement with analysis of all the data. By using the information in the sampling design, our method enables efficient and valid estimation of the underlying hazard ratio from a study design that is intuitive and easily implemented.


Asunto(s)
Estudios de Casos y Controles , Modelos de Riesgos Proporcionales , Proyectos de Investigación , Estudios de Cohortes , Simulación por Computador , Humanos , Incidencia , Estimación de Kaplan-Meier , Funciones de Verosimilitud , Modelos Logísticos , Masculino , Pronóstico , Neoplasias de la Próstata/mortalidad
19.
Arterioscler Thromb Vasc Biol ; 33(9): 2267-72, 2013 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-23685553

RESUMEN

OBJECTIVE: Current guidelines do not support the use of genetic profiles in risk assessment of coronary heart disease (CHD). However, new single nucleotide polymorphisms associated with CHD and intermediate cardiovascular traits have recently been discovered. We aimed to compare several multilocus genetic risk score (MGRS) in terms of association with CHD and to evaluate clinical use. APPROACH AND RESULTS: We investigated 6 Swedish prospective cohort studies with 10 612 participants free of CHD at baseline. We developed 1 overall MGRS based on 395 single nucleotide polymorphisms reported as being associated with cardiovascular traits, 1 CHD-specific MGRS, including 46 single nucleotide polymorphisms, and 6 trait-specific MGRS for each established CHD risk factors. Both the overall and the CHD-specific MGRS were significantly associated with CHD risk (781 incident events; hazard ratios for fourth versus first quartile, 1.54 and 1.52; P<0.001) and improved risk classification beyond established risk factors (net reclassification improvement, 4.2% and 4.9%; P=0.006 and 0.017). Discrimination improvement was modest (C-index improvement, 0.004). A polygene MGRS performed worse than the CHD-specific MGRS. We estimate that 1 additional CHD event for every 318 people screened at intermediate risk could be saved by measuring the CHD-specific genetic score in addition to the established risk factors. CONCLUSIONS: Our results indicate that genetic information could be of some clinical value for prediction of CHD, although further studies are needed to address aspects, such as feasibility, ethics, and cost efficiency of genetic profiling in the primary prevention setting.


Asunto(s)
Enfermedad Coronaria/diagnóstico , Enfermedad Coronaria/genética , Pruebas Genéticas , Tamizaje Masivo/métodos , Polimorfismo de Nucleótido Simple , Sitios de Carácter Cuantitativo , Carácter Cuantitativo Heredable , Anciano , Enfermedad Coronaria/epidemiología , Análisis Discriminante , Supervivencia sin Enfermedad , Femenino , Predisposición Genética a la Enfermedad , Herencia , Humanos , Incidencia , Modelos Lineales , Modelos Logísticos , Estudios Longitudinales , Masculino , Persona de Mediana Edad , Epidemiología Molecular , Fenotipo , Valor Predictivo de las Pruebas , Modelos de Riesgos Proporcionales , Estudios Prospectivos , Sistema de Registros , Medición de Riesgo , Factores de Riesgo , Suecia/epidemiología , Factores de Tiempo
20.
J Virol ; 86(20): 11373-9, 2012 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-22875969

RESUMEN

Cytomegalovirus (CMV) coinfection is associated with infant HIV-1 disease progression and mortality. In a cohort of Kenyan HIV-infected infants, the frequencies of activated (CD38(+) HLA-DR(+)) and apoptosis-vulnerable (CD95(+) Bcl-2(-)) CD4(+) and CD8(+) T cells increased substantially during acute CMV infection. The frequency of activated CD4(+) T cells was strongly associated with both concurrent CMV coinfection (P = 0.001) and HIV-1 viral load (P = 0.05). The frequency of apoptosis-vulnerable cells was also associated with CMV coinfection in the CD4 (P = 0.02) and CD8 (P < 0.001) T cell subsets. Similar observations were made in HIV-exposed uninfected infants. CMV-induced increases in T cell activation and apoptosis may contribute to the rapid disease progression in coinfected infants.


Asunto(s)
Linfocitos T CD4-Positivos/inmunología , Linfocitos T CD8-positivos/inmunología , Infecciones por Citomegalovirus/complicaciones , Infecciones por Citomegalovirus/inmunología , Infecciones por VIH/complicaciones , Infecciones por VIH/inmunología , VIH-1 , Activación de Linfocitos , ADP-Ribosil Ciclasa 1/análisis , Apoptosis , Linfocitos T CD4-Positivos/metabolismo , Linfocitos T CD4-Positivos/virología , Linfocitos T CD8-positivos/metabolismo , Linfocitos T CD8-positivos/virología , Coinfección , Citomegalovirus/inmunología , Infecciones por Citomegalovirus/virología , Progresión de la Enfermedad , Infecciones por VIH/virología , VIH-1/inmunología , Antígenos HLA-DR/análisis , Humanos , Lactante , Kenia , Proteínas Proto-Oncogénicas c-bcl-2/análisis , Carga Viral , Receptor fas/biosíntesis
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA