Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 52
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Liver Int ; 44(8): 1952-1960, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-38619000

RESUMEN

BACKGROUND AND AIMS: Genetic variants influence primary biliary cholangitis (PBC) risk. We established and tested an accurate polygenic risk score (PRS) using these variants. METHODS: Data from two Italian cohorts (OldIT 444 cases, 901 controls; NewIT 255 cases, 579 controls) were analysed. The latest international genome-wide meta-analysis provided effect size estimates. The PRS, together with human leukocyte antigen (HLA) status and sex, was included in an integrated risk model. RESULTS: Starting from 46 non-HLA genes, 22 variants were selected. PBC patients in the OldIT cohort showed a higher risk score than controls: -.014 (interquartile range, IQR, -.023, .005) versus -.022 (IQR -.030, -.013) (p < 2.2 × 10-16). For genetic-based prediction, the area under the curve (AUC) was .72; adding sex increased the AUC to .82. Validation in the NewIT cohort confirmed the model's accuracy (.71 without sex, .81 with sex). Individuals in the top group, representing the highest 25%, had a PBC risk approximately 14 times higher than that of the reference group (lowest 25%; p < 10-6). CONCLUSION: The combination of sex and a novel PRS accurately discriminated between PBC cases and controls. The model identified a subset of individuals at increased risk of PBC who might benefit from tailored monitoring.


Asunto(s)
Predisposición Genética a la Enfermedad , Cirrosis Hepática Biliar , Humanos , Masculino , Cirrosis Hepática Biliar/genética , Cirrosis Hepática Biliar/diagnóstico , Femenino , Persona de Mediana Edad , Estudios de Casos y Controles , Italia , Anciano , Factores de Riesgo , Medición de Riesgo , Estudio de Asociación del Genoma Completo , Herencia Multifactorial , Antígenos HLA/genética , Polimorfismo de Nucleótido Simple , Área Bajo la Curva , Adulto , Factores Sexuales , Puntuación de Riesgo Genético
2.
BMC Med Res Methodol ; 24(1): 3, 2024 01 03.
Artículo en Inglés | MEDLINE | ID: mdl-38172810

RESUMEN

BACKGROUND: In any single-arm trial on novel treatments, assessment of toxicity plays an important role as occurrence of adverse events (AEs) is relevant for application in clinical practice. In the presence of a non-fatal time-to-event(s) efficacy endpoint, the analysis should be broadened to consider AEs occurrence in time. The AEs analysis could be tackled with two approaches, depending on the clinical question of interest. Approach 1 focuses on the occurrence of AE as first event. Treatment ability to protect from the efficacy endpoint event(s) has an impact on the chance of observing AEs due to competing risks action. Approach 2 considers how treatment affects the occurrence of AEs in the potential framework where the efficacy endpoint event(s) could not occur. METHODS: In the first part of the work we review the strategy of analysis for these two approaches. We identify theoretical quantities and estimators consistent with the following features: (a) estimators should address for the presence of right censoring; (b) theoretical quantities and estimators should be functions of time. In the second part of the work we propose the use of alternative methods (regression models, stratified Kaplan-Meier curves, inverse probability of censoring weighting) to relax the assumption of independence between the potential times to AE and to event(s) in the efficacy endpoint for addressing Approach 2. RESULTS: We show through simulations that the proposed methods overcome the bias due to the dependence between the two potential times and related to the use of standard estimators. CONCLUSIONS: We demonstrated through simulations that one can handle patients selection in the risk sets due to the competing event, and thus obtain conditional independence between the two potential times, adjusting for all the observed covariates that induce dependence.


Asunto(s)
Modelos Estadísticos , Proyectos de Investigación , Humanos , Sesgo , Probabilidad , Ensayos Clínicos como Asunto
3.
Support Care Cancer ; 32(1): 48, 2023 Dec 22.
Artículo en Inglés | MEDLINE | ID: mdl-38129602

RESUMEN

PURPOSE: Clinical practice guidelines recommend altering neurotoxic chemotherapy treatment in patients experiencing intolerable chemotherapy-induced peripheral neuropathy (CIPN). The primary objective of this survey was to understand patient's perspectives on altering neurotoxic chemotherapy treatment, including their perceptions of the benefits of preventing irreversible CIPN and the risks of reducing treatment efficacy. METHODS: A cross-sectional online survey was distributed via social networks to patients who were currently receiving or had previously received neurotoxic chemotherapy for cancer. Survey results were analyzed using descriptive statistics and qualitative analysis. RESULTS: Following data cleaning, 447 participants were included in the analysis. The median age was 57 years, 93% were white, and most were from the UK (53%) or USA (38%). Most participants who were currently or recently treated expected some CIPN symptom resolution (86%), but 45% of those who had completed treatment more than a year ago reported experiencing no symptom resolution. Participants reported that they would discontinue chemotherapy treatment for less severe CIPN if they knew their symptoms would be permanent than if symptoms would disappear after treatment. Most patients stated that the decision to alter chemotherapy or not was usually made collaboratively between the patient and their treating clinician (61%). The most common reason participants were reluctant to talk with their clinician about CIPN was fear that treatment would be altered. Participants noted a need for improved understanding of CIPN symptoms and their permanence, better patient education relating to CIPN prior to and after treatment, and greater clinician understanding and empathy around CIPN. CONCLUSIONS: This survey highlights the importance of shared decision-making, including a consideration of both the long-term benefits and risks of altering neurotoxic chemotherapy treatment due to CIPN. Additional work is needed to develop decision aids and other communication tools that can be used to improve shared decision making and help patients with cancer achieve their treatment goals.


Asunto(s)
Antineoplásicos , Neoplasias , Enfermedades del Sistema Nervioso Periférico , Humanos , Persona de Mediana Edad , Antineoplásicos/uso terapéutico , Estudios Transversales , Enfermedades del Sistema Nervioso Periférico/diagnóstico , Neoplasias/tratamiento farmacológico , Resultado del Tratamiento , Calidad de Vida
4.
Surg Endosc ; 37(10): 8133-8143, 2023 10.
Artículo en Inglés | MEDLINE | ID: mdl-37684403

RESUMEN

BACKGROUND: Laparoscopic cholecystectomy (LapC) is one of the most frequently performed surgical procedures worldwide. Reaching technical competency in performing LapC is considered one essential task for young surgeons. Investigating the learning curve for LapC (LC-LapC) may provide important information regarding the learning process and guide the training pathway of residents, improving educational outcomes. The present study aimed to investigate LC-LapC among general surgery residents (GSRs). METHODS: Operative surgical reports of consecutive patients undergoing LapC performed by GSRs attending the General Surgery Residency Program at the University of Milan were analysed. Data on patient- and surgery-related variables were obtained from the ICD-9-CM diagnosis codes and gathered. A multidimensional assessment of the LC was performed through Cumulative Sum (CUSUM) and Risk-Adjusted (RA)-CUSUM analysis. RESULTS: 340 patients operated by 6 GSRs were collected. The CUSUM and RA-CUSUM graphs based on surgical failures allowed to distinguish two defined phases for all GSRs: an initial phase ending at the peak, so-called learning phase, followed by a phase in which there was a significant decrease in failure incidence, so-called proficiency phase. The learning phase was completed for all GSRs at most within 25 procedures, but the trend of the curves and the number of procedures needed to achieve technical competency varied among operators ranging between 7 and 25. CONCLUSIONS: The present study suggested that at most 25 procedures might be sufficient to acquire technical competency in LapC. The variability in the number of procedures needed to complete the LC, ranging between 7 and 25, could be due to the heterogeneous scenarios in which LapC was performed, and deserves to be investigated through a prospective study involving a larger number of GSRs and institutions.


Asunto(s)
Colecistectomía Laparoscópica , Internado y Residencia , Laparoscopía , Humanos , Curva de Aprendizaje , Estudios Prospectivos , Competencia Clínica , Laparoscopía/métodos , Estudios Retrospectivos
5.
Liver Int ; 42(3): 615-627, 2022 03.
Artículo en Inglés | MEDLINE | ID: mdl-34951722

RESUMEN

BACKGROUND & AIMS: Machine learning (ML) provides new approaches for prognostication through the identification of novel subgroups of patients. We explored whether ML could support disease sub-phenotyping and risk stratification in primary biliary cholangitis (PBC). METHODS: ML was applied to an international dataset of PBC patients. The dataset was split into a derivation cohort (training set) and a validation cohort (validation set), and key clinical features were analysed. The outcome was a composite of liver-related death or liver transplantation. ML and standard survival analysis were performed. RESULTS: The training set was composed of 11,819 subjects, while the validation set was composed of 1,069 subjects. ML identified four clusters of patients characterized by different phenotypes and long-term prognosis. Cluster 1 (n = 3566) included patients with excellent prognosis, whereas Cluster 2 (n = 3966) consisted of individuals at worse prognosis differing from Cluster 1 only for albumin levels around the limit of normal. Cluster 3 (n = 2379) included young patients with florid cholestasis and Cluster 4 (n = 1908) comprised advanced cases. Further sub-analyses on the dynamics of albumin within the normal range revealed that ursodeoxycholic acid-induced increase of albumin >1.2 x lower limit of normal (LLN) is associated with improved transplant-free survival. CONCLUSIONS: Unsupervised ML identified four novel groups of PBC patients with different phenotypes and prognosis and highlighted subtle variations of albumin within the normal range. Therapy-induced increase of albumin >1.2 x LLN should be considered a treatment goal.


Asunto(s)
Colangitis , Cirrosis Hepática Biliar , Colagogos y Coleréticos/uso terapéutico , Colangitis/complicaciones , Humanos , Cirrosis Hepática Biliar/tratamiento farmacológico , Aprendizaje Automático , Pronóstico , Medición de Riesgo , Ácido Ursodesoxicólico/uso terapéutico
6.
Colorectal Dis ; 24(5): 577-586, 2022 05.
Artículo en Inglés | MEDLINE | ID: mdl-35108445

RESUMEN

AIM: Despite the suggested potential benefit of complete mesocolic excision (CME) for right-sided colon cancer (RCC) for patient survival, concerns about its safety and feasibility have contributed to delayed acceptance of the procedure, especially when performed by a minimally invasive approach. Thus, the aim of this work was to evaluate the actual learning curve (LC) of laparoscopic CME for experienced colorectal surgeons. METHOD: Prospectively collected data for consecutive patients undergoing laparoscopic CME for RCC between October 2015 and January 2021 at our institution, operated on by experienced surgeons, were analysed. A multidimensional assessment of the LC was performed through cumulative sum (CUSUM) and risk-adjusted (RA) CUSUM analysis. RESULTS: Two hundred and two patients operated by on by three surgeons were considered. The CUSUM graphs based on operating time showed one peak of the curve between 17 and 27 cases. The CUSUM graphs based on surgical failure showed one peak of the curve between 20 and 24 cases The RA-CUSUM curve also showed one preeminent peak at 24-33 cases. Based on the CUSUM and RA-CUSUM analyses all the surgeons reached proficiency in 24-33 cases. CONCLUSIONS: Our study showed that an experienced minimally invasive colorectal surgeon acquires proficiency in laparoscopic CME for RCC after performing 24-33 cases.


Asunto(s)
Carcinoma de Células Renales , Neoplasias del Colon , Neoplasias Renales , Laparoscopía , Carcinoma de Células Renales/cirugía , Colectomía/métodos , Neoplasias del Colon/cirugía , Humanos , Laparoscopía/métodos , Curva de Aprendizaje , Estudios Retrospectivos
7.
Clin Gastroenterol Hepatol ; 19(8): 1688-1697.e14, 2021 08.
Artículo en Inglés | MEDLINE | ID: mdl-32777554

RESUMEN

BACKGROUND & AIMS: Gamma-glutamyltransferase (GGT) is a serum marker of cholestasis. We investigated whether serum level of GGT is a prognostic marker for patients with primary biliary cholangitis (PBC). METHODS: We analyzed data from patients with PBC from the Global PBC Study Group, comprising 14 centers in Europe and North America. We obtained measurements of serum GGT at baseline and time points after treatment. We used Cox model hazard ratios to evaluate the association between GGT and clinical outcomes, including liver transplantation and liver-related death. RESULTS: Of the 2129 patients included in our analysis, 281 (13%) had a liver-related clinical endpoint. Mean age at diagnosis was 53 years and 91% of patients were female patients. We found a correlation between serum levels of GGT and alkaline phosphatase (ALP) (r = 0.71). Based on data collected at baseline and yearly for up to 5 years, higher serum levels of GGT were associated with lower hazard for transplant-free survival. Serum level of GGT at 12 months after treatment higher than 3.2-fold the upper limit of normal (ULN) identified patients who required liver transplantation or with liver-related death at 10 years with an area under the receiver operating characteristic curve of 0.70. The risk of liver transplantation or liver-related death in patients with serum level of GGT above 3.2-fold the ULN, despite level of ALP lower than 1.5-fold the ULN, was higher compared to patients with level of GGT lower than 3.2-fold the ULN and level of ALP lower than 1.5-fold the ULN (P < .05). Including information on level of GGT increased the prognostic value of the Globe score. CONCLUSIONS: Serum level of GGT can be used to identify patients with PBC at risk for liver transplantation or death, and increase the prognostic value of ALP measurement. Our findings support the use of GGT as primary clinical endpoint in clinical trials. In patients with low serum level of ALP, a high level of GGT identifies those who might require treatment of metabolic disorders or PBC treatment escalation.


Asunto(s)
Colestasis , Cirrosis Hepática Biliar , Trasplante de Hígado , Femenino , Humanos , Pronóstico , gamma-Glutamiltransferasa
8.
HIV Med ; 22(9): 860-866, 2021 10.
Artículo en Inglés | MEDLINE | ID: mdl-34293254

RESUMEN

OBJECTIVES: The aim of the present study was too investigate prevalence and persistence of human papilloma virus (HPV) and cytological abnormalities (CAs) in the anal swabs of people living with HIV (PLWH): men who have sex with men (MSM), men who have sex with women (MSW) and women (W). METHODS: Between March 2010 and January 2019, an anal swab for cytological and HPV genotyping tests was offered to all PLWH attending our clinic. Logistic regression analysis was conducted to identify predictors of infection. RESULTS: In all, 354 PLWH were screened: 174 MSM, 90 MSW and 61 W. Prevalence of at least one high-risk (HR) HPV was higher in MSM (91%) and W (85%) than in MSW (77%) (P < 0.05). Cytological abnormalities were found in 21.1% of the entire population. At multivariable regression analysis a lower risk for HPV infection was found for W than for MSM [odds ratio = 0.24 (95% confidence interval: 0.115-0.513)] and for MSW than for MSM [0.37 (0.180-0.773)] and there was a significantly higher risk of CAs in PLWH with HPV 16 and 18 [3.3 (1.04-10.49)]. A total of 175 PLWH (103 MSM, 33 MSW and 26 W) had at least one follow-up visit (T1) after a median (interquartile range) follow-up of 3.6 (2.1-5.7) years. The acquisition rate of HR-HPV was high, with 66.7% of PLWH negative for HR-HPV at T0 who became positive at T1 (P < 0.001). The prevalence of CAs was stable (20.6%). A significant association between CAs at T1 and persistence of HPV-16 and/or 18 was found (P < 0.05). CONCLUSIONS: HPV 16 and 18 are associated with the presence and development of CAs irrespective of sexual orientation.


Asunto(s)
Infecciones por VIH , Infecciones por Papillomavirus , Minorías Sexuales y de Género , Canal Anal , Femenino , Genotipo , Infecciones por VIH/epidemiología , Homosexualidad Masculina , Papillomavirus Humano 16/genética , Humanos , Masculino , Papillomaviridae , Infecciones por Papillomavirus/complicaciones , Infecciones por Papillomavirus/epidemiología , Prevalencia , Factores de Riesgo , Conducta Sexual
9.
BMC Health Serv Res ; 20(1): 181, 2020 Mar 06.
Artículo en Inglés | MEDLINE | ID: mdl-32143625

RESUMEN

BACKGROUND: The Informative System of Nursing Performance was developed to measure complexity of nursing care based on the actual interventions performed by nurses at the point of care. The association of this score with in-hospital mortality was not investigated before. Having this information is relevant to define evidence-based criteria that hospital administrators can use to allocate nursing workforce according to the real and current patients' need for nursing care. The aim of this study is to assess the association between complexity of nursing care and in-hospital mortality. METHODS: Register-based cohort study on all patients admitted to acute medical wards of a middle-large hospital in the North of Italy between January 1, 2014, to December 31, 2015 and followed up to discharge. Out of all the eligible 7247 records identified in the Hospital Discharge Register, 6872 records from 5129 patients have been included. A multivariable frailty Cox model was adopted to estimate the association between the Informative System of Nursing Performance score, both as continuous variable and dichotomized as low (score < 50) or high (score ≥ 50), and in-hospital mortality adjusting for several factors recorded at admission (age, gender, type of admission unit, type of access and Charlson Comorbidity Index). RESULTS: The median age of the 5129 included patients was 76 [first-third quartiles 64-84] and 2657(52%) patients were males. Over the 6872 admissions, there were 395 in-hospital deaths among 2922 patients at high complexity of nursing care (13.5%) and 74/3950 (1.9%) among those at low complexity leading to a difference of 11.6% (95% CI: 10.3-13.0%). Adjusting by relevant confounders, the hazard rate of mortality in the first 10 days from admission resulted 6 times significantly higher in patients at high complexity of nursing care with respect to patients at low complexity (hazard ratio, HR 6.58, 95%CI: 4.50;9.62, p < 0.001). The HR was lower after 10 days from admission but still significantly higher than 1. By considering the continuous score, the association was confirmed. CONCLUSION: Complexity of nursing care is strongly associated to in-hospital mortality of acute patients admitted to medical departments. It predicts in-hospital mortality better than widely used indicators, such as comorbidity.


Asunto(s)
Mortalidad Hospitalaria/tendencias , Unidades Hospitalarias , Atención de Enfermería/organización & administración , Admisión del Paciente/estadística & datos numéricos , Anciano , Anciano de 80 o más Años , Estudios de Cohortes , Femenino , Humanos , Italia/epidemiología , Masculino , Persona de Mediana Edad , Factores de Tiempo
10.
Biom J ; 62(3): 836-851, 2020 05.
Artículo en Inglés | MEDLINE | ID: mdl-31515830

RESUMEN

The illness-death model is the simplest multistate model where the transition from the initial state 0 to the absorbing state 2 may involve an intermediate state 1 (e.g., disease relapse). The impact of the transition into state 1 on the subsequent transition hazard to state 2 enables insight to be gained into the disease evolution. The standard approach of analysis is modeling the transition hazards from 0 to 2 and from 1 to 2, including time to illness as a time-varying covariate and measuring time from origin even after transition into state 1. The hazard from 1 to 2 can be also modeled separately using only patients in state 1, measuring time from illness and including time to illness as a fixed covariate. A recently proposed approach is a model where time after the transition into state 1 is measured in both scales and time to illness is included as a time-varying covariate. Another possibility is a model where time after transition into state 1 is measured only from illness and time to illness is included as a fixed covariate. Through theoretical reasoning and simulation protocols, we discuss the use of these models and we develop a practical strategy aiming to (a) validate the properties of the illness-death process, (b) estimate the impact of time to illness on the hazard from state 1 to 2, and (c) quantify the impact that the transition into state 1 has on the hazard of the absorbing state. The strategy is also applied to a literature dataset on diabetes.


Asunto(s)
Biometría/métodos , Enfermedad , Modelos Estadísticos , Mortalidad , Diabetes Mellitus/mortalidad , Humanos , Análisis Multivariante
11.
Dig Surg ; 36(6): 530-538, 2019.
Artículo en Inglés | MEDLINE | ID: mdl-30636244

RESUMEN

AIM OF THE STUDY: The diagnosis of choledocholithiasis is challenging. Previously published scoring systems designed to calculate the risk of choledocholithiasis were evaluated to appraise the diagnostic performance. PATIENTS AND METHODS: Data of patients who were admitted between 2013 and 2015 with the following characteristics were retrieved: bile stone-related symptoms and signs, and indication to laparoscopic cholecystectomy. To validate and appraise the performance of the 6 scoring systems, the acknowledged domains of each metrics were applied to the present cohort. Sensitivity, specificity, positive, negative predictive, Youden index, and receiver operating characteristic curve with the area under the curve (AUC) values of the scores were calculated. RESULTS: Two-hundred patients were analyzed. The highest sensitivity and specificity were obtained from the Menezes' (96.6%) and Telem's (99.3%) metrics respectively. The Telem's and Menezes' scores had the best positive (75.0%) and negative (96.4%) predictive values respectively. The best accuracy, as computed by the Youden index and AUC, was found for the Soltan's scoring system (0.628 and 0.88, respectively). CONCLUSION: The available scoring systems are precise only in identifying patients with a negligible risk of common bile duct stone, but overall insufficiently accurate to suggest the routine use in clinical practice.


Asunto(s)
Coledocolitiasis/sangre , Coledocolitiasis/diagnóstico por imagen , Anciano , Alanina Transaminasa/sangre , Fosfatasa Alcalina/sangre , Amilasas/sangre , Aspartato Aminotransferasas/sangre , Bilirrubina/sangre , Proteína C-Reactiva/metabolismo , Pancreatocolangiografía por Resonancia Magnética , Coledocolitiasis/complicaciones , Femenino , Cálculos Biliares , Humanos , Masculino , Persona de Mediana Edad , Valor Predictivo de las Pruebas , Curva ROC , Estudios Retrospectivos , Ultrasonografía , gamma-Glutamiltransferasa/sangre
12.
Biom J ; 61(6): 1417-1429, 2019 11.
Artículo en Inglés | MEDLINE | ID: mdl-30290002

RESUMEN

The availability of novel biomarkers in several branches of medicine opens room for refining prognosis by adding factors on top of those having an established role. It is accepted that the impact of novel factors should not rely solely on regression coefficients and their significance but also on predictive power measures, such as Brier score and ROC-based quantities. However, novel factors that are promising at the exploratory stage often result in disappointingly low impact in the predictive power. This motivated the proposal of the net reclassification improvement and the integrated discrimination improvement, as direct measures of predictive power gain due to additional factors based on the concept of reclassification tables. These measures became extremely popular in cardiovascular disease and cancer applications, given the apparently easy interpretation. However, recent contributions in the biostatistical literature enlightened the tendency to indicate as advantageous models obtained by adding unrelated factors. These measures should not be used in practice. A further measure proposed a decade ago, the net benefit, is becoming a standard in assessing the consequences in terms of costs and benefits when using a risk predictor in practice for classification. This work reviews the conceptual formulations and interpretations of the available graphical methods and summary measures for evaluating risk predictor models. The aim is to provide guidance in the evaluation process that from the model development brings the risk predictor to be used in clinical practice for binary decision rules.


Asunto(s)
Biometría/métodos , Gráficos por Computador , Calibración , Reacciones Falso Positivas , Curva ROC , Medición de Riesgo
13.
BMC Infect Dis ; 18(1): 690, 2018 Dec 20.
Artículo en Inglés | MEDLINE | ID: mdl-30572830

RESUMEN

BACKGROUND: Prompt diagnosis of active tuberculosis (TB) has paramount importance to reduce TB morbidity and mortality and to prevent the spread of Mycobacterium tuberculosis. Few studies so far have assessed the diagnostic delay of TB and its risk factors in low-incidence countries. METHODS: We present a cross-sectional multicentre observational study enrolling all consecutive patients diagnosed with TB in seven referral centres in Italy. Information on demographic and clinical characteristics, health-seeking trajectories and patients' knowledge and awareness of TB were collected. Diagnostic delay was assessed as patient-related (time between symptoms onset and presentation to care) and healthcare-related (time between presentation to care and TB diagnosis). Factors associated with patient-related and healthcare-related delays in the highest tertile were explored using uni- and multivariate logistic regression analyses. RESULTS: We enrolled 137 patients, between June 2011 and May 2012. The median diagnostic delay was 66 days (Interquartile Range [IQR] 31-146). Patient-related and healthcare-related delay were 14.5 days (IQR 0-54) and 31 days (IQR: 7.25-85), respectively. Using multivariable analysis, patients living in Italy for < 5 years were more likely to have longer patient-related delay (> 3 weeks) than those living in Italy for > 5 years (Odds Ratio [OR] 3.47; 95% Confidence Interval [CI] 1.09-11.01). The most common self-reported reasons to delay presentation to care were the mild nature of symptoms (82%) and a good self-perceived health (76%). About a quarter (26%) of patients had wrong beliefs and little knowledge of TB, although this was not associated with longer diagnostic delay. Regarding healthcare-related delay, multivariate analysis showed that extra-pulmonary TB (OR 4.3; 95% CI 1.4-13.8) and first contact with general practitioner (OR 5.1; 95% CI 1.8-14.5) were both independently associated with higher risk of healthcare-related delay > 10 weeks. CONCLUSIONS: In this study, TB was diagnosed with a remarkable delay, mainly attributable to the healthcare services. Delay was higher in patients with extra-pulmonary disease and in those first assessed by general practitioners. We suggest the need to improve knowledge and raise awareness about TB not only in the general population but also among medical providers. Furthermore, specific programs to improve access to care should be designed for recent immigrants, at significantly high risk of patient-related delay. TRIAL REGISTRATION: The study protocol was registered under the US National Institute of Health ClinicalTrials.gov register, reference number: NCT01390987 . Study start date: June 2011.


Asunto(s)
Diagnóstico Tardío/estadística & datos numéricos , Conocimientos, Actitudes y Práctica en Salud , Accesibilidad a los Servicios de Salud/estadística & datos numéricos , Tiempo de Tratamiento/estadística & datos numéricos , Tuberculosis , Adulto , Concienciación , Estudios Transversales , Atención a la Salud/normas , Atención a la Salud/estadística & datos numéricos , Femenino , Accesibilidad a los Servicios de Salud/normas , Humanos , Incidencia , Italia/epidemiología , Masculino , Persona de Mediana Edad , Mycobacterium tuberculosis/aislamiento & purificación , Factores de Riesgo , Tiempo de Tratamiento/normas , Tuberculosis/diagnóstico , Tuberculosis/epidemiología , Tuberculosis/terapia
14.
BMC Pregnancy Childbirth ; 18(1): 6, 2018 01 03.
Artículo en Inglés | MEDLINE | ID: mdl-29298662

RESUMEN

BACKGROUNDS: Maternal total weight gain during pregnancy influences adverse obstetric outcomes in singleton pregnancies. However, its impact in twin gestation is less understood. Our objective was to estimate the influence of total maternal weight gain on preterm delivery in twin pregnancies. METHODS: We conducted a retrospective cohort study including diamniotic twin pregnancies with spontaneous labor delivered at 28 + 0 weeks or later. We analyzed the influence of total weight gain according to Institute of Medicine (IOM) cut-offs on the development of preterm delivery (both less than 34 and 37 weeks). Outcome were compared between under and normal weight gain and between over and normal weight gain separately using Fisher's exact test with Holm-Bonferroni correction. RESULTS: One hundred seventy five women were included in the study and divided into three groups: under (52.0%), normal (41.7%) and overweight gain (6.3%). Normal weight gain was associated with a reduction in the rate of preterm delivery compared to under and over weight gain [less than 34 weeks: under vs. normal OR 4.97 (1.76-14.02), over vs. normal OR 4.53 (0.89-23.08); less than 37 weeks: OR 3.16 (1.66-6.04) and 6.51 (1.30-32.49), respectively]. CONCLUSIONS: Normal weight gain reduces spontaneous preterm delivery compared to over and underweight gain.


Asunto(s)
Peso al Nacer , Embarazo Gemelar , Nacimiento Prematuro/epidemiología , Aumento de Peso/fisiología , Adulto , Femenino , Edad Gestacional , Guías como Asunto , Humanos , Recién Nacido , Recién Nacido Pequeño para la Edad Gestacional , National Academies of Science, Engineering, and Medicine, U.S., Health and Medicine Division , Preeclampsia/epidemiología , Embarazo , Estudios Retrospectivos , Estados Unidos/epidemiología
16.
Stat Med ; 35(7): 1032-48, 2016 Mar 30.
Artículo en Inglés | MEDLINE | ID: mdl-26503800

RESUMEN

The 'landmark' and 'Simon and Makuch' non-parametric estimators of the survival function are commonly used to contrast the survival experience of time-dependent treatment groups in applications such as stem cell transplant versus chemotherapy in leukemia. However, the theoretical survival functions corresponding to the second approach were not clearly defined in the literature, and the use of the 'Simon and Makuch' estimator was criticized in the biostatistical community. Here, we review the 'landmark' approach, showing that it focuses on the average survival of patients conditional on being failure free and on the treatment status assessed at the landmark time. We argue that the 'Simon and Makuch' approach represents counterfactual survival probabilities where treatment status is forced to be fixed: the patient is thought as under chemotherapy without possibility to switch treatment or as under transplant since the beginning of the follow-up. We argue that the 'Simon and Makuch' estimator leads to valid estimates only under the Markov assumption, which is however less likely to occur in practical applications. This motivates the development of a novel approach based on time rescaling, which leads to suitable estimates of the counterfactual probabilities in a semi-Markov process. The method is also extended to deal with a fixed landmark time of interest.


Asunto(s)
Modelos Estadísticos , Análisis de Supervivencia , Bioestadística , Simulación por Computador , Humanos , Estimación de Kaplan-Meier , Cadenas de Markov , Leucemia-Linfoma Linfoblástico de Células Precursoras/tratamiento farmacológico , Leucemia-Linfoma Linfoblástico de Células Precursoras/mortalidad , Leucemia-Linfoma Linfoblástico de Células Precursoras/terapia , Probabilidad , Estadísticas no Paramétricas , Trasplante de Células Madre , Factores de Tiempo
17.
Scand J Gastroenterol ; 50(4): 429-38, 2015 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-25633726

RESUMEN

OBJECTIVE: Hyperferritinemia is frequent in chronic liver diseases of any cause, but the extent to which ferritin truly reflects iron stores is variable. In these patients, both liver iron and fat are found in variable amount and association. Liver biopsy is often required to quantify liver fat and iron, but sampling variability and invasiveness limit its use. We aimed to assess single breath-hold multiecho magnetic resonance imaging (MRI) for the simultaneous lipid and iron quantification in patients with hyperferritinemia. MATERIAL AND METHODS: We compared MRI results for both iron and fat with their respective gold standards - liver iron concentration and computer-assisted image analysis for steatosis on biopsy. We prospectively studied 67 patients with hyperferritinemia and other 10 consecutive patients were used for validation. We estimated two linear calibration equations for the prediction of iron and fat based on MRI. The agreement between MRI and biopsy was evaluated. RESULTS: MRI showed good performances in both the training and validation samples. MRI information was almost completely in line with that obtained from liver biopsy. CONCLUSION: Single breath-hold multiecho MRI is an accurate method to obtain a valuable measure of both liver iron and steatosis in patients with hyperferritinemia.


Asunto(s)
Tejido Adiposo , Hígado Graso/patología , Hierro/análisis , Hígado/química , Hígado/patología , Imagen por Resonancia Magnética , Biopsia , Femenino , Humanos , Hierro/sangre , Imagen por Resonancia Magnética/métodos , Masculino , Persona de Mediana Edad , Estudios Prospectivos
18.
World J Surg Oncol ; 13: 260, 2015 Aug 28.
Artículo en Inglés | MEDLINE | ID: mdl-26311420

RESUMEN

BACKGROUND: Although several meta-analyses showed the positive effects of follow-up on the prognosis of colon cancer (CC), international guidelines are not in accordance on appropriate tests and their time frequency to optimize surveillance. Furthermore, stratified strategies based upon risk grading have not been implemented. This approach may be useful to rationalize resources. METHODS: From 2006, all patients operated for an early stage CC (I, IIA, IIB) according to the 7th edition of the AJCC-2010 classification entered in a prospective surveillance program in accordance to our local guidelines. Patients who underwent surgical resection after 2009 have been excluded to guarantee at least a 5-year follow-up. Classic histopathologic prognostic factors such as grade, T and N status, lymphatic and vascular invasion were assessed. Moreover, tumor budding and tumor-to-stroma proportion were evaluated. RESULTS: We had complete records of 196 patients. Distribution was as follows: 65 (33.2%) in stage I, 122 (62.2%) in stage IIA, and 9 (4.6%) in stage IIB. Eleven patients (5.6%) had a disease recurrence (local or distant). The median recurrence time was 20 months (range 6-48). Nine patients (82%) had recurrence with 24 months, and 91% were asymptomatic and detected by ultrasound or CT scan. According to the log-rank test, the risk factors with significant effect on the disease-free survival (DFS) were the number of lymph nodes <12 (p = 0.027) and the vascular invasion (p = 0.021), while for the overall (OS), only the vascular invasion was significant (p = 0.043). By the univariate and multivariate analyses, DSF was significantly lower in patients with less than 12 nodes removed, with vascular invasion, and with left of double cancer. OS was negatively affected only by vascular invasion despite the hazard ratios were similar to DSF. Stage IIB was associated with a threefold-increased risk of reduced OS and DSF. CONCLUSIONS: Stages I and IIA appear to behave similarly and should be considered as true early stages. The detection of fibrosis and budding do not seem to add valuable information for prognosis. In early CC stages, the surveillance program should be maximized within the first two years.


Asunto(s)
Neoplasias del Colon/mortalidad , Neoplasias del Colon/patología , Recurrencia Local de Neoplasia/mortalidad , Recurrencia Local de Neoplasia/patología , Estadificación de Neoplasias/normas , Adulto , Anciano , Anciano de 80 o más Años , Neoplasias del Colon/terapia , Femenino , Estudios de Seguimiento , Humanos , Metástasis Linfática , Masculino , Persona de Mediana Edad , Análisis Multivariante , Clasificación del Tumor , Invasividad Neoplásica , Recurrencia Local de Neoplasia/terapia , Pronóstico , Estudios Prospectivos , Estudios Retrospectivos , Factores de Riesgo , Tasa de Supervivencia
19.
Cell Tissue Bank ; 16(1): 151-7, 2015 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-24820865

RESUMEN

The aim of this study was to analyze factors contributing to bacteriological contamination of bone and tendon allograft. Between 2008 and 2011, 2,778 bone and tendon allografts obtained from 196 organ and tissue donors or tissue donors only were retrospectively analysed. Several variables were taken into account: donor type (organ and tissue donors vs. tissue donor), cause of death, time interval between death and tissue procurement, duration of the procurement procedure, type of allografts, number of team members, number of trainees members, associated surgical procedures, positivity to haemoculture, type of procurement. The overall incidence of graft contamination was 23 %. The cause of death, the procurement time, the duration of procurement, the associated surgical procedures were not associated with increased risk of contamination. Significant effect on contamination incidence was observed for the number of staff members performing the procurement. In addition, our study substantiated significantly higher contamination rate among bone allografts than from tendon grafts. According to these observations, in order to minimize the contamination rate of procured musculoskeletal allografts, we recommend appropriate donor selection, use of standard sterile techniques, immediate packaging of each allograft to reduce graft exposure. Allograft procurement should be performed by a small surgical team.


Asunto(s)
Aloinjertos , Bacterias/aislamiento & purificación , Huesos/microbiología , Tendones/microbiología , Donantes de Tejidos , Adolescente , Adulto , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Adulto Joven
20.
Respir Care ; 2024 Jun 12.
Artículo en Inglés | MEDLINE | ID: mdl-38594036

RESUMEN

BACKGROUND: The use of prone position (PP) has been widespread during the COVID-19 pandemic. Whereas it has demonstrated benefits, including improved oxygenation and lung aeration, the factors influencing the response in terms of gas exchange to PP remain unclear. In particular, the association between baseline quantitative computed tomography (CT) scan results and gas exchange response to PP in invasively ventilated subjects with COVID-19 ARDS is unknown. The present study aimed to compare baseline quantitative CT results between subjects responding to PP in terms of oxygenation or CO2 clearance and those who did not. METHODS: This was a single-center, retrospective observational study including critically ill, invasively ventilated subjects with COVID-19-related ARDS admitted to the ICUs of Niguarda Hospital between March 2020-November 2021. Blood gas samples were collected before and after PP. Subjects in whom the PaO2 /FIO2 increase was ≥ 20 mm Hg after PP were defined as oxygen responders. CO2 responders were defined when the ventilatory ratio (VR) decreased during PP. Automated quantitative CT analyses were performed to obtain tissue mass and density of the lungs. RESULTS: One hundred twenty-five subjects were enrolled, of which 116 (93%) were O2 responders and 51 (41%) CO2 responders. No difference in quantitative CT characteristics and oxygen were observed between responders and non-responders (tissue mass 1,532 ± 396 g vs 1,654 ± 304 g, P = .28; density -544 ± 109 HU vs -562 ± 58 HU P = .42). Similar findings were observed when dividing the population according to CO2 response (tissue mass 1,551 ± 412 g vs 1,534 ± 377 g, P = .89; density -545 ± 123 HU vs -546 ± 94 HU, P = .99). CONCLUSIONS: Most subjects with COVID-19-related ARDS improved their oxygenation at the first pronation cycle. The study suggests that baseline quantitative CT scan data were not associated with the response to PP in oxygenation or CO2 in mechanically ventilated subjects with COVID-19-related ARDS.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA