Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 136
Filtrar
1.
Surg Innov ; 29(3): 378-384, 2022 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-34637364

RESUMEN

BACKGROUND: During cancer operations, the cancer itself is often hard to delineate-buried beneath healthy tissue and lacking discernable differences from the surrounding healthy organ. Long-wave infrared, or thermal, imaging poses a unique solution to this problem, allowing for the real-time label-free visualization of temperature deviations within the depth of tissues. The current study evaluated this technology for intraoperative cancer detection. METHODS: In this diagnostic study, patients with gastrointestinal, hepatobiliary, and renal cancers underwent long-wave infrared imaging of the malignancy during routine operations. RESULTS: It was found that 74% were clearly identifiable as hypothermic anomalies. The average temperature difference was 2.4°C (range 0.7 to 5.0) relative to the surrounding tissue. Cancers as deep as 3.3 cm from the surgical surface were visualized. Yet, 79% of the images had clinically relevant false positive signals [median 3 per image (range 0 to 10)] establishing an accuracy of 47%. Analysis suggests that the degree of temperature difference was primarily determined by features within the cancer and not peritumoral changes in the surrounding tissue. CONCLUSION: These findings provide important information on the unexpected hypothermal properties of intra-abdominal cancers, directions for future use of intraoperative long-wave infrared imaging, and new knowledge about the in vivo thermal energy expenditure of cancers and peritumoral tissue.


Asunto(s)
Neoplasias , Humanos , Temperatura
2.
Transpl Infect Dis ; 23(4): e13634, 2021 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-33982834

RESUMEN

BACKGROUND: Neutropenia is a serious complication following heart transplantation (OHT); however, risk factors for its development and its association with outcomes is not well described. We sought to study the prevalence of neutropenia, risk factors associated with its development, and its impact on infection, rejection, and survival. METHODS: A retrospective single-center analysis of adult OHT recipients from July 2004 to December 2017 was performed. Demographic, laboratory, medication, infection, rejection, and survival data were collected for 1 year post-OHT. Baseline laboratory measurements were collected within the 24 hours before OHT. Neutropenia was defined as absolute neutrophil count ≤1000 cells/mm3. Cox proportional hazards models explored associations with time to first neutropenia. Associations between neutropenia, analyzed as a time-dependent covariate, with secondary outcomes of time to infection, rejection, or death were also examined. RESULTS: Of 278 OHT recipients, 84 (30%) developed neutropenia at a median of 142 days (range 81-228) after transplant. Factors independently associated with increased risk of neutropenia included lower baseline WBC (HR 1.12; 95% CI 1.11-1.24), pre-OHT ventricular assist device (1.63; 1.00-2.66), high-risk CMV serostatus [donor positive, recipient negative] (1.86; 1.19-2.88), and having a previous CMV infection (4.07; 3.92-13.7). CONCLUSIONS: Neutropenia is a fairly common occurrence after adult OHT. CMV infection was associated with subsequent neutropenia, however, no statistically significant differences in outcomes were found between neutropenic and non-neutropenic patients in this small study. It remains to be determined in future studies if medication changes in response to neutropenia would impact patient outcomes.


Asunto(s)
Infecciones por Citomegalovirus , Trasplante de Corazón , Corazón Auxiliar , Neutropenia , Trasplante de Corazón/efectos adversos , Corazón Auxiliar/efectos adversos , Humanos , Neutropenia/epidemiología , Estudios Retrospectivos
3.
J Card Fail ; 27(5): 552-559, 2021 05.
Artículo en Inglés | MEDLINE | ID: mdl-33450411

RESUMEN

BACKGROUND: Elevated pulmonary vascular resistance (PVR) is common in patients with advanced heart failure. PVR generally improves after left ventricular assist device (LVAD) implantation, but the rate of decrease has not been quantified and the patient characteristics most strongly associated with this improvement are unknown. METHODS AND RESULTS: We analyzed 1581 patients from the Interagency Registry for Mechanically Assisted Circulatory Support registry who received a primary continuous-flow LVAD, had a baseline PVR of ≥3 Wood units (WU), and had PVR measured at least once postoperatively. Multivariable linear mixed effects modeling was used to evaluate independent associations between postoperative PVR and patient characteristics. PVR decreased by 1.53 WU (95% confidence interval [CI] 1.27-1.79 WU) per month in the first 3 months postoperatively, and by 0.066 WU (95% CI 0.060-0.070 WU) per month thereafter. Severe mitral regurgitation at any time during follow-up was associated with a 1.29 WU (95% CI 1.05-1.52 WU) higher PVR relative to absence of mitral regurgitation at that time. In a cross-sectional analysis, 15%-25% of patients had persistently elevated PVR of ≥3 WU at any given time within 36 months after LVAD implantation. CONCLUSION: The PVR tends to decrease rapidly early after implantation, and only more gradually thereafter. Residual mitral regurgitation may be an important contributor to elevated postoperative PVR. Future research is needed to understand the implications of elevated PVR after LVAD implantation and the optimal strategies for prevention and treatment.


Asunto(s)
Insuficiencia Cardíaca , Trasplante de Corazón , Corazón Auxiliar , Hipertensión Pulmonar , Estudios Transversales , Insuficiencia Cardíaca/terapia , Humanos , Estudios Retrospectivos , Resultado del Tratamiento , Resistencia Vascular
4.
Acta Diabetol ; 58(6): 707-722, 2021 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-33517494

RESUMEN

OBJECTIVE: Approximately 84 million people in the USA have pre-diabetes, but only a fraction of them receive proven effective therapies to prevent type 2 diabetes. We estimated the value of prioritizing individuals at highest risk of progression to diabetes for treatment, compared to non-targeted treatment of individuals meeting inclusion criteria for the Diabetes Prevention Program (DPP). METHODS: Using microsimulation to project outcomes in the DPP trial population, we compared two interventions to usual care: (1) lifestyle modification and (2) metformin administration. For each intervention, we compared targeted and non-targeted strategies, assuming either limited or unlimited program capacity. We modeled the individualized risk of developing diabetes and projected diabetic outcomes to yield lifetime costs and quality-adjusted life expectancy, from which we estimated net monetary benefits (NMB) for both lifestyle and metformin versus usual care. RESULTS: Compared to usual care, lifestyle modification conferred positive benefits and reduced lifetime costs for all eligible individuals. Metformin's NMB was negative for the lowest population risk quintile. By avoiding use when costs outweighed benefits, targeted administration of metformin conferred a benefit of $500 per person. If only 20% of the population could receive treatment, when prioritizing individuals based on diabetes risk, rather than treating a 20% random sample, the difference in NMB ranged from $14,000 to $20,000 per person. CONCLUSIONS: Targeting active diabetes prevention to patients at highest risk could improve health outcomes and reduce costs compared to providing the same intervention to a similar number of patients with pre-diabetes without targeted selection.


Asunto(s)
Diabetes Mellitus Tipo 2/prevención & control , Selección de Paciente , Estado Prediabético/terapia , Prevención Primaria , Adulto , Estudios de Cohortes , Análisis Costo-Beneficio , Diabetes Mellitus Tipo 2/economía , Diabetes Mellitus Tipo 2/epidemiología , Femenino , Accesibilidad a los Servicios de Salud/economía , Accesibilidad a los Servicios de Salud/organización & administración , Accesibilidad a los Servicios de Salud/estadística & datos numéricos , Humanos , Hipoglucemiantes/economía , Hipoglucemiantes/uso terapéutico , Esperanza de Vida , Estilo de Vida , Masculino , Metformina/economía , Metformina/uso terapéutico , Persona de Mediana Edad , Estado Prediabético/economía , Estado Prediabético/epidemiología , Prevención Primaria/economía , Prevención Primaria/métodos , Prevención Primaria/organización & administración , Prevención Primaria/estadística & datos numéricos , Calidad de Vida , Factores de Riesgo , Nivel de Atención/economía , Nivel de Atención/organización & administración , Nivel de Atención/normas , Estados Unidos/epidemiología
5.
Public Health Nutr ; 24(9): 2577-2591, 2021 06.
Artículo en Inglés | MEDLINE | ID: mdl-32489172

RESUMEN

OBJECTIVE: To quantify diet-related burdens of cardiometabolic diseases (CMD) by country, age and sex in Latin America and the Caribbean (LAC). DESIGN: Intakes of eleven key dietary factors were obtained from the Global Dietary Database Consortium. Aetiologic effects of dietary factors on CMD outcomes were obtained from meta-analyses. We combined these inputs with cause-specific mortality data to compute country-, age- and sex-specific absolute and proportional CMD mortality of eleven dietary factors in 1990 and 2010. SETTING: Thirty-two countries in LAC. PARTICIPANTS: Adults aged 25 years and older. RESULTS: In 2010, an estimated 513 371 (95 % uncertainty interval (UI) 423 286-547 841; 53·8 %) cardiometabolic deaths were related to suboptimal diet. Largest diet-related CMD burdens were related to low intake of nuts/seeds (109 831 deaths (95 % UI 71 920-121 079); 11·5 %), low fruit intake (106 285 deaths (95 % UI 94 904-112 320); 11·1 %) and high processed meat consumption (89 381 deaths (95 % UI 82 984-97 196); 9·4 %). Among countries, highest CMD burdens (deaths per million adults) attributable to diet were in Trinidad and Tobago (1779) and Guyana (1700) and the lowest were in Peru (492) and The Bahamas (504). Between 1990 and 2010, greatest decline (35 %) in diet-attributable CMD mortality was related to greater consumption of fruit, while greatest increase (7·2 %) was related to increased intakes of sugar-sweetened beverages. CONCLUSIONS: Suboptimal intakes of commonly consumed foods were associated with substantial CMD mortality in LAC with significant heterogeneity across countries. Improved access to healthful foods, such as nuts and fruits, and limits in availability of unhealthful factors, such as processed foods, would reduce diet-related burdens of CMD in LAC.


Asunto(s)
Enfermedades Cardiovasculares , Diabetes Mellitus , Adulto , Enfermedades Cardiovasculares/etiología , Dieta , Conducta Alimentaria , Humanos , América Latina/epidemiología , Encuestas Nutricionales , Nueces , Medición de Riesgo , Factores de Riesgo
6.
Anesth Analg ; 132(3): 698-706, 2021 03 01.
Artículo en Inglés | MEDLINE | ID: mdl-32332290

RESUMEN

BACKGROUND: The proportion of live births by cesarean delivery (CD) in China is significant, with some, particularly rural, provinces reporting up to 62.5%. The No Pain Labor & Delivery-Global Health Initiative (NPLD-GHI) was established to improve obstetric and neonatal outcomes in China, including through a reduction of CD through educational efforts. The purpose of this study was to determine whether a reduction in CD at a rural Chinese hospital occurred after NPLD-GHI. We hypothesized that a reduction in CD trend would be observed. METHODS: The NPLD-GHI program visited the Weixian Renmin Hospital, Hebei Province, China, from June 15 to 21, 2014. The educational intervention included problem-based learning, bedside teaching, simulation drill training, and multidisciplinary debriefings. An interrupted time-series analysis using segmented logistic regression models was performed on data collected between June 1, 2013 and May 31, 2015 to assess whether the level and/or trend over time in the proportion of CD births would decline after the program intervention. The primary outcome was monthly proportion of CD births. Secondary outcomes included neonatal intensive care unit (NICU) admissions and extended NICU length of stay, neonatal antibiotic and intubation use, and labor epidural analgesia use. RESULTS: Following NPLD-GHI, there was a level decrease in CD with an estimated odds ratio (95% confidence interval [CI]) of 0.87 (0.78-0.98), P = .017, with odds (95% CI) of monthly CD reduction an estimated 3% (1-5; P < .001), more in the post- versus preintervention periods. For labor epidural analgesia, there was a level increase (estimated odds ratio [95% CI] of 1.76 [1.48-2.09]; P < .001) and a slope decrease (estimated odds ratio [95% CI] of 0.94 [0.92-0.97]; P < .001). NICU admissions did not have a level change (estimated odds ratio [95% CI] of 0.99 [0.87-1.12]; P = .835), but the odds (95% CI) of monthly reduction in NICU admission was estimated 9% (7-11; P < .001), greater in post- versus preintervention. Neonatal intubation level and slope changes were not statistically significant. For neonatal antibiotic administration, while the level change was not statistically significant, there was a decrease in the slope with an odds (95% CI) of monthly reduction estimated 6% (3-9; P < .001), greater post- versus preintervention. CONCLUSIONS: In a large, rural Chinese hospital, live births by CD were lower following NPLD-GHI and associated with increased use of labor epidural analgesia. We also found decreasing NICU admissions. International-based educational programs can significantly alter practices associated with maternal and neonatal outcomes.


Asunto(s)
Analgesia Epidural/tendencias , Analgesia Obstétrica/tendencias , Cesárea/tendencias , Capacitación en Servicio , Dolor de Parto/tratamiento farmacológico , Manejo del Dolor/tendencias , Adulto , Analgesia Epidural/efectos adversos , Analgesia Obstétrica/efectos adversos , Cesárea/efectos adversos , China , Femenino , Conocimientos, Actitudes y Práctica en Salud , Hospitales Rurales/tendencias , Humanos , Recién Nacido , Cuidado Intensivo Neonatal/tendencias , Análisis de Series de Tiempo Interrumpido , Dolor de Parto/etiología , Nacimiento Vivo , Manejo del Dolor/efectos adversos , Grupo de Atención al Paciente , Embarazo , Evaluación de Programas y Proyectos de Salud , Resultado del Tratamiento , Adulto Joven
7.
Transplant Proc ; 53(1): 119-123, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-32690312

RESUMEN

PURPOSE: We examined the role of obesity and intraoperative red blood cell (RBC) and platelet transfusion in early allograft dysfunction (EAD) following liver transplantation (LT). METHODS: This is a retrospective analysis of 239 adult deceased-donor LT recipients over a 10-year period. EAD was defined by Olthoff's criteria. Data collection included donor (D) and recipient (R) age, body mass index (BMI) ≥ 35 kg/m2, diabetes mellitus, allograft macrosteatosis, and intraoperative (RBC) and platelet administration. We employed logistic regression to evaluate associations of these factors with EAD. Results are presented as odds ratios (OR) and 95% confidence intervals (CI) with corresponding P values. A P ≤ .05 was considered statistically significant. RESULTS: EAD occurred in 85 recipients (36%). Macrosteatosis data were available for 199 donors. In the multivariate analyses, BMI-D ≥ 35 kg/m2 increased the odds of developing EAD by 156% in the entire cohort (OR 2.56, 95% CI 1.09-6.01) and by 187% in recipients with macrosteatosis data (n = 199, OR 2.87, 95% CI 1.15-7.15). Each unit of RBCs increased the odds for EAD by 8% (OR 1.08, 95% CI 1.02-1.14) and, for the subgroup of 238 recipients with macrosteatosis data, by 9% (OR 1.09, 95% CI 1.02-1.16). CONCLUSION: We found a significant independent association of donor obesity and intraoperative RBC transfusion with EAD but no such association for platelet administration, MELD score, age, recipient obesity, and diabetes.


Asunto(s)
Diabetes Mellitus , Transfusión de Eritrocitos/efectos adversos , Trasplante de Hígado/efectos adversos , Obesidad/complicaciones , Disfunción Primaria del Injerto/etiología , Adulto , Estudios de Cohortes , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Factores de Riesgo
8.
Stroke ; 51(10): 3119-3123, 2020 10.
Artículo en Inglés | MEDLINE | ID: mdl-32921262

RESUMEN

BACKGROUND AND PURPOSE: In patients with cryptogenic stroke and patent foramen ovale (PFO), the Risk of Paradoxical Embolism (RoPE) Score has been proposed as a method to estimate a patient-specific "PFO-attributable fraction"-the probability that a documented PFO is causally-related to the stroke, rather than an incidental finding. The objective of this research is to examine the relationship between this RoPE-estimated PFO-attributable fraction and the effect of closure in 3 randomized trials. METHODS: We pooled data from the CLOSURE-I (Evaluation of the STARFlex Septal Closure System in Patients With a Stroke and/or Transient Ischemic Attack due to Presumed Paradoxical Embolism through a Patent Foramen Ovale), RESPECT (Randomized Evaluation of Recurrent Stroke Comparing PFO Closure to Established Current Standard of Care Treatment), and PC (Clinical Trial Comparing Percutaneous Closure of Patent Foramen Ovale [PFO] Using the Amplatzer PFO Occluder With Medical Treatment in Patients With Cryptogenic Embolism) trials. We examine the treatment effect of closure in high RoPE score (≥7) versus low RoPE score (<7) patients. We also estimated the relative risk reduction associated with PFO closure across each level of the RoPE score using Cox proportional hazard analysis. We estimated a patient-specific attributable fraction using a PC trial-compatible (9-point) RoPE equation (omitting the neuroradiology variable), as well as a 2-trial analysis using the original (10-point) RoPE equation. We examined the Pearson correlation between the estimated attributable fraction and the relative risk reduction across RoPE strata. RESULTS: In the low RoPE score group (<7, n=912), the rate of recurrent strokes per 100 person-years was 1.37 in the device arm versus 1.68 in the medical arm (hazard ratio, 0.82 [0.42-1.59] P=0.56) compared with 0.30 versus 1.03 (hazard ratio, 0.31 [0.11-0.85] P=0.02) in the high RoPE score group (≥7, n=1221); treatment-by-RoPE score group interaction, P=0.12. The RoPE score estimated attributable fraction anticipated the relative risk reduction across all levels of the RoPE score, in both the 3-trial (r=0.95, P<0.001) and 2-trial (r=0.92, P<0.001) analyses. CONCLUSIONS: The RoPE score estimated attributable fraction is highly correlated to the relative risk reduction of device versus medical therapy. This observation suggests the RoPE score identifies patients with cryptogenic stroke who are likely to have a PFO that is pathogenic rather than incidental.


Asunto(s)
Embolia Paradójica/etiología , Foramen Oval Permeable/complicaciones , Accidente Cerebrovascular/complicaciones , Cateterismo Cardíaco , Foramen Oval Permeable/cirugía , Humanos , Factores de Riesgo , Prevención Secundaria , Resultado del Tratamiento
9.
J Clin Transl Sci ; 4(2): 133-140, 2020 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-32313703

RESUMEN

INTRODUCTION: Shared patient-clinician decision-making is central to choosing between medical treatments. Decision support tools can have an important role to play in these decisions. We developed a decision support tool for deciding between nonsurgical treatment and surgical total knee replacement for patients with severe knee osteoarthritis. The tool aims to provide likely outcomes of alternative treatments based on predictive models using patient-specific characteristics. To make those models relevant to patients with knee osteoarthritis and their clinicians, we involved patients, family members, patient advocates, clinicians, and researchers as stakeholders in creating the models. METHODS: Stakeholders were recruited through local arthritis research, advocacy, and clinical organizations. After being provided with brief methodological education sessions, stakeholder views were solicited through quarterly patient or clinician stakeholder panel meetings and incorporated into all aspects of the project. RESULTS: Participating in each aspect of the research from determining the outcomes of interest to providing input on the design of the user interface displaying outcome predications, 86% (12/14) of stakeholders remained engaged throughout the project. Stakeholder engagement ensured that the prediction models that form the basis of the Knee Osteoarthritis Mathematical Equipoise Tool and its user interface were relevant for patient-clinician shared decision-making. CONCLUSIONS: Methodological research has the opportunity to benefit from stakeholder engagement by ensuring that the perspectives of those most impacted by the results are involved in study design and conduct. While additional planning and investments in maintaining stakeholder knowledge and trust may be needed, they are offset by the valuable insights gained.

10.
J Pediatr ; 214: 60-65.e2, 2019 11.
Artículo en Inglés | MEDLINE | ID: mdl-31474426

RESUMEN

OBJECTIVES: To evaluate salivary biomarkers that elucidate the molecular mechanisms by which in utero opioid exposure exerts sex-specific effects on select hypothalamic and reward genes driving hyperphagia, a hallmark symptom of infants suffering from neonatal opioid withdrawal syndrome (NOWS). STUDY DESIGN: We prospectively collected saliva from 50 newborns born at ≥34 weeks of gestational age with prenatal opioid exposure and 50 sex- and gestational age-matched infants without exposure. Saliva underwent transcriptomic analysis for 4 select genes involved in homeostatic and hedonic feeding regulation (neuropeptide Y2 receptor [NPY2R], proopiomelanocortin [POMC], leptin receptor [LEPR], dopamine type 2 receptor [DRD2]). Normalized gene expression data were stratified based on sex and correlated with feeding volume on day of life 7 and length of stay in infants with NOWS requiring pharmacotherapy. RESULTS: Expression of DRD2, a hedonistic/reward regulator, was significantly higher in male newborns compared with female newborns with NOWS (Δ threshold cycle 10.8 ± 3.8 vs 13.9 ± 3.7, P = .01). In NOWS requiring pharmacotherapy expression of leptin receptor, an appetite suppressor, was higher in male subjects than female subjects (Δ threshold cycle 8.4 ± 2.5 vs 12.4 ± 5.1, P = .05), DRD2 expression significantly correlated with intake volume on day of life 7 (r = 0.58, P = .02), and expression of NPY2R, an appetite regulator, negatively correlated with length of stay (r = -0.24, P = .05). CONCLUSIONS: Prenatal opioid exposure exerts sex-dependent effects on hypothalamic feeding regulatory genes with clinical correlations. Neonatal salivary gene expression analyses may predict hyperphagia, severity of withdrawal state, and length of stay in infants with NOWS.


Asunto(s)
Analgésicos Opioides/efectos adversos , Expresión Génica , Hiperfagia/etiología , Síndrome de Abstinencia Neonatal/genética , Saliva/química , Estudios de Casos y Controles , Femenino , Perfilación de la Expresión Génica , Marcadores Genéticos , Humanos , Recién Nacido , Masculino , Síndrome de Abstinencia Neonatal/complicaciones , Proyectos Piloto , Proopiomelanocortina/genética , Estudios Prospectivos , Receptores de Dopamina D2/genética , Receptores de Leptina/genética , Receptores de Neuropéptido Y/genética , Índice de Severidad de la Enfermedad , Factores Sexuales
11.
J Clin Transl Sci ; 3(1): 27-36, 2019 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-31404154

RESUMEN

BACKGROUND: To enhance enrollment into randomized clinical trials (RCTs), we proposed electronic health record-based clinical decision support for patient-clinician shared decision-making about care and RCT enrollment, based on "mathematical equipoise." OBJECTIVES: As an example, we created the Knee Osteoarthritis Mathematical Equipoise Tool (KOMET) to determine the presence of patient-specific equipoise between treatments for the choice between total knee replacement (TKR) and nonsurgical treatment of advanced knee osteoarthritis. METHODS: With input from patients and clinicians about important pain and physical function treatment outcomes, we created a database from non-RCT sources of knee osteoarthritis outcomes. We then developed multivariable linear regression models that predict 1-year individual-patient knee pain and physical function outcomes for TKR and for nonsurgical treatment. These predictions allowed detecting mathematical equipoise between these two options for patients eligible for TKR. Decision support software was developed to graphically illustrate, for a given patient, the degree of overlap of pain and functional outcomes between the treatments and was pilot tested for usability, responsiveness, and as support for shared decision-making. RESULTS: The KOMET predictive regression model for knee pain had four patient-specific variables, and an r 2 value of 0.32, and the model for physical functioning included six patient-specific variables, and an r 2 of 0.34. These models were incorporated into prototype KOMET decision support software and pilot tested in clinics, and were generally well received. CONCLUSIONS: Use of predictive models and mathematical equipoise may help discern patient-specific equipoise to support shared decision-making for selecting between alternative treatments and considering enrollment into an RCT.

12.
Appetite ; 142: 104348, 2019 11 01.
Artículo en Inglés | MEDLINE | ID: mdl-31299192

RESUMEN

Eating behaviors such as eating fast and ignoring internal satiety cues are associated with overweight/obesity, and may be influenced by environmental factors. This study examined changes in those behaviors, and associations between those behaviors and BMI, cardiometabolic biomarkers, and diet quality in military recruits before and during initial military training (IMT), an environment wherein access to food is restricted. Eating rate and reliance on internal satiety cues were self-reported, and BMI, body fat, cardiometabolic biomarkers, and diet quality were measured in 1389 Army, Air Force and Marine recruits (45% female, mean ±â€¯SEM BMI = 24.1 ±â€¯0.1 kg/m2) before and after IMT. Pre-IMT, habitually eating fast relative to slowly was associated with a 1.1 ±â€¯0.3 kg/m2 higher BMI (P < 0.001), but not with other outcomes; whereas, habitually eating until no food is left (i.e., ignoring internal satiety cues) was associated with lower diet quality (P < 0.001) and, in men, 1.6 ±â€¯0.6% lower body fat (P = 0.03) relative to those that habitually stopped eating before feeling full. More recruits reported eating fast (82% vs 39%) and a reduced reliance on internal satiety cues (55% vs 16%) during IMT relative to pre-IMT (P < 0.001). Findings suggest that eating behaviors correlate with body composition and/or diet quality in young, predominantly normal-weight recruits entering the military, and that IMT is associated with potentially unfavorable changes in these eating behaviors.


Asunto(s)
Índice de Masa Corporal , Conducta Alimentaria , Personal Militar , Autoinforme , Adolescente , Adulto , Biomarcadores/sangre , Composición Corporal , Peso Corporal , Dieta , Femenino , Humanos , Masculino , Obesidad/epidemiología , Sobrepeso/epidemiología , Aptitud Física , Saciedad , Encuestas y Cuestionarios , Estados Unidos , Adulto Joven
13.
Am J Kidney Dis ; 74(5): 620-628, 2019 11.
Artículo en Inglés | MEDLINE | ID: mdl-31301926

RESUMEN

RATIONALE & OBJECTIVE: Identifying patients who are likely to transfer from peritoneal dialysis (PD) to hemodialysis (HD) before transition could improve their subsequent care. This study developed a prediction tool for transition from PD to HD. STUDY DESIGN: Retrospective cohort study. SETTING & PARTICIPANTS: Adults initiating PD between January 2008 and December 2011, followed up through June 2015, for whom data were available in the US Renal Data System (USRDS). PREDICTORS: Clinical characteristics at PD initiation and peritonitis claims. OUTCOMES: Transfer to HD, with the competing outcomes of death and kidney transplantation. ANALYTICAL APPROACH: Outcomes were ascertained from USRDS treatment history files. Subdistribution hazards (competing-risk) models were fit using clinical characteristics at PD initiation. A nomogram was developed to classify patient risk at 1, 2, 3, and 4 years. These data were used to generate quartiles of HD transfer risk; this quartile score was incorporated into a cause-specific hazards model that additionally included a time-dependent variable for peritonitis. RESULTS: 29,573 incident PD patients were followed up for a median of 21.6 (interquartile range, 9.0-42.3) months, during which 41.2% transferred to HD, 25.9% died, 17.1% underwent kidney transplantation, and the rest were followed up to the study end in June 2015. Claims for peritonitis were present in 11,733 (40.2%) patients. The proportion of patients still receiving PD decreased to <50% at 22.6 months and 14.2% at 5 years. Peritonitis was associated with a higher rate of HD transfer (HR, 1.82; 95% CI, 1.76-1.89; P < 0.001), as were higher quartile scores of HD transfer risk (HRs of 1.31 [95% CI, 1.25-1.37), 1.51 [95% CI, 1.45-1.58], and 1.78 [95% CI, 1.71-1.86] for quartiles 2, 3, and 4 compared to quartile 1 [P < 0.001 for all]). LIMITATIONS: Observational data, reliant on the Medical Evidence Report and Medicare claims. CONCLUSIONS: A large majority of the patients who initiated renal replacement therapy with PD discontinued this modality within 5 years. Transfer to HD was the most common outcome. Patient characteristics and comorbid diseases influenced the probability of HD transfer, death, and transplantation, as did episodes of peritonitis.


Asunto(s)
Fallo Renal Crónico/terapia , Transferencia de Pacientes/estadística & datos numéricos , Diálisis Peritoneal/métodos , Terapia de Reemplazo Renal/métodos , Cuidado de Transición/organización & administración , Anciano , Femenino , Estudios de Seguimiento , Humanos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos
14.
BMC Pulm Med ; 19(1): 118, 2019 Jul 01.
Artículo en Inglés | MEDLINE | ID: mdl-31262278

RESUMEN

BACKGROUND: Despite well-defined criteria for use of antibiotics in patients presenting with mild to moderate Acute Exacerbation of Chronic Obstructive Pulmonary Disease (AECOPD), their overuse is widespread. We hypothesized that following implementation of a molecular multiplex respiratory viral panel (RVP), AECOPD patients with viral infections would be more easily identified, limiting antibiotic use in this population. The primary objective of our study was to investigate if availability of the RVP decreased antibiotic prescription at discharge among patients with AECOPD. METHODS: This is a single center, retrospective, before (pre-RVP) - after (post-RVP) study of patients admitted to a tertiary medical center from January 2013 to March 2016. The primary outcome was antibiotic prescription at discharge. Groups were compared using univariable and multivariable logistic-regression. RESULTS: A total of 232 patient-episodes were identified, 133 following RVP introduction. Mean age was 68.1 (pre-RVP) and 68.3 (post-RVP) years respectively (p = 0.88). Patients in pre-RVP group were similar to the post-RVP group with respect to gender (p = 0.54), proportion of patients with BMI < 21(p = 0.23), positive smoking status (p = 0.19) and diagnoses of obstructive sleep apnea (OSA, p = 0.16). We found a significant reduction in antibiotic prescription rate at discharge in patients admitted with AECOPD after introduction of the respiratory viral assay (pre-RVP 77.8% vs. post-RVP 63.2%, p = 0.01). In adjusted analyses, patients in the pre-RVP group [OR 2.11 (CI: 1.13-3.96), p = 0.019] with positive gram stain in sputum [OR 4.02 (CI: 1.61-10.06), p = 0.003] had the highest odds of antibiotic prescription at discharge. CONCLUSIONS: In patients presenting with mild to moderate Acute Exacerbation of Chronic Obstructive Pulmonary Disease (AECOPD), utilization of a comprehensive respiratory viral panel can significantly decrease the rate of antibiotic prescription at discharge.


Asunto(s)
Antibacterianos/administración & dosificación , Prescripciones de Medicamentos/estadística & datos numéricos , Alta del Paciente/estadística & datos numéricos , Enfermedad Pulmonar Obstructiva Crónica/tratamiento farmacológico , Infecciones del Sistema Respiratorio/tratamiento farmacológico , Anciano , Estudios Controlados Antes y Después , Progresión de la Enfermedad , Femenino , Humanos , Modelos Logísticos , Masculino , Persona de Mediana Edad , Análisis Multivariante , Enfermedad Pulmonar Obstructiva Crónica/diagnóstico , Enfermedad Pulmonar Obstructiva Crónica/virología , Infecciones del Sistema Respiratorio/diagnóstico , Infecciones del Sistema Respiratorio/virología , Estudios Retrospectivos , Esputo/microbiología
15.
Clin Breast Cancer ; 19(4): 259-267.e1, 2019 08.
Artículo en Inglés | MEDLINE | ID: mdl-31175052

RESUMEN

BACKGROUND: Anthracycline agents can cause cardiotoxicity. We used multivariable risk prediction models to identify a subset of patients with breast cancer at high risk of cardiotoxicity, for whom the harms of anthracycline chemotherapy may balance or exceed the benefits. PATIENTS AND METHODS: A clinical prediction model for anthracycline cardiotoxicity was created in 967 patients with human epidermal growth factor receptor-negative breast cancer treated with doxorubicin in the ECOG-ACRIN study E5103. Cardiotoxicity was defined as left ventricular ejection fraction (LVEF) decline of ≥ 10% to < 50% and/or a centrally adjudicated clinical heart failure diagnosis. Patient-specific incremental absolute benefit of anthracyclines (compared with non-anthracycline taxane chemotherapy) was estimated using the PREDICT model to assess breast cancer mortality risk. RESULTS: Of the 967 women who initiated therapy, 51 (5.3%) developed cardiotoxicity (12 with clinical heart failure). In a multivariate model, increasing age (odds ratio [OR], 1.04; 95% confidence interval [CI], 1.01-1.08), higher body mass index (OR, 1.06; 95% CI, 1.02-1.10), and lower baseline LVEF (OR, 0.93; 95% CI, 0.89-0.98) at baseline were significantly associated with cardiotoxicity. The concordance statistic of the risk model was 0.70 (95% CI, 0.63-0.77). In patients with low anticipated treatment benefit (n = 176) from the addition of anthracycline (< 2% absolute risk difference of breast cancer mortality at 10 years), 16 (9%) of 176 had a > 10% risk of cardiotoxicity and 61 (35%) of 176 had a 5% to 10% risk of cardiotoxicity at 1 year. CONCLUSION: Older age, higher body mass index, and lower baseline LVEF were associated with increased risk of cardiotoxicity. We identified a subgroup with low predicted absolute benefit of anthracyclines but with high predicted risk of cardiotoxicity. Additional studies are needed incorporating long-term cardiac outcomes and cardiotoxicity model external validation prior to implementation in routine clinical practice.


Asunto(s)
Protocolos de Quimioterapia Combinada Antineoplásica/efectos adversos , Neoplasias de la Mama/mortalidad , Cardiotoxicidad/diagnóstico , Toma de Decisiones , Insuficiencia Cardíaca/diagnóstico , Modelos Estadísticos , Medicina de Precisión , Neoplasias de la Mama/tratamiento farmacológico , Neoplasias de la Mama/patología , Cardiotoxicidad/epidemiología , Cardiotoxicidad/etiología , Ciclofosfamida/administración & dosificación , Doxorrubicina/administración & dosificación , Femenino , Estudios de Seguimiento , Insuficiencia Cardíaca/inducido químicamente , Insuficiencia Cardíaca/epidemiología , Humanos , Incidencia , Persona de Mediana Edad , Valor Predictivo de las Pruebas , Medición de Riesgo , Tasa de Supervivencia , Estados Unidos/epidemiología
16.
Resuscitation ; 139: 308-313, 2019 06.
Artículo en Inglés | MEDLINE | ID: mdl-30836171

RESUMEN

AIM: "Early" withdrawal of life support therapies (eWLST) within the first 3 calendar days after resuscitation from cardiac arrest (CA) is discouraged. We evaluated a prospective multicenter registry of patients admitted to hospitals after resuscitation from CA to determine predictors of eWLST and estimate its impact on outcomes. METHODS: CA survivors enrolled from 2012-2017 in the International Cardiac Arrest Registry (INTCAR) were included. We developed a propensity score for eWLST and matched a cohort with similar probabilities of eWLST who received ongoing care. The incidence of good outcome (Cerebral Performance Category of 1 or 2) was measured across deciles of eWLST in the matched cohort. RESULTS: 2688 patients from 24 hospitals were included. Median ischemic time was 20 (IQR 11, 30) minutes, and 1148 (43%) had an initial shockable rhythm. Withdrawal of life support occurred in 1162 (43%) cases, with 459 (17%) classified as eWLST. Older age, initial non-shockable rhythm, increased ischemic time, shock on admission, out-of-hospital arrest, and admission in the United States were each independently associated with eWLST. All patients with eWLST died, while the matched cohort, good outcome occurred in 21% of patients. 19% of patients within the eWLST group were predicted to have a good outcome, had eWLST not occurred. CONCLUSIONS: Early withdrawal of life support occurs frequently after cardiac arrest. Although the mortality of patients matched to those with eWLST was high, these data showed excess mortality with eWLST.


Asunto(s)
Paro Cardíaco/mortalidad , Paro Cardíaco/terapia , Cuidados para Prolongación de la Vida , Privación de Tratamiento , Anciano , Reanimación Cardiopulmonar , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Prospectivos , Factores de Tiempo
17.
Breastfeed Med ; 14(4): 230-235, 2019 05.
Artículo en Inglés | MEDLINE | ID: mdl-30882237

RESUMEN

Background: Olfactory maturation is essential for successful oral feeding. Previous studies have suggested that olfactory stimulation with maternal breast milk may expedite oral feeding skills in the premature infant; however, the optimal developmental window to utilize this intervention and sex-specific responses to stimuli are largely unknown. Objectives: To determine individual responses to olfactory stimulation with mother's own milk (MOM) on feeding outcomes in premature newborns. Materials and Methods: Infants born between 28 0/7 and 33 6/7 weeks' gestation (n = 36) were randomized to receive either MOM or water (sham) stimulus during the learning process of oral feeding. Clinical and feeding outcomes were recorded. Statistical analyses examined the effect of stimulation with MOM on feeding outcomes stratified for age and sex. Results: Overall, there was no significant difference between sham infants compared with MOM infants in mean postmenstrual age of full oral feeds (sham: 35 5/7 versus MOM 36 0/7; p = 0.37). However, when stratified by gestational age (GA), infants born <31 weeks' gestation who received MOM stimulation learned to feed sooner than controls (p = 0.06), whereas infants born ≥31 weeks' gestation learned to feed later than controls (p = 0.20) with a significant interaction (p = 0.02) between the stimulus (MOM versus sham) and dichotomized GA (<31 versus ≥31 weeks). There were no sex differences in response to olfactory stimulus. Conclusions: Infants born <31 weeks' GA who received MOM stimulation learned to feed sooner than control infants and the impact of MOM is significantly different between infants born before or after 31 weeks GA. These data suggest there may be an optimal time in development to utilize maternal breast milk to expedite oral feeding maturation in the premature newborn.


Asunto(s)
Recien Nacido Prematuro , Leche Humana/química , Odorantes , Olfato , Lactancia Materna , Femenino , Edad Gestacional , Humanos , Fenómenos Fisiológicos Nutricionales del Lactante , Recién Nacido , Estimación de Kaplan-Meier , Masculino , Madres , Estudios Prospectivos
18.
Ann Surg Oncol ; 26(6): 1795-1804, 2019 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-30911945

RESUMEN

BACKGROUND: Peritoneal lesions are common findings during operative abdominal cancer staging. The decision to perform biopsy is made subjectively by the surgeon, a practice the authors hypothesized to be imprecise. This study aimed to describe optical characteristics differentiating benign peritoneal lesions from peritoneal metastases. METHODS: The study evaluated laparoscopic images of 87 consecutive peritoneal lesions biopsied during staging laparoscopies for gastrointestinal malignancies from 2014 to 2017. A blinded survey assessing these lesions was completed by 10 oncologic surgeons. Three senior investigators categorized optical features of the lesions. Computer-aided digital image processing and machine learning was used to classify the lesions. RESULTS: Of the 87 lesions, 28 (32%) were metastases. On expert survey, surgeons on the average misidentified 36 ± 19% of metastases. Multivariate analysis identified degree of nodularity, border transition, and degree of transparency as independent predictors of metastases (each p < 0.03), with an area under the receiver operating characteristics curve (AUC) of 0.82 (95% confidence interval [CI], 0.72-0.91). Image processing demonstrated no difference using image color segmentation, but showed a difference in gradient magnitude between benign and metastatic lesions (AUC, 0.66; 95% CI 0.54-0.78; p = 0.02). Machine learning using a neural network with a tenfold cross-validation obtained an AUC of only 0.47. CONCLUSIONS: To date, neither experienced oncologic surgeons nor computerized image analysis can differentiate peritoneal metastases from benign peritoneal lesions with an accuracy that is clinically acceptable. Although certain features correlate with the presence of metastases, a substantial overlap in optical appearance exists between benign and metastatic peritoneal lesions. Therefore, this study suggested the need to perform biopsy for all peritoneal lesions during operative staging, or at least to lower the threshold significantly.


Asunto(s)
Adenocarcinoma/patología , Neoplasias Gastrointestinales/patología , Procesamiento de Imagen Asistido por Computador/métodos , Cuidados Intraoperatorios , Aprendizaje Automático , Neoplasias Peritoneales/secundario , Pautas de la Práctica en Medicina/tendencias , Adenocarcinoma/cirugía , Adulto , Anciano , Anciano de 80 o más Años , Estudios de Cohortes , Femenino , Estudios de Seguimiento , Neoplasias Gastrointestinales/cirugía , Humanos , Laparoscopía , Masculino , Persona de Mediana Edad , Estadificación de Neoplasias , Neoplasias Peritoneales/cirugía , Pronóstico
20.
Muscle Nerve ; 58(6): 852-854, 2018 12.
Artículo en Inglés | MEDLINE | ID: mdl-30028521

RESUMEN

INTRODUCTION: Benign fasciculations are common. Despite the favorable prognosis of benign fasciculation syndrome (BFS), patients are often anxious about their symptoms. In this study, we prospectively followed 35 patients with BFS over a 24-month period. METHODS: We conducted serial questionnaires to assess anxiety, associated symptoms, and duration. RESULTS: 71.4% of patients were men, and 34.4% were employed in the medical field. Most reported anxiety, but only 14% were anxious as measured by the Zung self-rating anxiety scale. Fasciculations were most common in the calves and persisted in 93% of patients. Anxiety levels did not change over time. Associated symptoms (subjective weakness, sensory symptoms, and cramps) were common and resolved to varying degrees. No patients developed motor neuron disease. DISCUSSION: BFS is a benign disorder that usually persists over time. Commonly associated symptoms include subjective weakness, sensory symptoms, and cramps. BFS is usually not associated with pathologic anxiety. Muscle Nerve 58:852-854, 2018.


Asunto(s)
Ansiedad/diagnóstico , Ansiedad/etiología , Enfermedades Neuromusculares/complicaciones , Enfermedades Neuromusculares/psicología , Adulto , Electromiografía , Femenino , Humanos , Estudios Longitudinales , Masculino , Persona de Mediana Edad , Estudios Prospectivos , Escalas de Valoración Psiquiátrica , Encuestas y Cuestionarios , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...