Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 136
Filter
Add more filters

Publication year range
1.
Surg Innov ; 29(3): 378-384, 2022 Jun.
Article in English | MEDLINE | ID: mdl-34637364

ABSTRACT

BACKGROUND: During cancer operations, the cancer itself is often hard to delineate-buried beneath healthy tissue and lacking discernable differences from the surrounding healthy organ. Long-wave infrared, or thermal, imaging poses a unique solution to this problem, allowing for the real-time label-free visualization of temperature deviations within the depth of tissues. The current study evaluated this technology for intraoperative cancer detection. METHODS: In this diagnostic study, patients with gastrointestinal, hepatobiliary, and renal cancers underwent long-wave infrared imaging of the malignancy during routine operations. RESULTS: It was found that 74% were clearly identifiable as hypothermic anomalies. The average temperature difference was 2.4°C (range 0.7 to 5.0) relative to the surrounding tissue. Cancers as deep as 3.3 cm from the surgical surface were visualized. Yet, 79% of the images had clinically relevant false positive signals [median 3 per image (range 0 to 10)] establishing an accuracy of 47%. Analysis suggests that the degree of temperature difference was primarily determined by features within the cancer and not peritumoral changes in the surrounding tissue. CONCLUSION: These findings provide important information on the unexpected hypothermal properties of intra-abdominal cancers, directions for future use of intraoperative long-wave infrared imaging, and new knowledge about the in vivo thermal energy expenditure of cancers and peritumoral tissue.


Subject(s)
Neoplasms , Humans , Temperature
2.
J Card Fail ; 27(5): 552-559, 2021 05.
Article in English | MEDLINE | ID: mdl-33450411

ABSTRACT

BACKGROUND: Elevated pulmonary vascular resistance (PVR) is common in patients with advanced heart failure. PVR generally improves after left ventricular assist device (LVAD) implantation, but the rate of decrease has not been quantified and the patient characteristics most strongly associated with this improvement are unknown. METHODS AND RESULTS: We analyzed 1581 patients from the Interagency Registry for Mechanically Assisted Circulatory Support registry who received a primary continuous-flow LVAD, had a baseline PVR of ≥3 Wood units (WU), and had PVR measured at least once postoperatively. Multivariable linear mixed effects modeling was used to evaluate independent associations between postoperative PVR and patient characteristics. PVR decreased by 1.53 WU (95% confidence interval [CI] 1.27-1.79 WU) per month in the first 3 months postoperatively, and by 0.066 WU (95% CI 0.060-0.070 WU) per month thereafter. Severe mitral regurgitation at any time during follow-up was associated with a 1.29 WU (95% CI 1.05-1.52 WU) higher PVR relative to absence of mitral regurgitation at that time. In a cross-sectional analysis, 15%-25% of patients had persistently elevated PVR of ≥3 WU at any given time within 36 months after LVAD implantation. CONCLUSION: The PVR tends to decrease rapidly early after implantation, and only more gradually thereafter. Residual mitral regurgitation may be an important contributor to elevated postoperative PVR. Future research is needed to understand the implications of elevated PVR after LVAD implantation and the optimal strategies for prevention and treatment.


Subject(s)
Heart Failure , Heart Transplantation , Heart-Assist Devices , Hypertension, Pulmonary , Cross-Sectional Studies , Heart Failure/therapy , Humans , Retrospective Studies , Treatment Outcome , Vascular Resistance
3.
Transpl Infect Dis ; 23(4): e13634, 2021 Aug.
Article in English | MEDLINE | ID: mdl-33982834

ABSTRACT

BACKGROUND: Neutropenia is a serious complication following heart transplantation (OHT); however, risk factors for its development and its association with outcomes is not well described. We sought to study the prevalence of neutropenia, risk factors associated with its development, and its impact on infection, rejection, and survival. METHODS: A retrospective single-center analysis of adult OHT recipients from July 2004 to December 2017 was performed. Demographic, laboratory, medication, infection, rejection, and survival data were collected for 1 year post-OHT. Baseline laboratory measurements were collected within the 24 hours before OHT. Neutropenia was defined as absolute neutrophil count ≤1000 cells/mm3. Cox proportional hazards models explored associations with time to first neutropenia. Associations between neutropenia, analyzed as a time-dependent covariate, with secondary outcomes of time to infection, rejection, or death were also examined. RESULTS: Of 278 OHT recipients, 84 (30%) developed neutropenia at a median of 142 days (range 81-228) after transplant. Factors independently associated with increased risk of neutropenia included lower baseline WBC (HR 1.12; 95% CI 1.11-1.24), pre-OHT ventricular assist device (1.63; 1.00-2.66), high-risk CMV serostatus [donor positive, recipient negative] (1.86; 1.19-2.88), and having a previous CMV infection (4.07; 3.92-13.7). CONCLUSIONS: Neutropenia is a fairly common occurrence after adult OHT. CMV infection was associated with subsequent neutropenia, however, no statistically significant differences in outcomes were found between neutropenic and non-neutropenic patients in this small study. It remains to be determined in future studies if medication changes in response to neutropenia would impact patient outcomes.


Subject(s)
Cytomegalovirus Infections , Heart Transplantation , Heart-Assist Devices , Neutropenia , Heart Transplantation/adverse effects , Heart-Assist Devices/adverse effects , Humans , Neutropenia/epidemiology , Retrospective Studies
4.
Anesth Analg ; 132(3): 698-706, 2021 03 01.
Article in English | MEDLINE | ID: mdl-32332290

ABSTRACT

BACKGROUND: The proportion of live births by cesarean delivery (CD) in China is significant, with some, particularly rural, provinces reporting up to 62.5%. The No Pain Labor & Delivery-Global Health Initiative (NPLD-GHI) was established to improve obstetric and neonatal outcomes in China, including through a reduction of CD through educational efforts. The purpose of this study was to determine whether a reduction in CD at a rural Chinese hospital occurred after NPLD-GHI. We hypothesized that a reduction in CD trend would be observed. METHODS: The NPLD-GHI program visited the Weixian Renmin Hospital, Hebei Province, China, from June 15 to 21, 2014. The educational intervention included problem-based learning, bedside teaching, simulation drill training, and multidisciplinary debriefings. An interrupted time-series analysis using segmented logistic regression models was performed on data collected between June 1, 2013 and May 31, 2015 to assess whether the level and/or trend over time in the proportion of CD births would decline after the program intervention. The primary outcome was monthly proportion of CD births. Secondary outcomes included neonatal intensive care unit (NICU) admissions and extended NICU length of stay, neonatal antibiotic and intubation use, and labor epidural analgesia use. RESULTS: Following NPLD-GHI, there was a level decrease in CD with an estimated odds ratio (95% confidence interval [CI]) of 0.87 (0.78-0.98), P = .017, with odds (95% CI) of monthly CD reduction an estimated 3% (1-5; P < .001), more in the post- versus preintervention periods. For labor epidural analgesia, there was a level increase (estimated odds ratio [95% CI] of 1.76 [1.48-2.09]; P < .001) and a slope decrease (estimated odds ratio [95% CI] of 0.94 [0.92-0.97]; P < .001). NICU admissions did not have a level change (estimated odds ratio [95% CI] of 0.99 [0.87-1.12]; P = .835), but the odds (95% CI) of monthly reduction in NICU admission was estimated 9% (7-11; P < .001), greater in post- versus preintervention. Neonatal intubation level and slope changes were not statistically significant. For neonatal antibiotic administration, while the level change was not statistically significant, there was a decrease in the slope with an odds (95% CI) of monthly reduction estimated 6% (3-9; P < .001), greater post- versus preintervention. CONCLUSIONS: In a large, rural Chinese hospital, live births by CD were lower following NPLD-GHI and associated with increased use of labor epidural analgesia. We also found decreasing NICU admissions. International-based educational programs can significantly alter practices associated with maternal and neonatal outcomes.


Subject(s)
Analgesia, Epidural/trends , Analgesia, Obstetrical/trends , Cesarean Section/trends , Inservice Training , Labor Pain/drug therapy , Pain Management/trends , Adult , Analgesia, Epidural/adverse effects , Analgesia, Obstetrical/adverse effects , Cesarean Section/adverse effects , China , Female , Health Knowledge, Attitudes, Practice , Hospitals, Rural/trends , Humans , Infant, Newborn , Intensive Care, Neonatal/trends , Interrupted Time Series Analysis , Labor Pain/etiology , Live Birth , Pain Management/adverse effects , Patient Care Team , Pregnancy , Program Evaluation , Treatment Outcome , Young Adult
5.
Public Health Nutr ; 24(9): 2577-2591, 2021 06.
Article in English | MEDLINE | ID: mdl-32489172

ABSTRACT

OBJECTIVE: To quantify diet-related burdens of cardiometabolic diseases (CMD) by country, age and sex in Latin America and the Caribbean (LAC). DESIGN: Intakes of eleven key dietary factors were obtained from the Global Dietary Database Consortium. Aetiologic effects of dietary factors on CMD outcomes were obtained from meta-analyses. We combined these inputs with cause-specific mortality data to compute country-, age- and sex-specific absolute and proportional CMD mortality of eleven dietary factors in 1990 and 2010. SETTING: Thirty-two countries in LAC. PARTICIPANTS: Adults aged 25 years and older. RESULTS: In 2010, an estimated 513 371 (95 % uncertainty interval (UI) 423 286-547 841; 53·8 %) cardiometabolic deaths were related to suboptimal diet. Largest diet-related CMD burdens were related to low intake of nuts/seeds (109 831 deaths (95 % UI 71 920-121 079); 11·5 %), low fruit intake (106 285 deaths (95 % UI 94 904-112 320); 11·1 %) and high processed meat consumption (89 381 deaths (95 % UI 82 984-97 196); 9·4 %). Among countries, highest CMD burdens (deaths per million adults) attributable to diet were in Trinidad and Tobago (1779) and Guyana (1700) and the lowest were in Peru (492) and The Bahamas (504). Between 1990 and 2010, greatest decline (35 %) in diet-attributable CMD mortality was related to greater consumption of fruit, while greatest increase (7·2 %) was related to increased intakes of sugar-sweetened beverages. CONCLUSIONS: Suboptimal intakes of commonly consumed foods were associated with substantial CMD mortality in LAC with significant heterogeneity across countries. Improved access to healthful foods, such as nuts and fruits, and limits in availability of unhealthful factors, such as processed foods, would reduce diet-related burdens of CMD in LAC.


Subject(s)
Cardiovascular Diseases , Diabetes Mellitus , Adult , Cardiovascular Diseases/etiology , Diet , Feeding Behavior , Humans , Latin America/epidemiology , Nutrition Surveys , Nuts , Risk Assessment , Risk Factors
6.
Stroke ; 51(10): 3119-3123, 2020 10.
Article in English | MEDLINE | ID: mdl-32921262

ABSTRACT

BACKGROUND AND PURPOSE: In patients with cryptogenic stroke and patent foramen ovale (PFO), the Risk of Paradoxical Embolism (RoPE) Score has been proposed as a method to estimate a patient-specific "PFO-attributable fraction"-the probability that a documented PFO is causally-related to the stroke, rather than an incidental finding. The objective of this research is to examine the relationship between this RoPE-estimated PFO-attributable fraction and the effect of closure in 3 randomized trials. METHODS: We pooled data from the CLOSURE-I (Evaluation of the STARFlex Septal Closure System in Patients With a Stroke and/or Transient Ischemic Attack due to Presumed Paradoxical Embolism through a Patent Foramen Ovale), RESPECT (Randomized Evaluation of Recurrent Stroke Comparing PFO Closure to Established Current Standard of Care Treatment), and PC (Clinical Trial Comparing Percutaneous Closure of Patent Foramen Ovale [PFO] Using the Amplatzer PFO Occluder With Medical Treatment in Patients With Cryptogenic Embolism) trials. We examine the treatment effect of closure in high RoPE score (≥7) versus low RoPE score (<7) patients. We also estimated the relative risk reduction associated with PFO closure across each level of the RoPE score using Cox proportional hazard analysis. We estimated a patient-specific attributable fraction using a PC trial-compatible (9-point) RoPE equation (omitting the neuroradiology variable), as well as a 2-trial analysis using the original (10-point) RoPE equation. We examined the Pearson correlation between the estimated attributable fraction and the relative risk reduction across RoPE strata. RESULTS: In the low RoPE score group (<7, n=912), the rate of recurrent strokes per 100 person-years was 1.37 in the device arm versus 1.68 in the medical arm (hazard ratio, 0.82 [0.42-1.59] P=0.56) compared with 0.30 versus 1.03 (hazard ratio, 0.31 [0.11-0.85] P=0.02) in the high RoPE score group (≥7, n=1221); treatment-by-RoPE score group interaction, P=0.12. The RoPE score estimated attributable fraction anticipated the relative risk reduction across all levels of the RoPE score, in both the 3-trial (r=0.95, P<0.001) and 2-trial (r=0.92, P<0.001) analyses. CONCLUSIONS: The RoPE score estimated attributable fraction is highly correlated to the relative risk reduction of device versus medical therapy. This observation suggests the RoPE score identifies patients with cryptogenic stroke who are likely to have a PFO that is pathogenic rather than incidental.


Subject(s)
Embolism, Paradoxical/etiology , Foramen Ovale, Patent/complications , Stroke/complications , Cardiac Catheterization , Foramen Ovale, Patent/surgery , Humans , Risk Factors , Secondary Prevention , Treatment Outcome
7.
J Pediatr ; 214: 60-65.e2, 2019 11.
Article in English | MEDLINE | ID: mdl-31474426

ABSTRACT

OBJECTIVES: To evaluate salivary biomarkers that elucidate the molecular mechanisms by which in utero opioid exposure exerts sex-specific effects on select hypothalamic and reward genes driving hyperphagia, a hallmark symptom of infants suffering from neonatal opioid withdrawal syndrome (NOWS). STUDY DESIGN: We prospectively collected saliva from 50 newborns born at ≥34 weeks of gestational age with prenatal opioid exposure and 50 sex- and gestational age-matched infants without exposure. Saliva underwent transcriptomic analysis for 4 select genes involved in homeostatic and hedonic feeding regulation (neuropeptide Y2 receptor [NPY2R], proopiomelanocortin [POMC], leptin receptor [LEPR], dopamine type 2 receptor [DRD2]). Normalized gene expression data were stratified based on sex and correlated with feeding volume on day of life 7 and length of stay in infants with NOWS requiring pharmacotherapy. RESULTS: Expression of DRD2, a hedonistic/reward regulator, was significantly higher in male newborns compared with female newborns with NOWS (Δ threshold cycle 10.8 ± 3.8 vs 13.9 ± 3.7, P = .01). In NOWS requiring pharmacotherapy expression of leptin receptor, an appetite suppressor, was higher in male subjects than female subjects (Δ threshold cycle 8.4 ± 2.5 vs 12.4 ± 5.1, P = .05), DRD2 expression significantly correlated with intake volume on day of life 7 (r = 0.58, P = .02), and expression of NPY2R, an appetite regulator, negatively correlated with length of stay (r = -0.24, P = .05). CONCLUSIONS: Prenatal opioid exposure exerts sex-dependent effects on hypothalamic feeding regulatory genes with clinical correlations. Neonatal salivary gene expression analyses may predict hyperphagia, severity of withdrawal state, and length of stay in infants with NOWS.


Subject(s)
Analgesics, Opioid/adverse effects , Gene Expression , Hyperphagia/etiology , Neonatal Abstinence Syndrome/genetics , Saliva/chemistry , Case-Control Studies , Female , Gene Expression Profiling , Genetic Markers , Humans , Infant, Newborn , Male , Neonatal Abstinence Syndrome/complications , Pilot Projects , Pro-Opiomelanocortin/genetics , Prospective Studies , Receptors, Dopamine D2/genetics , Receptors, Leptin/genetics , Receptors, Neuropeptide Y/genetics , Severity of Illness Index , Sex Factors
8.
Am J Kidney Dis ; 74(5): 620-628, 2019 11.
Article in English | MEDLINE | ID: mdl-31301926

ABSTRACT

RATIONALE & OBJECTIVE: Identifying patients who are likely to transfer from peritoneal dialysis (PD) to hemodialysis (HD) before transition could improve their subsequent care. This study developed a prediction tool for transition from PD to HD. STUDY DESIGN: Retrospective cohort study. SETTING & PARTICIPANTS: Adults initiating PD between January 2008 and December 2011, followed up through June 2015, for whom data were available in the US Renal Data System (USRDS). PREDICTORS: Clinical characteristics at PD initiation and peritonitis claims. OUTCOMES: Transfer to HD, with the competing outcomes of death and kidney transplantation. ANALYTICAL APPROACH: Outcomes were ascertained from USRDS treatment history files. Subdistribution hazards (competing-risk) models were fit using clinical characteristics at PD initiation. A nomogram was developed to classify patient risk at 1, 2, 3, and 4 years. These data were used to generate quartiles of HD transfer risk; this quartile score was incorporated into a cause-specific hazards model that additionally included a time-dependent variable for peritonitis. RESULTS: 29,573 incident PD patients were followed up for a median of 21.6 (interquartile range, 9.0-42.3) months, during which 41.2% transferred to HD, 25.9% died, 17.1% underwent kidney transplantation, and the rest were followed up to the study end in June 2015. Claims for peritonitis were present in 11,733 (40.2%) patients. The proportion of patients still receiving PD decreased to <50% at 22.6 months and 14.2% at 5 years. Peritonitis was associated with a higher rate of HD transfer (HR, 1.82; 95% CI, 1.76-1.89; P < 0.001), as were higher quartile scores of HD transfer risk (HRs of 1.31 [95% CI, 1.25-1.37), 1.51 [95% CI, 1.45-1.58], and 1.78 [95% CI, 1.71-1.86] for quartiles 2, 3, and 4 compared to quartile 1 [P < 0.001 for all]). LIMITATIONS: Observational data, reliant on the Medical Evidence Report and Medicare claims. CONCLUSIONS: A large majority of the patients who initiated renal replacement therapy with PD discontinued this modality within 5 years. Transfer to HD was the most common outcome. Patient characteristics and comorbid diseases influenced the probability of HD transfer, death, and transplantation, as did episodes of peritonitis.


Subject(s)
Kidney Failure, Chronic/therapy , Patient Transfer/statistics & numerical data , Peritoneal Dialysis/methods , Renal Replacement Therapy/methods , Transitional Care/organization & administration , Aged , Female , Follow-Up Studies , Humans , Male , Middle Aged , Retrospective Studies
9.
Ann Surg Oncol ; 26(6): 1795-1804, 2019 Jun.
Article in English | MEDLINE | ID: mdl-30911945

ABSTRACT

BACKGROUND: Peritoneal lesions are common findings during operative abdominal cancer staging. The decision to perform biopsy is made subjectively by the surgeon, a practice the authors hypothesized to be imprecise. This study aimed to describe optical characteristics differentiating benign peritoneal lesions from peritoneal metastases. METHODS: The study evaluated laparoscopic images of 87 consecutive peritoneal lesions biopsied during staging laparoscopies for gastrointestinal malignancies from 2014 to 2017. A blinded survey assessing these lesions was completed by 10 oncologic surgeons. Three senior investigators categorized optical features of the lesions. Computer-aided digital image processing and machine learning was used to classify the lesions. RESULTS: Of the 87 lesions, 28 (32%) were metastases. On expert survey, surgeons on the average misidentified 36 ± 19% of metastases. Multivariate analysis identified degree of nodularity, border transition, and degree of transparency as independent predictors of metastases (each p < 0.03), with an area under the receiver operating characteristics curve (AUC) of 0.82 (95% confidence interval [CI], 0.72-0.91). Image processing demonstrated no difference using image color segmentation, but showed a difference in gradient magnitude between benign and metastatic lesions (AUC, 0.66; 95% CI 0.54-0.78; p = 0.02). Machine learning using a neural network with a tenfold cross-validation obtained an AUC of only 0.47. CONCLUSIONS: To date, neither experienced oncologic surgeons nor computerized image analysis can differentiate peritoneal metastases from benign peritoneal lesions with an accuracy that is clinically acceptable. Although certain features correlate with the presence of metastases, a substantial overlap in optical appearance exists between benign and metastatic peritoneal lesions. Therefore, this study suggested the need to perform biopsy for all peritoneal lesions during operative staging, or at least to lower the threshold significantly.


Subject(s)
Adenocarcinoma/pathology , Gastrointestinal Neoplasms/pathology , Image Processing, Computer-Assisted/methods , Intraoperative Care , Machine Learning , Peritoneal Neoplasms/secondary , Practice Patterns, Physicians'/trends , Adenocarcinoma/surgery , Adult , Aged , Aged, 80 and over , Cohort Studies , Female , Follow-Up Studies , Gastrointestinal Neoplasms/surgery , Humans , Laparoscopy , Male , Middle Aged , Neoplasm Staging , Peritoneal Neoplasms/surgery , Prognosis
10.
BMC Pulm Med ; 19(1): 118, 2019 Jul 01.
Article in English | MEDLINE | ID: mdl-31262278

ABSTRACT

BACKGROUND: Despite well-defined criteria for use of antibiotics in patients presenting with mild to moderate Acute Exacerbation of Chronic Obstructive Pulmonary Disease (AECOPD), their overuse is widespread. We hypothesized that following implementation of a molecular multiplex respiratory viral panel (RVP), AECOPD patients with viral infections would be more easily identified, limiting antibiotic use in this population. The primary objective of our study was to investigate if availability of the RVP decreased antibiotic prescription at discharge among patients with AECOPD. METHODS: This is a single center, retrospective, before (pre-RVP) - after (post-RVP) study of patients admitted to a tertiary medical center from January 2013 to March 2016. The primary outcome was antibiotic prescription at discharge. Groups were compared using univariable and multivariable logistic-regression. RESULTS: A total of 232 patient-episodes were identified, 133 following RVP introduction. Mean age was 68.1 (pre-RVP) and 68.3 (post-RVP) years respectively (p = 0.88). Patients in pre-RVP group were similar to the post-RVP group with respect to gender (p = 0.54), proportion of patients with BMI < 21(p = 0.23), positive smoking status (p = 0.19) and diagnoses of obstructive sleep apnea (OSA, p = 0.16). We found a significant reduction in antibiotic prescription rate at discharge in patients admitted with AECOPD after introduction of the respiratory viral assay (pre-RVP 77.8% vs. post-RVP 63.2%, p = 0.01). In adjusted analyses, patients in the pre-RVP group [OR 2.11 (CI: 1.13-3.96), p = 0.019] with positive gram stain in sputum [OR 4.02 (CI: 1.61-10.06), p = 0.003] had the highest odds of antibiotic prescription at discharge. CONCLUSIONS: In patients presenting with mild to moderate Acute Exacerbation of Chronic Obstructive Pulmonary Disease (AECOPD), utilization of a comprehensive respiratory viral panel can significantly decrease the rate of antibiotic prescription at discharge.


Subject(s)
Anti-Bacterial Agents/administration & dosage , Drug Prescriptions/statistics & numerical data , Patient Discharge/statistics & numerical data , Pulmonary Disease, Chronic Obstructive/drug therapy , Respiratory Tract Infections/drug therapy , Aged , Controlled Before-After Studies , Disease Progression , Female , Humans , Logistic Models , Male , Middle Aged , Multivariate Analysis , Pulmonary Disease, Chronic Obstructive/diagnosis , Pulmonary Disease, Chronic Obstructive/virology , Respiratory Tract Infections/diagnosis , Respiratory Tract Infections/virology , Retrospective Studies , Sputum/microbiology
11.
Appetite ; 142: 104348, 2019 11 01.
Article in English | MEDLINE | ID: mdl-31299192

ABSTRACT

Eating behaviors such as eating fast and ignoring internal satiety cues are associated with overweight/obesity, and may be influenced by environmental factors. This study examined changes in those behaviors, and associations between those behaviors and BMI, cardiometabolic biomarkers, and diet quality in military recruits before and during initial military training (IMT), an environment wherein access to food is restricted. Eating rate and reliance on internal satiety cues were self-reported, and BMI, body fat, cardiometabolic biomarkers, and diet quality were measured in 1389 Army, Air Force and Marine recruits (45% female, mean ±â€¯SEM BMI = 24.1 ±â€¯0.1 kg/m2) before and after IMT. Pre-IMT, habitually eating fast relative to slowly was associated with a 1.1 ±â€¯0.3 kg/m2 higher BMI (P < 0.001), but not with other outcomes; whereas, habitually eating until no food is left (i.e., ignoring internal satiety cues) was associated with lower diet quality (P < 0.001) and, in men, 1.6 ±â€¯0.6% lower body fat (P = 0.03) relative to those that habitually stopped eating before feeling full. More recruits reported eating fast (82% vs 39%) and a reduced reliance on internal satiety cues (55% vs 16%) during IMT relative to pre-IMT (P < 0.001). Findings suggest that eating behaviors correlate with body composition and/or diet quality in young, predominantly normal-weight recruits entering the military, and that IMT is associated with potentially unfavorable changes in these eating behaviors.


Subject(s)
Body Mass Index , Feeding Behavior , Military Personnel , Self Report , Adolescent , Adult , Biomarkers/blood , Body Composition , Body Weight , Diet , Female , Humans , Male , Obesity/epidemiology , Overweight/epidemiology , Physical Fitness , Satiation , Surveys and Questionnaires , United States , Young Adult
12.
Clin Infect Dis ; 67(9): 1395-1402, 2018 10 15.
Article in English | MEDLINE | ID: mdl-29635432

ABSTRACT

Background: Recurrent cytomegalovirus (CMV) disease in solid organ transplant recipients frequently occurs despite effective antiviral therapy. We previously demonstrated that patients with lymphopenia before liver transplantation are more likely to develop posttransplant infectious complications including CMV. The aim of this study was to explore absolute lymphocyte count (ALC) as a predictor of relapse following treatment for CMV disease. Methods: We performed a retrospective cohort study of heart, liver, and kidney transplant recipients treated for an episode of CMV disease. Our primary outcome was time to relapse of CMV within 6 months. Data on potential predictors of relapse including ALC were collected at the time of CMV treatment completion. Univariate and multivariate hazard ratios (HRs) were calculated with a Cox model. Multiple imputation was used to complete the data. Results: Relapse occurred in 33 of 170 participants (19.4%). Mean ALC in relapse-free patients was 1.08 ± 0.69 vs 0.73 ± 0.42 × 103 cells/µL in those who relapsed, corresponding to an unadjusted hazard ratio of 1.11 (95% confidence interval, 1.03-1.21; P = .009, n = 133) for every decrease of 100 cells/µL. After adjusting for potential confounders, the association between ALC and relapse remained significant (HR, 1.11 [1.03-1.20]; P = .009). Conclusions: Low ALC at the time of CMV treatment completion was a strong independent predictor for recurrent CMV disease. This finding is biologically plausible given the known importance of T-cell immunity in maintaining CMV latency. Future studies should consider this inexpensive, readily available marker of host immunity.


Subject(s)
Cytomegalovirus Infections/diagnosis , Cytomegalovirus Infections/immunology , Lymphocyte Count , Organ Transplantation/adverse effects , Transplant Recipients , Adolescent , Adult , Aged , Aged, 80 and over , Antiviral Agents/therapeutic use , Cytomegalovirus , Cytomegalovirus Infections/drug therapy , Electronic Health Records , Female , Humans , Lymphopenia , Male , Middle Aged , Proportional Hazards Models , Recurrence , Retrospective Studies , Risk Assessment , Risk Factors , T-Lymphocytes/immunology , Young Adult
13.
Muscle Nerve ; 58(6): 852-854, 2018 12.
Article in English | MEDLINE | ID: mdl-30028521

ABSTRACT

INTRODUCTION: Benign fasciculations are common. Despite the favorable prognosis of benign fasciculation syndrome (BFS), patients are often anxious about their symptoms. In this study, we prospectively followed 35 patients with BFS over a 24-month period. METHODS: We conducted serial questionnaires to assess anxiety, associated symptoms, and duration. RESULTS: 71.4% of patients were men, and 34.4% were employed in the medical field. Most reported anxiety, but only 14% were anxious as measured by the Zung self-rating anxiety scale. Fasciculations were most common in the calves and persisted in 93% of patients. Anxiety levels did not change over time. Associated symptoms (subjective weakness, sensory symptoms, and cramps) were common and resolved to varying degrees. No patients developed motor neuron disease. DISCUSSION: BFS is a benign disorder that usually persists over time. Commonly associated symptoms include subjective weakness, sensory symptoms, and cramps. BFS is usually not associated with pathologic anxiety. Muscle Nerve 58:852-854, 2018.


Subject(s)
Anxiety/diagnosis , Anxiety/etiology , Neuromuscular Diseases/complications , Neuromuscular Diseases/psychology , Adult , Electromyography , Female , Humans , Longitudinal Studies , Male , Middle Aged , Prospective Studies , Psychiatric Status Rating Scales , Surveys and Questionnaires , Young Adult
14.
Am Heart J ; 188: 18-25, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28577674

ABSTRACT

BACKGROUND: While infarct size in patients with ST-segment elevation myocardial infarction (STEMI) has been generally associated with long-term prognosis, whether a therapeutic effect on infarct size has a corresponding therapeutic effect on long-term outcomes is unknown. METHODS: Using combined patient-level data from 10 randomized trials of primary percutaneous coronary intervention (PCI) for STEMI, we created multivariable Cox proportional hazard models for one-year heart failure hospitalization and all-cause mortality, which included clinical features and a variable representing treatment effect on infarct size. The trials included 2679 participants; infarct size was measured at a median 4 days post infarction. RESULTS: Mean infarct size among the control groups ranged from 16% to 35% of the left ventricle, and from 12% to 36% among treatment groups. There was a significant relationship between treatment effect on infarct size and treatment effect on 1-year heart failure hospitalization (HR 0.85, 95% CI 0.77-0.93, P=.0006), but not on one-year mortality (HR 0.97, 95% CI 0.89-1.06). The treatment effect between infarct size and heart failure hospitalization was stable in sensitivity analyses adjusting for time from STEMI onset to infarct size assessment, and when considering heart failure as the main outcome and death as a competing risk. CONCLUSIONS: We conclude that early treatment-induced effects on infarct size are related in direction and magnitude to treatment effects on heart failure hospitalizations. This finding enables consideration of using infarct size as a valid surrogate outcome measure in assessing new STEMI treatments.


Subject(s)
Percutaneous Coronary Intervention , Randomized Controlled Trials as Topic , ST Elevation Myocardial Infarction , Cause of Death/trends , Global Health , Humans , Magnetic Resonance Imaging, Cine , ST Elevation Myocardial Infarction/diagnosis , ST Elevation Myocardial Infarction/mortality , ST Elevation Myocardial Infarction/surgery , Survival Rate/trends , Tomography, Emission-Computed, Single-Photon , Treatment Outcome
15.
Liver Transpl ; 23(12): 1541-1552, 2017 12.
Article in English | MEDLINE | ID: mdl-28703464

ABSTRACT

Though serum iron has been known to be associated with an increased risk of infection, hepcidin, the major regulator of iron metabolism, has never been systematically explored in this setting. Finding early biomarkers of infection, such as hepcidin, could help identify patients in whom early empiric antimicrobial therapy would be beneficial. We prospectively enrolled consecutive patients (n = 128) undergoing first-time, single-organ orthotopic liver transplantation (OLT) without known iron overload disorders at 2 academic hospitals in Boston from August 2009 to November 2012. Cox regression compared the associations between different iron markers and the development of first infection at least 1 week after OLT; 47 (37%) patients developed a primary outcome of infection at least 1 week after OLT and 1 patient died. After adjusting for perioperative bleeding complications, number of hospital days, and hepatic artery thrombosis, changes in iron markers were associated with the development of infection post-OLT including increasing ferritin (hazard ratio [HR], 1.51; 95% confidence interval [CI], 1.12-2.05), rising ferritin slope (HR, 1.10; 95% CI, 1.03-1.17), and increasing hepcidin (HR, 1.43; 95% CI, 1.05-1.93). A decreasing iron (HR, 1.76; 95% CI, 1.20-2.57) and a decreasing iron slope (HR, 4.21; 95% CI, 2.51-7.06) were also associated with subsequent infections. In conclusion, hepcidin and other serum iron markers and their slope patterns or their combination are associated with infection in vulnerable patient populations. Liver Transplantation 23 1541-1552 2017 AASLD.


Subject(s)
Communicable Diseases/blood , End Stage Liver Disease/surgery , Iron/blood , Liver Transplantation/adverse effects , Postoperative Complications/blood , Biomarkers/blood , Boston/epidemiology , Communicable Diseases/epidemiology , Communicable Diseases/immunology , Communicable Diseases/microbiology , Female , Ferritins/blood , Hepcidins/blood , Host-Pathogen Interactions/immunology , Humans , Immunocompromised Host , Iron/metabolism , Male , Middle Aged , Postoperative Complications/epidemiology , Postoperative Complications/immunology , Postoperative Complications/microbiology , Prospective Studies , Reoperation/statistics & numerical data , Risk Assessment/methods , Treatment Outcome
16.
Clin Transplant ; 31(2)2017 02.
Article in English | MEDLINE | ID: mdl-28004856

ABSTRACT

Early allograft dysfunction (EAD) following liver transplantation (LT) remains a challenge for patients and clinicians. We retrospectively analyzed the effect of pre-defined EAD on outcomes in a 10-year cohort of deceased-donor LT recipients with clearly defined exclusion criteria. EAD was defined by at least one of the following: AST or ALT >2000 IU/L within first-week post-LT, total bilirubin ≥10 mg/dL, and/or INR ≥1.6 on post-operative day 7. Ten patients developed primary graft failure and were analyzed separately. EAD occurred in 86 (36%) recipients in a final cohort of 239 patients. In univariate and multivariate analyses, EAD was significantly associated with mechanical ventilation ≥2 days or death on days 0, 1, PACU/SICU stay >2 days or death on days 0-2 and renal failure (RF) at time of hospital discharge (all P<.05). EAD was also significantly associated with higher one-year graft loss in both uni- and multivariate Cox hazard analyses (P=.0203 and .0248, respectively). There was no difference in patient mortality between groups in either of the Cox proportional hazard models. In conclusion, we observed significant effects of EAD on short-term post-LT outcomes and lower graft survival.


Subject(s)
Graft Rejection/epidemiology , Liver Transplantation/adverse effects , Postoperative Complications , Primary Graft Dysfunction/epidemiology , Adult , Allografts , Boston/epidemiology , Female , Follow-Up Studies , Graft Rejection/etiology , Graft Survival , Humans , Incidence , Male , Middle Aged , Primary Graft Dysfunction/etiology , Prognosis , Retrospective Studies , Risk Factors
17.
Am Heart J ; 178: 168-75, 2016 Aug.
Article in English | MEDLINE | ID: mdl-27502865

ABSTRACT

AIMS: In the IMMEDIATE Trial, intravenous glucose-insulin-potassium (GIK) was started as early as possible for patients with suspected acute coronary syndrome by ambulance paramedics in communities. In the IMMEDIATE Biological Mechanism Cohort substudy, reported here, we investigated potential modes of GIK action on specific circulating metabolic components. Specific attention was given to suppression of circulating oxygen-wasting free fatty acids (FFAs) that had been posed as part of the early GIK action related to averting cardiac arrest. METHODS: We analyzed the changes in plasma levels of FFA, glucose, C-peptide, and the homeostasis model assessment (HOMA) index. RESULTS: With GIK, there was rapid suppression of FFA levels with estimated levels for GIK and placebo groups after 2 hours of treatment of 480 and 781 µmol/L (P<.0001), even while patterns of FFA saturation remained unchanged. There were no significant changes in the HOMA index in the GIK or placebo groups (HOMA index: placebo 10.93, GIK 12.99; P = .07), suggesting that GIK infusions were not countered by insulin resistance. Also, neither placebo nor GIK altered endogenous insulin secretion as reflected by unchanging C-peptide levels. CONCLUSION: These mechanistic observations support the potential role of FFA suppression in very early cardioprotection by GIK. They also suggest that the IMMEDIATE Trial GIK formula is balanced with respect to its insulin and glucose composition, as it induced no endogenous insulin secretion.


Subject(s)
Acute Coronary Syndrome/drug therapy , Emergency Medical Services/methods , Glucose/therapeutic use , Heart Arrest/prevention & control , Hypoglycemic Agents/therapeutic use , Insulin/therapeutic use , Potassium/therapeutic use , Acute Coronary Syndrome/blood , Aged , Angina Pectoris/blood , Angina Pectoris/drug therapy , Blood Glucose/metabolism , C-Peptide/blood , Early Medical Intervention , Electrocardiography , Fatty Acids, Nonesterified/blood , Female , Heart Arrest/blood , Humans , Infusions, Intravenous , Insulin Resistance , Male , Middle Aged , Non-ST Elevated Myocardial Infarction/blood , Non-ST Elevated Myocardial Infarction/drug therapy , ST Elevation Myocardial Infarction/blood , ST Elevation Myocardial Infarction/drug therapy
18.
Crit Care Med ; 44(3): 583-91, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26540397

ABSTRACT

OBJECTIVE: To compare the efficacy and safety of scheduled low-dose haloperidol versus placebo for the prevention of delirium (Intensive Care Delirium Screening Checklist ≥ 4) administered to critically ill adults with subsyndromal delirium (Intensive Care Delirium Screening Checklist = 1-3). DESIGN: Randomized, double-blind, placebo-controlled trial. SETTING: Three 10-bed ICUs (two medical and one surgical) at an academic medical center in the United States. PATIENTS: Sixty-eight mechanically ventilated patients with subsyndromal delirium without complicating neurologic conditions, cardiac surgery, or requiring deep sedation. INTERVENTIONS: Patients were randomly assigned to receive IV haloperidol 1 mg or placebo every 6 hours until delirium occurred (Intensive Care Delirium Screening Checklist ≥ 4 with psychiatric confirmation), 10 days of therapy had elapsed, or ICU discharge. MEASUREMENTS AND MAIN RESULTS: Baseline characteristics were similar between the haloperidol (n = 34) and placebo (n = 34) groups. A similar number of patients given haloperidol (12/34 [35%]) and placebo (8/34 [23%]) developed delirium (p = 0.29). Haloperidol use reduced the hours per study day spent agitated (Sedation Agitation Scale ≥ 5) (p = 0.008), but it did not influence the proportion of 12-hour ICU shifts patients spent alive without coma (Sedation Agitation Scale ≤ 2) or delirium (p = 0.36), the time to first delirium occurrence (p = 0.22), nor delirium duration (p = 0.26). Days of mechanical ventilation (p = 0.80), ICU mortality (p = 0.55), and ICU patient disposition (p = 0.22) were similar in the two groups. The proportion of patients who developed corrected QT-interval prolongation (p = 0.16), extrapyramidal symptoms (p = 0.31), excessive sedation (p = 0.31), or new-onset hypotension (p = 1.0) that resulted in study drug discontinuation was comparable between the two groups. CONCLUSIONS: Low-dose scheduled haloperidol, initiated early in the ICU stay, does not prevent delirium and has little therapeutic advantage in mechanically ventilated, critically ill adults with subsyndromal delirium.


Subject(s)
Antipsychotic Agents/administration & dosage , Critical Illness/therapy , Delirium/prevention & control , Haloperidol/administration & dosage , Administration, Intravenous , Adult , Aged , Antipsychotic Agents/adverse effects , Coma , Double-Blind Method , Female , Haloperidol/adverse effects , Humans , Intensive Care Units , Length of Stay , Male , Middle Aged , Pilot Projects , Psychomotor Agitation/drug therapy , Respiration, Artificial , United States
19.
Am J Hematol ; 91(6): 560-5, 2016 06.
Article in English | MEDLINE | ID: mdl-26928381

ABSTRACT

Hodgkin lymphoma post-transplant lymphoproliferative disorder (HL-PTLD) is an uncommon PTLD with unclear prognosis and differences between HL-PTLD and immunocompetent HL are not well defined. Patient characteristics were compared among 192 patients with HL-PTLD from the Scientific Registry of Transplant Recipients and 13,847 HL patients in SEER (HL-SEER). Overall survival (OS) and disease-specific survival (DSS) were compared after exact matching. Additionally, multivariable analyses were used to identify prognostic markers of survival and associations between treatment and survival. Median time from transplant to HL-PTLD diagnosis was 88 months. When compared with HL-SEER, patients with HL-PTLD were older (median age, 52 vs. 36 years, P = 0.001), more likely male (73% vs. 54%, P < 0.001), Caucasian (81% vs. 70%, P = 0.02), and had extranodal disease (42% vs. 3%, P < 0.001). Five-year OS for patients with HL-PTLD was 57% versus 80% for HL-SEER (P < 0.001); DSS was also inferior (P < 0.001). For patients with HL-PTLD, the use of any chemotherapy was associated with decreased hazard of death (HR = 0.36, P < 0.001). Furthermore, patients who received no chemotherapy or nontraditional HL regimens had increased hazard of death (aHR = 2.94, P = 0.001 and 2.01, P = 0.04) versus HL-specific chemotherapy regimens. In multivariable analysis, advanced age and elevated creatinine were associated with inferior OS (aHR = 1.26/decade P < 0.001 and 1.64/0.1 mg/dL increase P = 0.02). A prognostic score based on the number of these adverse factors (0, 1, 2) was associated with 10-year OS rates of 79%, 53%, and 11%, respectively (P < 0.001). Altogether, HL-PTLD patients have inferior survival when compared with HL-SEER. Furthermore, treatment with HL-specific chemotherapy was associated with improved OS, whereas age and creatinine identified patients with markedly divergent survival. Am. J. Hematol. 91:560-565, 2016. © 2016 Wiley Periodicals, Inc.


Subject(s)
Hodgkin Disease/mortality , Hodgkin Disease/therapy , Lymphoproliferative Disorders/mortality , Lymphoproliferative Disorders/therapy , Adult , Age Factors , Antineoplastic Agents/therapeutic use , Creatine/blood , Female , Humans , Male , Middle Aged , Prognosis , Registries , Survival Analysis , Survival Rate , Treatment Outcome , Young Adult
20.
Cardiovasc Ultrasound ; 14(1): 29, 2016 Aug 03.
Article in English | MEDLINE | ID: mdl-27488569

ABSTRACT

BACKGROUND: In patients with acute coronary syndrome (ACS), reduced left ventricular ejection fraction (LVEF) is a known marker for increased mortality. However, the relationship between LVEF measured during index ACS hospitalization and mortality and heart failure (HF) within 1 year are less well-defined. METHODS: We performed a retrospective analysis of 445 participants in the IMMEDIATE Trial who had LVEF measured by left ventriculography or echocardiogram during hospitalization. RESULTS: Adjusting for age and coronary artery disease (CAD) history, lower LVEF was significantly associated with 1-year mortality or hospitalization for HF. For every 5 % LVEF reduction, the hazard ratio [HR] was 1.26 (95 % CI 1.15, 1.38, P < 0.001). Participants with LVEF < 40 % had higher hazard of 1-year mortality or HF hospitalization than those with LVEF > 40 (HR 3.59; 95 % CI 2.05, 6.27, P < 0.001). The HRs for the association of LVEF with the study outcomes were similar whether measured by left ventriculography or by echocardiography, (respectively, HR 1.32; 95 % CI 1.15, 1.51 and 1.21; 95 % CI 1.106, 1.35, interaction P = 0.32) and whether done within 24 h or not within 24 h (respectively, HR 1.28; 95 % CI 1.10, 1.50 and 1.23; 95 % CI 1.10, 1.38, interaction P = 0.67). CONCLUSIONS: Among patients with ACS, lower in-hospital LVEF is associated with increased 1-year mortality or hospitalization for HF, regardless of the method or timing of the LVEF assessment. This has prognostic implications for clinical practice and suggests the possibility of using various methods of LVEF determination in clinical research.


Subject(s)
Acute Coronary Syndrome/physiopathology , Echocardiography/methods , Heart Failure/diagnosis , Inpatients , Stroke Volume/physiology , Ventricular Function, Left/physiology , Acute Coronary Syndrome/complications , Acute Coronary Syndrome/diagnosis , Aged , Double-Blind Method , Female , Follow-Up Studies , Heart Failure/etiology , Heart Failure/physiopathology , Humans , Male , Middle Aged , Prognosis , Retrospective Studies , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL