Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
Int J Womens Dermatol ; 8(3): e032, 2022 Oct.
Article in English | MEDLINE | ID: mdl-35923587

ABSTRACT

Skin cancer education targeted to patients' needs is a goal of practicing dermatologists. Data regarding dermatology patients' baseline knowledge regarding skin cancer could aid clinicians in tailoring education efforts. Objective: To help quantify existing patients' existing visual recognition of skin cancer and common benign lesions, with the goal of helping to provide more targeted and meaningful education to patients. Methods: Two hundred forty-four adult patients from the dermatology clinics at University of Oklahoma and Loyola University Chicago were surveyed using digital images and questions regarding personal and family history of skin cancer, sun protection practices and sun protection knowledge. Results: Of the 244 subjects, 43% percent had a positive personal history of skin cancer, 40% had a positive family history. Scores differed minimally by personal history of skin cancer (p = .37) but differed more markedly by family history of skin cancer (p = .02). Limitations: Lack of generalizability to the general public, age range of subjects. Conclusions: There are knowledge gaps within the dermatology patient population regarding common benign and malignant skin lesions.

2.
Surgery ; 169(3): 671-677, 2021 03.
Article in English | MEDLINE | ID: mdl-32951903

ABSTRACT

BACKGROUND: We applied various machine learning algorithms to a large national dataset to model the risk of postoperative sepsis after appendectomy to evaluate utility of such methods and identify factors associated with postoperative sepsis in these patients. METHODS: The National Surgery Quality Improvement Program database was used to identify patients undergoing appendectomy between 2005 and 2017. Logistic regression, support vector machines, random forest decision trees, and extreme gradient boosting machines were used to model the occurrence of postoperative sepsis. RESULTS: In the study, 223,214 appendectomies were identified; 2,143 (0.96%) were indicated as having postoperative sepsis. Logistic regression (area under the curve 0.70; 95% confidence interval, 0.68-0.73), random forest decision trees (area under the curve 0.70; 95% confidence interval, 0.68-0.73), and extreme gradient boosting (area under the curve 0.70; 95% confidence interval, 0.68-0.73) afforded similar performance, while support vector machines (area under the curve 0.51; 95% confidence interval, 0.50-0.52) had worse performance. Variable importance analyses identified preoperative congestive heart failure, transfusion, and acute renal failure as predictors of postoperative sepsis. CONCLUSION: Machine learning methods can be used to predict the development of sepsis after appendectomy with moderate accuracy. Such predictive modeling has potential to ultimately allow for preoperative recognition of patients at risk for developing postoperative sepsis after appendectomy thus facilitating early intervention and reducing morbidity.


Subject(s)
Appendectomy/adverse effects , Machine Learning , Postoperative Complications/diagnosis , Postoperative Complications/etiology , Sepsis/diagnosis , Sepsis/etiology , Adult , Appendectomy/methods , Area Under Curve , Disease Susceptibility , Factor Analysis, Statistical , Female , Humans , Male , Middle Aged , Models, Theoretical , Prognosis , Public Health Surveillance , ROC Curve
3.
Eur J Gastroenterol Hepatol ; 32(2): 216-221, 2020 02.
Article in English | MEDLINE | ID: mdl-31584463

ABSTRACT

OBJECTIVES: Hounsfield Units (HU) to compare the various computed tomography (CT) criteria for diagnosing hepatic steatosis with laboratory liver function parameters, and clinical risk factors retrospectively, when hepatic steatosis was incidentally detected. METHODS: Institutional review board-approved, Health Insurance Portability and Accountability Act-compliant, retrospective study in 200 randomly selected patients who had either nonenhanced CT (NECT) or contrast-enhanced CT (CECT) studies with reported hepatic steatosis. The participants were matched to age, gender, and ethnicity with 200 patients without hepatic steatosis. For NECT, four different criteria have been proposed in the literature to diagnose fatty liver: (1) liver HU less than 48 HU; (2) ratio of liver to spleen HU less than 0.8; (3) HU difference between liver and spleen less than -10; and (4) hepatic vessel HU ≥ liver HU. For CECT, difference between liver and spleen HU, in portal venous phase, ≤ -20 to -25 HU. Serum glucose, aspartate aminotransferase (AST), amino alanine transferase (ALT), total bilirubin were documented. Clinical history and clinical risk factors were documented from the electronic health records. Matched analyses and Wilcoxon signed rank sum test analysis were performed for matched variables. RESULTS: Fatty liver by NECT criteria 1 and 3 has statistically significant correlation with elevated glucose levels (P = 0.02). Similarly, fatty liver by 1, 3, and 4 NECT criteria showed statistically significant associations with higher levels of ALT and AST. There were statistically significant higher prevalence of diabetes mellitus (P = 0.003) and alcohol consumption (P ≤ 0.0001) in cases when compared with the controls. There was marginal significance in CT Dose Index between cases and controls (95% confidence interval: 0.98, 1.00; odds ratio 0.99), reflecting that cases had slightly higher BMI compared to their matched controls, thereby requiring slightly higher mA/mAs for imaging. CONCLUSION: Particular NECT criteria for fatty liver are best at identification of abnormal liver function and certain comorbidities, in the setting of incidental fatty liver detection, This creates the potential for benefits of early detection in clinical management.


Subject(s)
Fatty Liver , Fatty Liver/diagnostic imaging , Fatty Liver/epidemiology , Humans , Retrospective Studies , Risk Factors , Tomography, X-Ray Computed
4.
J Neurol Surg B Skull Base ; 80(5): 458-468, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31534886

ABSTRACT

Objective This study is to establish predictors of facial paralysis and auditory morbidity secondary to facial schwannomas by assimilating individualized patient data from the literature. Design A systematic review of the literature was conducted for studies regarding facial schwannomas. Studies were only included if they presented patient level data, House-Brackmann grades, and tumor location by facial nerve segment. Odds ratios (OR) were estimated using generalized linear mixed models. Main Outcome Measures Facial weakness and hearing loss. Results Data from 504 patients were collected from 32 studies. The geniculate ganglion was the most common facial nerve segment involved (39.3%). A greater number of facial nerve segments involved was positively associated with both facial weakness and hearing loss, whereas tumor diameter did not correlate with either morbidity. Intratemporal involvement was associated with higher odds of facial weakness (OR = 4.78, p < 0.001), intradural involvement was negatively associated with facial weakness (OR = 0.56, p = 0.004), and extratemporal involvement was not a predictor of facial weakness (OR = 0.68, p = 0.27). The odds of hearing loss increased with more proximal location of the tumor (intradural: OR = 3.26, p < 0.001; intratemporal: OR = 0.60, p = 0.14; extratemporal: OR = 0.27, p = 0.01). Conclusion The most important factors associated with facial weakness and hearing loss are tumor location and the number of facial nerve segments involved. An understanding of the factors that contribute most heavily to the natural morbidity can help guide the appropriate timing and type of intervention in future cases of facial schwannoma.

5.
Am J Cardiol ; 124(1): 39-43, 2019 07 01.
Article in English | MEDLINE | ID: mdl-31056110

ABSTRACT

The incremental benefit of emergency medical services (EMS) activation of the cardiac catheterization laboratory (CCL) for ST-elevation myocardial infarction (STEMI) in the setting of an established in-house interventional team (IHIT) is uncertain. We evaluated the impact of EMS activation on door-to-balloon (D2B) time and first medical contact-to-balloon (FMC2B) time for STEMI when coupled with a 24-hour/day IHIT. All patients presenting with STEMI to Loyola University Medical Center had demographic, procedural, and outcome data consecutively entered in a STEMI Data Registry. From 223 consecutive patients presenting between April 2009 and December 2015, a retrospective analysis was performed on 190 patients. Patients were divided into 2 groups depending on CCL activation mode (EMS activation or emergency department activation) and STEMI treatment process times were compared. The primary end point was D2B process times. The secondary end point was FMC2B process times in a subgroup analysis of EMS-transported patients. D2B times were shorter (37 ± 14 minutes vs 57 ± 27 minutes, p < 0.001) with EMS activation. Subgroup analysis of EMS-transported patients demonstrated shorter FMC2B times with EMS activation (52 ± 17 minutes vs 67 ± 32 minutes, p = 0.002). EMS activation was the only predictor of D2B ≤60 minutes in multivariable analysis of EMS-transported patients (odds ratio 9.4; 95% confidence interval 2.1 to 43.0; p = 0.04). In conclusion, EMS activation of the CCL in STEMI was associated with significant improvements in already excellent D2B and FMC2B times even in the setting of a 24-hour/day IHIT.


Subject(s)
Angioplasty, Balloon, Coronary , Cardiac Catheterization , Emergency Medical Services , ST Elevation Myocardial Infarction/therapy , Time-to-Treatment , Aged , Female , Humans , Male , Middle Aged , Retrospective Studies , ST Elevation Myocardial Infarction/diagnosis , Time Factors , Treatment Outcome
6.
Cornea ; 38(2): 177-182, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30615600

ABSTRACT

PURPOSE: Descemet stripping endothelial keratoplasty (DSEK), currently the most common procedure for managing corneal endothelial dysfunction, may be repeated following DSEK failure from a variety of causes. This multicenter study reports the risk factors and outcomes of repeat DSEK. METHODS: This was an institutional review board-approved multicenter retrospective chart review of patients who underwent repeat DSEK. Twelve surgeons from 5 Midwest academic centers and 3 private practice groups participated. The Eversight Eye Bank provided clinical indication and donor graft data. We also assessed the role of the learning curve by comparing cohorts from the first and second 5-year periods. RESULTS: A total of 121 eyes from 121 patients who underwent repeat DSEK were identified. The average age of the patients was 70 ± 12 years. The most common indication for repeat DSEK was late endothelial graft failure without rejection (58%, N = 63). Average preoperative and 12-month postoperative repeat DSEK corrected distance visual acuities were 20/694 and 20/89, respectively. Visual acuity outcomes, endothelial cell density, and cell loss did not significantly vary between the 2 cohorts. Initial graft rebubble rates for the first and second cohorts were 51% and 25%. The presence of glaucoma, prior glaucoma surgery, or a history of penetrating (full thickness) keratoplasty did not significantly affect visual outcomes. The median, mean, and range of intraocular pressures before repeat DSEK were 15.0, 15.7, and 6 to 37 mm Hg, respectively. Patients with higher intraocular pressures before repeat DSEK had improved postoperative corrected distance visual acuities. CONCLUSIONS: Repeating DSEK improves vision following failed or decompensated DSEK surgery. Higher preoperative repeat DSEK IOPs were associated with improved visual outcomes, and initial graft rebubble rates, which decreased over time, were likely due to surgeon experience.


Subject(s)
Corneal Diseases/surgery , Descemet Stripping Endothelial Keratoplasty/methods , Aged , Aged, 80 and over , Corneal Diseases/physiopathology , Corneal Endothelial Cell Loss/etiology , Female , Graft Rejection/pathology , Humans , Intraocular Pressure/physiology , Male , Middle Aged , Reoperation , Retrospective Studies , Risk Factors , Visual Acuity/physiology
7.
World Neurosurg ; 112: e465-e472, 2018 Apr.
Article in English | MEDLINE | ID: mdl-29355794

ABSTRACT

BACKGROUND: Common peroneal nerve (CPN) compressive neuropathy is the most common lower-extremity entrapment neuropathy. MATERIALS AND METHODS: A retrospective review of a prospectively maintained single-institution database of all patients with CPN palsy who underwent decompression and neuroplasty over a 5-year period was performed. RESULTS: Thirty patients underwent a neuroplasty of the CPN over a 5-year period (2010-2015) at our institution. The median age was 45 years, and there was a male preponderance. The average time between first onset of symptoms to surgery was 122.9 weeks and between first clinic visit and surgery was 21 weeks. The etiology of the CPN neuropathy was as follows: in 12 patients, it followed a surgical procedure and in 14 patients, it occurred after a trauma to the lower extremity. In 2 patients, it occurred as a result of a mass lesion compromising the nerve and in 1 patient, a local infection predisposed to CPN palsy. Right and left lower extremities were equally involved. The median body mass index was 28.6. The most common presentation was weakness of the tibialis anterior (TA) and extensor hallucis longus (EHL) and loss of sensation in the distribution of the CPN or one of its major branches. Pain was a presenting symptom in 16 patients. Only 12 of the 30 patients had a positive Tinel's sign at the site of compression over the lateral fibular neck. Preoperative electrophysiologic confirmation of CPN neuropathy was available in all patients. Mean follow-up was 52 weeks. Prone positioning and selective use of the operating microscope provided excellent visualization and surgical exposure of the CPN from the lower popliteal region to the peroneal tunnel. Average operating room time was 170 minutes and average skin-to-skin time 91 minutes. Clinical improvement after surgery in terms of motor function was noted in 24 of the 26 patients who presented with a motor deficit. The most consistent improvement was noted in the TA and EHL; a trend toward greater improvement with shorter time to surgery was noted. No complications related to the surgical site or CPN were encountered, and no patient had a decline in their neurologic examination as a consequence of the surgery. One patient developed a positioning-related right upper-extremity brachial plexus neuropraxic injury after surgery that recovered completely. CONCLUSIONS: Common peroneal neuropathy usually presents with weakness of the TA and EHL and decreased sensation or pain in the distribution of the CPN. Microscope-assisted surgical neuroplasty of the CPN at the lateral fibular neck with the patient in a prone position allows decompression of the nerve from the lower popliteal region to the peroneal tunnel. Significant improvement in motor strength after surgery, particularly of the TA and EHL, was observed in this series.


Subject(s)
Nerve Compression Syndromes/surgery , Neurosurgical Procedures , Peroneal Nerve/surgery , Peroneal Neuropathies/surgery , Plastic Surgery Procedures , Adolescent , Adult , Aged , Aged, 80 and over , Child , Female , Humans , Male , Middle Aged , Retrospective Studies , Treatment Outcome , Young Adult
8.
Otolaryngol Head Neck Surg ; 158(1): 62-75, 2018 01.
Article in English | MEDLINE | ID: mdl-28895459

ABSTRACT

Objectives (1) Determine the prevalence of hearing loss following microvascular decompression (MVD) for trigeminal neuralgia (TN) and hemifacial spasm (HFS). (2) Demonstrate factors that affect postoperative hearing outcomes after MVD. Data Sources PubMed-NCBI, Scopus, CINAHL, and PsycINFO databases from 1981 to 2016. Review Methods Systematic review of prospective cohort studies and retrospective reviews in which any type of hearing loss was recorded after MVD for TN or HFS. Three researchers extracted data regarding operative indications, procedures performed, and diagnostic tests employed. Discrepancies were resolved by mutual consensus. Results Sixty-nine references with 18,233 operations met inclusion criteria. There were 7093 patients treated for TN and 11,140 for HFS. The overall reported prevalence of hearing loss after MVD for TN and HFS was 5.58% and 8.25%, respectively. However, many of these studies relied on subjective measures of reporting hearing loss. In 23 studies with consistent perioperative audiograms, prevalence of hearing loss was 13.47% for TN and 13.39% for HFS, with no significant difference between indications ( P = .95). Studies using intraoperative brainstem auditory evoked potential monitoring were more likely to report hearing loss for TN (relative risk [RR], 2.28; P < .001) but not with HFS (RR, 0.88; P = .056). Conclusion Conductive and sensorineural hearing loss are important complications following posterior fossa MVD. Many studies have reported on hearing loss using either subjective measures and/or inconsistent audiometric testing. Routine perioperative audiogram protocols improve the detection of hearing loss and may more accurately represent the true risk of hearing loss after MVD for TN and HFS.


Subject(s)
Decompression, Surgical/adverse effects , Hearing Loss/etiology , Hemifacial Spasm/surgery , Trigeminal Neuralgia/surgery , Humans , Microcirculation , Risk Factors
9.
Am J Clin Oncol ; 41(6): 576-580, 2018 Jun.
Article in English | MEDLINE | ID: mdl-27560156

ABSTRACT

OBJECTIVES: Angiotensin-converting enzyme inhibitors (ACEi) have demonstrated decreased rates of radiation-induced lung injury in animal models and clinical reports have demonstrated decreased pneumonitis in the setting of conventionally fractionated radiation to the lung. We tested the role of ACEi in diminishing rates of symptomatic (grade ≥2) pneumonitis in the setting of lung stereotactic body radiation therapy (SBRT). METHODS: We analyzed patients treated with thoracic SBRT to 48 to 60 Gy in 4 to 5 fractions from 2006 to 2014. We reviewed pretreatment and posttreatment medication profiles to document use of ACEi, angiotensin receptor blockers, bronchodilators, aspirin, PDE-5 inhibitors, nitrates, and endothelin receptor antagonists. Pneumonitis was graded posttreatment based on Common Terminology Criteria for Adverse Events Version 4.0. Univariate and multivariate analysis was performed and time to development of pneumonitis was evaluated by the Kaplan-Meier method. RESULTS: A total of 189 patients were evaluated with a median follow-up of 24.8 months. The overall 1-year rate of symptomatic pneumonitis was 13.2%. The 1-year rate of symptomatic pneumonitis was 4.2% for ACEi users versus 16.3% in nonusers (P=0.03). On univariate analysis, the odds of developing grade 2 or greater pneumonitis were significantly lower for patients on ACEi (P=0.03). On multivariate analysis, after controlling for clinicopathologic characteristics and dosimetric endpoints, there was a significant association between ACEi use and decreased risk of clinical pneumonitis (P=0.04). Angiotensin receptor blockers or other bronchoactive medications did not show significant associations with development of pneumonitis. CONCLUSIONS: Incidental concurrent use of ACEi demonstrated efficacy in diminishing rates of symptomatic pneumonitis in the setting of lung SBRT.


Subject(s)
Angiotensin-Converting Enzyme Inhibitors/therapeutic use , Chemoradiotherapy , Lung Neoplasms/surgery , Radiation Injuries/drug therapy , Radiation Pneumonitis/drug therapy , Radiosurgery/adverse effects , Adult , Aged , Aged, 80 and over , Cohort Studies , Female , Follow-Up Studies , Humans , Lung Neoplasms/pathology , Male , Middle Aged , Prognosis , Radiation Injuries/diagnosis , Radiation Injuries/etiology , Radiation Pneumonitis/diagnosis , Radiation Pneumonitis/etiology , Risk Factors , Survival Rate
10.
Endocr Pract ; 24(2): 163-169, 2018 Feb.
Article in English | MEDLINE | ID: mdl-29144808

ABSTRACT

OBJECTIVE: Bisphosphonate (BP) drug holidays are recommended to lower the risk of rare adverse events, such as atypical femoral fractures and osteonecrosis of the jaw. However, there are minimal data on the optimal duration of these holidays. Our aim was to determine the clinical and laboratory parameters associated with increased fracture risk in patients on BP drug holiday. METHODS: A retrospective chart review was conducted of 401 patients with osteopenia or osteoporosis who began a BP drug holiday from 2004 to 2013. Collected parameters included demographics, prior therapy, bone mineral density (BMD), bone turnover markers, parathyroid hormone, calcium & vitamin D status, and clinical reports of fractures. RESULTS: Sixty-two (15.4%) patients developed a fracture during follow-up. The yearly incidence of fractures ranged from 3.7 to 9.9%, peaking at 9.9% and 9.8% during years 4 and 5, respectively. The mean age of the fracture group was higher than the nonfracture group, though not significantly different (69.24 ± 12.26 years vs. 66.42 ± 10.18 years; P = .09). Compared to the nonfracture group, the fracture group had lower femoral neck BMD (0.75 ± 0.12 g/cm2 vs. 0.79 ± 0.10 g/cm2; P = .03) and T-scores (-2.13 ± 0.99 vs. -1.78 ± 0.79; P = .01) at baseline. CONCLUSION: Patients who begin BP drug holidays at high risk of fracture based on BMD, age, or other clinical risk factors warrant close follow-up, especially as its duration lengthens. Fracture risk analysis needs to be regularly assessed during the drug holiday and treatment resumed accordingly. ABBREVIATIONS: 25-OHD = 25-hydroxyvitamin D AACE = American Association of Clinical Endocrinologists ACE = American College of Endocrinology BMD = bone mineral density BP = bisphosphonate BSAP = bone-specific alkaline phosphatase BTM = bone turnover marker FN = femoral neck LS = lumbar spine PTH = parathyroid hormone.


Subject(s)
Bone Density Conservation Agents/therapeutic use , Diphosphonates/therapeutic use , Osteoporosis/drug therapy , Osteoporosis/epidemiology , Osteoporotic Fractures/epidemiology , Withholding Treatment , Aged , Aged, 80 and over , Bone Density/drug effects , Bone Density Conservation Agents/adverse effects , Diphosphonates/adverse effects , Female , Humans , Male , Middle Aged , Osteoporotic Fractures/chemically induced , Retrospective Studies , Time Factors
11.
Eur J Gastroenterol Hepatol ; 29(12): 1389-1396, 2017 Dec.
Article in English | MEDLINE | ID: mdl-28957871

ABSTRACT

PURPOSE: The aim of this study was to compare the correlations between computed tomography (CT) criteria for hepatic steatosis and lipid profile values when hepatic steatosis is incidentally detected. PARTICIPANTS AND METHODS: This is an institutional Review Board-approved, HIPPA-compliant, retrospective study of abdominal CT scans in 200 randomly selected patients who had either nonenhanced CT (NECT) or contrast-enhanced CT (CECT) studies with reported fatty liver. The participants were matched for age, sex, and ethnicity with 200 patients with nonfatty liver. For NECT, four different criteria have been proposed in the literature to diagnose fatty liver: (i) liver Hounsfield Units (HU) less than 48 HU, (ii) ratio of liver to spleen HU less than 0.8, (iii) HU difference between liver and spleen less than -10, and (iv) hepatic vessel HU greater than or equal to liver HU. For CECT, the criteria was attenuation difference between liver and spleen HU, in the portal venous phase of up to -20 to -25 HU. Laboratory results (low-density lipoprotein, high-density lipoprotein, triglycerides) were documented. Matched analyses and conditional logistic regression analysis were carried out for matched variables. RESULTS: There were statistically significant differences in triglyceride values, between the cases and controls (P=0.02), when all criteria were considered. Also, statistically significant differences were found between cases and controls on the basis of NECT criterion 2 and high-density lipoprotein (P=0.04), as well as CECT criteria and triglyceride levels (P=0.02). In addition, the data indicate that criteria for steatosis on CECT may be more broad than traditionally utilized. CONCLUSION: Incidental reporting of fatty liver on NECT/CECT should prompt consideration of clinical follow-up and lipid profile testing in an otherwise asymptomatic patient. Additional metrics for the diagnosis of steatosis in CECT exam should also be considered.


Subject(s)
Fatty Liver/blood , Fatty Liver/diagnostic imaging , Tomography, X-Ray Computed , Adolescent , Adult , Case-Control Studies , Contrast Media , Humans , Incidental Findings , Lipoproteins, HDL/blood , Lipoproteins, LDL/blood , Middle Aged , Retrospective Studies , Spleen/diagnostic imaging , Triglycerides/blood , Young Adult
12.
JACC Basic Transl Sci ; 2(2): 122-131, 2017 Apr.
Article in English | MEDLINE | ID: mdl-28596995

ABSTRACT

The degradation and release of cardiac myosin binding protein-C (cMyBP-C) upon cardiac damage may stimulate an inflammatory response and autoantibody (AAb) production. We determined whether the presence of cMyBP-C-AAbs associated with adverse cardiac function in CVD patients. Importantly, cMyBP-C-AAbs were significantly detected in ACS patient sera upon arrival to the emergency department, particularly in STEMI patients. Patients positive for cMyBP-C-AAbs had a reduced LVEF and elevated levels of clinical biomarkers of MI. We conclude that cMyBP-C-AAbs may serve as early predictive indicators of deteriorating cardiac function and patient outcome in ACS patients prior to the infarction.

13.
J Burn Care Res ; 38(6): 379-389, 2017.
Article in English | MEDLINE | ID: mdl-28338517

ABSTRACT

The authors sought to increase the number of days when burn service patients receive 100% of prescribed enteral nutrition. The authors first performed a retrospective review of 37 patients (group 1) receiving enteral nutrition. The authors then created and implemented a nurse-directed feeding algorithm, placing patients into three age groups addressing maximum hourly infusion rates, high residual limits, initiating feeding, refeeding residuals, and replacing formula. The authors then performed a prospective review of 37 patients (group 2) fed utilizing the new algorithm. The amount of prescribed, infused, discarded, and missed feeds were recorded, as well as admitting diagnosis, age, gender, length of stay, ventilator days, infections, and mortality. All patients in group 1 (n = 37) received 100% of feeds 59.9% of prescribed days vs 76.5% in group 2 (n = 37; P = .003). Burn patients in group 1 (n = 26) received 100% of feeds 61.6% of prescribed days vs 85.4% in group 2 (n = 21; P < .001). The mean amount of hours tube feeds were held for surgery, procedures, clogged or dislodged tubes, in both historical control and the group using the restorative algorithm were the same. While there was a significant difference in burn size between groups (6.24 vs 18.39%, P = .01), there were no statistically significant differences in length of stay, ventilator days, or mortality. Implementation of a nurse-directed feeding algorithm improved delivery of enteral nutrition for all burn service patients, increasing the number of days when 100% of prescribed enteral nutrition is given.


Subject(s)
Burns/therapy , Enteral Nutrition , Adolescent , Adult , Age Factors , Algorithms , Child , Child, Preschool , Energy Intake , Humans , Infant , Infant, Newborn , Practice Patterns, Nurses' , Retrospective Studies , Time Factors , Young Adult
14.
Pharmacotherapy ; 37(4): 420-428, 2017 Apr.
Article in English | MEDLINE | ID: mdl-28226419

ABSTRACT

STUDY OBJECTIVES: The primary objective was to determine the impact of hematologic malignancies and/or conditioning regimens on the risk of developing Clostridium difficile infection (CDI) in patients undergoing hematopoietic stem cell transplantation (HSCT). Secondary objectives were to determine if traditional CDI risk factors applied to patients undergoing HSCT and to determine the presence of CDI markers of severity of illness among this patient population. DESIGN: Single-center retrospective case-control study. SETTING: Quaternary care academic medical center. PATIENTS: A total of 105 patients who underwent HSCT between December 2009 and December 2014; of these patients, 35 developed an initial episode of CDI (HSCT/CDI group [cases]), and 70 did not (controls). Controls were matched in a 2:1 ratio to cases based on age (± 10 yrs) and date of HSCT (± 6 mo). MEASUREMENTS AND MAIN RESULTS: Baseline characteristics of the two groups were well balanced regarding age, sex, race, ethnicity, and type of HSCT. No significant differences in conditioning regimen, hematologic malignancy, total body irradiation received for HSCT, use of antibiotics within 60 days of HSCT, or use of prophylactic antibiotics after HSCT were noted between the two groups. Patients in the control group were 10.57 (95% confidence interval 1.24-492.75) more likely to have received corticosteroids prior to HSCT than patients in the HSCT/CDI group (p=0.01). Use of proton pump inhibitors at the time of HSCT was greater among the control group than among patients in the HSCT/CDI group (97% vs 86%, p=0.048). No significant difference in mortality was noted between the groups at 3, 6, and 12 months after HSCT. Metronidazole was frequently prescribed for patients in the HSCT/CDI group (34 patients [97%]). Severe CDI was not common among patients within the HSCT/CDI group (13 patients [37%]); vancomycin was infrequently prescribed for these patients ([31%] 4/13 patients). CONCLUSION: Hematologic malignancies and a conditioning regimen administered for HSCT were not significant risk factors for the development of CDI after HSCT. Use of corticosteroids prior to HSCT and use of proton pump inhibitors at the time of HSCT were associated with a significantly decreased risk of CDI.


Subject(s)
Clostridium Infections/epidemiology , Hematologic Neoplasms/pathology , Hematopoietic Stem Cell Transplantation/methods , Transplantation Conditioning/methods , Academic Medical Centers , Adrenal Cortex Hormones/administration & dosage , Adult , Aged , Anti-Bacterial Agents/therapeutic use , Case-Control Studies , Clostridioides difficile/isolation & purification , Clostridium Infections/drug therapy , Clostridium Infections/etiology , Female , Hematologic Neoplasms/therapy , Humans , Male , Middle Aged , Proton Pump Inhibitors/administration & dosage , Retrospective Studies , Risk Factors , Time Factors , Transplantation Conditioning/adverse effects , Vancomycin/therapeutic use
15.
J Clin Aesthet Dermatol ; 10(12): 44-48, 2017 Dec.
Article in English | MEDLINE | ID: mdl-29399266

ABSTRACT

BACKGROUND: Melanoma surveillance serves to identify new primary melanomas and curable locoregional or early distant recurrences. Although an optimal melanoma surveillance strategy has not been determined, several clinical guidelines exist. OBJECTIVE: The aim of this study was to identify demographic and clinico-pathologic variables associated with poor adherence to National Comprehensive Cancer Network (NCCN) melanoma surveillance guidelines. DESIGN: We retrospectively reviewed the initial five-year dermatology follow-up visit frequencies of melanoma patients and extracted basic demographic and clinical data from their medical records. PARTICIPANTS: Of 186 patients included, the mean age was 55 (standard deviation=15); 47.5 percent (n=85) were female, 93.0 percent (n=173) were white, and 76.2 percent (n=141) were married. Sixty percent of patients lived at locations more than 10 miles from the clinic, and 58.6 percent had private insurance. MEASUREMENTS: "Aggressive" and "conservative" surveillance schedules were adapted from National Comprehensive Cancer Network visit frequency guidelines. RESULTS: Between 58.4 and 74.5 percent of patients adhered to "aggressive" surveillance, with decreasing rates over the five-year period. Annual rates of poor surveillance adherence (7.3-23.6%) increased over time. Based on adjusted odds ratios, patients younger than 50 years of age (odds ratios 2.11 [95% CI 1.13-3.93], p<0.05), those lacking health insurance (odds ratios 3.08 [95% CI 1.09-8.68], p<0.05), and those with at least Stage IIB disease (odds ratios 3.21 [95% CI 1.36-7.58], p<0.01) are more likely to be poorly adherent to melanoma surveillance. CONCLUSION: This study's findings highlight some variables associated with poor surveillance adherence among melanoma survivors that could help to guide efforts in counseling this at-risk population.

16.
Radiother Oncol ; 121(1): 9-14, 2016 10.
Article in English | MEDLINE | ID: mdl-27543255

ABSTRACT

BACKGROUND: Recent reports demonstrate impaired tumor re-oxygenation 24-48h after stereotactic body radiation therapy (SBRT), suggesting that non-consecutive treatment delivery may be advantageous. To test this hypothesis clinically, we compared local control in patients treated in consecutive daily fractions vs. nonconsecutive fractions. METHODS: We retrospectively reviewed 107 lung SBRT patients (117 tumors) treated for T1-T2N0 NSCLC with LINAC based SBRT (50 or 60Gy/5fractions). Patients were characterized as having been treated in consecutive daily fractions vs. in non-consecutive fractions. Local control, survival and toxicity end points (CTCAE V4.0) were compared. Propensity score matching and Cox regression analyses were performed in order to determine the effect of fractionation on local control. RESULTS: With a median follow up of 23.7months, 3-year local control was superior at 93.3% vs. 63.6% in the non-consecutive and consecutive group, respectively (p=0.001). Multivariate analysis and propensity score matching showed that consecutive fractionation was an independent predictor of local failure. Overall survival trended toward improvement in the non-consecutive group, but this was not statistically significant (p=0.188). Development of any grade 2 toxicity was not significantly different between the two groups (p=0.75). CONCLUSION: Five-fraction SBRT delivered over non-consecutive days imparts superior LC and similar toxicity compared to consecutive fractionation. These results should be validated in independent datasets and in a prospective fashion.


Subject(s)
Carcinoma, Non-Small-Cell Lung/radiotherapy , Dose Fractionation, Radiation , Lung Neoplasms/radiotherapy , Radiosurgery/methods , Aged , Aged, 80 and over , Female , Follow-Up Studies , Humans , Male , Middle Aged , Propensity Score , Retrospective Studies , Treatment Outcome
17.
BMC Infect Dis ; 16: 283, 2016 06 13.
Article in English | MEDLINE | ID: mdl-27296465

ABSTRACT

BACKGROUND: Hepatitis C (HCV) is a deleterious virus that can be cured with new, highly effective anti-viral treatments, yet more than 185 million individuals worldwide remain HCV positive (with the vast majority un-diagnosed or untreated). Of importance, HCV is a leading cause of chronic liver disease and liver cancer, especially in Sub-Saharan Africa (SSA) where the prevalence remains high but uncertain due to little population-based evidence of the epidemic. We aimed to synthesize available data to calculate and highlight the HCV disease burden in SSA. METHODS: Weighted random-effects generalized linear mixed models were used to estimate prevalence by risk cohort, African region (Southern, Eastern, Western, and Central Africa), type of assay used, publication year, and whether the estimate included children. A pooled prevalence estimate was also calculated. Multi-variable analyses were limited to cohort and region specific prevalence estimates in the adult population due to limited studies including children. Prevalence estimates were additionally weighted using the known adult population size within each region. RESULTS: We included more than 10 years of data. Almost half of the studies on HCV prevalence in SSA were from the Western region (49 %), and over half of all studies were from either blood donor (25 %) or general population cohorts (31 %). In uni-variable analyses, prevalence was lowest in Southern Africa (0.72 %), followed by Eastern Africa at 3.00 %, Western Africa at 4.14 %, and Central Africa at 7.82 %. Blood donors consistently had the lowest prevalence (1.78 %), followed by pregnant women (2.51 %), individuals with comorbid HIV (3.57 %), individuals from the general population (5.41 %), those with a chronic illness (7.99 %), and those at high risk for infection (10.18 %). After adjusting for the population size in each region, the overall adult prevalence of HCV in SSA rose from 3.82 to 3.94 %. CONCLUSION: This meta-analysis offers a timely update to the HCV disease burden in SSA and offers additional evidence of the burgeoning epidemic. The study highlights the need to account for type of cohort and region variation when describing the HCV epidemic in SSA, the need for more studies that include children, as well as the need to factor in such variations when planning public health interventions.


Subject(s)
Blood Donors/statistics & numerical data , Coinfection/epidemiology , HIV Infections/epidemiology , Hepatitis C/epidemiology , Pregnancy Complications, Infectious/epidemiology , Adult , Africa South of the Sahara/epidemiology , Africa, Central/epidemiology , Africa, Eastern/epidemiology , Africa, Southern/epidemiology , Africa, Western/epidemiology , Chronic Disease , Female , Hepacivirus , Humans , Multivariate Analysis , Pregnancy , Prevalence
18.
Int Urol Nephrol ; 48(8): 1321-1326, 2016 Aug.
Article in English | MEDLINE | ID: mdl-27209426

ABSTRACT

PURPOSE: Diuretics remain an important medication for hypertension management among adults with chronic kidney disease (CKD), but diuretics may also worsen urinary symptoms, especially urinary incontinence (UI). This single-center pilot study examined the prevalence of UI among adults age ≥60 years with CKD using diuretics and assessed diuretic avoidance due to urinary symptoms. METHODS: Patients with non-dialysis-dependent CKD (estimated glomerular filtration rate <60 ml/min/1.73 m(2)) and diuretic use were recruited from outpatient nephrology clinics. Urinary symptoms and diuretic avoidance were assessed using standardized questionnaires. RESULTS: The cohort of 44 women and 54 men had a mean age of 71.8 (8.4) years, and urgency-UI, stress-UI and mixed-UI (the presence of both urgency-UI and stress-UI) were reported by 44.9 % (n = 44), 36.7 % (n = 36) and 26.5 % (n = 26), respectively. Nocturia was noted in 68 % (n = 67). Overall, 15.3 % (6 men and 9 women) reported diuretic avoidance. Avoidance of diuretics was 27.3 % (n = 12), 25.5 % (n = 9) and 34.6 % (n = 9) among participants with urgency-UI, stress-UI and mixed-UI, respectively, while only 6.8 % (n = 3) of participants without any UI reported diuretic avoidance. After adjusting for age, sex and diuretic type (loop vs. others), both urgency-UI (odds ratio 5.9 95 % CI 1.5-22.8) and mixed-UI (odds ratio 5.7; 95 % CI 1.6-19.9) were significantly associated with diuretic avoidance compared to participants without urgency-UI, or mixed-UI, respectively. Stress-UI and nocturia were not significantly associated with diuretic avoidance. CONCLUSIONS: UI is common among older adults with CKD receiving diuretics. Patients with urgency-UI are more likely to avoid diuretics.


Subject(s)
Diuretics/adverse effects , Patient Compliance/statistics & numerical data , Renal Insufficiency, Chronic/drug therapy , Surveys and Questionnaires , Urinary Incontinence/chemically induced , Urinary Incontinence/epidemiology , Age Distribution , Aged , Cohort Studies , Diuretics/therapeutic use , Female , Humans , Hypertension/diagnosis , Hypertension/drug therapy , Hypertension/epidemiology , Incidence , Logistic Models , Male , Middle Aged , Odds Ratio , Pilot Projects , Prognosis , Quality of Life , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/epidemiology , Retrospective Studies , Risk Assessment , Sex Distribution , Statistics, Nonparametric , Urinary Incontinence/physiopathology
SELECTION OF CITATIONS
SEARCH DETAIL
...