Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 38
Filter
1.
J Obstet Gynaecol Can ; 46(6): 102458, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38615915

ABSTRACT

Our objective was to determine if placental lake presence or size is associated with adverse pregnancy outcomes. This was a retrospective cohort of patients who had fetal anatomy ultrasounds at 18-22 weeks and delivered between 2018 and 2022. Placental lakes were classified as small (>2.0 to 3.9 cm) or large (≥4 cm). Multiple gestations, placenta previas, and placenta accretas were excluded. Outcomes included low birthweight, cesarean delivery, primary cesarean for non-reassuring fetal heart tracing, fetal growth restriction, preterm birth, and severe preeclampsia. A total of 1052 patients were included; 294 had placental lakes (204 small, 90 large). No differences in pregnancy outcomes were observed.


Subject(s)
Pregnancy Outcome , Ultrasonography, Prenatal , Humans , Female , Pregnancy , Retrospective Studies , Adult , Placenta/diagnostic imaging , Placenta/anatomy & histology , Pregnancy Trimester, Second , Cesarean Section
2.
AIDS Behav ; 27(7): 2370-2375, 2023 Jul.
Article in English | MEDLINE | ID: mdl-36576664

ABSTRACT

In January 2021, cabotegravir/rilpivirine, the first extended-release injectable regimen for the treatment of Human Immunodeficiency Virus (HIV) was approved. Long-acting injections have the potential to improve adherence and viral suppression. We analyzed the acceptance rate of, and reasons for declining to switch to, the new regimen. During routine appointments, 102 people living with HIV (PLWH) were presented with information on the new medication and asked if they would like to switch from their current regimen. If they declined to switch, they were asked why. Sixty-nine percent of respondents declined to switch, with frequency of injections as the primary reason. Patients indicated they would be willing to switch if the interval between injections was longer. Forty percent of the patients accepting the injectable anti-retrovirals (ARVs) were not on any other medications. Barriers to switching to long-acting injectable ARVs include the need for more frequent provider visits, aversion to needles, and a perceived lack of evidence supporting the new medication.


Subject(s)
Anti-HIV Agents , HIV Infections , Humans , Rilpivirine/therapeutic use , Anti-HIV Agents/therapeutic use , HIV Infections/drug therapy , Anti-Retroviral Agents/therapeutic use , Injections
3.
BMC Infect Dis ; 22(1): 620, 2022 Jul 15.
Article in English | MEDLINE | ID: mdl-35840929

ABSTRACT

BACKGROUND: Clostridiodies difficile infection (CDI) has been characterized by the Center for Disease Control and Prevention (CDC) as an urgent public health threat and a major concern in hospital, outpatient and extended-care facilities worldwide. METHODS: A retrospective cohort study of patients aged ≥ 18 hospitalized with CDI in New York State (NYS) between January 1, 2014-December 31, 2016. Data were extracted from NY Statewide Planning and Research Cooperative (SPARCS) and propensity score matching was performed to achieve comparability of the CDI (exposure) and non-CDI (non-exposure) groups. Of the 3,714,486 hospitalizations, 28,874 incidence CDI cases were successfully matched to 28,874 non-exposures. RESULTS: The matched pairs comparison demonstrated that CDI cases were more likely to be readmitted to the hospital at 30 (28.26% vs. 19.46%), 60 (37.65% vs. 26.02%), 90 (42.93% vs. 30.43) and 120 days (46.47% vs. 33.74), had greater mortality rates at 7 (3.68% vs. 2.0%) and 180 days (20.54% vs. 11.96%), with significant increases in length of stay and total hospital charges (p < .001, respectively). CONCLUSIONS: CDI is associated with a large burden on patients and health care systems, significantly increasing hospital utilization, costs and mortality.


Subject(s)
Clostridioides difficile , Clostridium Infections , Cross Infection , Health Care Costs , Hospitalization , Humans , Length of Stay , Propensity Score , Retrospective Studies
4.
J Neurooncol ; 155(2): 117-124, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34601657

ABSTRACT

PURPOSE: Pre-clinical evidence suggests bevacizumab (BV) depletes the GBM peri-vascular cancer-stem cell niche. This phase I/II study assesses the safety and efficacy of repeated doses of superselective intra-arterial cerebral infusion (SIACI) of BV after blood-brain barrier disruption (BBBD). METHODS: Date of surgery was day 0. Evaluated patients received repeated SIACI bevacizumab (15 mg/kg) with BBBD at days 30 ± 7, 120 ± 7, and 210 ± 7 along with 6 weeks of standard chemoradiation. Response assessment in neuro-oncology criteria and the Kaplan-Meier product-limit method was used to evaluate progression free and overall survival (PFS and OS, respectively). RESULTS: Twenty-three patients with a median age of 60.5 years (SD = 12.6; 24.7-78.3) were included. Isocitrate dehydrogenase mutation was found in 1/23 (4%) patients. MGMT status was available for 11/23 patients (7 unmethylated; 3 methylated; 1 inconclusive). Median tumor volume was 24.0 cm3 (SD = 31.1, 1.7-48.3 cm3). Median PFS was 11.5 months (95% CI 7.7-25.9) with 6, 12, 24 and 60 month PFS estimated to be 91.3% (95% CI 69.5-97.8), 47.4% (26.3-65.9), 32.5% (14.4-52.2) and 5.4% (0.4-21.8), respectively. Median OS was 23.1 months (95% CI 12.2-36.9) with 12, 24, and 36 month OS as 77.3% (95% CI 53.6-89.9), 45.0% (22.3-65.3) and 32.1% (12.5-53.8), respectively. CONCLUSIONS: Repeated dosing of IA BV after BBBD offers an encouraging outcome in terms of PFS and OS. Phase III trials are warranted to determine whether repeated IA BV combined with Stupp protocol is superior to Stupp protocol alone for newly diagnosed GBM.


Subject(s)
Bevacizumab , Blood-Brain Barrier , Brain Neoplasms , Glioblastoma , Adult , Aged , Bevacizumab/administration & dosage , Bevacizumab/adverse effects , Blood-Brain Barrier/pathology , Brain Neoplasms/drug therapy , Drug Administration Schedule , Glioblastoma/drug therapy , Humans , Infusions, Intra-Arterial , Middle Aged , Treatment Outcome
5.
Childs Nerv Syst ; 37(7): 2251-2259, 2021 07.
Article in English | MEDLINE | ID: mdl-33738542

ABSTRACT

PURPOSE: We describe a detailed evaluation of predictors associated with individual lead placement efficiency and accuracy for 261 stereoelectroencephalography (sEEG) electrodes placed for epilepsy monitoring in twenty-three children at our institution. METHODS: Intra- and post-operative data was used to generate a linear mixed model to investigate predictors associated with three outcomes (lead placement time, lead entry error, lead target error) while accounting for correlated observations from the same patients. Lead placement time was measured using electronic time-stamp records stored by the ROSA software for each individual electrode; entry and target site accuracy was measured using postoperative stereotactic CT images fused with preoperative electrode trajectory planning images on the ROSA computer software. Predictors were selected from a list of variables that included patient demographics, laterality of leads, anatomic location of lead, skull thickness, bolt cap device used, and lead sequence number. RESULTS: Twenty-three patients (11 female, 48%) of mean age 11.7 (± 6.1) years underwent placement of intracranial sEEG electrodes (median 11 electrodes) at our institution over a period of 1 year. There were no associated infections, hemorrhages, or other adverse events, and successful seizure capture was obtained in all monitored patients. The mean placement time for individual electrodes across all patients was 6.56 (± 3.5) min; mean target accuracy was 4.5 (± 3.5) mm. Lesional electrodes were associated with 25.7% (95% CI: 6.7-40.9%, p = 0.02) smaller target point errors. Larger skull thickness was associated with larger error: for every 1-mm increase in skull thickness, there was a 4.3% (95% CI: 1.2-7.5%, p = 0.007) increase in target error. Bilateral lead placement was associated with 26.0% (95% CI: 9.9-44.5%, p = 0.002) longer lead placement time. The relationship between placement time and lead sequence number was nonlinear: it decreased consistently for the first 4 electrodes, and became less pronounced thereafter. CONCLUSIONS: Variation in sEEG electrode placement efficiency and accuracy can be explained by phenomena both within and outside of operator control. It is important to keep in mind the factors that can lead to better or worse lead placement efficiency and/or accuracy in order to maximize patient safety while maintaining the standard of care.


Subject(s)
Robotics , Child , Electrodes, Implanted , Electroencephalography , Female , Humans , Seizures , Stereotaxic Techniques
6.
Endocr Pract ; 25(1): 6-15, 2019 Jan.
Article in English | MEDLINE | ID: mdl-30383486

ABSTRACT

OBJECTIVE: To determine which vitamin D dose, formulation, and schedule most effectively and safely achieves a 25-hydroxyvitamin D (25[OH]D) level of >30 ng/mL (75 nmol/L). METHODS: In this prospective study, 100 subjects from the NY Harbor HCS Brooklyn Campus, ages 25 to 85 years, with 25(OH)D <30 ng/mL (<75 nmol/L), were randomized into four groups: cholecalciferol (D3) 2,000 international units (IU) daily; D3 3,000 IU daily; ergocalciferol (D2) 50,000 IU weekly; and D2 50,000 IU twice weekly. All were supplemented with 500 mg calcium carbonate daily. 25(OH)D, parathyroid hormone (PTH), urinary calcium, urinary creatinine, and other variables were measured during 7 visits over 12 months. RESULTS: All groups achieved a mean vitamin D level >30 ng/mL (>75 nmol/L) by visit 4 (5 months). Those receiving 50,000 IU D2 twice weekly displayed the most rapid and robust response, with 25(OH)D reaching >30 ng/mL (>75 nmol/L) after only 1 month and plateauing at 60 ng/mL (150 nmol/L) by 7 months. Although no statistically significant difference was seen in mean 25(OH)D levels between groups 1 through 3, subjects on 50,000 IU D2 weekly more consistently showed higher mean levels than either groups 1 or 2. No episodes of significant hypercalcemia occurred. There was a negative correlation in mean PTH levels and mean vitamin D levels in group 4 and all groups combined. CONCLUSION: All four schedules of vitamin D replacement were effective in safely achieving and maintaining 25(OH)D >30 ng/mL (>75 nmol/L). D2 50,000 IU twice weekly provided the most rapid attainment and highest mean levels of vitamin D. ABBREVIATIONS: 25(OH)D = 25-hydroxyvitamin D; BMI = body mass index; BUN = blood urea nitrogen; Ca/Cr = calcium/creatinine; D2 = ergocalciferol; D3 = cholecalciferol; IU = international units; PTH = parathyroid hormone.


Subject(s)
Vitamin D Deficiency , Vitamin D/pharmacology , Adult , Aged , Aged, 80 and over , Cholecalciferol , Dietary Supplements , Humans , Middle Aged , Parathyroid Hormone , Prospective Studies , Vitamins
7.
J Stroke Cerebrovasc Dis ; 28(5): 1243-1251, 2019 May.
Article in English | MEDLINE | ID: mdl-30745230

ABSTRACT

OBJECTIVE: To explore a 5-year comparison of disparities in intravenous t-PA (IV t-PA) use among acute ischemic stroke (AIS) patients based on race, gender, age, ethnic origin, hospital status, and geographic location. METHODS: We extracted patients' demographic information and hospital characteristics for 2010 and 2014 from the New York Statewide Planning and Research Cooperative System (SPARCS). We compared disparities in IV t-PA use among AIS patients in 2010 to that in 2014 to estimate temporal trends. Multiple logistic regression was performed to compare disparities based on demographic variables, hospital designation, and geographic location. RESULTS: Overall, there was approximately a 2% increase in IV t-PA from 2010 to 2014. Blacks were 15% less likely to receive IV t-PA compared to Whites in 2014, but in 2010, there was no difference. Patients aged 62-73 had lower odds of receiving IV t-PA than age group ≤61 in both 2010 and 2014. Designated stroke centers in the Lower New York State region were associated with reduced odds of IV t-PA use in 2010 while those located in the Upper New York State region were associated with increased odds of IV t-PA use in both 2010 and 2014, compared to their respective nondesignated counterparts. Gender, ethnic origin, and insurance status were not associated with IV t-PA utilization in both 2010 and 2014. CONCLUSION: Overall IV t-PA utilization among AIS patients increased between 2010 and 2014. However, there are evident disparities in IV t-PA use based on patient's race, age, hospital geography, and stroke designation status.


Subject(s)
Brain Ischemia/drug therapy , Fibrinolytic Agents/administration & dosage , Health Services Accessibility/trends , Healthcare Disparities/trends , Process Assessment, Health Care/trends , Stroke/drug therapy , Thrombolytic Therapy/trends , Administration, Intravenous , Age Factors , Aged , Aged, 80 and over , Brain Ischemia/diagnosis , Brain Ischemia/ethnology , Databases, Factual , Female , Healthcare Disparities/ethnology , Humans , Male , Middle Aged , New York/epidemiology , Racial Groups , Sex Factors , Stroke/diagnosis , Stroke/ethnology , Time Factors , Treatment Outcome
8.
Am J Occup Ther ; 73(2): 7302205120p1-7302205120p9, 2019.
Article in English | MEDLINE | ID: mdl-30915973

ABSTRACT

OBJECTIVE: Our objective was to evaluate the effectiveness of four adapted feeding utensils with participants with essential tremor (ET) or tremor related to Parkinson's disease (PD). METHOD: Participants performed a simulated feeding task under five conditions: (1) standard spoon (control condition), (2) weighted spoon with standard handle, (3) weighted spoon with built-up handle, (4) swivel spoon, and (5) Liftware Steady™ spoon, a product using active tremor cancellation technology. Participants rated each adapted utensil in comparison with the standard spoon regarding performance, ease of use, speed, neatness, and aesthetics. RESULTS: Participants preferred the Liftware Steady spoon and weighted spoon with standard handle. Friedman's test did not reveal statistically significant differences in ratings between the two preferred utensils. CONCLUSION: Participants had varied reactions to the different adaptive utensils and gave different reasons for preferences. These findings support the need for people with tremor related to ET or PD to have access to trial use of all four devices assessed in this study.


Subject(s)
Essential Tremor , Household Articles , Parkinson Disease , Humans
9.
Stroke ; 49(8): 1933-1938, 2018 08.
Article in English | MEDLINE | ID: mdl-29976582

ABSTRACT

Background and Purpose- The 2015 updated US Food and Drug Administration alteplase package insert altered several contraindications. We thus explored clinical factors influencing alteplase treatment decisions for patients with minor stroke. Methods- An expert panel selected 7 factors to build a series of survey vignettes: National Institutes of Health Stroke Scale (NIHSS), NIHSS area of primary deficit, baseline functional status, previous ischemic stroke, previous intracerebral hemorrhage, recent anticoagulation, and temporal pattern of symptoms in first hour of care. We used a fractional factorial design (150 vignettes) to provide unconfounded estimates of the effect of all 7 main factors, plus first-order interactions for NIHSS. Surveys were emailed to national organizations of neurologists, emergency physicians, and colleagues. Physicians were randomized to 1 of 10 sets of 15 vignettes, presented randomly. Physicians reported the subjective likelihood of giving alteplase on a 0 to 5 scale; scale categories were anchored to 6 probabilities from 0% to 100%. A conjoint statistical analysis was applied. Results- Responses from 194 US physicians yielded 156 with complete vignette data: 74% male, mean age 46, 80% neurologists. Treatment mean probabilities for individual vignettes ranged from 6% to 95%. Treatment probability increased from 24% for NIHSS score =1 to 41% for NIHSS score =5. The conjoint model accounted for 25% of total observed response variance. In contrast, a model accounting for all possible interactions accounted for 30% variance. Four of the 7 factors accounted jointly for 58% of total relative importance within the conjoint model: previous intracerebral hemorrhage (18%), recent anticoagulation (17%), NIHSS (13%), and previous ischemic stroke (10%). Conclusions- Four main variables jointly account for only a small fraction (<15%) of the total variance related to deciding to treat with intravenous alteplase, reflecting high variability and complexity. Future studies should consider other variables, including physician characteristics.


Subject(s)
Clinical Decision-Making , Physicians/trends , Stroke/drug therapy , Surveys and Questionnaires , Thrombolytic Therapy/trends , Tissue Plasminogen Activator/administration & dosage , Administration, Intravenous , Clinical Decision-Making/methods , Female , Humans , Male , Stroke/diagnostic imaging , Treatment Outcome
10.
Learn Mem ; 24(7): 267-277, 2017 07.
Article in English | MEDLINE | ID: mdl-28620074

ABSTRACT

Dendritic regulatory BC1 RNA is a non-protein-coding (npc) RNA that operates in the translational control of gene expression. The absence of BC1 RNA in BC1 knockout (KO) animals causes translational dysregulation that entails neuronal phenotypic alterations including prolonged epileptiform discharges, audiogenic seizure activity in vivo, and excessive cortical oscillations in the γ frequency band. Here we asked whether BC1 RNA control is also required for higher brain functions such as learning, memory, or cognition. To address this question, we used odor/object attentional set shifting tasks in which prefrontal cortical performance was assessed in a series of discrimination and conflict learning sessions. Results obtained in these behavioral trials indicate that BC1 KO animals were significantly impaired in their cognitive flexibility. When faced with conflicting information sources, BC1 KO animals committed regressive errors as they were compromised in their ability to disengage from recently acquired memories even though recall of such memories was in conflict with new situational context. The observed cognitive deficits are reminiscent of those previously described in subtypes of human autism spectrum disorders.


Subject(s)
Attention/physiology , Cognition Disorders/genetics , Cognition Disorders/physiopathology , Odorants , RNA, Small Cytoplasmic/metabolism , Animals , Conflict, Psychological , Discrimination Learning/physiology , Grooming/physiology , Learning Curve , Maze Learning , Mental Recall/physiology , Mice , Mice, Inbred C57BL , Mice, Knockout , RNA, Small Cytoplasmic/genetics
11.
Am J Nephrol ; 46(2): 114-119, 2017.
Article in English | MEDLINE | ID: mdl-28704826

ABSTRACT

BACKGROUND: We hypothesized that in the very elderly dialysis patients in the United States, institutionalization in nursing homes would increase mortality in addition to age alone. METHODS: Incident dialysis patients from 2001 to 2008 above the age of 70 were included. Patients above 70 were categorized into 4 groups according to age as 70-75, 76-80, 81-85, and >85 years and further divided into institutionalized and noninstitutionalized. Kaplan-Meier survival curves were plotted to assess patient survival. RESULTS: A total of 349,440 patients were identified above the age of 70 at the time of initiation of dialysis. For institutionalized patients, the mean survival was significantly lower, 1.71 ± 0.03 years for those in the age range 70-75, 1.44 ± 0.02 years for those in the age range 76-80, 1.25 ± 0.02 years for those in the age range 81-85, and 1.04 ± 0.02 for those in the >85 years age group (p = 0.0001). The hazard ratio for mortality in institutionalized elderly patients on dialysis was 1.80 ([95% CI 1.77-1.83]; p = 0.0001). After adjustment for other variables (multivariate Cox regression), to be institutionalized was still an independent risk factor for mortality (adjusted hazard ratio = 1.57 [95% CI 1.54-1.60]; p = 0.0001). CONCLUSION: There was increased mortality in institutionalized elderly patients as compared to noninstutionalized elderly patients in the same age group. In accordance with the increased frailty and decreased benefits of therapies in the very elderly, especially in those with additional co-morbidities besides age, palliative and end-of-life care should be considered.


Subject(s)
Frailty/mortality , Institutionalization/statistics & numerical data , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Renal Dialysis/adverse effects , Age Factors , Aged , Aged, 80 and over , Female , Frail Elderly/statistics & numerical data , Health Information Systems/statistics & numerical data , Humans , Kaplan-Meier Estimate , Male , Proportional Hazards Models , Renal Dialysis/statistics & numerical data , Risk Factors , Treatment Outcome , United States/epidemiology
12.
Am J Nephrol ; 45(2): 180-186, 2017.
Article in English | MEDLINE | ID: mdl-28110327

ABSTRACT

INTRODUCTION: The outcomes of patients who fail their kidney transplant and return to dialysis (RTD) has not been investigated in a nationally representative sample. We hypothesized that variations in management of transplant chronic kidney disease stage 5 leading to kidney allograft failure (KAF) and RTD, such as access, nutrition, timing of dialysis, and anemia management predict long-term survival. METHODS: We used an incident cohort of patients from the United States Renal Data System who initiated hemodialysis between January 1, 2003 and December 31, 2008, after KAF. We used Cox regression analysis for statistical associations, with mortality as the primary outcome. RESULTS: We identified 5,077 RTD patients and followed them for a mean of 30.9 ± 22.6 months. Adjusting for all possible confounders at the time of RTD, the adjusted hazards ratio (AHR) for death was increased with lack of arteriovenous fistula at initiation of dialysis (AHR 1.22, 95% CI 1.02-1.46, p = 0.03), albumin <3.5 g/dL (AHR 1.33, 95% CI 1.18-1.49, p = 0.0001), and being underweight (AHR 1.30, 95% CI 1.07-1.58, p = 0.006). Hemoglobin <10 g/dL (AHR 0.96, 95% CI 0.86-1.06, p = 0.46), type of insurance, and zip code-based median household income were not associated with higher mortality. Glomerular filtration rate <10 mL/min/1.73 m2 at time of dialysis initiation (AHR 0.83, 95% CI 0.75-0.93, p = 0.001) was associated with reduction in mortality. CONCLUSIONS: Excess mortality risk observed in patients starting dialysis after KAF is multifactorial, including nutritional issues and vascular access. Adequate preparation of patients with failing kidney transplants prior to resuming dialysis may improve outcomes.


Subject(s)
Graft Rejection , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Kidney Transplantation/adverse effects , Renal Dialysis , Adult , Aged , Allografts/pathology , Anemia/drug therapy , Anemia/mortality , Cohort Studies , Female , Glomerular Filtration Rate , Heart Failure/epidemiology , Hematinics/therapeutic use , Hemoglobins/analysis , Humans , Incidence , Kidney Failure, Chronic/blood , Male , Middle Aged , Patient Transfer , Proportional Hazards Models , Risk Factors , Survival Rate , Time Factors , Transplantation, Homologous/adverse effects , United States/epidemiology
13.
J Clin Rheumatol ; 23(1): 1-5, 2017 Jan.
Article in English | MEDLINE | ID: mdl-28002149

ABSTRACT

BACKGROUND: Hyperuricemia is associated with development of gout, hypertension, and renal disease. The impact of allopurinol, a urate-lowering therapy, on renal function is unclear, especially in patients with chronic kidney disease who are at higher risk of hypersensitivity reaction. OBJECTIVES: The aim of this study was to determine the effect of allopurinol on kidney function in hyperuricemic male veterans. METHODS: This is a retrospective cohort study using pharmacy, medical, and laboratory records of veterans enrolled at the Veterans Administration New York Harbor Healthcare System, Brooklyn campus. Fifty patients with hyperuricemia defined as a serum uric acid greater than 7 mg/dL (average of ~9 mg/dL), newly started on allopurinol for any reason, with evidence of treatment compliance, were matched by age, race, sex, and estimated glomerular filtration rate (EGFR) to 50 hyperuricemic control subjects. The retrospective cases were observed from October 2000 until November 2006, at which time there was a change in the laboratory analyzer, making further comparisons inappropriate. RESULTS: On average, patients treated with a mean 221 (SD, 96) mg/d dose of allopurinol achieved 11.9 mL/min higher GFR (95% confidence interval, 4.8-11.9 mg/d dose; P = 0.01) than did the control group. Treatment effect was found to depend on the initial EGFR, as indicated by the significant treatment by initial EGFR interaction (P = 0.004) and increased with a higher initial EGFR. The allopurinol-treated group had a 0.10 mg/dL lower final creatinine level (95% confidence interval, 0.003-0.20 mg/dL; P = 0.04) than did the control subjects, adjusted for initial creatinine and age. The average length of follow-up was 3.4 years. There were 5 mild adverse events in the treated cases. CONCLUSIONS: Treatment of hyperuricemic patients with allopurinol over an average of 3.4 years resulted in a significant improvement of kidney function in this male cohort from the Veterans Administration Healthcare System. Clinicians should consider this potential benefit of allopurinol in the treatment of patients with hyperuricemia, those with overall maintained renal function.


Subject(s)
Allopurinol , Glomerular Filtration Rate/drug effects , Hyperuricemia , Renal Insufficiency, Chronic , Aged , Allopurinol/administration & dosage , Allopurinol/adverse effects , Antimetabolites/administration & dosage , Antimetabolites/adverse effects , Creatinine/blood , Humans , Hyperuricemia/blood , Hyperuricemia/complications , Hyperuricemia/drug therapy , Male , Middle Aged , New York City , Protective Agents/administration & dosage , Protective Agents/adverse effects , Renal Insufficiency, Chronic/complications , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/physiopathology , Retrospective Studies , Treatment Outcome , Uric Acid/blood , Veterans Health/statistics & numerical data
15.
Helicobacter ; 20(1): 64-8, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25308209

ABSTRACT

BACKGROUND: Recently, publications in adults and children have documented a potential role of Helicobacter pylori (H. pylori) in decreasing the likelihood of obesity. The present study compares the prevalence of H. pylori colonization between obese (body mass index [BMI] ≥ 95th percentile) and healthy weight (BMI ≥ 5th to <85th percentiles) children seen at an inner city medical center in the United States. METHODS: This retrospective study reviewed clinical features, BMI, and gastric histology of consecutive children aged 1-18 years undergoing an esophagogastroduodenoscopy. BMI percentile was calculated for age and gender. Helicobacter pylori colonization was determined by histopathologic identification of the organism. Multiple logistic regression was employed to measure the association between BMI and H. pylori colonization, controlling for baseline age, gender, and presenting symptoms. RESULTS: Among 340 patients (51.5% female, mean age of 10.5 ± 4.7 years), 98 (29%) were obese and 173 (51%) were healthy weight. The H. pylori colonization rate of the entire cohort was 18.5% (95% CI = 14.7-23.0%). Among obese children, 10% had H. pylori colonization compared to 21% of the healthy weight children (RR = 2.1, 95% CI = 1.1-4.0). Conversely, 39% of noncolonized children, but only 21% of the infected children, were obese (RR = 1.8, 95% CI = 1.1-3.3). Multivariate analysis revealed that being colonized with H. pylori is associated with a 50% reduction in the odds of being obese (adjusted OR = 0.5, 95% CI = 0.2-1.0). CONCLUSIONS: Our findings in a North American cohort are in agreement with studies from Asia and Europe suggesting that H. pylori infection decreases the prevalence of obesity in children. Further work to characterize the extent and nature of this relationship is warranted.


Subject(s)
Helicobacter Infections/complications , Helicobacter pylori/isolation & purification , Obesity/epidemiology , Adolescent , Biopsy , Child , Child, Preschool , Cohort Studies , Endoscopy, Digestive System , Female , Gastric Mucosa/microbiology , Gastric Mucosa/pathology , Humans , Infant , Male , Prevalence , Retrospective Studies , United States/epidemiology , Urban Population
16.
Teach Learn Med ; 26(4): 350-6, 2014.
Article in English | MEDLINE | ID: mdl-25318029

ABSTRACT

BACKGROUND: Medical students experience a high burden of stress and suffer elevated rates of depression, burnout, and suicide compared to the general population, yet there is no consensus on how to address student wellness. PURPOSES: The purpose of this study was to determine whether an abridged mindfulness based stress reduction (MBSR) intervention can improve measures of wellness in a randomized sample of 1st-year medical students. METHODS: Fifty-eight participants were randomized to control or 8-week MBSR intervention and then invited to participate in the study. All participants were assessed using the Perceived Stress Scale (PSS), the Resilience Scale (RS), and Self-Compassion Scale (SCS) at 3 separate time points: baseline, at the conclusion of the study intervention (8 weeks), and at 6 months after the conclusion of the intervention. The intervention consisted of 75 minutes of weekly class time, suggested meditation at home, and a half-day retreat in the last week. RESULTS: The intervention group achieved significant increase on SCS scores both at the conclusion of the study (0.58, p=.002), 95% confidence interval (CI) [0.23, 0.92], and at 6 months (0.56, p=.001), 95% CI [0.25, 0.87]. PSS scores achieved significant reduction at the conclusion of the study (3.63, p=.03), 95% CI [0.37, 6.89], but not at 6 months poststudy (2.91, p=.08), 95% CI [-0.37, 6.19]. The study did not demonstrate a difference in RS after the intervention, though RS was significantly correlated with both SCS and PSS. CONCLUSIONS: An abridged MBSR intervention improves perceived stress and self-compassion in 1st-year medical students and may be a valuable curricular tool to enhance wellness and professional development.


Subject(s)
Mindfulness , Stress, Psychological/prevention & control , Students, Medical/psychology , Depression/prevention & control , Depression/psychology , Female , Humans , Male , Prospective Studies , Stress, Psychological/psychology , Suicidal Ideation , Surveys and Questionnaires , Treatment Outcome , Young Adult
17.
J Clin Rheumatol ; 20(2): 91-3, 2014 Mar.
Article in English | MEDLINE | ID: mdl-24561412

ABSTRACT

OBJECTIVE: The objective of this study was to survey members of the American College of Rheumatology (ACR) regarding intra-articular and soft tissue (musculoskeletal [MSK]) injections and to determine if injection techniques vary depending on type of practice and years of experience. METHODS: A survey was e-mailed to the members of the ACR to obtain demographics of the respondents, MSK injection practices, and adverse events seen. RESULTS: The most common indications for MSK injections were rheumatoid arthritis, osteoarthritis, and bursitis. Written consent and time-out procedures were more common in academic/government practices when compared with private practice. There was variation in the type of corticosteroid used. The most common preparations were methylprednisolone actetate (45.0%), triamcinolone acetonide (26.1%), triamcinolone hexacetonide (22.1%). This survey showed good agreement on the dosage of corticosteroid for MSK injections; however, as years of experience increased, clinicians were more likely to prescribe lower doses for shoulder and knee injections. CONCLUSIONS: In this survey of ACR members, we found self-reported differences in the type of corticosteroid used for MSK injections. There was general agreement on frequency of injections, but more experienced practitioners reported using lower doses of corticosteroid.


Subject(s)
Arthritis, Rheumatoid/drug therapy , Bursitis/drug therapy , Glucocorticoids/administration & dosage , Osteoarthritis/drug therapy , Practice Patterns, Physicians' , Adult , Bursa, Synovial , Data Collection , Dose-Response Relationship, Drug , Female , Glucocorticoids/therapeutic use , Humans , Injections/methods , Injections, Intra-Articular , Male , Middle Aged , Self Report , Societies, Medical , Tendons , United States
18.
Adv Radiat Oncol ; 9(4): 101436, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38435966

ABSTRACT

Purpose: Disparities have been reported in women treated for breast cancer (BrCa). This study examines potential disparities in BrCa treatment offered based on race and age from a multicenter radiation department. Methods and Materials: We identified 901 patients with early stage BrCa who received curative intent radiation therapy (RT) between 2004 and 2018. Data extracted included age, race, disease stage, treatment technique, treatment dates, and fractionation. Patient race was recorded as Asian, Black, Hispanic, and White. RT technique delivered was classified as a type of external beam radiation therapy or brachytherapy/intraoperative radiation therapy. Fractionation schema were defined as 1) standard fractionation, 1.8-2 Gy; 2) hypofractionation, 2.5-2.67 Gy; 3) accelerated partial breast irradiation (APBI), 3.4 Gy - 4.25 Gy, and 4) intraoperative radiation therapy, single dose of 20 Gy. Stage was recorded using TNM staging. The χ2 test and a multivariable multinomial logistic regression model were used to assess whether patient characteristics, such as age, race, or stage influenced fractionation schemes. Results with 2-sided P values < .05 were considered statistically significant. Results: Racial composition of the study was 13.8% Asian, 22% Black, 29%, White, and 35.1% Hispanic. Mean age was 61 and was divided into 4 age range groups: 30 to 49 (n = 160), 50 to 59 (n = 231), 60 to 69 (n = 294), and ≥70 years (n = 216). In addition, 501 patients (56%) received hypofractionation, 342 (38.8%) received standard fractionation, and 58 (7.1%) received APBI, respectively. For all groups, hypofractionation became more common over time. Age ≥70 years was associated with 9 times higher odds of APBI and 14 times higher odds of hypofractionation, compared with age 30 to 49 years. After adjusting for the other predictors in a multivariable multinomial logistic regression model, the race distribution differed among the 3 groups (P = .03), with a smaller percentage of Hispanics and higher percentage of blacks in the standard group. Conclusions: This study of a diverse cohort of patients with breast cancer failed to identify treatment differences associated by race. The study found an association between age and hypofractionation.

19.
Clin Exp Emerg Med ; 2024 May 23.
Article in English | MEDLINE | ID: mdl-38778494

ABSTRACT

Introduction: Emergency Department observation units (EDOU) transition patients from the emergency department (ED) to dedicated areas where they can receive continuous monitoring. Understanding patient return visits after EDOU discharge is important for optimizing healthcare. The objective of this study was to investigate the correlation between demographic and clinical features and the likelihood of returning to the ED within 30 days following their initial assessment in the EDOU. Methods: This retrospective, observational cohort study of adult EDOU subjects was conducted between February 1, 2018 - January 31, 2023. Adult patients evaluated in the EDOU and returned to the ED within 30 days were identified. Subjects were compared to those assessed in the EDOU but did not return to the ED within 30 days. The analysis took into account multiple visits by the same subject and made adjustments for variables including gender, ethnicity, insurance status, primary diagnosis, and disposition, using a generalized linear mixed model. Results: A total of 14,910 EDOU encounters were analyzed and 2,252 (15%) patients returned to the ED within 30 days. The analysis took into account several variables demonstrated a significant association with the likelihood of returning to the ED within 30 days. These included gender (p=0.0002), ethnicity (p=0.005), race (p=0.0004), insurance status (p<0.0001), primary diagnosis (p<0.0001), and disposition (p<0.001). Emergency severity index and length of stay were not associated with returning. Conclusions: Understanding these factors may guide interventions, enhance EDOU care, and reduce resource strain. Further research should explore these associations and long-term intervention impacts for improved outcomes.

20.
Urogynecology (Phila) ; 30(3): 251-255, 2024 03 01.
Article in English | MEDLINE | ID: mdl-38484239

ABSTRACT

IMPORTANCE: This study is important because it aimed to assess an intervention to decrease patient discomfort after a robotic sacral colpopexy. OBJECTIVE: Our primary outcome was to determine whether preoperative use of polyethylene glycol decreases time to first bowel movement postoperatively. Secondary outcomes include degree of pain with first bowel movement and stool consistency. STUDY DESIGN: This was a randomized controlled trial. The experimental group was assigned polyethylene glycol daily for 7 days before surgery and the control group was not. All patients received polyethylene glycol postoperatively. RESULTS: There was no statistically significant reduction in the time to first postoperative bowel movement when preoperative polyethylene glycol was used (mean [SD] in days for the control and experimental groups of 2.32 [0.99] and 1.96 [1.00], P = 0.21). There was a statistically significant reduction in pain levels with the first postoperative bowel movement in the experimental group (median [IQR] of 4 [2-5] vs 1 [0-2], P = 0.0007). Postoperative day 1 pain levels were also significantly lower in the experimental group (median [IQR] of 4 [3-6] vs 2 [0-4], P = 0.0484). In addition, patients had decreased average postoperative pain levels over 7 days with an estimated difference in the median pain levels of 1.88 units (95% confidence interval, 0.64-3.12; P = 0.0038). CONCLUSIONS: Preoperative administration of polyethylene glycol did not decrease time to first postoperative bowel movement. Patients in the experimental group exhibited less pain with their first postoperative bowel movement and had improved pain levels on postoperative day 1.


Subject(s)
Defecation , Polyethylene Glycols , Humans , Polyethylene Glycols/therapeutic use , Pain, Postoperative
SELECTION OF CITATIONS
SEARCH DETAIL