ABSTRACT
OBJECTIVE: An expert panel made recommendations to optimize surgical education and training based on the effects of contemporary challenges. BACKGROUND: The inaugural Blue Ribbon Committee (BRC I) proposed sweeping recommendations for surgical education and training in 2004. In light of those findings, a second BRC (BRC II) was convened to make recommendations to optimize surgical training considering the current landscape in medical education. METHODS: BRC II was a panel of 67 experts selected on the basis of experience and leadership in surgical education and training. It was organized into subcommittees which met virtually over the course of a year. They developed recommendations, along with the Steering Committee, based on areas of focus and then presented them to the entire BRC II. The Delphi method was chosen to obtain consensus, defined as ≥80% agreement among the panel. Cronbach α was computed to assess the internal consistency of 3 Delphi rounds. RESULTS: Of the 50 recommendations, 31 obtained consensus in the following aspects of surgical training (# of consensus recommendation/# of proposed): Workforce (1/5); Medical Student Education (3/8); Work Life Integration (4/6); Resident Education (5/7); Goals, Structure, and Financing of Training (5/8); Education Support and Faculty Development (5/6); Research Training (7/9); and Educational Technology and Assessment (1/1). The internal consistency was good in Rounds 1 and 2 and acceptable in Round 3. CONCLUSIONS: BRC II used the Delphi approach to identify and recommend 31 priorities for surgical education in 2024. We advise establishing a multidisciplinary surgical educational group to oversee, monitor, and facilitate implementation of these recommendations.
Subject(s)
Delphi Technique , General Surgery , United States , Humans , General Surgery/education , Education, Medical, Graduate/methodsABSTRACT
OBJECTIVE: Among patients ≥45 years, the birth rate in the United States continues to increase. As fertility declines with age, this cohort often utilizes assisted reproductive technology, specifically in vitro fertilization (IVF). While both advancing maternal age and IVF are independently associated with adverse maternal outcomes, data regarding their additive effect are scant. This article aims to determine if patients who conceive via IVF are at increased risk for preterm birth (PTB) compared to patients with non-IVF pregnancies in a very advanced maternal age (vAMA) cohort (≥45 years). STUDY DESIGN: Retrospective cohort study of all pregnant patients ≥45 years old who delivered at a single institution (2014-2021). Those with incomplete delivery/neonatal records or multiples beyond twins were excluded. We compared individuals who conceived via IVF to those who conceived without IVF. The primary outcome was preterm delivery <37 weeks gestation. Secondary outcomes included other adverse perinatal outcomes. Using multivariable logistic regression, we adjusted for multiple gestation as well as confounders found to be significantly different in the univariable analysis and other known risk factors for PTB. RESULTS: In our study cohort of 420 vAMA patients, individuals who underwent IVF were more likely to be older, privately insured, nulliparous, and with a twin gestation. The PTB rate in vAMA patients who underwent IVF was 24.4 compared to 8.4% in patients who did not use IVF (p < 0.001). After adjusting for confounders, IVF was an independent risk factor for PTB <37 weeks in vAMA patients (adjusted odds ratio {aOR] = 4.3, 95% confidence interval [CI]: 1.7-10.4, p = 0.001). In vitro fertilization was also associated with a composite of adverse maternal outcomes (hypertensive disorder of pregnancy, postpartum hemorrhage, blood transfusion, and unplanned hysterectomy) (aOR 1.7, 95% CI [1.1-2.9], p = 0.03). CONCLUSION: In the vAMA population, conception via IVF is associated with an increased risk of PTB <37 weeks. KEY POINTS: Ā· This study examines IVF as an independent risk factor for PTB in patients ≥45 years at delivery, which has not been specifically addressed in prior studies.. Ā· In vAMA patients, use of IVF is associated with an increased risk of PTB <37 weeks. These patients also have higher rates of cesarean delivery. Neonates from IVF pregnancies are more likely to be very low birth weight or low birth weight.. Ā· Bodies of research exist for both advanced maternal age and assisted reproductive technology, there is a paucity of data specifically in parturients of vAMA who conceive via IVF..
ABSTRACT
Lemon balm (Melissa officinalis L.) is commonly consumed as an herbal tea for its antioxidant health benefits. Young seedlings known as microgreens are popular for their distinct flavors and can contain higher mineral content on a dry weight basis compared to their adult counterparts. However, the use of microgreens for herbal teas has not been previously investigated. In this study, lemon balm was grown to adult and microgreen harvest stages and prepared as herbal teas by brewing with boiled (100Ā Ā°C) water for 5Ā minutes and room temperature water (22Ā Ā°C) for 2Ā hours. The effects of harvest time and brewing method on the mineral content, phenolic compounds, and antioxidant capacity of lemon balm herbal teas were assessed. Results showed that adult lemon balm tea contained higher total phenolics, total flavonoids, rosmarinic acid, and antioxidant capacity than microgreen teas, with hot preparations containing the highest amounts (p ≤ 0.05). In contrast, microgreen lemon balm teas contained higher amounts of minerals (p ≤ 0.05), including calcium, potassium, magnesium, sodium, phosphorus, copper, and zinc. In general, brewing conditions did not impact the content of most minerals. Overall, the results support the potential of using dried microgreens as herbal teas. Microgreen lemon balm teas prepared hot and cold offer antioxidant compounds and are richer sources of minerals than adult teas. The ease of growth for microgreens offers consumers the opportunity for home preparation of a novel herbal tea beverage.
Subject(s)
Melissa , Teas, Herbal , Antioxidants/analysis , Plant Extracts/pharmacology , Phenols/analysis , MineralsABSTRACT
OBJECTIVES: The aim of the study is to evaluate how current management of Category II fetal heart rate tracings compares with that suggested by a published algorithm and whether these differences lead to disparate neonatal outcomes. STUDY DESIGN: This is a retrospective observational study from the resident service at an academic-community tertiary care center from 2013 to 2018. We reviewed archived fetal heart rate tracings from patients with cesarean delivery performed for nonreassuring fetal heart rate tracing and interpreted tracings against the algorithm. We assigned tracings to one of three categories: Group A-consistent; Group B-inconsistent too early (algorithm permits the patient to labor longer); Group C-inconsistent too late (algorithm suggests performing the cesarean delivery sooner). Maternal demographics, features of labor, and neonatal outcomes were compared. RESULTS: Of the 110 cases, 27 (24.5%) had a cesarean delivery performed in group A, 49 (44.5%) in group B, and 34 (30.9%) in group C. Baseline characteristics were similar. Of the 49 in group B, 46 (93.9%) violated the algorithm at the same branchpoint. In group C, cesarean deliveries would have been performed on average 244 minutes earlier had the algorithm been used. Neonatal outcomes were not significantly different among the groups, including 5-minute Apgar <7, pH <7.1, and NICU admit. CONCLUSION: Our retrospective application of the algorithm showed that 44.5% of patients who have cesarean delivery for nonreassuring fetal heart rate tracing may be able to labor longer and that violation at a common decision point on the algorithm (moderate variability or accelerations, but a lack of recurrent decelerations) is responsible for nearly all such cesarean deliveries. More studies are needed to evaluate if cesarean delivery rates for nonreassuring fetal heart rate tracing can be reduced without impacting neonatal outcomes using the algorithm. KEY POINTS: Ā· There is a potential to further standardize management of Category II fetal heart rate tracings.. Ā· In our practice, 25% of cesareans performed for fetal distress were consistent with the algorithm.. Ā· A subset of patients (45%) with cesarean for fetal distress may have been able to labor longer..
Subject(s)
Fetal Distress , Heart Rate, Fetal , Algorithms , Cesarean Section , Female , Humans , Infant, Newborn , Pregnancy , Retrospective StudiesABSTRACT
OBJECTIVE: We compare maternal morbidity and clinical care metrics before and after the electronic implementation of a maternal early warning trigger (MEWT) tool. STUDY DESIGN: This is a study of maternal morbidity and clinical care within three linked hospitals comparing 1 year before and after electronic MEWT implementation. We compare severe maternal morbidity overall as well as within the subcategories of hemorrhage, hypertension, cardiopulmonary, and sepsis in addition to relevant process metrics in each category. We describe the MEWT trigger rate in addition to MEWT sensitivity and specificity for morbidity overall and by morbidity type. RESULTS: The morbidity rate ratio increased from 1.6 per 100 deliveries in the pre-MEWT period to 2.06 per 100 deliveries in the post-MEWT period (incidence rate ratio = 1.28, p = 0.018); however, in cases of septic morbidity, time to appropriate antibiotics decreased (pre-MEWT: 1.87 hours [1.11-2.63] vs. post-MEWT: 0.75 hours [0.31-1.19], p = 0.036) and in cases of hypertensive morbidity, the proportion of cases treated with appropriate antihypertensive medication within 60 minutes improved (pre-MEWT: 62% vs. post-MEWT: 83%, p = 0.040). The MEWT trigger rate was 2.3%, ranging from 0.8% in the less acute centers to 2.9% in our tertiary center. The MEWT sensitivity for morbidity overall was 50%; detection of hemorrhage morbidity was lowest (30%); however, it ranged between 69% for septic morbidity, 74% for cardiopulmonary morbidity, and 82% for cases of hypertensive morbidity. CONCLUSION: Overall, maternal morbidity did not decrease after implementation of the MEWT system; however, important clinical metrics such as time to antibiotics and antihypertensive care improved. We suspect increased morbidity was related to annual variation and unexpected lower morbidity in the pre-MEWT comparison year. Because MEWT sensitivity for hemorrhage was low, and because hemorrhage dominates administrative metrics of morbidity, process metrics around sepsis, hypertension, and cardiopulmonary morbidity are important to track as markers of MEWT efficacy. KEY POINTS: Ā· MEWT was not associated with a decrease in maternal morbidity.. Ā· MEWT was associated with improvements in some clinical care metrics.. Ā· MEWT is more sensitive in detecting septic, hypertensive, and cardiopulmonary morbidities than hemorrhage morbidity..
Subject(s)
Early Diagnosis , Medical Records Systems, Computerized , Pregnancy Complications/diagnosis , California/epidemiology , Critical Pathways , Female , Hemorrhage/diagnosis , Humans , Hypertension, Pregnancy-Induced/diagnosis , Maternal Mortality/trends , Morbidity , Pregnancy , Pregnancy Complications/epidemiology , Pregnancy Complications/prevention & control , ROC Curve , Time-to-TreatmentABSTRACT
Selenium supplementation in humans has been suggested for the prevention of chronic diseases including cardiovascular disease, cancer, and neurodegenerative diseases. Selenium biofortification of plants has been explored as a method for increasing selenium content of food and dietary selenium intake in humans. However, the effects of selenium biofortification on other dietary nutrients is often a secondary discussion. These effects are especially important to explore considering selenium-biofortified foods contain many other nutrients important to human health, such as other minerals and antioxidant compounds, which can make these foods superior to selenium supplementation alone. Investigation of selenium biofortification's effect on these nutrients is necessary for a comprehensive human nutrition perspective on biofortification strategies. This review considers the effects of selenium biofortification on selenium content, other minerals, and antioxidant compounds as they pertain to human health in order to suggest optimal strategies for biofortification. Pre-clinical and clinical studies assessing the effects of consumption of selenium biofortified foods are also discussed.
Subject(s)
Biofortification , Selenium , Antioxidants , Crops, Agricultural , Food, Fortified , Humans , NutrientsABSTRACT
The objective of the study was to determine whether ADORA2A or PER3 polymorphisms contribute to individual responsivity to sleep restriction. Nineteen healthy adults (ages 18-39, 11 males, 8 females) underwent sleep restriction (SR) which consisted of seven nights of 3Ā h time in bed (TIB) (04:00-07:00). SR was preceded by seven in-laboratory nights of 10Ā h TIB (21:00-07:00) and followed by three nights of 8Ā h TIB (23:00-07:00). Volunteers underwent psychomotor vigilance, objective alertness, and subjective sleepiness assessments throughout. Volunteers were genotyped for the PER3 VNTR polymorphism (PER3(4/4) nĀ =Ā 7; PER3(4/5) nĀ =Ā 10; PER3(5/5) nĀ =Ā 2) and the ADORA2A c.1083T>C polymorphism, (ADORA2A(C) (/T) nĀ =Ā 9; ADORA2A(T) (/T) nĀ =Ā 9; ADORA2A(C) (/C) nĀ =Ā 1) using polymerase chain reaction (PCR). Separate mixed-model anovas were used to assess contributions of ADORA2A and PER3 polymorphisms. Results showed that PER3(4/4) and ADORA2A(C/T) individuals expressed greater behavioral resiliency to SR compared to PER(4/5) and ADORA2A(T/T) individuals. Our findings contrast with previously reported non-significant effects for the PER3 polymorphism under a less challenging sleep restriction regimen (4Ā h TIB per night for five nights). We conclude that PER3 and ADORA2A polymorphisms become more behaviorally salient with increasing severity and/or duration of sleep restriction (based on psychomotor vigilance). Given the small sample size these results are preliminary and require replication.
Subject(s)
Period Circadian Proteins/genetics , Polymorphism, Single Nucleotide/genetics , Psychomotor Performance/physiology , Receptor, Adenosine A2A/genetics , Sleep Deprivation/genetics , Adolescent , Adult , Arousal/physiology , Female , Genotype , Humans , Male , Period Circadian Proteins/physiology , Sleep Deprivation/physiopathology , Wakefulness/genetics , Wakefulness/physiology , Young AdultABSTRACT
Naps are an effective strategy for maintaining alertness and cognitive performance; however, upon abrupt wakening from naps, sleep inertia (temporary performance degradation) may ensue. In the present study, attenuation of post-nap sleep inertia was attempted by administration of caffeine gum. Using a double-blind, placebo-controlled crossover design, 15 healthy, non-smoking adults were awakened at 1 hr. and again at 6 hr. after lights out (0100 and 0600, respectively) and were immediately administered a gum pellet containing 100 mg of caffeine or placebo. A 5-min. psychomotor vigilance task was administered at 0 min., 6 min., 12 min., and 18 min. post-awakening. At 0100, response speed with caffeine was significantly better at 12 min. and 18 min. post-awakening compared to placebo; at 0600, caffeine's effects were evident at 18 min. post-awakening. Caffeinated gum is a viable means of rapidly attenuating sleep inertia, suggesting that the adenosine receptor system is involved in sleep maintenance.
Subject(s)
Caffeine/pharmacology , Wakefulness/drug effects , Administration, Oral , Adult , Caffeine/administration & dosage , Cross-Over Studies , Double-Blind Method , Female , Humans , Male , Neuropsychological Tests , Sleep , Sleep Stages/drug effects , Sleep Stages/physiology , Time Factors , Treatment Outcome , Young AdultABSTRACT
There is higher prevalence of epilepsy and SUDEP in people with intellectual disability (PwID) compared to general population. Accurate seizure recording particularly at night can be challenging in PwID. Neuro Event Labs seizure monitoring (Nelli) uses high-quality video based artificial intelligence to detect and record possible nocturnal seizures. This study looks to evaluate the clinical utility and acceptability of Nelli in PwID and epilepsy. Family/carers of PwID and drug resistant epilepsy with suspicions of nocturnal seizures who had not tolerated routine or ambulatory EEGs were invited to evaluate Nelli. Relevant demographics and clinical characteristics were collected. Nelli's impact, it's facilitators, barriers and feedback quality was captured from patient and professional stakeholders. Quantitative and thematic analysis was undertaken. Fifteen PwID and epilepsy and four health professionals were involved. Nelli recorded 707 possible seizure events across the study cohort of which 247 were not heard or recognised by carers. Carers recorded 165 episodes of 'restless' or "seizure behaviour" which the Nelli did not deem to be seizures. There was 93% acceptability. Thematic analysis revealed three broad themes of device acceptability, result implementation and possible seizure recognition ability. Nelli allowed for improved communication and care planning in a hitherto difficult to investigate population.
ABSTRACT
Uterine necrosis is an infrequent event and is most commonly reported as a complication of interventions for postpartum haemorrhage management. Cases of uterine necrosis in pregnancy are rare. The mainstay of treatment for uterine necrosis is hysterectomy, and the data regarding conservative management are limited. A gravida 3, para 2 presented at 33 weeks gestation with ovarian torsion and underwent an exploratory laparotomy with ovarian cystectomy. The surgery was complicated by excess bleeding, which was controlled with the placement of sutures along the uterine body. She had multiple subsequent presentations for severe abdominal pain without clear aetiology. Four weeks after the initial surgery, she underwent caesarean delivery, at which time uterine necrosis was diagnosed. Her uterus was preserved. She received postoperative intravenous antibiotics and was closely observed. She continued to do well 10 months postpartum. In patients with uterine necrosis during pregnancy who are haemodynamically stable, conservative management may be an option.
Subject(s)
Laparotomy , Postpartum Hemorrhage , Pregnancy , Female , Humans , Uterus/surgery , Postpartum Hemorrhage/surgery , Sutures , Necrosis/surgeryABSTRACT
Objective: To identify maternal and/or fetal characteristics associated with delivery within seven days for patients who present with vaginal bleeding in the antepartum period.Methods: This is a retrospective chart review performed at a community-academic tertiary care center. Three hundred and twenty-two consecutive charts associated with admission for vaginal bleeding during pregnancy between January 2015 and May 2020 were reviewed. One hundred and twenty-six women were included based on singleton gestation, gestational age 24 0/7 - 36 6/7 weeks, self-limited vaginal bleeding, vital sign stability (blood pressure >100/60 mmHg, heart rate >60 beats per minute, respiratory rate <20 breaths per minute), absence of signs of labor, no known placenta previa/accreta, recent vaginal intercourse, or trauma. Patient demographic and clinical characteristics were compared using Fisher's exact and two-sample t-tests tests when appropriate. Univariate and multivariate logistic regression models were fitted to predict delivery within 7 days.Results: Thirty-four percent of women who presented with light vaginal bleeding delivered within seven days, with a mean of 2.6 days (n = 44/126). Patients without evidence of labor but with sterile vaginal exam (SVE) >2 cm on admission were 14 times more likely to deliver within 7 days than SVE ≤ 2 cm (AOR 14.49, 95% CI 3.33-63.03); however, 35.2% of women with SVE ≤ 2 cm still delivered in this timeframe (n = 12/34). Of the 59 patients who had cervical lengths (CL) performed, those with CL ≤2.5 cm were 4.22 times more likely to deliver within 7 days (OR 4.22, 95% CI 1.10-16.20). Seventy-eight percent of the patients who had CL >2.5 cm and SVE 0-1 cm went on to deliver >14 days from their initial bleeds (n = 18/23).Conclusion: Patients who present with self-limited vaginal bleeding and SVE > 2 cm should be admitted for antenatal steroids. Prolonged inpatient observation beyond the typical steroid window of 48-72 h should be dependent on the individual patient. Given that CL ≤2.5 cm and regular contractions are known risk factors for preterm delivery, these characteristics alone may also warrant extended inpatient observation, though even in conjunction with vaginal bleeding, neither was a significant predictor for delivery in our study. In contrast, the majority of patients with vaginal bleeding and SVE <2 cm delivered >14 days after their initial bleeds and are likely eligible for shorter periods of observation.
Subject(s)
Placenta Accreta , Placenta Previa , Premature Birth , Infant, Newborn , Female , Humans , Pregnancy , Retrospective Studies , Uterine Hemorrhage/etiology , Uterine Hemorrhage/complications , Placenta Accreta/diagnosis , Premature Birth/etiologyABSTRACT
PROBLEM: We evaluated eculizumab, a complement protein C5 inhibitor, for treatment of severe COVID-19 in pregnant and postpartum individuals. METHOD OF STUDY: Protocol ECU-COV-401 (clinicaltrials.gov NCT04355494) is an open label, multicenter, Expanded Access Program (EAP), evaluating eculizumab for treatment of severe COVID-19. Participants enrolled at our center from August 2020 to February 2021. Hospitalized patients were eligible if they had severe COVID-19 with bilateral pulmonary infiltrates and oxygen requirement. Eculizumab was administered on day 1 (1200 mg IV) with additional doses if still hospitalized (1200 mg IV on Days 4 and 8; 900 mg IV on Days 15 and 22; optional doses on Days 12 and 18). The primary outcome was survival at Day 15. Secondary outcomes included survival at Day 29, need for mechanical ventilation, and duration of hospital stay. We evaluated pharmacokinetic and pharmacodynamic data, safety, and adverse outcomes. RESULTS: Eight participants were enrolled at the Cedars-Sinai Medical Center, six during pregnancy (mean 30 Ā± 4.0 weeks) and two in the postpartum period. Baseline oxygen requirement ranged from 2 L/min nasal cannula to 12 L/min by non-rebreather mask. The median number of doses of eculizumab was 2 (range 1-3); the median time to hospital discharge was 5.5 days (range 3-12). All participants met the primary outcome of survival at Day 15, and all were alive and free of mechanical ventilation at Day 29. In three participants we demonstrated that free C5 and soluble C5b-9 levels decreased following treatment. There were no serious adverse maternal or neonatal events attributed to eculizumab at 3 months. CONCLUSION: We describe use of eculizumab to treat severe COVID-19 in a small series of pregnant and postpartum adults. A larger, controlled study in pregnancy is indicated.
Subject(s)
Antibodies, Monoclonal, Humanized , COVID-19 Drug Treatment , Adult , Antibodies, Monoclonal, Humanized/therapeutic use , Complement System Proteins , Female , Humans , Infant, Newborn , Oxygen , Pregnancy , SARS-CoV-2 , Treatment OutcomeABSTRACT
Selenium biofortification of plants has been suggested as a method of enhancing dietary selenium intake to prevent deficiency and chronic disease in humans, while avoiding toxic levels of intake. Popular herbs such as basil (Ocimum basilicum L.), cilantro (Coriandrum sativum L.), and scallions (Allium fistulosum L.) present an opportunity for biofortification as these plants are used for added flavors to meals and are available as microgreens, young plants with increasing popularity in the consumer marketplace. In this study, basil, cilantro, and scallion microgreens were biofortified with sodium selenate under hydroponic conditions at various selenium concentrations to investigate the effects on yield, selenium content, other mineral contents (i.e., sodium, potassium, calcium, magnesium, phosphorus, copper, zinc, iron, manganese, sulfur, and boron), total phenol content, and antioxidant capacity [oxygen radical absorbance capacity (ORAC)]. The results showed that the selenium content increased significantly at all concentrations, with scallions demonstrating the largest increase. The effects on other minerals varied among herb species. Antioxidant capacity and total phenol content increased in all herbs at the highest selenium treatments, but basil and scallions demonstrated a decreased crop yield. Overall, these biofortified culinary herb microgreens are an ideal functional food for enhancing selenium, other dietary minerals, and antioxidants to benefit human health.
ABSTRACT
BACKGROUND: High-quality workplace-based assessments are essential for competency-based surgical education. We explored education leaders' perceptions regarding faculty competence in assessment. METHODS: Surgical education leaders were surveyed regarding which areas faculty needed improvement, and knowledge of assessment tools. Respondents were queried on specific skills regarding (a)importance in resident/medical student education (b)competence of faculty in assessment and feedback. RESULTS: Surveys (nĀ =Ā 636) were emailed, 103 responded most faculty needed improvement in: verbal (86%) and written (83%) feedback, assessing operative skill (49%) and preparation for procedures (50%). Cholecystectomy, trauma laparotomy, inguinal herniorrhaphy were "very-extremely important" in resident education (99%), but 21-24% thought faculty "moderately to not-at-all" competent in assessment. This gap was larger for non-technical skills. Regarding assessment tools, 56% used OSATS, 49% Zwisch; most were unfamiliar with all non-technical tools. SUMMARY: These data demonstrate a significant perceived gap in competence of faculty in assessment and feedback, and unfamiliarity with assessment tools. This can inform faculty development to support competency-based surgical education.
Subject(s)
Competency-Based Education , Educational Measurement/methods , Faculty, Medical , General Surgery/education , Professional Competence , Staff Development , Education, Medical, Graduate , Feedback , Humans , Internship and Residency , Surveys and QuestionnairesABSTRACT
The insulin-like growth factor I (IGF-I) signaling pathway has been shown to play an important role in several aspects of cancer biology, including metastasis. The aim of this study was to define the contribution of serum (endocrine) and local (tumour microenvironment) IGF-I on osteosarcoma tumour growth and metastasis, a cancer that is known to be dependent on the IGF-I axis. To test this hypothesis, we evaluated the primary tumour growth and metastatic progression of K7M2 murine osteosarcoma cells injected to a genetically engineered mouse [liver-specific IGF-I deficient (LID)] in which serum IGF-I levels are reduced by 75%, while maintaining expression of IGF-I in normal tissues. We first demonstrated that IGF-I in the tumour and the tumour-microenvironment were maintained in the LID mice. Within this designed model, there was no difference in primary tumour growth or in pulmonary metastasis in LID mice compared to control mice. Furthermore, there was no difference in the number or localization of single metastatic cells immediately after their arrival in the lungs of LID mice and control mice, as analysed by single cell video microscopy. Collectively, these data suggest that marked reduction in serum IGF-I is not sufficient to slow the progression of either primary or metastatic models of osteosarcoma.
Subject(s)
Bone Neoplasms/pathology , Insulin-Like Growth Factor I/physiology , Lung Neoplasms/secondary , Osteosarcoma/secondary , Animals , Bone Neoplasms/blood , Bone Neoplasms/genetics , Cell Proliferation , Disease Progression , Female , Humans , Integrases/metabolism , Liver/metabolism , Liver/pathology , Lung/metabolism , Lung/pathology , Lung Neoplasms/blood , Lung Neoplasms/metabolism , Male , Mice , Mice, Inbred BALB C , Mice, Knockout , Osteosarcoma/blood , Osteosarcoma/genetics , Phosphorylation , Proto-Oncogene Proteins c-akt/metabolism , Receptor, IGF Type 1/metabolism , Tumor Burden , Tumor Cells, CulturedABSTRACT
The authors' goal is to review the current recommendations for optimizing cardiovascular health beginning in adolescent years to adulthood, and to expand on the role that pregnancy complications may have as implications for future cardiovascular health. Attention to cardiac health begins in adolescence; however, most young patients are not screened. Pregnancy, with its increased cardiovascular demands and host of antepartum cardiopulmonary complications, may provide a window into future cardiac health. The distinct shift in cardiac risk that occurs once a woman enters menopause is largely ignored in routine screening guidelines.
Subject(s)
Cardiovascular Diseases/prevention & control , Women's Health , Adolescent , Adult , Aged , Female , Health Behavior , Humans , Mass Screening , Menopause , Middle Aged , Postmenopause , Pre-Eclampsia , Pregnancy , Pregnancy Complications , Risk Factors , Young AdultABSTRACT
BACKGROUND AND PURPOSE: Insufficient sleep can adversely affect a variety of cognitive abilities, ranging from simple alertness to higher-order executive functions. Although the effects of sleep loss on mood and cognition are well documented, there have been no controlled studies examining its effects on perceived emotional intelligence (EQ) and constructive thinking, abilities that require the integration of affect and cognition and are central to adaptive functioning. PATIENTS AND METHODS: Twenty-six healthy volunteers completed the Bar-On Emotional Quotient Inventory (EQi) and the Constructive Thinking Inventory (CTI) at rested baseline and again after 55.5 and 58 h of continuous wakefulness, respectively. RESULTS: Relative to baseline, sleep deprivation was associated with lower scores on Total EQ (decreased global emotional intelligence), Intrapersonal functioning (reduced self-regard, assertiveness, sense of independence, and self-actualization), Interpersonal functioning (reduced empathy toward others and quality of interpersonal relationships), Stress Management skills (reduced impulse control and difficulty with delay of gratification), and Behavioral Coping (reduced positive thinking and action orientation). Esoteric Thinking (greater reliance on formal superstitions and magical thinking processes) was increased. CONCLUSIONS: These findings are consistent with the neurobehavioral model suggesting that sleep loss produces temporary changes in cerebral metabolism, cognition, emotion, and behavior consistent with mild prefrontal lobe dysfunction.
Subject(s)
Adaptation, Psychological , Awareness , Emotions , Problem Solving , Sleep Deprivation/psychology , Thinking , Adaptation, Psychological/drug effects , Adolescent , Adult , Assertiveness , Caffeine/administration & dosage , Culture , Defense Mechanisms , Double-Blind Method , Empathy , Female , Humans , Internal-External Control , Interpersonal Relations , Male , Personality Inventory , Problem Solving/drug effects , Self Concept , Sleep Deprivation/drug therapy , Superstitions/psychology , Young AdultABSTRACT
To develop a model incorporating relevant prognostic biomarkers for untreated chronic lymphocytic leukemia patients, we re-analyzed the raw data from four published gene expression profiling studies. We selected 88 candidate biomarkers linked to immunoglobulin heavy-chain variable region gene (IgV(H)) mutation status and produced a reliable and reproducible microfluidics quantitative real-time polymerase chain reaction array. We applied this array to a training set of 29 purified samples from previously untreated patients. In an unsupervised analysis, the samples clustered into two groups. Using a cutoff point of 2% homology to the germline IgV(H) sequence, one group contained all 14 IgV(H)-unmutated samples; the other contained all 15 mutated samples. We confirmed the differential expression of 37 of the candidate biomarkers using two-sample t-tests. Next, we constructed 16 different models to predict IgV(H) mutation status and evaluated their performance on an independent test set of 20 new samples. Nine models correctly classified 11 of 11 IgV(H)-mutated cases and eight of nine IgV(H)-unmutated cases, with some models using three to seven genes. Thus, we can classify cases with 95% accuracy based on the expression of as few as three genes.
Subject(s)
Biomarkers, Tumor/genetics , Immunoglobulin Heavy Chains/genetics , Immunoglobulin Variable Region/genetics , Leukemia, Lymphocytic, Chronic, B-Cell/genetics , Microfluidics/methods , Mutation/genetics , Polymerase Chain Reaction/methods , Cluster Analysis , Gene Expression Profiling , Genetic Markers , Humans , Models, Genetic , Reproducibility of ResultsABSTRACT
OBJECTIVE: The American College of Surgeons (ACS) appointed a committee of leaders from the ACS, Association of Program Directors in Surgery, Accreditation Council for Graduate Medical Education, and American Board of Surgery to define key challenges facing surgery resident training programs and to explore solutions. The committee wanted to solicit the perspectives of surgery resident program directors (PDs) given their pivotal role in residency training. DESIGN: Two surveys were developed, pilot tested, and administered to PDs following Institutional Review Board approval. PDs from 247 Accreditation Council for Graduate Medical Education-accredited general surgery programs were randomized to receive 1 of the 2 surveys. Bias analyses were conducted, and adjusted Pearson χ2 tests were used to test for differences in response patterns by program type and size. SETTING: All accredited general surgery programs in the United States were included in the sampling frame of the survey; 10 programs with initial or withdrawn accreditation were excluded from the sampling frame. PARTICIPANTS: A total of 135 PDs responded, resulting in a 54.7% response rate (Survey A: n = 67 and Survey B: n = 68). The respondent sample was determined to be representative of program type and size. RESULTS: Nearly 52% of PD responses were from university-based programs, and 41% had over 6 residents per graduating cohort. More than 61% of PDs reported that, compared to 10 years ago, both entering and graduating residents are less prepared in technical skills. PDs expressed significant concerns regarding the effect of duty-hour restrictions on the overall preparation of graduating residents (61%) and quality of patient care (57%). The current 5-year training structure was viewed as needing a significant or extensive increase in opportunities for resident autonomy (63%), and the greatest barriers to resident autonomy were viewed to be patient preferences not to be cared for by residents (68%), liability concerns (68%), and Centers for Medicare and Medicaid Services regulations (65%). Although 64% of PDs believe that moderate or significant changes are needed in the current structure of residency training, 35% believe that no changes in the structure are needed. When asked for their 1 best recommendation regarding the structure of surgical residency, only 22% of PDs selected retaining the current 5-year structure. The greatest percentage of PDs (28%) selected the "4 + 2" model as their 1 best recommendation for the structure to be used. In the area of faculty development, 56% of PDs supported a significant or extensive increase in Train the Teacher programs, and 41% supported a significant or extensive increase in faculty certification in education. CONCLUSIONS: Information regarding the valuable perspectives of PDs gathered through these surveys should help in implementing important changes in residency training and faculty development. These efforts will need to be pursued collaboratively with involvement of key stakeholders, including the organizations represented on this ACS committee.
Subject(s)
Accreditation , Education, Medical, Graduate/organization & administration , General Surgery/education , Internship and Residency/organization & administration , Quality Improvement , Adult , Cross-Sectional Studies , Faculty, Medical/organization & administration , Feedback , Female , Humans , Male , Program Development , Program Evaluation , Societies, Medical , United StatesABSTRACT
An online survey was distributed via snowball sampling and resulted in responses from 61 gay fathers raising children in 2 states. Fathers reported on the barriers they experienced and the pathways they took to becoming parents. They reported also on experiences of stigma directed at them and their children, especially from family members, friends, and people in religious institutions. Despite these difficulties they reported that they engaged actively in parenting activities and that their child(ren)'s well-being was consistent with national samples.