RESUMO
BACKGROUND: An increasing number of studies in recent years investigate various dietary and lifestyle patterns and associated breast cancer (BC) risk. OBJECTIVES: This study aimed to comprehensively synthesize and grade the evidence on dietary and lifestyle patterns and BC risk. METHODS: Databases were systematically searched up to 31 March, 2022, for evidence from RCTs and prospective cohort studies on adherence to a dietary pattern alone or in combination with lifestyle behaviors and incidence of or mortality from primary BC in adult females. Findings in all, premenopausal, and postmenopausal females were descriptively synthesized instead of meta-analyzed due to patterns heterogeneity. An independent Global Cancer Update Programme Expert Panel graded the strength of the evidence. RESULTS: A total of 84 publications were included. Results for patterns reflecting both a healthy diet and lifestyle were more consistent than for patterns that included diet only. There was strong-probable evidence that a priori World Cancer Research Fund/American Institute for Cancer Research (WCRF/AICR) and American Cancer Society (ACS) dietary and lifestyle scores may reduce BC risk in all and postmenopausal females, whereas in premenopausal females, less evidence was found contributing to limited-suggestive grade. There was also a limited-suggestive evidence that adherence to the Healthy Lifestyle Index and other diet and lifestyle scores may reduce BC risk in postmenopausal females; a posteriori Western/meat/alcohol dietary patterns may increase BC risk in postmenopausal females; and prudent/vegetarian/Mediterranean dietary patterns may reduce BC risk in all females. For the remaining patterns, evidence was graded as limited-no conclusions. CONCLUSIONS: Advice to adopt combined aspects of a healthy diet and lifestyle according to WCRF/AICR and ACS scores, encouraging a healthy weight, physical activity, alcohol and smoking avoidance, and a healthy diet rich in fruits, vegetables, (whole)grains and cereals and discouraging red and processed meat, can be proposed to females to lower BC risk. This review was registered at PROSPERO as ID CRD42021270129 (https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42021270129) on 28 August, 2021, and further updated on 4 May, 2022, in order to extend the search period.
RESUMO
PURPOSE: To examine eating frequency, timing of meals, and sleep duration before and after a weight loss intervention for breast cancer survivors. METHODS: Female breast cancer survivors (n = 159; 55 ± 9 years; 31.4 ± 5.0 kg/m2; stage I-III, median [IQR] 9.5 [5.5] months post-diagnosis) participated in a randomized controlled trial of a 12-month weight loss intervention versus usual care. Eating frequency, proportion of daily calories consumed after 5 PM, eating after 8 PM, nightly fasting duration, and sleep duration were estimated and categorized based on existing associations with factors influencing breast cancer prognosis and breast cancer outcomes. These behaviors at baseline were compared to women from an Australian national survey with similar age and BMI range. Mixed-effects linear regression models were used to examine the changes in health behaviors from baseline to 18 months between intervention and usual care groups. RESULTS: Before the trial, eating after 8 PM (67%) was higher, and short nightly fasting duration (< 13 h, 83%) and long sleep duration (> 9 h/day, 26%) were marginally higher, in breast cancer survivors than women in the national survey (52%, 75%, and 17%, respectively). "Less optimal" eating behaviors and sleep duration tended to co-occur. Behaviors remained unchanged over the 18-month follow-up, irrespective of the study group (p > 0.05; Cohen's effect sizes < 0.3). CONCLUSIONS: Later timing of eating and long sleep duration were prevalent in breast cancer survivors and continued following a weight loss intervention. IMPLICATIONS FOR CANCER SURVIVORS: Future multi-behavior interventions in breast cancer survivors should consider specific messages to target eating timing behaviors and sleep.
RESUMO
BACKGROUND: Premature aging is a significant concern in adult survivors of childhood cancer as they develop aging-related conditions at a younger age than their peers with no history of childhood cancer. Although modifiable lifestyle factors, such as diet, are postulated to affect aging process, supporting evidence is sparse. METHODS: We examined if the consumption of sugar and sugar-sweetened beverages was related to premature aging in 3322 adult survivors of childhood cancer in the St. Jude Lifetime Cohort. Premature aging was assessed using the Deficit Accumulation Index (DAI) that was a ratio of the number of age-related chronic health conditions each survivor had out of 44 conditions total. Multinomial logistic regressions adjusting for confounders were used to estimate odds ratios (ORs) and 95% confidence intervals (CIs). RESULTS: There were 46% of childhood cancer survivors consumed SSBs once or more times per day. High intake of sugar, especially sugars added to foods during preparation or processing, and habitual consumption of sugar-sweetened beverage were associated with an increased risk of premature aging. DISCUSSION: Our findings support a need to include strategies to reduce sugar and sugar-sweetened beverages consumption in lifestyle interventions to promote healthy aging in adult survivors of childhood cancer.
Assuntos
Senilidade Prematura , Sobreviventes de Câncer , Neoplasias , Bebidas Adoçadas com Açúcar , Humanos , Sobreviventes de Câncer/estatística & dados numéricos , Masculino , Feminino , Adulto , Bebidas Adoçadas com Açúcar/efeitos adversos , Neoplasias/epidemiologia , Senilidade Prematura/etiologia , Adulto Jovem , Criança , Adolescente , Pessoa de Meia-Idade , Açúcares/efeitos adversosRESUMO
PURPOSE: To identify dietary factors that are related to premature aging in adult survivors of childhood cancer, we examined the associations between plant food intakes and age-related deficit accumulation. METHODS: A total of 3,322 childhood cancer survivors (age 18-65 years, mean = 31, standard deviation = 8.4) in the St Jude Lifetime Cohort had total fruit, total vegetables and subgroups, whole grains, refined grains, nuts/seeds, and nutrients intake assessed using a food frequency questionnaire. Premature aging at baseline was assessed by the deficit accumulation index (DAI) and categorized as low, medium, and high risk. Multinomial logistic regressions (reference: low risk) adjusting for confounders estimated odds ratios (ORs) and 95% CIs. Multivariable linear regression of a continuous intake against a continuous DAI was also performed. RESULTS: Dark green vegetable (ORhigh v low = 0.47 [95% CI, 0.28 to 0.78] per 1/2 cup/1,000 kcal increment) and nuts/seeds intakes (ORhigh v low = 0.71 [95% CI, 0.47 to 1.08] per 1 oz/1,000 kcal increment; coefficientlinear = -0.0115, P = .02) were associated with a lower risk of premature aging. Conversely, refined grain intake was related to an increased risk of premature aging (ORhigh v low = 1.33 [95% CI, 0.99 to 1.78], per 1 oz/1,000 kcal increment; coefficientlinear = 0.0093, P = .005). Fruit and whole grain intakes were not associated with premature aging risk. Among nutrients abundant in plant foods, dietary folate intake was associated with a lower risk of premature aging (ORhigh v low = 0.89 [95% CI, 0.80 to 0.99] per 50 mcg/1,000 kcal increase). Beta-carotene, lutein/zeaxanthin, and vitamin E intakes from foods were also related to a modestly lower, but not statistically significant, risk of premature aging. CONCLUSION: Specific plant foods are associated with lower risk of premature aging, providing targets for the interventions to promote healthy aging in childhood cancer survivors.
Assuntos
Senilidade Prematura , Sobreviventes de Câncer , Humanos , Masculino , Feminino , Adulto , Sobreviventes de Câncer/estatística & dados numéricos , Adolescente , Pessoa de Meia-Idade , Adulto Jovem , Senilidade Prematura/etiologia , Senilidade Prematura/epidemiologia , Idoso , Verduras , Neoplasias/epidemiologia , Estudos de Coortes , Frutas , Fatores de Risco , Dieta/efeitos adversos , NozesRESUMO
BACKGROUND: Little is known about the specific dietary patterns in adult survivors of childhood cancer. OBJECTIVES: We aimed to identify dietary patterns specific to childhood cancer survivors and examine their associations with sociodemographic and lifestyle factors. METHODS: Adult survivors of childhood cancer (mean:31 ± 8 y; n = 3022) and noncancer controls (n = 497) in the St. Jude Lifetime Cohort self-reported diet over the past 12 mo using a validated food frequency questionnaire. Factor analysis with 48 predefined food groups was performed to identify foods consumed together. Subsequently, cluster analysis with energy-adjusted factor scores was used to categorize survivors into a mutually exclusive dietary pattern. Dietary patterns were the primary outcomes. Multivariable multinomial logistic regressions were used to cross-sectionally examine associations between sociodemographic and lifestyle factors and dietary patterns in cancer survivors. RESULTS: Among the 4 dietary patterns identified, the fast-food pattern (36 %) was the most common, followed by the Western contemporary (30 %), the plant-based (20 %), and the animal-based (14 %) patterns in childhood cancer survivors. By contrast, the plant-based (38 %) and fast-food patterns (29 %) were prevalent in controls. In survivors, male sex, younger age, lower educational attainment, and physical inactivity were associated with the fast-food, Western contemporary, or animal-based pattern. Compared with non-Hispanic White survivors consuming the plant-based diet, non-Hispanic Black survivors were 2-5 times more likely to consume the fast-food [odds ratio (OR:= 2.76; 95 % CI: 1.82, 4.18) or the animal-based diet (OR: 5.61; 95 % CI: 3.58, 8.78)]. Moreover, survivors residing in the most deprived area were 2-3 times more likely to consume the fast-food, Western contemporary, or animal-based diet. CONCLUSIONS: Unhealthy dietary patterns are prevalent in adult survivors of childhood cancer, especially those with lower socioeconomic status and racial minorities. Interventions to improve diet and health in childhood cancer survivors need to concurrently address disparities that contribute to adherence to healthy dietary practices. This trial was registered at clinicaltrials.gov as NCT00760656 (https://classic. CLINICALTRIALS: gov/ct2/show/NCT00760656).
Assuntos
Sobreviventes de Câncer , Neoplasias , Adulto , Humanos , Criança , Estudos Transversais , Padrões Dietéticos , Dieta , Estilo de VidaRESUMO
Cervical spinal cord injury (SCI) causes devastating loss of upper limb function and independence. Restoration of upper limb function can have a profound impact on independence and quality of life. In low-cervical SCI (level C5-C8), upper limb function can be restored via reinnervation strategies such as nerve transfer surgery. The translation of recovered upper limb motor function into functional independence in activities of daily living (ADLs), however, remains unknown in low cervical SCI (i.e., tetraplegia). The objective of this study was to evaluate the association of patterns in upper limb motor recovery with functional independence in ADLs. This will then inform prioritization of reinnervation strategies focused to maximize function in patients with tetraplegia. This retrospective study performed a secondary analysis of patients with low cervical (C5-C8) enrolled in the SCI Model Systems (SCIMS) database. Baseline neurological examinations and their association with functional independence in major ADLs-i.e., eating, bladder management, and transfers (bed/wheelchair/chair)-were evaluated. Motor functional recovery was defined as achieving motor strength, in modified research council (MRC) grade, of ≥ 3 /5 at one year from ≤ 2/5 at baseline. The association of motor function recovery with functional independence at one-year follow-up was compared in patients with recovered elbow flexion (C5), wrist extension (C6), elbow extension (C7), and finger flexion (C8). A multi-variable logistic regression analysis, adjusting for known factors influencing recovery after SCI, was performed to evaluate the impact of motor function at one year on a composite outcome of functional independence in major ADLs. Composite outcome was defined as functional independence measure score of 6 or higher (complete independence) in at least two domains among eating, bladder management, and transfers. Between 1992 and 2016, 1090 patients with low cervical SCI and complete neurological/functional measures were included. At baseline, 67% of patients had complete SCI and 33% had incomplete SCI. The majority of patients were dependent in eating, bladder management, and transfers. At one-year follow-up, the largest proportion of patients who recovered motor function in finger flexion (C8) and elbow extension (C7) gained independence in eating, bladder management, and transfers. In multi-variable analysis, patients who had recovered finger flexion (C8) or elbow extension (C7) had higher odds of gaining independence in a composite of major ADLs (odds ratio [OR] = 3.13 and OR = 2.87, respectively, p < 0.001). Age 60 years (OR = 0.44, p = 0.01), and complete SCI (OR = 0.43, p = 0.002) were associated with reduced odds of gaining independence in ADLs. After cervical SCI, finger flexion (C8) and elbow extension (C7) recovery translate into greater independence in eating, bladder management, and transfers. These results can be used to design individualized reinnervation plans to reanimate upper limb function and maximize independence in patients with low cervical SCI.
RESUMO
BACKGROUND: Despite no sufficient evidence on benefits and harms of multivitamin use, cancer survivors use multivitamins as a self-care strategy to improve or maintain health. We examined if multivitamin use was associated with mortality in cancer survivors. METHODS: 15,936 male and 7026 female cancer survivors in the NIH-AARP Diet and Health Study were included in the analysis. Types and frequency of multivitamin use at on average 4.6 years after cancer diagnosis were assessed. Multivariable-adjusted relative risks (RR) and 95% confidence intervals (CI) were estimated using Cox proportional hazards regression models. RESULTS: Multivitamin use was not associated with lower all-cause mortality risk in all female (RR = 0.94, 95% CI:0.87-1.01 daily vs. no use) or male cancer survivors (RR = 0.96, 95% CI:0.91-1.00); however, a modest inverse association for CVD mortality was observed in female survivors of reproductive cancers (RR = 0.75, 95% CI:0.61-0.92) and male survivors of non-reproductive cancers (RR = 0.81, 95% CI:0.70-0.94). Multivitamin use was also associated with a lower risk of cancer-specific mortality in survivors of skin (RR = 0.65, 95% CI:0.48-0.88) and breast (RR = 0.79, 95% CI:0.65-0.95) cancer. DISCUSSION: Multivitamin use may provide a modest survival benefit to some cancer survivors. Cancer care providers should talk with cancer survivors about potential benefits and harms of multivitamin use.
Assuntos
Sobreviventes de Câncer , Neoplasias , Humanos , Masculino , Feminino , Causas de Morte , Vitaminas , Dieta , Risco , Neoplasias/terapia , Fatores de RiscoRESUMO
BACKGROUND: Whether diet has beneficial effects on cardiovascular disease (CVD) in childhood cancer survivors as in the general population is unknown. Therefore, we examined associations between dietary patterns and risk of CVD in adult survivors of childhood cancer. METHODS: Childhood cancer survivors, 18-65 years old in the St Jude Lifetime Cohort (1882 men and 1634 women) were included in the analysis. Dietary patterns were defined by the adherence to the Healthy Eating Index (HEI)-2015, Dietary Approaches to Stop Hypertension (DASH), and alternate Mediterranean diet (aMED) based on a food frequency questionnaire at study entry. CVD cases (323 in men and 213 in women) were defined as participants with at least one grade 2 or higher CVD-related diagnosis at baseline. Multivariable logistic regression adjusted for confounders was used to estimate odds ratios (ORs) and 95% confidence intervals (CIs) of CVD. RESULTS: Greater adherence to HEI-2015 (OR=0.88, 95% CI: 0.75-1.03, per 10 score increment), DASH (OR=0.85, 95% CI: 0.71-1.01, per 10 score increment), and aMED (OR=0.92, 95% CI: 0.84-1.00, each score increment) were, albeit trending towards significance, associated with a lower risk of CVD in women. HEI-2015 was associated with a non-significantly lower risk of CVD in men (ORQ5 vs. Q1=0.80, 95% CI: 0.50-1.28). These dietary patterns were also associated with a lower risk of CVD in survivors with high underlying CVD risk. CONCLUSIONS: As recommended to the general population, a diet rich in plant foods and moderate in animal foods needs to be a part of CVD management and prevention in childhood cancer survivors.
Assuntos
Sobreviventes de Câncer , Doenças Cardiovasculares , Dieta Mediterrânea , Neoplasias , Humanos , Feminino , Criança , Dieta Saudável , Doenças Cardiovasculares/epidemiologia , Doenças Cardiovasculares/prevenção & controle , Estudos Transversais , Neoplasias/epidemiologia , Neoplasias/prevenção & controle , Estudos Prospectivos , Dieta/efeitos adversos , Fatores de RiscoRESUMO
OBJECTIVE: High cervical spinal cord injury (SCI) results in complete loss of upper-limb function, resulting in debilitating tetraplegia and permanent disability. Spontaneous motor recovery occurs to varying degrees in some patients, particularly in the 1st year postinjury. However, the impact of this upper-limb motor recovery on long-term functional outcomes remains unknown. The objective of this study was to characterize the impact of upper-limb motor recovery on the degree of long-term functional outcomes in order to inform priorities for research interventions that restore upper-limb function in patients with high cervical SCI. METHODS: A prospective cohort of high cervical SCI (C1-4) patients with American Spinal Injury Association Impairment Scale (AIS) grade A-D injury and enrolled in the Spinal Cord Injury Model Systems Database was included. Baseline neurological examinations and functional independence measures (FIMs) in feeding, bladder management, and transfers (bed/wheelchair/chair) were evaluated. Independence was defined as score ≥ 4 in each of the FIM domains at 1-year follow-up. At 1-year follow-up, functional independence was compared among patients who gained recovery (motor grade ≥ 3) in elbow flexors (C5), wrist extensors (C6), elbow extensors (C7), and finger flexors (C8). Multivariable logistic regression evaluated the impact of motor recovery on functional independence in feeding, bladder management, and transfers. RESULTS: Between 1992 and 2016, 405 high cervical SCI patients were included. At baseline, 97% of patients had impaired upper-limb function with total dependence in eating, bladder management, and transfers. At 1 year of follow-up, the largest proportion of patients who gained independence in eating, bladder management, and transfers had recovery in finger flexion (C8) and wrist extension (C6). Elbow flexion (C5) recovery had the lowest translation to functional independence. Patients who achieved elbow extension (C7) were able to transfer independently. On multivariable analysis, patients who gained elbow extension (C7) and finger flexion (C8) were 11 times more likely to gain functional independence (OR 11, 95% CI 2.8-47, p < 0.001) and patients who gained wrist extension (C6) were 7 times more likely to gain functional independence (OR 7.1, 95% CI 1.2-56, p = 0.04). Older age (≥ 60 years) and motor complete SCI (AIS grade A-B) reduced the likelihood of gaining independence. CONCLUSIONS: After high cervical SCI, patients who gained elbow extension (C7) and finger flexion (C8) had significantly greater independence in feeding, bladder management, and transfers than those with recovery in elbow flexion (C5) and wrist extension (C6). Recovery of elbow extension (C7) also increased the capability for independent transfers. This information can be used to set patient expectations and prioritize interventions that restore these upper-limb functions in patients with high cervical SCI.
Assuntos
Medula Cervical , Traumatismos da Medula Espinal , Humanos , Estudos Prospectivos , Extremidade Superior , Traumatismos da Medula Espinal/complicações , Quadriplegia/complicações , Recuperação de Função FisiológicaAssuntos
Transplante de Fígado , Fígado , Humanos , Fígado/cirurgia , Abdome , Perfusão , Preservação de ÓrgãosRESUMO
BACKGROUND: Previous studies on calcium intake and lung cancer risk reported inconsistent associations, possibly due to the differences in intake amounts and contributing sources of calcium and smoking prevalence. OBJECTIVES: We investigated the associations of lung cancer risk with intake of calcium from foods and/or supplements and major calcium-rich foods in 12 studies. METHODS: Data from 12 prospective cohort studies conducted in the United States, Europe, and Asia were pooled and harmonized. We applied the DRI to categorize calcium intake based on the recommendations and quintile distribution to categorize calcium-rich food intake. We ran multivariable Cox regression by each cohort and pooled risk estimates to compute overall HR (95% CI). RESULTS: Among 1,624,244 adult men and women, 21,513 incident lung cancer cases were ascertained during a mean follow-up of 9.9 y. Overall, the dietary calcium intake was not significantly associated with lung cancer risk; the HRs (95% CI) were 1.08 (0.98-1.18) for higher (>1.5 RDA) and 1.01 (0.95-1.07) for lower intake (<0.5 RDA) comparing with recommended intake (EAR to RDA). Milk and soy food intake were positively or inversely associated with lung cancer risk [HR (95% CI) = 1.07 (1.02-1.12) and 0.92 (0.84-1.00)], respectively. The positive association with milk intake was significant only in European and North American studies (P-interaction for region = 0.04). No significant association was observed for calcium supplements. CONCLUSIONS: In this largest prospective investigation, overall, calcium intake was not associated with risk of lung cancer, but milk intake was associated with a higher risk. Our findings underscore the importance of considering food sources of calcium in studies of calcium intake.
Assuntos
Cálcio , Neoplasias Pulmonares , Masculino , Adulto , Humanos , Feminino , Estados Unidos/epidemiologia , Animais , Estudos Prospectivos , Fatores de Risco , Leite , Neoplasias Pulmonares/epidemiologia , Neoplasias Pulmonares/etiologia , Cálcio da Dieta , LaticíniosRESUMO
PURPOSE: To determine how participation in daily life is impacted during the first six months following a new cancer diagnosis and to identify risk factors for participation restrictions. Patient-reported outcomes (PROs) were used to suggest referrals to rehabilitation services. METHODS: Participants (n = 123) were adults (> 18 years) with the newly diagnosed primary brain, breast, colorectal, or lung cancer. PROs were collected at baseline (within 30 days of diagnosis/treatment initiation), two and five months post baseline. Daily life participation was assessed through the community participation indicators (CPI) (score range: 0-1) and patient-reported outcome measurement information system (PROMIS) ability to participate, (score range: 20-80; mean: 50, SD: 10). PROMIS-43 profile was also completed. Linear mixed-effect models with random intercept evaluated change in participation over time. RESULTS: The baseline total sample mean CPI score was 0.56; patients reported mildly impaired participation based on PROMIS scores (baseline: 46.19, 2-month follow-up: 44.81, 5 months: 44.84). However, no statistically significant changes in participation were observed over the study period. Risk factors for lower participation included receiving chemotherapy, lower physical function, higher anxiety and fatigue, and reduction in employment, p < 0.05. PROs indicated that roughly half of the participants may benefit from physical or occupational therapy or mental health support, but only 20-36% were referred by their medical team. CONCLUSION: People newly diagnosed with cancer experience impaired participation, but they are infrequently referred to supportive services such as rehabilitation. The use of PROs to assess participation, physical function, and mental health can promote access to supportive care services by identifying patients who may benefit from rehabilitation beyond those identified through routine clinical care.
Assuntos
Neoplasias , Qualidade de Vida , Adulto , Humanos , Estudos Longitudinais , Saúde Mental , Neoplasias/terapia , Ansiedade/etiologiaRESUMO
BACKGROUND: Organ waste is a major cause of the donor liver shortage. Roughly 67% of recovered organ donors have liver utilization annually. A new technology called normothermic machine perfusion (NMP) offers a way to recover marginal and declined livers for transplant. We report interim results of the RESTORE trial (FDA investigational drug exemption trial NCT04483102) that aims to transplant NMP-treated livers that would otherwise be discarded. STUDY DESIGN: Declined livers were screened for NMP eligibility (eg donation after circulatory death [DCD] grafts with warm ischemic time <40 minutes, donation after brain death [DBD] grafts with cold ischemic time <8 hours). Livers meeting pre-NMP eligibility criteria received NMP using the OrganOx metra device for a minimum of 4 hours. All NMP-treated livers meeting the viability criteria were transplanted to consented recipients. RESULTS: Over 22 months, 60 declined livers from three organ procurement organizations (OPOs; 40 DCD and 20 DBD donor livers) were offered, and 22 livers (10 DCD and 12 DBD livers) met the pre-NMP eligibility. After NMP, 16 of 22 livers passed viability testing and were transplanted into needy recipients (median Model for End-Stage Liver Disease [MELD] score of 8, range 6 to 24), resulting in a 72.7% rescue rate (50% DCD, 91.7% DBD). The rate of early allograft dysfunction was 31.3%, but there were no graft-related deaths, primary nonfunction, or instances of nonanastomotic biliary strictures. CONCLUSIONS: Interim results of the RESTORE trial suggest that a sizable number of declined livers can be reclaimed. They are safe for transplantation and can enable lower MELD patients at high risk of morbidity and mortality to receive lifesaving grafts while offering OPOs a way to allocate more livers and reduce organ waste.
Assuntos
Doença Hepática Terminal , Transplante de Fígado , Humanos , Transplante de Fígado/métodos , Preservação de Órgãos/métodos , Doadores Vivos , Índice de Gravidade de Doença , Perfusão/métodos , Doadores de Tecidos , Sobrevivência de EnxertoRESUMO
There is a chronic shortage of donor lungs for pulmonary transplantation due, in part, to low lung utilization rates in the United States. We performed a retrospective cohort study using data from the Scientific Registry of Transplant Recipients database (2006-2019) and developed the lung donor (LUNDON) acceptability score. A total of 83 219 brain-dead donors were included and were randomly divided into derivation (n = 58 314, 70%) and validation (n = 24 905, 30%) cohorts. The overall lung acceptance was 27.3% (n = 22 767). Donor factors associated with the lung acceptance were age, maximum creatinine, ratio of arterial partial pressure of oxygen to fraction of inspired oxygen, mechanism of death by asphyxiation or drowning, history of cigarette use (≥20 pack-years), history of myocardial infarction, chest x-ray appearance, bloodstream infection, and the occurrence of cardiac arrest after brain death. The prediction model had high discriminatory power (C statistic, 0.891; 95% confidence interval, 0.886-0.895) in the validation cohort. We developed a web-based, user-friendly tool (available at https://sites.wustl.edu/lundon) that provides the predicted probability of donor lung acceptance. LUNDON score was also associated with recipient survival in patients with high lung allocation scores. In conclusion, the multivariable LUNDON score uses readily available donor characteristics to reliably predict lung acceptability. Widespread adoption of this model may standardize lung donor evaluation and improve lung utilization rates.
Assuntos
Transplante de Pulmão , Obtenção de Tecidos e Órgãos , Humanos , Adulto Jovem , Adulto , Estudos Retrospectivos , Doadores de Tecidos , Pulmão , Morte EncefálicaRESUMO
BACKGROUND: Survival benefits of self-reported recreational physical activity (PA) during cancer survivorship are well-documented in common cancer types, yet there are limited data on the associations between accelerometer-derived PA of all domains, sedentary behavior, and mortality in large, diverse cohorts of cancer survivors. METHODS: Participants included adults who reported a cancer diagnosis in the National Health and Nutrition Examination Survey and wore an accelerometer for up to 7 days in 2003-2006. Participants were followed for subsequent mortality through 2015. We examined the association of light PA, moderate to vigorous PA, total PA, and sedentary behavior, with all-cause mortality. Cox proportional hazards models estimated hazard ratios (HRs) and 95% confidence intervals (CIs), adjusting for demographics and health indicators. RESULTS: A total of 480 participants (mean age of 68.8 years [SD = 12.4] at the time of National Health and Nutrition Examination Survey assessment) reported a history of cancer. A total of 215 deaths occurred over the follow-up period. For every 1-h/d increase in light PA and moderate to vigorous PA (MVPA), cancer survivors had 49% (HR = 0.51, 95% CI = 0.34 to 0.76) and 37% (HR = 0.63 , 95% CI = 0.40 to 0.99) lower hazards of all-cause mortality, respectively. Total PA demonstrated similar associations with statistically significantly lower hazards of death for each additional hour per day (HR = 0.68, 95% CI = 0.54 to 0.85), as did every metabolic equivalents of task-hour per day increase in total PA estimations of energy expenditure (HR = 0.88, 95% CI = 0.82 to 0.95). Conversely, more sedentary time (1 h/d) was not associated with statistically significantly higher hazards (HR = 1.08, 95% CI = 0.94 to 1.23). CONCLUSIONS: These findings reinforce the current recommendations for cancer survivors to be physically active and underscore the continued need for widespread PA promotion for long-term survival in older cancer survivors.
Assuntos
Sobreviventes de Câncer , Neoplasias , Adulto , Humanos , Idoso , Comportamento Sedentário , Inquéritos Nutricionais , Exercício Físico , AcelerometriaRESUMO
BACKGROUND: Despite an increased understanding of the impact of socioeconomic status on neurosurgical outcomes, the impact of neighborhood-level social determinants on lumbar spine surgery patient-reported outcomes remains unknown. OBJECTIVE: To evaluate the impact of geographic social deprivation on physical and mental health of lumbar surgery patients. METHODS: A single-center retrospective cohort study analyzing patients undergoing lumbar surgery for degenerative disease from 2015 to 2018 was performed. Surgeries were categorized as decompression only or decompression with fusion. The area deprivation index was used to define social deprivation. Study outcomes included preoperative and change in Patient-Reported Outcomes Measurement (PROMIS) physical function (PF), pain interference (PI), depression, and anxiety (mean follow-up: 43.3 weeks). Multivariable imputation was performed for missing data. One-way analysis of variance and multivariable linear regression were used to evaluate the association between area deprivation index and PROMIS scores. RESULTS: In our cohort of 2010 patients, those with the greatest social deprivation had significantly worse mean preoperative PROMIS scores compared with the least-deprived cohort (mean difference [95% CI]-PF: -2.5 [-3.7 to -1.4]; PI: 3.0 [2.0-4.1]; depression: 5.5 [3.4-7.5]; anxiety: 6.0 [3.8-8.2], all P < .001), without significant differences in change in these domains at latest follow-up (PF: +0.5 [-1.2 to 2.2]; PI: -0.2 [-1.7 to 2.1]; depression: -2 [-4.0 to 0.1]; anxiety: -2.6 [-4.9 to 0.4], all P > .05). CONCLUSION: Lumbar spine surgery patients with greater social deprivation present with worse preoperative physical and mental health but experience comparable benefit from surgery than patients with less deprivation, emphasizing the need to further understand social and health factors that may affect both disease severity and access to care.
Assuntos
Medidas de Resultados Relatados pelo Paciente , Disparidades Socioeconômicas em Saúde , Humanos , Estudos Retrospectivos , Procedimentos Neurocirúrgicos , Região Lombossacral/cirurgiaRESUMO
PURPOSE: Circadian rhythm disruptors (e.g., night-shift work) are risk factors for breast cancer, however studies on their association with prognosis is limited. A small but growing body of research suggests that altered sleep patterns and eating behaviours are potential mechanistic links between circadian rhythm disruptors and breast cancer. We therefore systematically summarised literature examining the influence of circadian rhythm disrupting behaviours on cancer outcomes in women with breast cancer. METHODS: A systematic search of five databases from inception to January 2021 was conducted. Original research published in English, assessing the relationship between post-diagnosis sleep patters and eating behaviours, and breast cancer outcomes were considered. Risk of bias was assessed using the Newcastle-Ottawa Assessment Scale for Cohort Studies. RESULTS: Eight studies published original evidence addressing sleep duration and/or quality (k = 7) and, eating time and frequency (k = 1). Longer sleep duration (≥ 9 h versus [referent range] 6-8 h) was consistently associated with increased risk of all outcomes of interest (HR range: 1.37-2.33). There was limited evidence to suggest that measures of better sleep quality are associated with lower risk of all-cause mortality (HR range: 0.29-0.97). Shorter nightly fasting duration (< 13 h versus ≥ 13 h) was associated with higher risk of all breast cancer outcomes (HR range: 1.21-1.36). CONCLUSION: Our review suggests that circadian rhythm disrupting behaviours may influence cancer outcomes in women with breast cancer. While causality remains unclear, to further understand these associations future research directions have been identified. Additional well-designed studies, examining other exposures (e.g., light exposure, temporal eating patterns), biomarkers, and patient-reported outcomes, in diverse populations (e.g., breast cancer subtype-specific, socio-demographic diversity) are warranted.
Assuntos
Neoplasias da Mama , Sobreviventes de Câncer , Humanos , Feminino , Neoplasias da Mama/epidemiologia , Neoplasias da Mama/etiologia , Ritmo Circadiano , Sono , Fatores de RiscoRESUMO
BACKGROUND: Although adolescent diet has been proposed to contribute to prostate cancer (PCa) development, no studies have investigated the relation between adolescent dietary patterns and PCa risk or mortality. METHODS: Using data from 164,079 men in the NIH-AARP Diet and Health Study, we performed factor analysis to identify dietary patterns at ages 12-13 years and then used Cox proportional hazards regression to estimate hazard ratios (HRs) and 95% confidence intervals (CIs) of total (n = 17,861), non-advanced (n = 15,499), advanced (n = 2362), and fatal PCa (n = 832). RESULTS: Although not entirely consistent across analyses, a higher adolescent plant-based pattern (characterised by vegetables, fruits, and dark bread) score was associated with slightly reduced risks of total (fully adjusted HRQ5vs.Q1 = 0.93, 95% CI: 0.89-0.98, p trend=0.003) and non-advanced PCa (HR = 0.91, 95% CI: 0.87-0.96, p trend<0.001), whereas no associations were observed for advanced or fatal PCa, or for Western modern (characterised by sweets, processed meat, beef, cheese, and pizza) or Western traditional (characterised gravy, eggs, potatoes and white bread) patterns. CONCLUSION: We found evidence to support a modest, protective role for a plant-based dietary pattern during adolescence on PCa risk. If confirmed in future studies, our findings may help to inform the development of new, primary prevention strategies for PCa.
Assuntos
Dieta , Neoplasias da Próstata , Masculino , Animais , Bovinos , Humanos , Adolescente , Criança , Fatores de Risco , Neoplasias da Próstata/epidemiologia , Verduras , Frutas , Modelos de Riscos ProporcionaisRESUMO
Importance: Traumatic cervical spinal cord injury (SCI) can result in debilitating paralysis. Following cervical SCI, accurate early prediction of upper limb recovery can serve an important role in guiding the appropriateness and timing of reconstructive therapies. Objective: To develop a clinical prediction rule to prognosticate upper limb functional recovery after cervical SCI. Design, Setting, and Participants: This prognostic study was a retrospective review of a longitudinal cohort study including patients enrolled in the National SCI model systems (SCIMS) database in US. Eligible patients were 15 years or older with tetraplegia (neurological level of injury C1-C8, American Spinal Cord Injury Association [ASIA] impairment scale [AIS] A-D), with early (within 1 month of SCI) and late (1-year follow-up) clinical examinations from 2011 to 2016. The data analysis was conducted from September 2021 to June 2022. Main Outcomes and Measures: The primary outcome was a composite of dependency in eating, bladder management, transfers, and locomotion domains of functional independence measure at 1-year follow-up. Each domain ranges from 1 to 7 with a lower score indicating greater functional dependence. Composite dependency was defined as a score of 4 or higher in at least 3 chosen domains. Multivariable logistic regression was used to predict the outcome based on early neurological variables. Discrimination was quantified using C statistics, and model performance was internally validated with bootstrapping and 10-fold cross-validation. The performance of the prediction score was compared with AIS grading. Data were split into derivation (2011-2014) and temporal-validation (2015-2016) cohorts. Results: Among 2373 patients with traumatic cervical SCI, 940 had complete 1-year outcome data (237 patients [25%] aged 60 years or older; 753 men [80%]). The primary outcome was present in 118 patients (13%), which included 92 men (78%), 83 (70%) patients who were younger than 60 years, and 73 (62%) patients experiencing AIS grade A SCI. The variables significantly associated with the outcome were age (age 60 years or older: OR, 2.31; 95% CI, 1.26-4.19), sex (men: OR, 0.60; 95% CI, 0.31-1.17), light-touch sensation at C5 (OR, 0.44; 95% CI, 0.44-1.01) and C8 (OR, 036; 95% CI, 0.24-0.53) dermatomes, and motor scores of the elbow flexors (C5) (OR, 0.74; 95% CI, 0.60-0.89) and wrist extensors (C6) (OR, 0.61; 95% CI, 0.49-0.75). A multivariable model including these variables had excellent discrimination in distinguishing dependent from independent patients in the temporal-validation cohort (C statistic, 0.90; 95% CI, 0.88-0.93). A clinical prediction score (range, 0 to 45 points) was developed based on these measures, with higher scores increasing the probability of dependency. The discrimination of the prediction score was significantly higher than from AIS grading (change in AUC, 0.14; 95% CI, 0.10-0.18; P < .001). Conclusions and Relevance: The findings of this study suggest that this prediction rule may help prognosticate upper limb function following cervical SCI. This tool can be used to set patient expectations, rehabilitation goals, and aid decision-making regarding the appropriateness and timing for upper limb reconstructive surgeries.
Assuntos
Medula Cervical , Lesões do Pescoço , Traumatismos da Medula Espinal , Masculino , Humanos , Estudos Longitudinais , Regras de Decisão Clínica , Extremidade SuperiorRESUMO
Importance: Cervical spinal cord injury (SCI) causes devastating loss of upper extremity function and independence. Nerve transfers are a promising approach to reanimate upper limbs; however, there remains a paucity of high-quality evidence supporting a clinical benefit for patients with tetraplegia. Objective: To evaluate the clinical utility of nerve transfers for reanimation of upper limb function in tetraplegia. Design, Setting, and Participants: In this prospective case series, adults with cervical SCI and upper extremity paralysis whose recovery plateaued were enrolled between September 1, 2015, and January 31, 2019. Data analysis was performed from August 2021 to February 2022. Interventions: Nerve transfers to reanimate upper extremity motor function with target reinnervation of elbow extension and hand grasp, pinch, and/or release. Main Outcomes and Measures: The primary outcome was motor strength measured by Medical Research Council (MRC) grades 0 to 5. Secondary outcomes included Sollerman Hand Function Test (SHFT); Michigan Hand Outcome Questionnaire (MHQ); Disabilities of Arm, Shoulder, and Hand (DASH); and 36-Item Short Form Health Survey (SF-36) physical component summary (PCS) and mental component summary (MCS) scores. Outcomes were assessed up to 48 months postoperatively. Results: Twenty-two patients with tetraplegia (median age, 36 years [range, 18-76 years]; 21 male [95%]) underwent 60 nerve transfers on 35 upper limbs at a median time of 21 months (range, 6-142 months) after SCI. At final follow-up, upper limb motor strength improved significantly: median MRC grades were 3 (IQR, 2.5-4; P = .01) for triceps, with 70% of upper limbs gaining an MRC grade of 3 or higher for elbow extension; 4 (IQR, 2-4; P < .001) for finger extensors, with 79% of hands gaining an MRC grade of 3 or higher for finger extension; and 2 (IQR, 1-3; P < .001) for finger flexors, with 52% of hands gaining an MRC grade of 3 or higher for finger flexion. The secondary outcomes of SHFT, MHQ, DASH, and SF36-PCS scores improved beyond the established minimal clinically important difference. Both early (<12 months) and delayed (≥12 months) nerve transfers after SCI achieved comparable motor outcomes. Continual improvement in motor strength was observed in the finger flexors and extensors across the entire duration of follow-up. Conclusions and Relevance: In this prospective case series, nerve transfer surgery was associated with improvement of upper limb motor strength and functional independence in patients with tetraplegia. Nerve transfer is a promising intervention feasible in both subacute and chronic SCI.