Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
J Bone Miner Res ; 39(1): 39-49, 2024 Mar 04.
Article in English | MEDLINE | ID: mdl-38630876

ABSTRACT

Aerobic exercise reduces circulating ionized Ca (iCa) and increases parathyroid hormone (PTH), but the cause and consequences on Ca handling are unknown. The objective of this study was to determine the effects of strenuous exercise on Ca kinetics using dual stable Ca isotopes. Twenty-one healthy women (26.4 ± 6.7 yr) completed a randomized, crossover study entailing two 6-d iterations consisting of either 60 min of treadmill walking at 65% VO2max wearing a vest weighing 30% body weight on study days 1, 3, and 5 (exercise [EX]), or a rest iteration (rest [REST]). On day 1, participants received intravenous 42Ca and oral 44Ca. Isotope ratios were determined by thermal ionization mass spectrometry. Kinetic modeling determined fractional Ca absorption (FCA), Ca deposition (Vo+), resorption (Vo-) from bone, and balance (Vbal). Circulating PTH and iCa were measured before, during, and after each exercise/rest session. Data were analyzed by paired t-test or linear mixed models using SPSS. iCa decreased and PTH increased (P < .001) during each EX session and were unchanged during REST. On day 1, urinary Ca was lower in the EX pool (25 ± 11 mg) compared to REST (38 ± 16 mg, P = .001), but did not differ over the full 24-h collection (P > .05). FCA was greater during EX (26.6 ± 8.1%) compared to REST (23.9 ± 8.3%, P < .05). Vbal was less negative during EX (-61.3 ± 111 mg) vs REST (-108 ± 23.5 mg, P < .05), but VO+ (574 ± 241 vs 583 ± 260 mg) and VO- (-636 ± 243 vs -692 ± 252 mg) were not different (P > .05). The rapid reduction in circulating iCa may be due to a change in the miscible Ca pool, resulting in increased PTH and changes in intestinal absorption and renal Ca handling that support a more positive Ca balance.


Subject(s)
Calcium, Dietary , Calcium , Humans , Female , Calcium/metabolism , Cross-Over Studies , Parathyroid Hormone , Exercise , Intestinal Absorption
2.
Endocrinol Metab Clin North Am ; 52(4): 603-615, 2023 12.
Article in English | MEDLINE | ID: mdl-37865476

ABSTRACT

Despite the increasing prevalence of diabetes in populations experiencing humanitarian crisis, along with evidence that people living with diabetes are at higher risk for poor outcomes in a crisis, diabetes care is not routinely included in humanitarian health interventions. We here describe 4 factors that have contributed to the inequities and lack of diabetes inclusion in humanitarian programmes: (1) evolving paradigms in humanitarian health care, (2) complexities of diabetes service provision in humanitarian settings, (3) social and cultural challenges, and (4) lack of financing. We also outline opportunities and possible interventions to address these challenges and improve diabetes care among crisis-affected populations.


Subject(s)
Delivery of Health Care , Diabetes Mellitus , Humans , Diabetes Mellitus/epidemiology , Diabetes Mellitus/therapy
3.
Crit Care Explor ; 5(11): e0974, 2023 Nov.
Article in English | MEDLINE | ID: mdl-38304708

ABSTRACT

BACKGROUND: Sepsis is a common and deadly syndrome, accounting for more than 11 million deaths annually. To mature a deeper understanding of the host and pathogen mechanisms contributing to poor outcomes in sepsis, and thereby possibly inform new therapeutic targets, sophisticated, and expensive biorepositories are typically required. We propose that remnant biospecimens are an alternative for mechanistic sepsis research, although the viability and scientific value of such remnants are unknown. METHODS AND RESULTS: The Remnant Biospecimen Investigation in Sepsis study is a prospective cohort study of 225 adults (age ≥ 18 yr) presenting to the emergency department with community sepsis, defined as sepsis-3 criteria within 6 hours of arrival. The primary objective was to determine the scientific value of a remnant biospecimen repository in sepsis linked to clinical phenotyping in the electronic health record. We will study candidate multiomic readouts of sepsis biology, governed by a conceptual model, and determine the precision, accuracy, integrity, and comparability of proteins, small molecules, lipids, and pathogen sequencing in remnant biospecimens compared with paired biospecimens obtained according to research protocols. Paired biospecimens will include plasma from sodium-heparin, EDTA, sodium fluoride, and citrate tubes. CONCLUSIONS: The study has received approval from the University of Pittsburgh Human Research Protection Office (Study 21120013). Recruitment began on October 25, 2022, with planned release of primary results anticipated in 2024. Results will be made available to the public, the funders, critical care societies, laboratory medicine scientists, and other researchers.

4.
JAMA Netw Open ; 5(7): e2220957, 2022 07 01.
Article in English | MEDLINE | ID: mdl-35834252

ABSTRACT

Importance: The effectiveness of monoclonal antibodies (mAbs), casirivimab-imdevimab and sotrovimab, is unknown in patients with mild to moderate COVID-19 caused by the SARS-CoV-2 Delta variant. Objective: To evaluate the effectiveness of mAb against the Delta variant compared with no mAb treatment and to ascertain the comparative effectiveness of casirivimab-imdevimab and sotrovimab. Design, Setting, and Participants: This study comprised 2 parallel studies: (1) a propensity score-matched cohort study of mAb treatment vs no mAb treatment and (2) a randomized comparative effectiveness trial of casirivimab-imdevimab and sotrovimab. The cohort consisted of patients who received mAb treatment at the University of Pittsburgh Medical Center outpatient infusion centers and emergency departments from July 14 to September 29, 2021. Participants were patients with a positive SARS-CoV-2 test result who were eligible to receive mAbs according to emergency use authorization criteria. Exposure: For the trial, patients were randomized to either intravenous casirivimab-imdevimab or sotrovimab according to a system therapeutic interchange policy. Main Outcomes and Measures: For the cohort study, risk ratio (RR) estimates for the primary outcome of hospitalization or death by 28 days were compared between mAb treatment and no mAb treatment using propensity score-matched models. For the comparative effectiveness trial, the primary outcome was hospital-free days (days alive and free of hospitalization) within 28 days after mAb treatment, where patients who died were assigned -1 day in a bayesian cumulative logistic model adjusted for treatment location, age, sex, and time. Inferiority was defined as a 99% posterior probability of an odds ratio (OR) less than 1. Equivalence was defined as a 95% posterior probability that the OR was within a given bound. Results: A total of 3069 patients (1023 received mAb treatment: mean [SD] age, 53.2 [16.4] years; 569 women [56%]; 2046 had no mAb treatment: mean [SD] age, 52.8 [19.5] years; 1157 women [57%]) were included in the prospective cohort study, and 3558 patients (mean [SD] age, 54 [18] years; 1919 women [54%]) were included in the randomized comparative effectiveness trial. In propensity score-matched models, mAb treatment was associated with reduced risk of hospitalization or death (RR, 0.40; 95% CI, 0.28-0.57) compared with no treatment. Both casirivimab-imdevimab (RR, 0.31; 95% CI, 0.20-0.50) and sotrovimab (RR, 0.60; 95% CI, 0.37-1.00) were associated with reduced hospitalization or death compared with no mAb treatment. In the clinical trial, 2454 patients were randomized to receive casirivimab-imdevimab and 1104 patients were randomized to receive sotrovimab. The median (IQR) hospital-free days were 28 (28-28) for both mAb treatments, the 28-day mortality rate was less than 1% (n = 12) for casirivimab-imdevimab and less than 1% (n = 7) for sotrovimab, and the hospitalization rate by day 28 was 12% (n = 291) for casirivimab-imdevimab and 13% (n = 140) for sotrovimab. Compared with patients who received casirivimab-imdevimab, those who received sotrovimab had a median adjusted OR for hospital-free days of 0.88 (95% credible interval, 0.70-1.11). This OR yielded 86% probability of inferiority for sotrovimab vs casirivimab-imdevimab and 79% probability of equivalence. Conclusions and Relevance: In this propensity score-matched cohort study and randomized comparative effectiveness trial, the effectiveness of casirivimab-imdevimab and sotrovimab against the Delta variant was similar, although the prespecified criteria for statistical inferiority or equivalence were not met. Both mAb treatments were associated with a reduced risk of hospitalization or death in nonhospitalized patients with mild to moderate COVID-19 caused by the Delta variant. Trial Registration: ClinicalTrials.gov Identifier: NCT04790786.


Subject(s)
COVID-19 Drug Treatment , SARS-CoV-2 , Antibodies, Monoclonal, Humanized , Antibodies, Neutralizing , Bayes Theorem , Cohort Studies , Female , Humans , Middle Aged , Prospective Studies
5.
Contemp Clin Trials ; 119: 106822, 2022 08.
Article in English | MEDLINE | ID: mdl-35697146

ABSTRACT

BACKGROUND: Monoclonal antibodies (mAb) that neutralize SARS-CoV-2 decrease hospitalization and death compared to placebo in patients with mild to moderate COVID-19; however, comparative effectiveness is unknown. We report the comparative effectiveness of bamlanivimab, bamlanivimab-etesevimab, and casirivimab-imdevimab. METHODS: A learning health system platform trial in a U.S. health system enrolled patients meeting mAb Emergency Use Authorization criteria. An electronic health record-embedded application linked local mAb inventory to patient encounters and provided random mAb allocation. Primary outcome was hospital-free days to day 28. Primary analysis was a Bayesian model adjusting for treatment location, age, sex, and time. Inferiority was defined as 99% posterior probability of an odds ratio < 1. Equivalence was defined as 95% posterior probability the odds ratio is within a given bound. FINDINGS: Between March 10 and June 25, 2021, 1935 patients received treatment. Median hospital-free days were 28 (IQR 28, 28) for each mAb. Mortality was 0.8% (1/128), 0.8% (7/885), and 0.7% (6/922) for bamlanivimab, bamlanivimab-etesevimab, and casirivimab-imdevimab, respectively. Relative to casirivimab-imdevimab (n = 922), median adjusted odds ratios were 0.58 (95% credible interval [CI] 0.30-1.16) and 0.94 (95% CI 0.72-1.24) for bamlanivimab (n = 128) and bamlanivimab-etesevimab (n = 885), respectively. These odds ratios yielded 91% and 94% probabilities of inferiority of bamlanivimab versus bamlanivimab-etesevimab and casirivimab-imdevimab, and an 86% probability of equivalence between bamlanivimab-etesevimab and casirivimab-imdevimab. INTERPRETATION: Among patients with mild to moderate COVID-19, bamlanivimab-etesevimab or casirivimab-imdevimab treatment resulted in 86% probability of equivalence. No treatment met prespecified criteria for statistical equivalence. Median hospital-free days to day 28 were 28 (IQR 28, 28) for each mAb. FUNDING AND REGISTRATION: This work received no external funding. The U.S. government provided the reported mAb. This trial is registered at ClinicalTrials.gov, NCT04790786.


Subject(s)
COVID-19 , Learning Health System , Antibodies, Monoclonal , Antibodies, Monoclonal, Humanized , Antibodies, Neutralizing , Bayes Theorem , Humans , SARS-CoV-2
6.
JAMA Netw Open ; 5(4): e226920, 2022 04 01.
Article in English | MEDLINE | ID: mdl-35412625

ABSTRACT

Importance: Monoclonal antibody (mAb) treatment decreases hospitalization and death in high-risk outpatients with mild to moderate COVID-19; however, only intravenous administration has been evaluated in randomized clinical trials of treatment. Subcutaneous administration may expand outpatient treatment capacity and qualified staff available to administer treatment, but the association with patient outcomes is understudied. Objectives: To evaluate whether subcutaneous casirivimab and imdevimab treatment is associated with reduced 28-day hospitalization and death compared with nontreatment among mAb-eligible patients and whether subcutaneous casirivimab and imdevimab treatment is clinically and statistically similar to intravenous casirivimab and imdevimab treatment. Design, Setting, and Participants: This prospective cohort study evaluated high-risk outpatients in a learning health system in the US with mild to moderate COVID-19 symptoms from July 14 to October 26, 2021, who were eligible for mAb treatment under emergency use authorization. A nontreated control group of eligible patients was also studied. Exposures: Subcutaneous injection or intravenous administration of the combined single dose of 600 mg of casirivimab and 600 mg of imdevimab. Main Outcomes and Measures: The primary outcome was the 28-day adjusted risk ratio or adjusted risk difference for hospitalization or death. Secondary outcomes included 28-day adjusted risk ratios and differences in hospitalization, death, a composite end point of emergency department admission and hospitalization, and rates of adverse events. Among 1959 matched adults with mild to moderate COVID-19, 969 patients (mean [SD] age, 53.8 [16.7] years; 547 women [56.4%]) who received casirivimab and imdevimab subcutaneously had a 28-day rate of hospitalization or death of 3.4% (22 of 653 patients) compared with 7.0% (92 of 1306 patients) in nontreated controls (risk ratio, 0.48; 95% CI, 0.30-0.80; P = .002). Among 2185 patients treated with subcutaneous (n = 969) or intravenous (n = 1216; mean [SD] age, 54.3 [16.6] years; 672 women [54.4%]) casirivimab and imdevimab, the 28-day rate of hospitalization or death was 2.8% vs 1.7%, which resulted in an adjusted risk difference of 1.5% (95% CI, -0.6% to 3.5%; P = .16). Among all infusion patients, there was no difference in intensive care unit admission (adjusted risk difference, 0.7%; 95% CI, -3.5% to 5.0%) or need for mechanical ventilation (adjusted risk difference, 0.2%; 95% CI, -5.8% to 5.5%). Conclusions and Relevance: In this cohort study of high-risk outpatients with mild to moderate COVID-19 symptoms, subcutaneously administered casirivimab and imdevimab was associated with reduced hospitalization and death when compared with no treatment. These results provide preliminary evidence of potential expanded use of subcutaneous mAb treatment, particularly in areas that are facing treatment capacity and/or staffing shortages.


Subject(s)
Antineoplastic Agents, Immunological , COVID-19 Drug Treatment , Adult , Aged , Antibodies, Monoclonal/therapeutic use , Antibodies, Monoclonal, Humanized , Cohort Studies , Female , Humans , Infusions, Intravenous , Male , Middle Aged , Prospective Studies , SARS-CoV-2
7.
J Acad Nutr Diet ; 122(6): 1114-1128.e1, 2022 06.
Article in English | MEDLINE | ID: mdl-34601165

ABSTRACT

BACKGROUND: Vitamin D deficiency (VDD), defined as serum 25-hydroxyvitamin D (25[OH]D) levels < 20 ng/mL [to convert 25[OH]D ng/mL to nmol/L, multiply by 2.5]) is prevalent in young adults and has been associated with adverse health outcomes, including stress fracture during periods of increased physical activity such as military training. Foods commonly consumed at breakfast provide an important source of vitamin D, yet breakfast skipping is common among young adults. However, whether breakfast skipping is associated with VDD in young adults is unclear. OBJECTIVES: This study aimed to determine whether breakfast skipping is associated with odds of VDD among recruits entering initial military training (IMT), and with changes in serum 25(OH)D during IMT. In addition, whether diet quality and vitamin D intake mediated these associations was determined. DESIGN: Secondary analysis of individual participant data collected during five IMT studies. Breakfast skipping (≥ 3 times/week) was self-reported. Dietary intake was determined using food frequency questionnaires, and vitamin D status was assessed using circulating 25(OH)D concentrations pre- and post-IMT. PARTICIPANTS AND SETTING: Participants were healthy US Army, US Air Force, and US Marine recruits (N = 1,569, 55% male, mean ± standard deviation age 21 ± 4 years) entering military service between 2010 and 2015 at Fort Jackson, SC; Fort Sill, OK; Lakeland Air Force Base, TX; or the Marine Corps Recruit Depot, Parris Island, SC. MAIN OUTCOME MEASURES: Primary outcomes were VDD pre-IMT and change in 25(OH)D from pre- to post-IMT. STATISTICAL ANALYSIS PERFORMED: Associations were determined using multivariate-adjusted logistic and linear regression and mediation models. RESULTS: Forty-six percent of military recruits were classified as breakfast skippers pre-IMT, and 30% were VDD. Breakfast skipping was associated with a higher odds of pre-IMT VDD (odds ratio 1.5, 95% CI 1.1 to 1.9), and lower vitamin D intake and diet quality were partial mediators of the association. Serum 25(OH)D concentrations improved (P = 0.01) among habitual breakfast skippers versus nonskippers during IMT; however, regression to the mean could not be ruled out. Neither change in diet quality nor vitamin D intake were associated with change in 25(OH)D concentrations during IMT. CONCLUSIONS: Breakfast skipping is prevalent among incoming military recruits and is associated with VDD. This relationship may be mediated by lower diet quality and vitamin D intake.


Subject(s)
Military Personnel , Vitamin D Deficiency , Adolescent , Adult , Breakfast , Diet , Female , Humans , Male , Vitamin D , Vitamin D Deficiency/epidemiology , Vitamins , Young Adult
8.
Br J Nutr ; 128(9): 1730-1737, 2022 11 14.
Article in English | MEDLINE | ID: mdl-34814952

ABSTRACT

Maintaining Mg status may be important for military recruits, a population that experiences high rates of stress fracture during initial military training (IMT). The objectives of this secondary analysis were to (1) compare dietary Mg intake and serum Mg in female and male recruits pre- and post-IMT, (2) determine whether serum Mg was related to parameters of bone health pre-IMT, and (3) whether Ca and vitamin D supplementation (Ca/vitamin D) during IMT modified serum Mg. Females (n 62) and males (n 51) consumed 2000 mg of Ca and 25 µg of vitamin D/d or placebo during IMT (12 weeks). Dietary Mg intakes were estimated using FFQ, serum Mg was assessed and peripheral quantitative computed tomography was performed on the tibia. Dietary Mg intakes for females and males pre-IMT were below the estimated average requirement and did not change with training. Serum Mg increased during IMT in females (0·06 ± 0·08 mmol/l) compared with males (-0·02 ± 0·10 mmol/l; P < 0·001) and in those consuming Ca/vitamin D (0·05 ± 0·09 mmol/l) compared with placebo (0·001 ± 0·11 mmol/l; P = 0·015). In females, serum Mg was associated with total bone mineral content (BMC, ß = 0·367, P = 0·004) and robustness (ß = 0·393, P = 0·006) at the distal 4 % site, stress-strain index of the polaris axis (ß = 0·334, P = 0·009) and robustness (ß = 0·420, P = 0·004) at the 14 % diaphyseal site, and BMC (ß = 0·309, P = 0·009) and stress-strain index of the polaris axis (ß = 0·314, P = 0·006) at the 66 % diaphyseal site pre-IMT. No significant relationships between serum Mg and bone measures were observed in males. Findings suggest that serum Mg may be modulated by Ca/vitamin D intake and may impact tibial bone health during training in female military recruits.


Subject(s)
Calcium , Military Personnel , Male , Humans , Female , Magnesium , Vitamin D , Bone Density , Dietary Supplements
9.
Appetite ; 142: 104348, 2019 11 01.
Article in English | MEDLINE | ID: mdl-31299192

ABSTRACT

Eating behaviors such as eating fast and ignoring internal satiety cues are associated with overweight/obesity, and may be influenced by environmental factors. This study examined changes in those behaviors, and associations between those behaviors and BMI, cardiometabolic biomarkers, and diet quality in military recruits before and during initial military training (IMT), an environment wherein access to food is restricted. Eating rate and reliance on internal satiety cues were self-reported, and BMI, body fat, cardiometabolic biomarkers, and diet quality were measured in 1389 Army, Air Force and Marine recruits (45% female, mean ±â€¯SEM BMI = 24.1 ±â€¯0.1 kg/m2) before and after IMT. Pre-IMT, habitually eating fast relative to slowly was associated with a 1.1 ±â€¯0.3 kg/m2 higher BMI (P < 0.001), but not with other outcomes; whereas, habitually eating until no food is left (i.e., ignoring internal satiety cues) was associated with lower diet quality (P < 0.001) and, in men, 1.6 ±â€¯0.6% lower body fat (P = 0.03) relative to those that habitually stopped eating before feeling full. More recruits reported eating fast (82% vs 39%) and a reduced reliance on internal satiety cues (55% vs 16%) during IMT relative to pre-IMT (P < 0.001). Findings suggest that eating behaviors correlate with body composition and/or diet quality in young, predominantly normal-weight recruits entering the military, and that IMT is associated with potentially unfavorable changes in these eating behaviors.


Subject(s)
Body Mass Index , Feeding Behavior , Military Personnel , Self Report , Adolescent , Adult , Biomarkers/blood , Body Composition , Body Weight , Diet , Female , Humans , Male , Obesity/epidemiology , Overweight/epidemiology , Physical Fitness , Satiation , Surveys and Questionnaires , United States , Young Adult
10.
Bone ; 123: 224-233, 2019 06.
Article in English | MEDLINE | ID: mdl-30902791

ABSTRACT

Stress fractures are common overuse injuries caused by repetitive bone loading. These fractures are of particular concern for military recruits and athletes resulting in attrition in up to 60% of recruits that sustain a fracture. Army and Navy recruits supplemented with daily calcium and vitamin D (Ca + D) demonstrated improved bone strength and reduced stress fractures. The aim of the current study was to evaluate whether Ca + D supplementation improves measures of bone health in recruits undergoing United States Marine Corps initial military training (IMT), and whether the effect of supplementation on indices of bone health varied by season. One-hundred ninety-seven Marine recruits (n = 107 males, n = 90 females, mean age = 18.9 ±â€¯1.6 y) were randomized to receive either Ca + D fortified snack bars (2000 mg Ca and 1000 IU vitamin D per day) or placebo divided into twice daily doses during 12 weeks of IMT. Anthropometrics, fasted blood samples, and peripheral quantitative computed tomography (pQCT) scans of the tibial metaphysis and diaphysis were collected upon entrance to- and post-training (12 weeks later). Half of the volunteers entered training in July and the other half started in February. Time-by-group interactions were observed for vitamin D status (25OHD) and the bone turnover markers, BAP, TRAP and OCN. 25OHD increased and BAP, TRAP and OCN all decreased in the Ca + D group (p < .05). Training increased distal tibia volumetric BMD (+1.9 ±â€¯2.8%), BMC (+2.0 ±â€¯3.1%), and bone strength index (BSI; +4.0 ±â€¯4.0%) and diaphyseal BMC (+1.0 ±â€¯2.2%) and polar stress strain index (SSIp; +0.7 ±â€¯2.1%) independent of Ca + D supplementation (p < .05 for all). When analyzed by season, change in BSI was greater in the Ca + D group as compared to placebo in the summer iteration only (T*G; p < .05). No other effects of supplementation on bone tissue were observed. When categorized by tertile of percent change in BSI, recruits demonstrating the greatest changes in BSI and 25OHD entered training with the lowest levels of 25OHD (p < .05). Over all, these results suggest that Ca + D supplementation reduced some markers of bone formation and resorption and the decline in 25OHD over training in volunteers that started training in the summer was prevented by supplementation. Baseline 25OHD and trajectory may impact bone responses to IMT, but little effect of Ca + D supplementation was observed at the investigated doses.


Subject(s)
Bone Density/drug effects , Calcium/administration & dosage , Dietary Supplements , Military Personnel , Seasons , Vitamin D/administration & dosage , Adolescent , Adult , Biomarkers/blood , Bone Density/physiology , Calcium/blood , Calcium, Dietary/administration & dosage , Calcium, Dietary/blood , Double-Blind Method , Female , Humans , Male , Physical Conditioning, Human/methods , Physical Conditioning, Human/physiology , Vitamin D/blood , Young Adult
11.
Am J Clin Nutr ; 109(1): 186-196, 2019 01 01.
Article in English | MEDLINE | ID: mdl-30615068

ABSTRACT

Background: Stress fracture risk is elevated during initial military training (IMT), particularly in lower-extremity bones such as the tibia. Although the etiology of stress fractures is multifactorial, lower bone strength increases risk. Objective: The objective of this study was to assess, through the use of peripheral quantitative computed tomography, whether adherence to a dietary pattern rich in calcium, potassium, and protein before IMT is positively associated with bone indexes in young adults entering IMT. Design: A cross-sectional analysis was performed with the use of baseline data from 3 randomized controlled trials in Army, Air Force, and Marine recruits (n = 401; 179 men, 222 women). Dietary intake was estimated from a food-frequency questionnaire. A dietary pattern characterized by calcium, potassium, and protein was derived via reduced rank regression and a pattern z score was computed for each volunteer, where higher scores indicated greater adherence to the pattern. At the 4% (metaphysis) and 14% (diaphysis) sites of the tibia, bone mineral content (BMC), volumetric bone mineral density, robustness, and strength indexes were evaluated. Associations between dietary pattern z score as the predictor variable and bone indexes as the response variables were evaluated by multiple linear regression. Results: Pattern z score was positively associated with BMC (P = 0.004) and strength (P = 0.01) at the metaphysis and with BMC (P = 0.0002), strength (P = 0.0006), and robustness (P = 0.02) at the diaphysis when controlling for age, sex, race, energy, smoking, education, and exercise. Further adjustment for BMI attenuated the associations, except with diaphyseal BMC (P = 0.005) and strength (P = 0.01). When height and weight were used in place of body mass index, the association with BMC remained (P = 0.046). Conclusions: A dietary pattern rich in calcium, potassium, and protein is positively associated with measures of tibia BMC and strength in recruits entering IMT. Whether adherence to this dietary pattern before IMT affects injury susceptibility during training remains to be determined. These trials were registered at clinicaltrials.gov as NCT01617109 and NCT02636348.


Subject(s)
Bone Density/physiology , Bone and Bones/physiology , Calcium, Dietary/administration & dosage , Dietary Proteins/administration & dosage , Military Personnel , Potassium, Dietary/administration & dosage , Adolescent , Cross-Sectional Studies , Diet Records , Dietary Supplements , Female , Humans , Male , Retrospective Studies , Surveys and Questionnaires , Tibia , Tomography, X-Ray Computed , United States , Young Adult
12.
JPEN J Parenter Enteral Nutr ; 43(1): 81-87, 2019 01.
Article in English | MEDLINE | ID: mdl-29846011

ABSTRACT

BACKGROUND: Malnutrition influences clinical outcomes. Although various screening tools are available to assess nutrition status, their use in the intensive care unit (ICU) has not been rigorously studied. Our goal was to compare the Nutrition Risk in Critically Ill (NUTRIC) to the Nutritional Risk Screening (NRS) 2002 in terms of their associations with macronutrient deficit in ICU patients. METHODS: We performed a retrospective analysis to investigate the relationship between NUTRIC vs NRS 2002 and macronutrient deficit (protein and calories) in critically ill patients. We performed linear regression analyses, controlling for age, sex, race, body mass index, and ICU length of stay. We then dichotomized our primary exposures and outcomes to perform logistic regression analyses, controlling for the same covariates. RESULTS: The analytic cohort included 312 adults. Mean NUTRIC and NRS 2002 scores were 4 ± 2 and 4 ± 1, respectively. Linear regression demonstrated that each increment in NUTRIC score was associated with a 49 g higher protein deficit (ß = 48.70: 95% confidence interval [CI] 29.23-68.17) and a 752 kcal higher caloric deficit (ß = 751.95; 95% CI 447.80-1056.09). Logistic regression demonstrated that NUTRIC scores >4 had over twice the odds of protein deficits ≥300 g (odds ratio [OR] 2.35; 95% CI 1.43-3.85) and caloric deficits ≥6000 kcal (OR 2.73; 95% CI 1.66-4.50) compared with NUTRIC scores ≤4. We did not observe an association of NRS 2002 scores with macronutrient deficit. CONCLUSION: Our data suggest that NUTRIC is superior to NRS 2002 for assessing malnutrition risk in ICU patients. Randomized, controlled studies are needed to determine whether nutrition interventions, stratified by NUTRIC score, can improve patient outcomes.


Subject(s)
Critical Illness , Intensive Care Units , Malnutrition/diagnosis , Mass Screening/methods , Nutrition Assessment , Nutritional Status , Adult , Aged , Body Mass Index , Critical Care , Dietary Proteins/administration & dosage , Energy Intake , Female , Humans , Length of Stay , Logistic Models , Male , Middle Aged , Odds Ratio , Retrospective Studies , Risk Assessment
13.
Nutr Clin Pract ; 34(3): 400-405, 2019 Jun.
Article in English | MEDLINE | ID: mdl-30207404

ABSTRACT

BACKGROUND: The Patient- And Nutrition-Derived Outcome Risk Assessment (PANDORA) was recently validated for predicting mortality in hospitalized patients; however, its utility in the intensive care unit (ICU) remains unknown. METHODS: We investigated whether PANDORA is associated with 30, 90, and 180 day mortality in critically ill surgical patients by performing logistic regressions, controlling for age, sex, race, body mass index, macronutrient deficit, and length of stay. The area under the receiver operating characteristic curves (AUC) of PANDORA vs Acute Physiology and Chronic Health Evaluation (APACHE) II scores for mortality at each time point were also compared. RESULTS: 312 patients comprised the analytic cohort. PANDORA was associated with mortality at 30 (OR 1.08; 95% CI 1.04-1.13; P < .001), 90 (OR 1.09; 95% CI 1.03-1.12; P < .001), and 180 days (OR 1.10; 95% CI 1.06-1.15; P < .001). PANDORA and APACHE II were comparable for mortality prediction at 30 (AUC: 0.69, 95% CI 0.62-0.76 vs 0.74, 95% CI 0.67-0.81; P = .29), 90 (AUC: 0.71, 95% CI 0.63-0.77 vs 0.74, 95% CI 0.67-0.80; P = .52), and 180 days (AUC: 0.73, 95% CI 0.67-0.79 vs 0.75, 95% CI 0.69-0.81; P = .66). CONCLUSION: In surgical ICU patients, PANDORA was associated with mortality and was comparable with APACHE II for mortality prediction at 30, 90, and 180 days after initiation of care. Prospective studies are needed to assess whether nutrition support, stratified by PANDORA scores, can improve outcomes in surgical ICU patients.


Subject(s)
Critical Illness/mortality , Nutritional Status , Postoperative Care , APACHE , Adult , Aged , Female , Hospitalization , Humans , Intensive Care Units , Male , Middle Aged , Nutrition Therapy/methods , Retrospective Studies , Risk Assessment , Severity of Illness Index
14.
J Am Coll Nutr ; 38(2): 171-178, 2019 02.
Article in English | MEDLINE | ID: mdl-30398960

ABSTRACT

OBJECTIVE: Food frequency questionnaires (FFQs) estimate habitual dietary intake and require evaluation in populations of interest in order to determine accuracy. Thus, the purpose of this study was to determine agreement between circulating biomarkers and FFQ estimated dietary intake in a military population consuming all meals in a dining facility over 12 weeks. METHODS: 2014 Block FFQs were administered and fasted blood samples were drawn to assess nutritional biomarkers at the end of a 12-week training period in male (n = 141) and female (n = 125) Marine recruits undergoing initial military training. FFQ estimates of alpha- and beta-carotene, folate, and fruit and vegetable intake and circulating concentrations of serum alpha- and beta-carotene and serum and erythrocyte folate were measured. Partial correlations were used in the full model, and weighted kappa coefficients were used to determine agreement between ranking quartiles of dietary intake estimates with corresponding biomarker status quartiles. RESULTS: Serum and dietary intake of alpha-carotene were positively associated in males (p = 0.009) and females (p < 0.001), as was serum and intake of beta-carotene (males, p = 0.002; females, p < 0.001). Alpha-carotene was positively associated with vegetable intake in males (p = 0.02) and beta-carotene with vegetable intake in females (p = 0.003). Serum folate in males (p = 0.002) and erythrocyte folate in females (p = 0.02) were associated with dietary folate intake. In females, the relationships between biomarker and dietary estimates yielded significant kappa coefficients. In males, a significant kappa coefficient was observed for erythrocyte folate and dietary intake of folate only. The kappa coefficient for serum and estimated intake of beta-carotene was not significant in males. CONCLUSION: Twelve-week habitual intake of alpha-and beta-carotene and folate were correlated with circulating biomarkers in a military training population. The 2014 Block FFQ was able to accurately rank females into quartiles of nutrient status based on intake, while males were ranked less accurately than females.


Subject(s)
Carotenoids/blood , Folic Acid/blood , Military Personnel/statistics & numerical data , Nutritional Status , Vegetables , beta Carotene/blood , Biomarkers/blood , Diet/statistics & numerical data , Diet Surveys , Erythrocytes/chemistry , Female , Humans , Male , Surveys and Questionnaires , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...