Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 109
Filter
Add more filters

Publication year range
1.
Pediatr Exerc Sci ; 36(1): 2-7, 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-37343946

ABSTRACT

PURPOSE: A decline in youth running was observed at the start of the COVID-19 pandemic. We investigated whether the resumption of organized running after social distancing restrictions changed running habits or injury frequency in adolescent runners. METHODS: Adolescents (age = 16.1 [2.1] y) who participated in long-distance running activities completed an online survey in the Spring and Fall of 2020. Participants self-reported average weekly running habits and whether they sustained an injury during the Fall 2020 season. Poisson regression models and 1-way analysis of variance compared running habits while Fisher exact test compared differences in frequencies of injuries during Fall 2020 among season statuses (full, delayed, and canceled). RESULTS: All runners, regardless of season status, increased weekly distance during Fall 2020. Only runners with a full Fall 2020 season ran more times per week and more high-intensity runs per week compared with their Spring 2020 running habits. There were no differences in running volume or running-related injury frequency among Fall 2020 season statuses. CONCLUSIONS: There were no significant differences in running-related injury (RRI) frequency among runners, regardless of season status, following the resumption of cross-country. Health care providers may need to prepare for runners to increase running volume and intensity following the resumption of organized team activities.


Subject(s)
COVID-19 , Running , Humans , Adolescent , Pandemics , Prospective Studies , Risk Factors , Habits
2.
J Clin Densitom ; 26(3): 101370, 2023.
Article in English | MEDLINE | ID: mdl-37100686

ABSTRACT

INTRODUCTION/BACKGROUND: Trabecular bone score (TBS) is an indirect measurement of bone quality and microarchitecture determined from dual-energy X-ray absorptiometry (DXA) imaging of the lumbar spine. TBS predicts fracture risk independent of bone mass/density, suggesting this assessment of bone quality adds value to the understanding of patients' bone health. While lean mass and muscular strength have been associated with higher bone density and lower fracture risk among older adults, the literature is limited regarding the relationship of lean mass and strength with TBS. The purpose of this study was to determine associations of DXA-determined total body and trunk lean mass, maximal muscular strength, and gait speed as a measure of physical function, with TBS in 141 older adults (65-84 yr, 72.5 +/- 5.1 yr, 74% women). METHODOLOGY: Assessments included lumbar spine (L1-L4) bone density and total body and trunk lean mass by DXA, lower body (leg press) and upper body (seated row) strength by one repetition maximum tests, hand grip strength, and usual gait speed. TBS was derived from the lumbar spine DXA scan. Multivariable linear regression determined the contribution of proposed predictors to TBS. RESULTS: After adjusting for age, sex, and lumbar spine bone density, upper body strength significantly predicted TBS (unadjusted/adjusted R2= 0.16/ 0.11, ß coefficient =0.378, p=0.005), while total body lean mass index showed a trend in the expected direction (ß coefficient =0.243, p=0.053). Gait speed and grip strength were not associated with TBS (p>0.05). CONCLUSION: Maximum strength of primarily back muscles measured as the seated row appears important to bone quality as measured by TBS, independent of bone density. Additional research on exercise training targeting back strength is needed to determine its clinical utility in preventing vertebral fractures among older adults.


Subject(s)
Fractures, Bone , Osteoporotic Fractures , Humans , Female , Aged , Male , Cancellous Bone/diagnostic imaging , Hand Strength , Bone Density , Absorptiometry, Photon/methods , Lumbar Vertebrae/diagnostic imaging , Lumbar Vertebrae/physiology
3.
Br J Sports Med ; 55(6): 305-318, 2021 Mar.
Article in English | MEDLINE | ID: mdl-33122252

ABSTRACT

Despite the worldwide popularity of running as a sport for children, relatively little is known about its impact on injury and illness. Available studies have focused on adolescent athletes, but these findings may not be applicable to preadolescent and pubescent athletes. To date, there are no evidence or consensus-based guidelines identifying risk factors for injury and illness in youth runners, and current recommendations regarding suitable running distances for youth runners at different ages are opinion based. The International Committee Consensus Work Group convened to evaluate the current science, identify knowledge gaps, categorise risk factors for injury/illness and provide recommendations regarding training, nutrition and participation for youth runners.


Subject(s)
Running/injuries , Running/physiology , Adolescent , Age Factors , Body Mass Index , Body Size , Bone and Bones/physiology , Child , Death, Sudden, Cardiac/etiology , Foot/physiology , Humans , Muscle Strength , Nutritional Requirements , Physical Conditioning, Human/adverse effects , Physical Conditioning, Human/methods , Risk Factors , Sex Factors , Shoes , Stress, Mechanical
4.
J Sports Sci ; 39(23): 2727-2734, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34313528

ABSTRACT

Running-related injuries are prevalent in adolescent long-distance runners. The aim of our retrospective study was to compare differences in sport specialization, running habits,quality of life, and sleep habits among middle-and high-school long-distance runners of different injury statuses. Middle- and high-school long-distance runners across the United States were recruited via cross-country coaches and athletic directors between January and May 2020. Participants completed an online survey with questions related to demographics, sport specialization, running habits, quality of life, sleep, and self-reported injury history. Overall, 306 participants completed the survey (male = 107, female = 176, unspecified = 23; age = 15.7 ± 1.1 years). Of the participants, 178 (58.2%) reported no history of injury, 101 (33.0%) reported a previous injury, and 27 (8.8%) reported a current injury. Middle- and high-school runners with a current injury reported significantly lower overall health (p= .01) and average distance per run (p = .05) than uninjured runners. No significant differences were found among injury status and sport specialization level, quality of life, sleep habits, or running volume (p> .05). Runners with a self-reported previous or current injury do not appear to be classified as high-specialized runners more frequently than uninjured runners.


Subject(s)
Quality of Life , Running , Adolescent , Female , Humans , Male , Retrospective Studies , Schools , Specialization , United States
5.
J Pediatr Orthop ; 41(8): 507-513, 2021 Sep 01.
Article in English | MEDLINE | ID: mdl-34397783

ABSTRACT

BACKGROUND: There is significant emerging evidence that early sport specialization is a potential risk factor for injury in youth sports. Despite basketball being the most popular youth team sport in the United States, sport specialization research, specifically in youth basketball players, has been limited. The purpose of this paper was to examine the association of sport specialization behaviors with injury history by surveying a nationally representative sample of parents of youth basketball athletes. We hypothesized that athletes who specialized in basketball, participated on multiple teams at the same time, and traveled regularly for basketball competitions would be more likely to report a basketball-related injury in the previous year. METHODS: A nationally representative sample of 805 parents of 805 youth basketball players (female N=241, 29.9%; age: 12.9±2.5 y old) completed an online questionnaire that had 3 sections: (1) parent/child demographics, (2) child basketball participation information for the previous year, and (3) child basketball injury history in the previous year. Multivariate logistic regression examined the associations between variables of interest and injury history, adjusting for covariates. Odds ratios (ORs) with 95% confidence intervals (95% CI) were calculated for the variables of interest from the logistic regression model. RESULTS: Highly specialized athletes were more likely than low specialization athletes to report history of basketball injury in the previous year [OR (95% CI): 2.47 (1.25-4.88), P=0.009]. The odds of reporting an injury in the previous year were twice as great among athletes who played on a basketball team at the same time as another sport team compared with those who played basketball only [OR (95% CI): 1.98 (1.30-3.01), P=0.001]. The odds of reporting an injury in the previous year were 3 times greater among athletes who received private coaching compared with those who did not receive private coaching [OR (95% CI): 2.91 (1.97-4.31), P<0.001]. CONCLUSION: Specialization in basketball, along with several other behaviors that have become typical of modern youth sport participation, were associated with reported injury history. Further prospective research is necessary to determine whether sport specialization behaviors increase the risk of injury in youth basketball. LEVEL OF EVIDENCE: Level III-cross-sectional study.


Subject(s)
Athletic Injuries , Basketball , Cumulative Trauma Disorders , Youth Sports , Adolescent , Athletic Injuries/epidemiology , Athletic Injuries/etiology , Child , Cross-Sectional Studies , Female , Humans , Risk Factors , Specialization , United States/epidemiology
6.
J Strength Cond Res ; 35(4): 1141-1148, 2021 Apr 01.
Article in English | MEDLINE | ID: mdl-30335714

ABSTRACT

ABSTRACT: Luedke, LE, Heiderscheit, BC, Williams, DSB, and Rauh, MJ. Factors associated with self-selected step rate in high school cross country runners. J Strength Cond Res 35(4): 1141-1148, 2021-Recommendations for step rate, or cadence, during distance running come from varying perspectives including performance, running economy, and injury risk. Studies of adult runners suggest that running experience and leg length may influence step rate, but limited evidence is available on factors that influence adolescent runner step rates. The purpose was to evaluate relationships between running experience, anthropometric factors, and lower extremity muscle strength with self-selected step rate in adolescent runners. Sixty-eight high school cross country runners (47 young women; age 16.2 ± 1.3 years) reported height, body mass, and running experience. Mean step rate was assessed at 3.3 m·s-1 and self-selected (mean 3.8 ± 0.5 m·s-1) speeds. Leg length and peak isometric strength of the hip abductors, knee extensors, and flexors were also measured. Step rates at 3.3 m·s-1 {r (95% confidence interval [CI]) = 0.44 [0.22, 0.61], p < 0.001} and self-selected (r [95% CI] = 0.45 [0.20, 0.66], p < 0.001) speeds were correlated with running experience. Step rates at 3.3 m·s-1 and self-selected speeds were inversely associated with body mass (r [95% CI] = -0.32 [-0.52, -0.09], p = 0.007 and r [95% CI] = -0.34 [-0.53, -0.11], p = 0.005, respectively), height (r [95% CI] = -0.40 [-0.58, -0.18], p = 0.01 and r [95% CI] = -0.32 [-0.52, -0.09], p = 0.008, respectively), and leg length (r [95% CI] = -0.48 [-0.64, -0.27], p < 0.001 and r [95% CI] = -0.35 [-0.52, -0.12], p = 0.004, respectively). No significant relationships were found between isometric strength values and step rate at either speed (p > 0.05). Adolescent runners with greater running experience displayed higher step rates. Hence, the lower step rates in runners with less experience may factor in the higher injury risk previously reported in novice runners. Runners with shorter leg length displayed higher step rates. Step rate recommendations should consider runner experience and anthropometrics.


Subject(s)
Running , Adolescent , Adult , Biomechanical Phenomena , Female , Humans , Knee , Knee Joint , Lower Extremity , Schools
7.
J Arthroplasty ; 35(3): 683-689, 2020 03.
Article in English | MEDLINE | ID: mdl-31801659

ABSTRACT

BACKGROUND: Racial disparities in functional outcomes after total knee arthroplasty (TKA) exist. Whether differences in rehabilitation utilization contribute to these disparities remains to be investigated. METHODS: Among 8349 women enrolled in the prospective Women's Health Initiative cohort who underwent primary TKA between 2006 and 2013, rehabilitation utilization was determined through linked Medicare claims data. Postacute discharge destination (home, skilled nursing facility, and inpatient rehabilitation facility), facility length of stay, and number of home health physical therapy (HHPT) and outpatient physical therapy (OPPT) sessions were compared between racial groups. RESULTS: Non-Hispanic black women had worse physical function (median score, 65 vs 70) and higher likelihood of disability (13.2% vs 6.9%) than non-Hispanic white women before surgery. After TKA, black women were more likely to be discharged postacutely to an institutional facility (64.3% vs 54.5%) than white women, were more likely to receive HHPT services (52.6% vs 47.8%), and received more HHPT and OPPT sessions. After stratification by postacute discharge setting, the likelihood of receipt of HHPT or OPPT services was similar between racial groups. No significant difference in receipt of HHPT or OPPT services was found after use of propensity score weighting to balance health and medical characteristics indicating severity of need for physical therapy services. CONCLUSION: Rehabilitation utilization was generally comparable between black and white women who received TKA when accounting for need. There was no evidence of underutilization of post-TKA rehabilitation services, and thus disparities in post-TKA functional outcomes do not appear to be a result of inequitable receipt of rehabilitation care.


Subject(s)
Arthroplasty, Replacement, Knee , Healthcare Disparities , Aged , Black People , Female , Humans , Medicare , Patient Discharge , Prospective Studies , Skilled Nursing Facilities , United States , White People
8.
J Strength Cond Res ; 32(6): 1692-1701, 2018 Jun.
Article in English | MEDLINE | ID: mdl-28930873

ABSTRACT

Brumitt, J, Heiderscheit, B, Manske, R, Niemuth, PE, Mattocks, A, and Rauh, MJ. Preseason functional test scores are associated with future sports injury in female collegiate athletes. J Strength Cond Res 32(6): 1692-1701, 2018-Recent prospective cohort studies have reported preseason functional performance test (FPT) measures and associations with future risk of injury; however, the findings associated with these studies have been equivocal. The purpose of this study was to determine the ability of a battery of FPTs as a preseason screening tool to identify female Division III (D III) collegiate athletes who may be at risk for a noncontact time-loss injury to the lower quadrant (LQ = low back and lower extremities). One hundred six female D III athletes were recruited for this study. Athletes performed 3 FPTs: standing long jump (SLJ), single-leg hop (SLH) for distance, and the lower extremity functional test (LEFT). Time-loss sport-related injuries were tracked during the season. Thirty-two (24 initial and 8 subsequent) time-loss LQ injuries were sustained during the study. Ten of the 24 initial injuries occurred at the thigh and knee. At-risk athletes with suboptimal FPT measures (SLJ ≤79% ht; (B) SLH ≤64% ht; LEFT ≥118 seconds) had significantly greater rates of initial (7.2 per 1,000 athletic exposures [AEs]) and total (7.6 per 1,000 AEs) time-loss thigh or knee injuries than the referent group (0.9 per 1,000 AEs; 1.0 per 1,000 AEs, respectively). At-risk athletes were 9 times more likely to experience a thigh or knee injury (odds ratio [OR] = 9.7, confidence interval [CI]: 2.3-39.9; p = 0.002) than athletes in the referent group. At-risk athletes with a history of LQ sports injury and lower off-season training habits had an 18-fold increased risk of a time-loss thigh or knee injury during the season (adjusted OR = 18.7, CI: 3.0-118.1; p = 0.002). This battery of FPTs appears useful as a tool for identifying female D III athletes at risk of an LQ injury, especially to the thigh or knee region.


Subject(s)
Athletic Injuries/epidemiology , Exercise Test , Knee Injuries/epidemiology , Thigh/injuries , Adolescent , Back Injuries/epidemiology , Female , Humans , Knee Joint/physiopathology , Lumbosacral Region/injuries , Male , Odds Ratio , Prospective Studies , Risk Assessment/methods , United States/epidemiology , Universities , Young Adult
9.
Int J Sport Nutr Exerc Metab ; 26(1): 17-25, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26252427

ABSTRACT

Early detection of the female athlete triad is essential for the long-term health of adolescent female athletes. The purpose of this study was to assess relationships between common anthropometric markers (ideal body weight [IBW] via the Hamwi formula, youth-percentile body mass index [BMI], adult BMI categories, and body fat percentage [BF%]) and triad components, (low energy availability [EA], measured by dietary restraint [DR], menstrual dysfunction [MD], low bone mineral density [BMD]). In the sample (n = 320) of adolescent female athletes (age 15.9± 1.2 y), Spearman's rho correlations and multiple logistic regression analyses evaluated associations between anthropometric clinical cutoffs and triad components. All underweight categories for the anthropometric measures predicted greater likelihood of MD and low BMD. Athletes with an IBW >85% were nearly 4 times more likely to report MD (OR = 3.7, 95% CI [1.8, 7.9]) and had low BMD (OR = 4.1, 95% CI [1.2, 14.2]). Those in <5th percentile for their age-specific BMI were 9 times more likely to report MD (OR 9.1, 95% CI [1.8, 46.9]) and had low BMD than those in the 50th to 85th percentile. Athletes with a high BF% were almost 3 times more likely to report DR (OR = 2.8, 95% CI [1.4, 6.1]). Our study indicates that low age-adjusted BMI and low IBW may serve as evidence-based clinical indicators that may be practically evaluated in the field, predicting MD and low BMD in adolescents. These measures should be tested for their ability as tools to minimize the risk for the triad.


Subject(s)
Body Mass Index , Body Weight , Female Athlete Triad Syndrome/epidemiology , Adolescent , Body Composition , Bone Density , Cross-Sectional Studies , Feeding Behavior , Female , Humans , Menstruation Disturbances/epidemiology , Risk Factors
10.
J Sport Rehabil ; 25(3): 219-26, 2016 Aug.
Article in English | MEDLINE | ID: mdl-25946403

ABSTRACT

CONTEXT: The Lower-Extremity Functional Test (LEFT) has been used to assess readiness to return to sport after a lower-extremity injury. Current recommendations suggest that women should complete the LEFT in 135 s (average; range 120-150 s) and men should complete the test in 100 s (average; range 90-125 s). However, these estimates are based on limited data and may not be reflective of college athletes. Thus, additional assessment, including normative data, of the LEFT in sport populations is warranted. OBJECTIVE: To examine LEFT times based on descriptive information and off-season training habits in NCAA Division III (DIII) athletes. In addition, this study prospectively examined the LEFT's ability to discriminate sport-related injury occurrence. DESIGN: Descriptive epidemiology. SETTING: DIII university. SUBJECTS: 189 DIII college athletes (106 women, 83 men) from 15 teams. MAIN OUTCOME MEASURES: LEFT times, preseason questionnaire, and time-loss injuries during the sport season. RESULTS: Men completed the LEFT (105 ± 9 s) significantly faster than their female counterparts (117 ± 10 s) (P < .0001). Female athletes who reported >3-5 h/wk of plyometric training during the off-season had significantly slower LEFT scores than those who performed ≤3 h/wk of plyometric training (P = .03). The overall incidence of a lower-quadrant (LQ) time-loss injury for female athletes was 4.5/1000 athletic exposures (AEs) and 3.7/1000 AEs for male athletes. Female athletes with slower LEFT scores (≥118 s) experienced a higher rate of LQ time-loss injuries than those with faster LEFT scores (≤117 s) (P = .03). CONCLUSION: Only off-season plyometric training practices seem to affect LEFT score times among female athletes. Women with slower LEFT scores are more likely to be injured than those with faster LEFT scores. Injury rates in men were not influenced by performance on the LEFT.


Subject(s)
Athletic Injuries/etiology , Athletic Performance/physiology , Leg Injuries/etiology , Lower Extremity/injuries , Plyometric Exercise , Return to Sport/physiology , Adolescent , Adult , Athletic Injuries/epidemiology , Athletic Injuries/physiopathology , Athletic Injuries/prevention & control , Female , Humans , Leg Injuries/epidemiology , Leg Injuries/physiopathology , Leg Injuries/prevention & control , Lower Extremity/physiopathology , Male , Prospective Studies , Reference Values , Risk Factors , Time Factors , United States/epidemiology , Universities , Young Adult
11.
J Shoulder Elbow Surg ; 24(7): 1005-13, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25957112

ABSTRACT

BACKGROUND: Approximately 6 million youngsters play organized baseball yearly, and injuries are common. Defining of risk factors for injuries in the throwing shoulder has largely been confined to the professional thrower. Unfortunately, these risk factors apply to only 1% of pitchers at risk for injury. Risk factors for injury in youth pitchers have received far less attention than those in more mature professional pitchers. Development of such an understanding would help clarify injury prevention efforts for the other 99% of pitchers actively participating in competitive baseball. This study intended to determine the ability of range of motion (ROM) measures to predict arm injuries in baseball pitchers aged 8 to 18 years. METHODS: Supine passive shoulder ROM was assessed in 115 pitchers with a digital inclinometer. Two trials of ROM were measured before the season. Arm injuries were prospectively tracked. Receiver operating characteristic curves were used to identify athletes who were at high risk for injury. Statistical significance was set a priori (α = .05). RESULTS: There were 33 injured and 82 uninjured pitchers. Side-to-side differences of horizontal adduction >15° and internal rotation >13° may discriminate between those adolescent pitchers at 4 and 6 times greater risk of injury, respectively. CONCLUSION: Preseason ROM differences were able to identify those adolescents at high risk for injury during the season. It appears that the risk profile for adolescent pitchers includes horizontal adduction differences that differ from the established prospective profile in adult pitchers.


Subject(s)
Athletic Injuries/prevention & control , Baseball/injuries , Range of Motion, Articular/physiology , Risk Assessment/methods , Shoulder Joint/physiology , Adolescent , Athletic Injuries/physiopathology , Child , Humans , Male , Prospective Studies , Risk Factors , Shoulder Injuries
12.
Pediatr Phys Ther ; 27(2): 126-33, 2015.
Article in English | MEDLINE | ID: mdl-25695196

ABSTRACT

PURPOSE: To examine relationships among age, gender, anthropometrics, and dynamic balance. METHODS: Height, weight, and arm and foot length were measured in 160 children with typical development aged 5 to 12 years. Dynamic balance was assessed using the Timed Up and Go (TUG) test, Pediatric Reach Test (PRT), and Pediatric Balance Scale (PBS). RESULTS: Moderate to good positive relationships (r = 0.61 and r = 0.56) were found between increasing age and PRT and PBS scores. A fair negative relationship (r = -0.49) was observed between age and TUG test. No significant gender-by-age group difference was observed. Age had the strongest influence on TUG and PBS scores; arm length had the strongest influence on PRT scores. CONCLUSIONS: Dynamic balance ability is directly related to chronological age. Age and arm length have the strongest relationships with balance scores. These findings may assist pediatric therapists in selecting dynamic balance tests according to age rather than specific diagnosis.


Subject(s)
Body Weights and Measures , Physical Therapy Modalities , Postural Balance , Age Factors , Child , Child, Preschool , Female , Humans , Male , Sex Factors
13.
J Orthop Sports Phys Ther ; 54(2): 1-13, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37970801

ABSTRACT

OBJECTIVE: To summarize and describe risk factors for running-related injuries (RRIs) among high school and collegiate cross-country runners. DESIGN: Descriptive systematic review. LITERATURE SEARCH: Four databases (Scopus, SPORTDiscus, CINAHL, Cochrane) were searched from inception to August 2023. STUDY SELECTION CRITERIA: Studies assessing RRI risk factors in high school or collegiate runners using a prospective design with at least 1 season of follow-up were included. DATA SYNTHESIS: Results across each study for a given risk factor were summarized and described. The NOS and GRADE frameworks were used to evaluate quality of each study and certainty of evidence for each risk factor. RESULTS: Twenty-four studies were included. Overall, study quality and certainty of evidence were low to moderate. Females or runners with prior RRI or increased RED-S (relative energy deficiency in sport) risk factors were most at risk for RRI, as were runners with a quadriceps angle of >20° and lower step rates. Runners with weaker thigh muscle groups had increased risk of anterior knee pain. Certainty of evidence regarding training, sleep, and specialization was low, but suggests that changes in training volume, poorer sleep, and increased specialization may increase RRI risk. CONCLUSION: The strongest predictors of RRI in high school and collegiate cross-country runners were sex and RRI history, which are nonmodifiable. There was moderate certainty that increased RED-S risk factors increased RRI risk, particularly bone stress injuries. There was limited evidence that changes in training and sleep quality influenced RRI risk, but these are modifiable factors that should be studied further in this population. J Orthop Sports Phys Ther 2024;54(2):1-13. Epub 16 November 2023. doi:10.2519/jospt.2023.11550.


Subject(s)
Running , Female , Humans , Prospective Studies , Risk Factors , Running/injuries , Knee Joint/physiology , Schools
14.
J Athl Train ; 2024 May 22.
Article in English | MEDLINE | ID: mdl-38775113

ABSTRACT

CONTEXT: Research that has examined the association between specialization and injury in basketball has been limited to cross-sectional or retrospective studies. OBJECTIVE: To determine whether specialization is a risk factor for injury among high school basketball athletes. DESIGN: Prospective cohort study. SETTING: Basketball players from 12 high schools participating in the National Athletic Treatment, Injury, and Outcomes Network Surveillance Program (NATION-SP) were recruited prior to the 2022-2023 interscholastic basketball season. PATIENTS OR OTHER PARTICIPANTS: 130 athletes (mean age (SD) = 15.6 (1.3); girls' basketball: n=68 (52.3%)). MAIN OUTCOME MEASURES: Participants completed a questionnaire prior to the start of their school basketball season that had questions regarding participation in various specialized sport behaviors. During the basketball season, the school's athletic trainer reported all athletic exposures (AEs) and injuries (regardless of time loss) for participating athletes into NATION-SP. Injury incidence (IR) and incidence rate ratios (IRR) with 95% confidence intervals [95%CI] were calculated for the specialized sport behaviors previously described. RESULTS: There was no difference in injury risk between highly specialized and low specialized athletes (IRR [95%CI]: 1.9 [0.9, 3.7]). Players who participated in basketball year-round were twice as likely to sustain an injury compared to those who did not play year-round (IRR [95%CI]: 2.1 [1.1, 3.6]). Similarly, players who reported participating in basketball skills camps were at increased risk of injury compared to athletes who did not participate in basketball skill camps (IRR [95%CI]: 2.5 [1.2, 5.7]). CONCLUSION: Injury risk related to sport specialization in basketball may be specific to certain behaviors such as year-round play and participation in skills camps. Validated measures of comprehensive sport activity are needed to better measure specialization in youth sports to better determine injury risk related to sport specialization and develop injury prevention programs for basketball athletes.

15.
PM R ; 2024 May 31.
Article in English | MEDLINE | ID: mdl-38818973

ABSTRACT

BACKGROUND: Injury characteristics of high school track and field throwing athletes in the United States are not well studied. Understanding epidemiology of injuries is important to identify treatment and prevention strategies. OBJECTIVE: To describe injury rates and patterns in U.S. high school track and field throwing events from a longitudinal national sports injury surveillance system. DESIGN: Descriptive epidemiology study. SETTING: Data were provided by the National High School Sports Related Injury Surveillance System, High School RIO (Reporting Information Online). METHODS: Athletic trainers reported injury and exposure data through the High School RIO website on a weekly basis. An athlete exposure (AE) was defined as one athlete participating in one school-sanctioned practice or competition. Throwing events of discus, shot put, and javelin were analyzed in this study. MAIN OUTCOME MEASURES: Injury rate, rate ratios (RR), injury proportion ratios (IPR). PARTICIPANTS: U.S. high school athletes. RESULTS: A total of 267 track and field throwing injuries occurred during 5,486,279 AEs. Overall, the rate of injuries in competition was higher than in practice (RR 1.35, 95% confidence interval [CI] 1.01-1.80). In practice, the rate of injuries was higher for girls than boys (RR 1.53, 95% CI 1.12-2.08). The most frequently injured body part was the shoulder (21.7%), followed by the ankle (16.5%) and knee (12.0%). The most common types of injury were muscle strains (26.14%) and ligament sprains (25%). Recurrent injuries accounted for a higher proportion of chronic injuries compared to new injuries (IPR 1.85, 95% CI 1.16-2.97). CONCLUSION: This study described injury characteristics of high school track and field throwing athletes from 2008 to 2019. Based on our results, injury prevention may be particularly important for female throwers with prior injury.

16.
J Sci Med Sport ; 26(6): 285-290, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37248163

ABSTRACT

OBJECTIVES: This study evaluated pathways to low energy availability in a sample of female adolescent athletes (n = 464). DESIGN: Cross-sectional. METHODS: Participants (age 13-18 y) underwent assessments for height, weight, eating attitudes and behaviors, and menstrual function. Bone mineral density and body composition were evaluated by dual-energy x-ray absorptiometry in a subset of participants (n = 209). Athletes were classified with clinical indicators of low energy availability if they met criteria for 1) primary or secondary amenorrhea or 2) clinical underweight status (body mass index-for-age < 5th percentile). Disordered eating was assessed using the Eating Disorder Examination Questionnaire. RESULTS: Thirty (6.5%) athletes exhibited clinical indicators of low energy availability, with higher estimates in leanness than non-leanness sports (10.9% vs. 2.1%, p < 0.005). Among athletes with clinical indicators of low energy availability, 80% (n = 24) did not meet criteria for disordered eating, eating disorder, or report the desire to lose weight. Athletes with (vs. without) clinical indicators of low energy availability exhibited lower lumbar spine (-1.30 ±â€¯1.38 vs. -0.07 ±â€¯1.21, p < 0.001) and total body (-0.30 ±â€¯0.98 vs. 0.53 ±â€¯0.97, p < 0.006) bone mineral density Z-scores. CONCLUSIONS: A majority of female adolescent athletes with clinical indicators of low energy availability did not exhibit characteristics consistent with intentional dietary restriction, supporting the significance of the inadvertent pathway to low energy availability and need for increased nutrition education in this population.


Subject(s)
Feeding and Eating Disorders , Sports , Female , Adolescent , Humans , Cross-Sectional Studies , Amenorrhea/epidemiology , Bone Density , Athletes , Absorptiometry, Photon
17.
Clin J Sport Med ; 22(2): 116-21, 2012 Mar.
Article in English | MEDLINE | ID: mdl-22343967

ABSTRACT

OBJECTIVE: Incidence rate (IR) of an ipsilateral or contralateral injury after anterior cruciate ligament reconstruction (ACLR) is unknown. The hypotheses were that the IR of anterior cruciate ligament (ACL) injury after ACLR would be greater than the IR in an uninjured cohort of athletes and would be greater in female athletes after ACLR than male athletes. DESIGN: Prospective case-control study. SETTING: Regional sports community. PARTICIPANTS: Sixty-three subjects who had ACLR and were ready to return to sport (RTS) and 39 control subjects. INDEPENDENT VARIABLES: Second ACL injury and sex. MAIN OUTCOME MEASURES: Second ACL injury and athletic exposure (AE) was tracked for 12 months after RTS. Sixteen subjects after ACLR and 1 control subject suffered a second ACL injury. Between- and within-group comparisons of second ACL injury rates (per 1000 AEs) were conducted. RESULTS: The IR of ACL injury after ACLR (1.82/1000 AE) was 15 times greater [risk ratio (RR) = 15.24; P = 0.0002) than that of control subjects (0.12/1000 AE). Female ACLR athletes demonstrated 16 times greater rate of injury (RR = 16.02; P = 0.0002) than female control subjects. Female athletes were 4 (RR = 3.65; P = 0.05) times more likely to suffer a second ACL injury and 6 times (RR = 6.21; P = 0.04) more likely to suffer a contralateral injury than male athletes. CONCLUSIONS: An increased rate of second ACL injury after ACLR exists in athletes when compared with a healthy population. Female athletes suffer contralateral ACL injuries at a higher rate than male athletes and seem to suffer contralateral ACL injuries more frequently than graft re-tears. The identification of a high-risk group within a population of ACLR athletes is a critical step to improve outcome after ACLR and RTS.


Subject(s)
Anterior Cruciate Ligament Injuries , Anterior Cruciate Ligament/surgery , Athletic Injuries/epidemiology , Athletic Injuries/surgery , Adolescent , Case-Control Studies , Female , Humans , Incidence , Male , Prospective Studies , Recurrence , Risk Assessment , Sex Factors
18.
Mil Med ; 177(7): 845-9, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22808893

ABSTRACT

Many U.S. Marines have experienced routine combat deployments during Operation Iraqi Freedom, which present numerous occupational hazards that may result in low back pain (LBP). The objective of this retrospective cohort study was to identify new-onset LBP among Marines following initial deployment to Operation Iraqi Freedom. Active duty Marines deployed to Iraq or Kuwait between 2005 and 2008 were identified from deployment records and linked to medical databases (n = 36,680). The outcome of interest was an International Classification of Diseases, 9th Revision, Clinical Modification code indicating LBP (724.2) within 1 year postdeployment. Multivariate logistic regression examined the effect of occupation on LBP. Overall, 4.1% (n = 1,517) of Marines were diagnosed with LBP. After adjusting for covariates, the service/supply (odds ratio 1.33, 95% confidence interval, 1.12-1.59) and electrical/mechanical/craftsworker occupations (odds ratio 1.31, 95% confidence interval, 1.12-1.53) had higher odds of LBP when compared to the administrative/other referent group. Within these groups, the highest LBP prevalence was in the construction (8.6%) and law enforcement (6.2%) subgroups. Although infantry occupations purposefully engage the enemy and often face sustained physical rigors of combat, LBP was most prevalent in noninfantry occupations. Future studies should include detailed exposure histories to elucidate occupation-specific etiologies of LBP in order to guide prevention efforts.


Subject(s)
Low Back Pain/epidemiology , Military Personnel/statistics & numerical data , Occupational Diseases/epidemiology , Confidence Intervals , Construction Industry , Humans , Iraq War, 2003-2011 , Law Enforcement , Logistic Models , Male , Multivariate Analysis , Odds Ratio , Prevalence , Retrospective Studies , Transportation , United States/epidemiology
19.
Sports (Basel) ; 10(3)2022 Mar 21.
Article in English | MEDLINE | ID: mdl-35324654

ABSTRACT

Trunk muscle endurance has been theorized to play a role in running kinematics and lower extremity injury. However, the evidence examining the relationships between static trunk endurance tests, such as plank tests, and lower extremity injury in athletes is conflicting. The purpose of this study was to assess if collegiate cross country and track-and-field athletes with shorter pre-season prone and side plank hold times would have a higher incidence of lower extremity time-loss overuse injury during their competitive sport seasons. During the first week of their competitive season, 75 NCAA Division III uninjured collegiate cross country and track-and-field athletes (52% female; mean age 20.0 ± 1.3 years) performed three trunk endurance plank tests. Hold times for prone plank (PP), right-side plank (RSP) and left-side plank (LSP) were recorded in seconds. Athletes were followed prospectively during the season for lower extremity overuse injury that resulted in limited or missed practices or competitions. Among the athletes, 25 (33.3%) experienced a lower extremity overuse injury. There were no statistically significant mean differences or associations found between PP, RSP or LSP plank test hold times (seconds) and occurrence of lower extremity overuse injury. In isolation, plank hold times appear to have limited utility as a screening test in collegiate track-and-field and cross country athletes.

20.
Orthop J Sports Med ; 10(1): 23259671211068079, 2022 Jan.
Article in English | MEDLINE | ID: mdl-35111863

ABSTRACT

BACKGROUND: Track and field (T&F) athletes compete in a variety of events that require different skills and training characteristics. Descriptive epidemiology studies often fail to describe event-specific injury patterns. PURPOSE: To describe the epidemiology of injuries in National Collegiate Athletic Association (NCAA) T&F by sex, setting (practice vs competition), and time of season (indoor vs outdoor) and to compare injury patterns by events within the sport. STUDY DESIGN: Descriptive epidemiology study. METHODS: Data were obtained from the NCAA Injury Surveillance Program for all indoor and outdoor T&F injuries during the academic years 2009-2010 to 2013-2014. Injury rates, injury rate ratios, and injury proportion ratios (IPRs) were reported and compared by sex, injury setting, season, and event. Analysis included time-loss as well as no-time loss injuries. RESULTS: Over the 5 seasons, the overall injury rate was 3.99 injuries per 1000 athletic-exposures (95% CI, 3.79-4.20). After controlling for injury diagnoses, women's T&F athletes experienced an 18% higher risk of injury (95% CI, 7% to 31%) and missed 41% more time after an injury (95% CI, 4% to 93%) when compared with men. Among all athletes, the injury risk during competition was 71% higher (95% CI, 50% to 95%) compared with practice and required 59% more time loss (95% CI, 7% to 135%). Distance running accounted for a significantly higher proportion of overuse injuries (IPR, 1.70; 95% CI, 1.40-2.05; P < .05) and required 168% more time loss (95% CI, 78% to 304%) than other events. The hip and thigh were the body regions most commonly injured; injury type, however, varied by T&F event. Sprinting accounted for the greatest proportion of hip and thigh injuries, distance running had the greatest proportion of lower leg injuries, and throwing reported the greatest proportion of spine and upper extremity injuries. CONCLUSION: Injury risk in NCAA T&F varied by sex, season, and setting. Higher injury rates were found in women versus men, indoor versus outdoor seasons, and competitions versus practices. The hip and thigh were the body regions most commonly injured; however, injury types varied by event. These findings may provide insight to programs aiming to reduce the risk of injury and associated time loss in collegiate T&F.

SELECTION OF CITATIONS
SEARCH DETAIL