Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 113
Filter
1.
J Athl Train ; 2024 May 22.
Article in English | MEDLINE | ID: mdl-38775113

ABSTRACT

CONTEXT: Research that has examined the association between specialization and injury in basketball has been limited to cross-sectional or retrospective studies. OBJECTIVE: To determine whether specialization is a risk factor for injury among high school basketball athletes. DESIGN: Prospective cohort study. SETTING: Basketball players from 12 high schools participating in the National Athletic Treatment, Injury, and Outcomes Network Surveillance Program (NATION-SP) were recruited prior to the 2022-2023 interscholastic basketball season. PATIENTS OR OTHER PARTICIPANTS: 130 athletes (mean age (SD) = 15.6 (1.3); girls' basketball: n=68 (52.3%)). MAIN OUTCOME MEASURES: Participants completed a questionnaire prior to the start of their school basketball season that had questions regarding participation in various specialized sport behaviors. During the basketball season, the school's athletic trainer reported all athletic exposures (AEs) and injuries (regardless of time loss) for participating athletes into NATION-SP. Injury incidence (IR) and incidence rate ratios (IRR) with 95% confidence intervals [95%CI] were calculated for the specialized sport behaviors previously described. RESULTS: There was no difference in injury risk between highly specialized and low specialized athletes (IRR [95%CI]: 1.9 [0.9, 3.7]). Players who participated in basketball year-round were twice as likely to sustain an injury compared to those who did not play year-round (IRR [95%CI]: 2.1 [1.1, 3.6]). Similarly, players who reported participating in basketball skills camps were at increased risk of injury compared to athletes who did not participate in basketball skill camps (IRR [95%CI]: 2.5 [1.2, 5.7]). CONCLUSION: Injury risk related to sport specialization in basketball may be specific to certain behaviors such as year-round play and participation in skills camps. Validated measures of comprehensive sport activity are needed to better measure specialization in youth sports to better determine injury risk related to sport specialization and develop injury prevention programs for basketball athletes.

2.
PM R ; 2024 May 31.
Article in English | MEDLINE | ID: mdl-38818973

ABSTRACT

BACKGROUND: Injury characteristics of high school track and field throwing athletes in the United States are not well studied. Understanding epidemiology of injuries is important to identify treatment and prevention strategies. OBJECTIVE: To describe injury rates and patterns in U.S. high school track and field throwing events from a longitudinal national sports injury surveillance system. DESIGN: Descriptive epidemiology study. SETTING: Data were provided by the National High School Sports Related Injury Surveillance System, High School RIO (Reporting Information Online). METHODS: Athletic trainers reported injury and exposure data through the High School RIO website on a weekly basis. An athlete exposure (AE) was defined as one athlete participating in one school-sanctioned practice or competition. Throwing events of discus, shot put, and javelin were analyzed in this study. MAIN OUTCOME MEASURES: Injury rate, rate ratios (RR), injury proportion ratios (IPR). PARTICIPANTS: U.S. high school athletes. RESULTS: A total of 267 track and field throwing injuries occurred during 5,486,279 AEs. Overall, the rate of injuries in competition was higher than in practice (RR 1.35, 95% confidence interval [CI] 1.01-1.80). In practice, the rate of injuries was higher for girls than boys (RR 1.53, 95% CI 1.12-2.08). The most frequently injured body part was the shoulder (21.7%), followed by the ankle (16.5%) and knee (12.0%). The most common types of injury were muscle strains (26.14%) and ligament sprains (25%). Recurrent injuries accounted for a higher proportion of chronic injuries compared to new injuries (IPR 1.85, 95% CI 1.16-2.97). CONCLUSION: This study described injury characteristics of high school track and field throwing athletes from 2008 to 2019. Based on our results, injury prevention may be particularly important for female throwers with prior injury.

3.
Pediatr Exerc Sci ; 36(1): 2-7, 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-37343946

ABSTRACT

PURPOSE: A decline in youth running was observed at the start of the COVID-19 pandemic. We investigated whether the resumption of organized running after social distancing restrictions changed running habits or injury frequency in adolescent runners. METHODS: Adolescents (age = 16.1 [2.1] y) who participated in long-distance running activities completed an online survey in the Spring and Fall of 2020. Participants self-reported average weekly running habits and whether they sustained an injury during the Fall 2020 season. Poisson regression models and 1-way analysis of variance compared running habits while Fisher exact test compared differences in frequencies of injuries during Fall 2020 among season statuses (full, delayed, and canceled). RESULTS: All runners, regardless of season status, increased weekly distance during Fall 2020. Only runners with a full Fall 2020 season ran more times per week and more high-intensity runs per week compared with their Spring 2020 running habits. There were no differences in running volume or running-related injury frequency among Fall 2020 season statuses. CONCLUSIONS: There were no significant differences in running-related injury (RRI) frequency among runners, regardless of season status, following the resumption of cross-country. Health care providers may need to prepare for runners to increase running volume and intensity following the resumption of organized team activities.


Subject(s)
COVID-19 , Running , Humans , Adolescent , Pandemics , Prospective Studies , Risk Factors , Habits
4.
J Orthop Sports Phys Ther ; 54(2): 1-13, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37970801

ABSTRACT

OBJECTIVE: To summarize and describe risk factors for running-related injuries (RRIs) among high school and collegiate cross-country runners. DESIGN: Descriptive systematic review. LITERATURE SEARCH: Four databases (Scopus, SPORTDiscus, CINAHL, Cochrane) were searched from inception to August 2023. STUDY SELECTION CRITERIA: Studies assessing RRI risk factors in high school or collegiate runners using a prospective design with at least 1 season of follow-up were included. DATA SYNTHESIS: Results across each study for a given risk factor were summarized and described. The NOS and GRADE frameworks were used to evaluate quality of each study and certainty of evidence for each risk factor. RESULTS: Twenty-four studies were included. Overall, study quality and certainty of evidence were low to moderate. Females or runners with prior RRI or increased RED-S (relative energy deficiency in sport) risk factors were most at risk for RRI, as were runners with a quadriceps angle of >20° and lower step rates. Runners with weaker thigh muscle groups had increased risk of anterior knee pain. Certainty of evidence regarding training, sleep, and specialization was low, but suggests that changes in training volume, poorer sleep, and increased specialization may increase RRI risk. CONCLUSION: The strongest predictors of RRI in high school and collegiate cross-country runners were sex and RRI history, which are nonmodifiable. There was moderate certainty that increased RED-S risk factors increased RRI risk, particularly bone stress injuries. There was limited evidence that changes in training and sleep quality influenced RRI risk, but these are modifiable factors that should be studied further in this population. J Orthop Sports Phys Ther 2024;54(2):1-13. Epub 16 November 2023. doi:10.2519/jospt.2023.11550.


Subject(s)
Running , Female , Humans , Prospective Studies , Risk Factors , Running/injuries , Knee Joint/physiology , Schools
5.
J Sci Med Sport ; 26(6): 285-290, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37248163

ABSTRACT

OBJECTIVES: This study evaluated pathways to low energy availability in a sample of female adolescent athletes (n = 464). DESIGN: Cross-sectional. METHODS: Participants (age 13-18 y) underwent assessments for height, weight, eating attitudes and behaviors, and menstrual function. Bone mineral density and body composition were evaluated by dual-energy x-ray absorptiometry in a subset of participants (n = 209). Athletes were classified with clinical indicators of low energy availability if they met criteria for 1) primary or secondary amenorrhea or 2) clinical underweight status (body mass index-for-age < 5th percentile). Disordered eating was assessed using the Eating Disorder Examination Questionnaire. RESULTS: Thirty (6.5%) athletes exhibited clinical indicators of low energy availability, with higher estimates in leanness than non-leanness sports (10.9% vs. 2.1%, p < 0.005). Among athletes with clinical indicators of low energy availability, 80% (n = 24) did not meet criteria for disordered eating, eating disorder, or report the desire to lose weight. Athletes with (vs. without) clinical indicators of low energy availability exhibited lower lumbar spine (-1.30 ±â€¯1.38 vs. -0.07 ±â€¯1.21, p < 0.001) and total body (-0.30 ±â€¯0.98 vs. 0.53 ±â€¯0.97, p < 0.006) bone mineral density Z-scores. CONCLUSIONS: A majority of female adolescent athletes with clinical indicators of low energy availability did not exhibit characteristics consistent with intentional dietary restriction, supporting the significance of the inadvertent pathway to low energy availability and need for increased nutrition education in this population.


Subject(s)
Feeding and Eating Disorders , Sports , Female , Adolescent , Humans , Cross-Sectional Studies , Amenorrhea/epidemiology , Bone Density , Athletes , Absorptiometry, Photon
6.
J Clin Densitom ; 26(3): 101370, 2023.
Article in English | MEDLINE | ID: mdl-37100686

ABSTRACT

INTRODUCTION/BACKGROUND: Trabecular bone score (TBS) is an indirect measurement of bone quality and microarchitecture determined from dual-energy X-ray absorptiometry (DXA) imaging of the lumbar spine. TBS predicts fracture risk independent of bone mass/density, suggesting this assessment of bone quality adds value to the understanding of patients' bone health. While lean mass and muscular strength have been associated with higher bone density and lower fracture risk among older adults, the literature is limited regarding the relationship of lean mass and strength with TBS. The purpose of this study was to determine associations of DXA-determined total body and trunk lean mass, maximal muscular strength, and gait speed as a measure of physical function, with TBS in 141 older adults (65-84 yr, 72.5 +/- 5.1 yr, 74% women). METHODOLOGY: Assessments included lumbar spine (L1-L4) bone density and total body and trunk lean mass by DXA, lower body (leg press) and upper body (seated row) strength by one repetition maximum tests, hand grip strength, and usual gait speed. TBS was derived from the lumbar spine DXA scan. Multivariable linear regression determined the contribution of proposed predictors to TBS. RESULTS: After adjusting for age, sex, and lumbar spine bone density, upper body strength significantly predicted TBS (unadjusted/adjusted R2= 0.16/ 0.11, ß coefficient =0.378, p=0.005), while total body lean mass index showed a trend in the expected direction (ß coefficient =0.243, p=0.053). Gait speed and grip strength were not associated with TBS (p>0.05). CONCLUSION: Maximum strength of primarily back muscles measured as the seated row appears important to bone quality as measured by TBS, independent of bone density. Additional research on exercise training targeting back strength is needed to determine its clinical utility in preventing vertebral fractures among older adults.


Subject(s)
Fractures, Bone , Osteoporotic Fractures , Humans , Female , Aged , Male , Cancellous Bone/diagnostic imaging , Hand Strength , Bone Density , Absorptiometry, Photon/methods , Lumbar Vertebrae/diagnostic imaging , Lumbar Vertebrae/physiology
7.
J Clin Med ; 11(21)2022 Nov 01.
Article in English | MEDLINE | ID: mdl-36362725

ABSTRACT

Despite its positive influence on physical and mental wellbeing, running is associated with a high incidence of musculoskeletal injury. Potential modifiable risk factors for running-related injury have been identified, including running biomechanics. Gait retraining is used to address these biomechanical risk factors in injured runners. While recent systematic reviews of biomechanical risk factors for running-related injury and gait retraining have been conducted, there is a lack of information surrounding the translation of gait retraining for injured runners into clinical settings. Gait retraining studies in patients with patellofemoral pain syndrome have shown a decrease in pain and increase in functionality through increasing cadence, decreasing hip adduction, transitioning to a non-rearfoot strike pattern, increasing forward trunk lean, or a combination of some of these techniques. This literature suggests that gait retraining could be applied to the treatment of other injuries in runners, although there is limited evidence to support this specific to other running-related injuries. Components of successful gait retraining to treat injured runners with running-related injuries are presented.

8.
Int J Sports Phys Ther ; 17(6): 1053-1062, 2022.
Article in English | MEDLINE | ID: mdl-36237650

ABSTRACT

Background: Female collegiate cross-country (XC) runners have a high incidence of running-related injury (RRI). Limited reports are available that have examined potential intrinsic factors that may increase RRI risk in this population. Purpose: To examine the relationships between RRI, hip muscle strength, and lower extremity running kinematics in female collegiate XC runners. Study Design: Prospective observational cohort. Methods: Participants included twenty female NCAA collegiate XC runners from Southern California universities who competed in the 2019-20 intercollegiate season. A pre-season questionnaire was used to gather demographic information. Hip muscle strength was measured with isokinetic dynamometry in a sidelying open-chain position and normalized by the runner's body weight (kg). Running kinematic variables were examined using Qualisys 3D Motion Capture and Visual 3D analysis. RRI occurrence was obtained via post-season questionnaires. Independent t-tests were used to determine mean differences between injured and non-injured runners for hip abductor muscle strength and selected running kinematics. Pearson correlation coefficients were calculated to examine relationships between hip muscle performance and kinematic variables. Results: End-of-the-season RRI information was gathered from 19 of the 20 participants. During the 2019-20 XC season, 57.9% (11 of 19) of the runners sustained an RRI. There were no significant differences between mean hip abductor normalized muscle strength (p=0.76) or mean normalized hip muscle strength asymmetry (p=0.18) of injured and non-injured runners during the XC season. Similarly, no significant differences were found between mean values of selected kinematic variables of runners who did and who did not report an RRI. Moderate relationships were found between hip abductor strength variables and right knee adduction at footstrike (r=0.50), maximum right knee adduction during stance (r=0.55), left supination at footstrike (r=0.48), right peak pronation during stance (r=-0.47), left supination at footstrike (r=0.51), and right peak pronation during stance (r=-0.54) (all p≤0.05). Conclusions: Hip abduction muscle strength, hip abduction strength asymmetry, and selected running kinematic variables were not associated with elevated risk of RRI in female collegiate XC runners.

9.
Gait Posture ; 98: 266-270, 2022 10.
Article in English | MEDLINE | ID: mdl-36209689

ABSTRACT

BACKGROUND: Lower cadence has been previously associated with injury in long-distance runners. Variations in cadence may be related to experience, speed, and anthropometric variables. It is unknown what factors, if any, predict cadence in healthy youth long-distance runners. RESEARCH QUESTION: Are demographic, anthropometric and/or biomechanical variables able to predict cadence in healthy youth long-distance runners. METHODS: A cohort of 138 uninjured youth long-distance runners (M = 62, F = 76; Mean ± SD; age = 13.7 ± 2.7; mass = 47.9 ± 13.6 kg; height = 157.9 ± 14.5 cm; running volume = 19.2 ± 20.6 km/wk; running experience: males = 3.5 ± 2.1 yrs, females = 3.3 ± 2.0 yrs) were recruited for the study. Multiple linear regression (MLR) models were developed for total sample and for each sex independently that only included variables that were significantly correlated to self-selected cadence. A variance inflation factor (VIF) assessed multicollinearity of variables. If VIF≥ 5, variable(s) were removed and the MLR analysis was conducted again. RESULTS: For all models, VIF was > 5 between speed and normalized stride length, therefore we removed normalized stride length from all models. Only leg length and speed were significantly correlated (p < .001) with cadence in the regression models for total sample (R2 = 51.9 %) and females (R2 = 48.2 %). The regression model for all participants was Cadence = -1.251 *Leg Length + 3.665 *Speed + 254.858. The regression model for females was Cadence = -1.190 *Leg Length + 3.705 *Speed + 249.688. For males, leg length, cadence, and running experience were significantly predictive (p < .001) of cadence in the model (R2 = 54.7 %). The regression model for males was Cadence = -1.268 *Leg Length + 3.471 *Speed - 1.087 *Running Experience + 261.378. SIGNIFICANCE: Approximately 50 % of the variance in cadence was explained by the individual's leg length and running speed. Shorter leg lengths and faster running speeds were associated with higher cadence. For males, fewer years of running experience was associated with a higher cadence.


Subject(s)
Leg , Running , Male , Female , Humans , Adolescent , Child , Biomechanical Phenomena , Running/injuries , Anthropometry , Linear Models
10.
Orthop J Sports Med ; 10(9): 23259671221122356, 2022 Sep.
Article in English | MEDLINE | ID: mdl-36147792

ABSTRACT

Background: Tibial stress fracture (SFx) is the most common SFx of the lower extremity. Presently, diagnostic accuracy of clinical examination techniques for tibial SFx remains suboptimal. Purpose: To assess the diagnostic effectiveness of 5 clinical tests for tibial SFx individually versus a test item cluster. Study Design: Cohort study (diagnosis); Level of evidence, 3. Methods: A total of 50 patients with tibial pain (17 with bilateral symptoms) were assessed with 5 clinical examination tests (tibial fulcrum test, focal tenderness to palpation, heel percussion test, therapeutic ultrasound test, and 128-Hz tuning fork test) before they underwent diagnostic imaging (radionuclide bone scan). The application of the clinical tests was counterbalanced to minimize the likelihood of carryover effects. Patients provided a pain rating immediately before and after the application of each clinical test. Results: The prevalence of tibial SFx among the study participants was 52.2%. High levels of specificity were produced by the therapeutic ultrasound test (93.8%), tuning fork test (90.6%), and percussion test (90.6%). The fulcrum test had moderate to high specificity (84.4%). All tests demonstrated low levels of sensitivity, with the highest levels found for focal tenderness to palpation (48.6%) and fulcrum (45.7%). The fulcrum test provided the highest positive likelihood ratio (2.93), followed by the therapeutic ultrasound test (2.30). The fulcrum test had the lowest negative likelihood ratio (0.64), with the focal tenderness to palpation and tuning fork tests having negative likelihood ratios >1.0. Combinations of these clinical tests did not improve the prediction of tibial SFx above that observed among the individual tests. Conclusion: The clinical tests evaluated were generally highly specific, but all had low sensitivity. The fulcrum test provided the highest level of diagnostic accuracy; however, it was inadequate for definitive clinical management. Combining tests did not improve the diagnostic accuracy of tibial SFx.

11.
Sports (Basel) ; 10(3)2022 Mar 21.
Article in English | MEDLINE | ID: mdl-35324654

ABSTRACT

Trunk muscle endurance has been theorized to play a role in running kinematics and lower extremity injury. However, the evidence examining the relationships between static trunk endurance tests, such as plank tests, and lower extremity injury in athletes is conflicting. The purpose of this study was to assess if collegiate cross country and track-and-field athletes with shorter pre-season prone and side plank hold times would have a higher incidence of lower extremity time-loss overuse injury during their competitive sport seasons. During the first week of their competitive season, 75 NCAA Division III uninjured collegiate cross country and track-and-field athletes (52% female; mean age 20.0 ± 1.3 years) performed three trunk endurance plank tests. Hold times for prone plank (PP), right-side plank (RSP) and left-side plank (LSP) were recorded in seconds. Athletes were followed prospectively during the season for lower extremity overuse injury that resulted in limited or missed practices or competitions. Among the athletes, 25 (33.3%) experienced a lower extremity overuse injury. There were no statistically significant mean differences or associations found between PP, RSP or LSP plank test hold times (seconds) and occurrence of lower extremity overuse injury. In isolation, plank hold times appear to have limited utility as a screening test in collegiate track-and-field and cross country athletes.

12.
Orthop J Sports Med ; 10(1): 23259671211068079, 2022 Jan.
Article in English | MEDLINE | ID: mdl-35111863

ABSTRACT

BACKGROUND: Track and field (T&F) athletes compete in a variety of events that require different skills and training characteristics. Descriptive epidemiology studies often fail to describe event-specific injury patterns. PURPOSE: To describe the epidemiology of injuries in National Collegiate Athletic Association (NCAA) T&F by sex, setting (practice vs competition), and time of season (indoor vs outdoor) and to compare injury patterns by events within the sport. STUDY DESIGN: Descriptive epidemiology study. METHODS: Data were obtained from the NCAA Injury Surveillance Program for all indoor and outdoor T&F injuries during the academic years 2009-2010 to 2013-2014. Injury rates, injury rate ratios, and injury proportion ratios (IPRs) were reported and compared by sex, injury setting, season, and event. Analysis included time-loss as well as no-time loss injuries. RESULTS: Over the 5 seasons, the overall injury rate was 3.99 injuries per 1000 athletic-exposures (95% CI, 3.79-4.20). After controlling for injury diagnoses, women's T&F athletes experienced an 18% higher risk of injury (95% CI, 7% to 31%) and missed 41% more time after an injury (95% CI, 4% to 93%) when compared with men. Among all athletes, the injury risk during competition was 71% higher (95% CI, 50% to 95%) compared with practice and required 59% more time loss (95% CI, 7% to 135%). Distance running accounted for a significantly higher proportion of overuse injuries (IPR, 1.70; 95% CI, 1.40-2.05; P < .05) and required 168% more time loss (95% CI, 78% to 304%) than other events. The hip and thigh were the body regions most commonly injured; injury type, however, varied by T&F event. Sprinting accounted for the greatest proportion of hip and thigh injuries, distance running had the greatest proportion of lower leg injuries, and throwing reported the greatest proportion of spine and upper extremity injuries. CONCLUSION: Injury risk in NCAA T&F varied by sex, season, and setting. Higher injury rates were found in women versus men, indoor versus outdoor seasons, and competitions versus practices. The hip and thigh were the body regions most commonly injured; however, injury types varied by event. These findings may provide insight to programs aiming to reduce the risk of injury and associated time loss in collegiate T&F.

13.
J Am Nutr Assoc ; 41(6): 551-558, 2022 08.
Article in English | MEDLINE | ID: mdl-34032561

ABSTRACT

Backgroud: Despite the evidence of an elevated prevalence of low bone mass in adolescent endurance runners, reports on dietary intake in this population is limited.Objectives: This study aimed to evaluate energy availability (EA) and dietary intake among 72 (n = 60 female, n = 12 male) high school cross-country runners.Methods: The sample consisted of a combined dataset of two cohorts. In both cohorts, the Block Food Frequency Questionnaire (FFQ; 2005 & 2014 versions) assessed dietary intake. Fat free mass was assessed using dual-energy x-ray absorptiometry or bioelectrical impedance analysis.Results: Mean EA was less than recommended (45 kcal/kgFFM/day) among male (35.8 ± 14.4 kcal/kg FFM/day) and female endurance runners (29.6 ± 17.4 kcal/kgFFM/day), with 30.0% of males and 60.0% of females meeting criteria for low EA (<30 kcal/kgFFM/day). Calorie intake for male (2,614.2 ± 861.8 kcal/day) and female (1,879.5 ± 723.6 kcal/day) endurance runners fell below the estimated energy requirement for "active" boys (>3,100 kcal/day) and girls (>2,300 kcal/day). Female endurance runners' relative carbohydrate intake (4.9 ± 2.1 g/kg/day) also fell below recommended levels (6-10 g/kg/day). Male and female endurance runners exhibited below-recommended intakes of calcium, vitamin D, potassium, fruit, vegetables, grains, and dairy. Compared to male endurance runners, female endurance runners demonstrated lower relative intakes of energy (kcal/kg/day), protein (g/kg/day), fat (g/kg/day), fiber, vegetables, total protein, and oils.Conclusion: This study provides evidence of the nutritional risk of adolescent endurance runners and underscores the importance of nutritional support efforts in this population.


Subject(s)
Energy Intake , Nutritional Status , Adolescent , Eating , Female , Humans , Male , Nutritional Requirements , Vegetables , Vitamins
14.
Phys Sportsmed ; 50(6): 471-477, 2022 12.
Article in English | MEDLINE | ID: mdl-34176442

ABSTRACT

OBJECTIVES: Previous studies of parents of adolescent athletes identified a belief among parents of the importance of early specialization for skill development. However, it is unclear if these attitudes and beliefs are also held among parents of baseball athletes, which is the second-most popular boy's sport in the United States. The purpose of this study was to describe the knowledge, attitudes, and beliefs of parents of Little League baseball players regarding sport specialization and college scholarships. METHODS: Two-hundred and forty-four parents of Little League baseball players (female parents: 60.7%, parent age: 41.1 ± 6.2 years old, male children: 98.0%, child age: 9.5 ± 1.6 years old) completed an anonymous online questionnaire regarding parent attitudes and beliefs on sport specialization and college scholarships. RESULTS: Most parents (72.4%) felt that specialization would increase their child's baseball ability either 'quite a bit' or 'a great deal.' Fewer than half of all parents (42.0%) reported that specialization was either 'quite a bit' or 'a great deal' of a problem. Parents underestimated the availability of Division I college baseball scholarship availability (median [IQR]: 5 [4-10]), compared to the actual value of 11.7 scholarships per Division I roster. Only 10.2% of parents (N = 25) reported that they believed it was 'somewhat' or 'very' likely that their child would receive a college baseball scholarship. CONCLUSION: Further efforts are needed to understand parent attitudes and beliefs regarding sport specialization and college scholarships in various sports to better understand current trends in youth sport participation.


Subject(s)
Baseball , Youth Sports , Child , Adolescent , Male , Female , Humans , United States , Adult , Middle Aged , Fellowships and Scholarships , Specialization , Athletes , Parents , Attitude
15.
PM R ; 14(7): 793-801, 2022 07.
Article in English | MEDLINE | ID: mdl-34053194

ABSTRACT

BACKGROUND: Understanding the prevalence and factors associated with running-related injuries in middle school runners may guide injury prevention. OBJECTIVE: To determine the prevalence of running-related injuries and describe factors related to a history of injury. DESIGN: Retrospective cross-sectional study. SETTING: Survey distributed online to middle school runners. METHODS: Participants completed a web-based survey regarding prior running-related injuries, training, sleep, diet, and sport participation. MAIN OUTCOME MEASUREMENTS: Prevalence and characteristics differentiating girls and boys with and without running-related injury history adjusted for age. PARTICIPANTS: Youth runners (total: 2113, average age, 13.2 years; boys: n = 1255, girls: n = 858). RESULTS: Running-related injuries were more prevalent in girls (56% vs. 50%, p = .01). Ankle sprain was the most common injury (girls: 22.5%, boys: 21.6%), followed by patellofemoral pain (20.4% vs. 7.8%) and shin splints (13.6% vs. 5.9%); both were more prevalent in girls (p < .001). Boys more frequently reported plantar fasciitis (5.6% vs. 3.3%, p = .01), iliotibial band syndrome (4.1% vs. 1.4%, p = .001) and Osgood-Schlatter disease (3.8% vs. 1.2%, p = .001). Runners with history of running-related injuries were older, ran greater average weekly mileage, ran faster, had fewer average hours of sleep on weekends, skipped more meals, missed breakfast, and consumed less milk (all p < .05). Girls with history of running-related injuries reported higher dietary restraint scores, later age of menarche, more menstrual cycle disturbances, and higher likelihood of following vegetarian diets and an eating disorder diagnosis (all p < .05). Runners with no history of running-related injuries were more likely to have participated in ≥2 years of soccer or basketball (p < .001). CONCLUSIONS: Most middle school runners reported a history of running-related injuries and certain injuries differing by gender. Modifiable factors with the greatest association with running-related injuries included training volume, dietary restraint, skipping meals, and less sleep. Sport sampling, including participation in ball sports, may reduce running-related injury risk in this population.


Subject(s)
Athletic Injuries , Iliotibial Band Syndrome , Adolescent , Athletic Injuries/epidemiology , Cross-Sectional Studies , Female , Humans , Male , Prevalence , Retrospective Studies , Schools
16.
PM R ; 14(9): 1056-1067, 2022 09.
Article in English | MEDLINE | ID: mdl-34251763

ABSTRACT

BACKGROUND: Bone stress injury (BSI) in youth runners is clinically important during times of skeletal growth and is not well studied. OBJECTIVE: To evaluate the prevalence, anatomical distribution, and factors associated with running-related BSI in boy and girl middle school runners. DESIGN: Retrospective cross-sectional study. SETTING: Online survey distributed to middle school runners. METHODS: Survey evaluated BSI history, age, grade, height, weight, eating behaviors, menstrual function, exercise training, and other health characteristics. MAIN OUTCOME MEASUREMENTS: Prevalence and characteristics associated with history of BSI, stratified by cortical-rich (eg, tibia) and trabecular-rich (pelvis and femoral neck) locations. PARTICIPANTS: 2107 runners (n = 1250 boys, n = 857 girls), age 13.2 ± 0.9 years. RESULTS: One hundred five (4.7%) runners reported a history of 132 BSIs, with higher prevalence in girls than boys (6.7% vs 3.8%, p = .004). The most common location was the tibia (n = 51). Most trabecular-rich BSIs (n = 16, 94% total) were sustained by girls (pelvis: n = 6; femoral neck: n = 6; sacrum: n = 4). In girls, consuming <3 daily meals (odds ratio [OR] = 18.5, 95% confidence interval [CI] = 7.3, 47.4), eating disorder (9.8, 95% CI = 2.0, 47.0), family history of osteoporosis (OR = 6.9, 95% CI = 2.6, 18.0), and age (OR = 1.6, 95% CI = 1.0, 2.6) were associated with BSI. In boys, family history of osteoporosis (OR = 3.2, 95% CI = 1.2, 8.4), prior non-BSI fracture (OR = 3.2, 95% CI = 1.6, 6.7), and running mileage (OR = 1.1, 95% CI = 1.0, 1.1) were associated with BSI. Participating in soccer or basketball ≥2 years was associated with lower odds of BSI for both sexes. CONCLUSION: Whereas family history of osteoporosis and prior fracture (non-BSI) were most strongly related to BSI in the youth runners, behaviors contributing to an energy deficit, such as eating disorder and consuming <3 meals daily, also emerged as independent factors associated with BSI. Although cross-sectional design limits determining causality, our findings suggest promoting optimal skeletal health through nutrition and participation in other sports including soccer and basketball may address factors associated with BSI in this population.


Subject(s)
Osteoporosis , Running , Adolescent , Bone Density , Child , Cross-Sectional Studies , Female , Humans , Male , Prevalence , Retrospective Studies , Running/injuries , Schools
17.
J Sci Med Sport ; 25(3): 272-278, 2022 Mar.
Article in English | MEDLINE | ID: mdl-34756802

ABSTRACT

OBJECTIVES: This study aimed to investigate differences in stance phase pelvic and hip running kinematics based on maturation and sex among healthy youth distance runners. DESIGN: Cross-Sectional. METHODS: 133 uninjured youth distance runners (M = 60, F = 73; age = 13.5 ±â€¯2.7 years) underwent a three-dimensional running analysis on a treadmill at a self-selected speed (2.8 ±â€¯0.6 m·s-1). Participants were stratified as pre-pubertal, mid-pubertal, or post-pubertal according to the modified Pubertal Maturational Observation Scale. Stance phase pelvis and hip range of motion (RoM) and peak joint positions were extracted. Two-way ANCOVAs (sex, maturation; covariate of running velocity) were used with Bonferroni-Holm method to control for multiple comparisons with a target alpha level of 0.05. RESULTS: A two-way interaction between sex and maturation was detected (p = 0.009) for frontal plane pelvic obliquity RoM. Post-hoc analysis identified a maturation main effect only among females (p˂0.008). Pelvic obliquity RoM was significantly greater among post-pubertal (p = 0.001) compared to pre-pubertal females. Significant main effects of sex (p = 0.02), and maturation (p = 0.01) were found for hip adduction RoM. Post-hoc analysis indicated a significant increase in hip adduction RoM from pre-pubertal to post-pubertal female runners (p = 0.001). A significant main effect of sex was found for peak hip adduction angle (p = 0.001) with female runners exhibiting greater maximum peak hip adduction compared to males. CONCLUSIONS: Maturation influences pelvic and hip kinematics greater in female than male runners. Sex differences became more pronounced during later stages of puberty. These differences may correspond to an increased risk for running-related injuries in female runners compared to male runners.


Subject(s)
Hip Joint , Knee Joint , Adolescent , Biomechanical Phenomena , Child , Cross-Sectional Studies , Female , Humans , Male , Pelvis
18.
J Athl Train ; 57(7): 672-677, 2022 Jul 01.
Article in English | MEDLINE | ID: mdl-34902855

ABSTRACT

CONTEXT: Running programs traditionally monitor external loads (eg, time and distance). Recent efforts have encouraged a more comprehensive approach to also account for internal loads (eg, intensity, measured as the session rating of perceived exertion [sRPE]). The combination of external and internal loads accounts for the possible interaction between these loads. Although weekly changes in training loads have been reported between external loads and the combination of external and internal loads during 2- and 4-week training cycles, no authors have indicated whether these differences occur during an entire cross-country season in high school runners. OBJECTIVE: To compare changes in training loads, as measured by (1) external loads and (2) combined external and internal loads in high school runners during an interscholastic cross-country season. DESIGN: Case series. SETTING: Community-based setting with daily online surveys. PATIENTS OR OTHER PARTICIPANTS: Twenty-four high school cross-country runners (females = 14, males = 10, age = 15.9 ± 1.1 years, running experience = 9.9 ± 3.2 years). MAIN OUTCOME MEASURE(S): Week-to-week percentage changes in training load were measured by external loads (time, distance) and combined external and internal loads (time × sRPE [timeRPE] and distance × sRPE [distanceRPE]). RESULTS: Overall, the average weekly change was 7.1% greater for distanceRPE than for distance (P = .04, d = 0.18). When the weekly running duration decreased, we found the average weekly change was 5.2% greater for distanceRPE than for timeRPE (P = .03, d = 0.24). When the weekly running duration was maintained or increased, the average weekly change was 10% to 15% greater when external and internal loads were combined versus external loads alone, but these differences were nonsignificant (P = .11-.22, d = 0.19-0.34). CONCLUSIONS: Progression in the training load may be underestimated when relying solely on external loads. The interaction between internal loads (sRPE) and external loads (distance or time) appears to provide a different measure of the training stresses experienced by runners than external loads alone.


Subject(s)
Physical Conditioning, Human , Running , Male , Female , Humans , Adolescent , Physical Exertion , Seasons , Schools
19.
J Athl Train ; 57(9-10): 937-945, 2022 Sep 01.
Article in English | MEDLINE | ID: mdl-36638338

ABSTRACT

BACKGROUND: The incidence of second anterior cruciate ligament (ACL) injury after ACL reconstruction (ACLR) is high in young, active populations. Failure to successfully meet return-to-sport (RTS) criteria may identify adult athletes at risk of future injury; however, these studies have yet to assess skeletally mature adolescent athletes. OBJECTIVE: To determine if failure to meet RTS criteria would identify adolescent and young adult athletes at risk for future ACL injury after ACLR and RTS. The tested hypothesis was that the risk of a second ACL injury after RTS would be lower in participants who met all RTS criteria compared with those who failed to meet all criteria before RTS. DESIGN: Prospective case-cohort (prognosis) study. SETTING: Laboratory. PATIENTS OR OTHER PARTICIPANTS: A total of 159 individuals (age = 17.2 ± 2.6 years, males = 47, females = 112). MAIN OUTCOME MEASURE(S): Participants completed an RTS assessment (quadriceps strength, functional hop tests) and the International Knee Documentation Committee patient survey (0 to 100 scale) after ACLR and were then tracked for occurrence of a second ACL tear. Athletes were classified into groups that passed all 6 RTS tests at a criterion level of 90% (or 90 of 100) limb symmetry and were compared with those who failed to meet all criteria. Crude odds ratios and 95% CIs were calculated to determine if passing all 6 RTS measures resulted in a reduced risk of second ACL injury in the first 24 months after RTS. RESULTS: Thirty-five (22%) of the participants sustained a second ACL injury. At the time of RTS, 26% achieved ≥90 on all tests, and the remaining athletes scored less than 90 on at least 1 of the 6 assessments. The second ACL injury incidence did not differ between those who passed all RTS criteria (28.6%) and those who failed at least 1 criterion (19.7%, P = .23). Subgroup analysis by graft type also indicated no differences between groups (P > .05). CONCLUSIONS: Current RTS criteria at a 90% threshold did not identify active skeletally mature adolescent and young adult athletes at high risk for second ACL injury.


Subject(s)
Anterior Cruciate Ligament Injuries , Anterior Cruciate Ligament Reconstruction , Male , Adolescent , Female , Young Adult , Humans , Adult , Anterior Cruciate Ligament Injuries/surgery , Return to Sport , Knee , Lower Extremity , Muscle Strength
20.
Front Sports Act Living ; 3: 696264, 2021.
Article in English | MEDLINE | ID: mdl-34553139

ABSTRACT

Purpose: The COVID-19 pandemic impacted the sporting and exercise activities of millions of youth. Running is an activity that could be maintained while social distancing restrictions were implemented during the pandemic. However, a recent study has indicated that youth runners reported lower running distance, frequency, and intensity during COVID-19. The reason for this reduction and the impact on overall well-being is unknown. Therefore, the purpose of this study was to determine if the social distancing restrictions during the 2020 COVID-19 pandemic influenced running motives, socialization, wellness and mental health in youth long-distance runners. Methods: A customized, open online questionnaire was provided to runners 9-19 years of age who participated in long-distance running activities including team/club cross-country, track and field (distances ≥800 m), road races, or recreational running. Participants responded to questions about demographics, motive for running, and wellness (sleep quality, anxiety, running enjoyment, food consumption quality) 6-months before as well as during social distancing restrictions due to COVID-19. Wilcoxon signed rank tests compared differences for ratio data and Chi-square tests were used to compare proportions before and during COVID-19 social distancing restrictions. Statistical significance was set at p ≤ 0.05. Results: A total of 287 youth long-distance runners (male = 124, female = 162, unspecified = 1; age = 15.3 ± 1.7 years; running experience = 5.0 ± 2.3 years) participated. Compared to their pre-COVID-19 responses, youth long-distance runners reported lower overall motivation to run (p < 0.001) and changes to most motive rankings (p < 0.001 to p = 0.71). The proportion of youth running alone increased during COVID-19 (65.8%) compared to pre-COVID-19 (13.8%, p < 0.001). Youth long-distance runners also reported less running enjoyment (p = 0.001), longer sleep duration (p < 0.001), lower sleep quality (p = 0.05), more anxiety (p = 0.043), and lower food quality consumed (p < 0.001) during COVID-19 social distance restrictions. Conclusion: The COVID-19 social distancing restrictions resulted in significant decreases in motivation and enjoyment of running. The removal of competition and team-based interactions likely had a role in these decreases for this population. Continuing team-based activities (e.g., virtual) during social distancing may help with maintaining motivation of youth long-distance runners. Reduced running occurred concurrently with reduced overall well-being of youth long-distance runners during the COVID-19 pandemic.

SELECTION OF CITATIONS
SEARCH DETAIL
...