Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 21
Filtrar
1.
Front Pain Res (Lausanne) ; 4: 1175574, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37654909

RESUMEN

Introduction: Chronic pain and associated interference with daily activities are common in the military and impact Force readiness. Chronic pain affects one-third of service members and is a leading cause of medical non-readiness (MNR) in the military. Research suggests that underlying psychological mechanisms related to trait coping styles and pain interference (PI) affect functional outcomes, but little research exists examining this relationship within an Army population. The purpose of this study was to examine the combined effects of PI and coping on U.S. Army soldier readiness by using annual well-being data from the Global Assessment Tool (GAT) and medical non-readiness (MNR) based on duty restriction records. Methods: The sample comprised 866,379 soldiers who completed the GAT between 2014 and 2017 with no duty restrictions at the time of baseline GAT completion; subjects were observed through 2018 for duty restrictions. Parametric survival regression models with a Weibull distribution predicted demographic-adjusted hazards of MNR by dichotomized PI (no PI/PI) and beneficial/non-beneficial use of GAT coping components (good coping, bad coping, catastrophizing-flexibility, and catastrophizing-hopelessness). Incident MNR was evaluated for all duty restrictions, and stratified by selected body systems (upper extremity, lower extremity, psychiatric). Results: Among soldiers with PI, hazards were higher in those reporting non-beneficial coping styles (bad coping, hopelessness) and lower in those reporting beneficial coping styles (good coping, flexibility). Across all coping styles, PI/coping interactions were particularly strong for catastrophizing-hopelessness and when examining MNR from psychiatric conditions. Discussion: These findings suggest some synergistic associations between pain and coping that may impact pain-related occupational disability. Coping skills may be an effective interventional target for chronic pain reduction/prevention within military programs, such as the Master Resilience Training Course offered to soldiers in the Army. Further research should assess whether early coping style interventions can reduce pain-related outcomes.

2.
J Athl Train ; 58(6): 511-518, 2023 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-36583956

RESUMEN

CONTEXT: The US Army embedded injury-prevention experts (IPEs), specifically athletic trainers and strength and conditioning coaches, into initial entry training (IET) to limit musculoskeletal (MSK) conditions and their negative consequences. However, little is known about the financial impact of IPEs. OBJECTIVE: To assess whether IPEs were associated with fewer sunk training costs due to MSK-related early discharges from service. DESIGN: Retrospective cohort study. SETTING: Database of US Army soldiers' administrative, medical, and readiness records. PATIENTS OR OTHER PARTICIPANTS: A total of 198 166 soldiers (age = 20.7 ± 3.2 years, body mass index = 24.4 ± 3.5 kg/m2) who began IET during 2014 to 2017. MAIN OUTCOME MEASURE(S): Early discharge from service was defined as occurring within 6 months of beginning IET. All IET sites employed IPEs from 2011 to 2017, except for 2 sites during April to November 2015. Soldiers who began IET at these 2 sites during these times were categorized as not having IPE exposure. All others were categorized as having IPE exposure. The unadjusted association between IPE access and MSK-related early discharge from service was assessed using logistic regression. Financial impact was assessed by quantifying differences in yearly sunk costs between groups with and those without IPE exposure and subtracting IPE hiring costs. RESULTS: Among 14 094 soldiers without IPE exposure, 2.77% were discharged early for MSK-related reasons. Among 184 072 soldiers with IPE exposure, 1.01% were discharged. Exposure to IPEs was associated with reduced odds of MSK-related early discharge (odds ratio = 0.36, 95% CI = 0.32, 0.40, P < .001) and a decrease in yearly sunk training costs of $11.19 to $20.00 million. CONCLUSIONS: Employing IPEs was associated with reduced sunk costs because of fewer soldiers being discharged from service early for MSK-related reasons. Evidence-based recommendations should be developed for guiding policy on the roles and responsibilities of IPEs in the military to reduce negative outcomes from MSK conditions and generate a positive return on investment.


Asunto(s)
Personal Militar , Enfermedades Musculoesqueléticas , Humanos , Adolescente , Adulto Joven , Adulto , Estudios Retrospectivos , Conducta Social , Índice de Masa Corporal
3.
BMJ Mil Health ; 169(4): 327-334, 2023 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-34373349

RESUMEN

INTRODUCTION: Minimising temporary and permanent disability associated with musculoskeletal conditions (MSK-D) is critical to the mission of the US Army. Prior research has identified potentially actionable risk factors for overall military disability and its MSK-D subset, including elevated body mass index, tobacco use and physical fitness. However, prior work does not appear to have addressed the impact of these factors on MSK-D when controlling for a full range of factors that may affect health behaviours, including aptitude scores that may serve as a proxy for health literacy. Identifying risk factors for MSK-D when providing control for all such factors may inform efforts to improve military readiness. METHODS: We studied 494 757 enlisted Army soldiers from 2014 to 2017 using a combined medical and administrative database. Leveraging data from the Army's digital 'eProfile' system of duty restriction records, we defined MSK-D as the first restriction associated with musculoskeletal conditions and resulting in the inability to deploy or train. We used multivariable Cox proportional hazards regression to assess the associations between incident MSK-D and selected risk factors including aptitude scores, physical fitness test scores, body mass index and tobacco use. RESULTS: Among the subjects, 281 278 (45.14%) experienced MSK-D. In the MSK-D hazards model, the highest effect size was for failing the physical fitness test (adjusted HR=1.63, 95% CI 1.58 to 1.67, p<0.001) compared with scoring ≥290 points. CONCLUSIONS: The analysis revealed the strongest associations between physical fitness and MSK-D. Additional efforts are warranted to determine potential mechanisms for the observed associations between selected factors and MSK-D.


Asunto(s)
Personal Militar , Enfermedades Musculoesqueléticas , Humanos , Factores de Riesgo , Aptitud Física , Enfermedades Musculoesqueléticas/epidemiología , Enfermedades Musculoesqueléticas/etiología , Índice de Masa Corporal
4.
J Foot Ankle Surg ; 62(2): 327-332, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36137898

RESUMEN

Tri-plane corrective Lapidus surgery has been described as advantageous with respect to its anatomic basis and outcomes. Because the procedure has been broadly publicized, changes in overall Lapidus procedure rates due to increased numbers of patients opting for the tri-plane approach could have occurred. Data supporting this possibility appears lacking. We employed official personnel and health records of the total active-duty US military to conduct a retrospective cohort study of Lapidus surgery rates before and after the advent of the tri-plane corrective Lapidus procedure. Least-squares and locally-weighted scatterplot smoother regression functions were used to confirm time trends. Sociodemographic and occupational traits of Lapidus patients were compared using 2-sided t tests and chi square tests. Lapidus surgery rates among hallux valgus patients decreased during 2014 to 2016 and increased during 2017 to 2021. While multiple factors might explain these trends, they coincide with the advent of and advocacy for tri-plane Lapidus surgery. The results support the possibility that its rise influenced overall Lapidus rates in this population. As these findings represent limited evidence of such an influence, further research is required to confirm a causal link. If such a link is found, and if the ongoing research suggests that superior outcomes are associated with tri-plane Lapidus surgery, substantial implications could exist for this population. Benefits might include enhanced medical readiness due to the importance of lower extremity function during military duties. Additional research is needed to confirm the impact of the procedure and to determine whether Lapidus surgery rate patterns in civilian populations mirror these findings.


Asunto(s)
Juanete , Hallux Valgus , Personal Militar , Humanos , Artrodesis/métodos , Estudios Retrospectivos , Hallux Valgus/cirugía
5.
Am J Public Health ; 111(11): 2064-2074, 2021 11.
Artículo en Inglés | MEDLINE | ID: mdl-34499537

RESUMEN

Objectives. To describe health-related behaviors or indicators associated with overall health and well-being using the Global Assessment Tool (GAT), a health behavior and psychosocial questionnaire completed annually by US Army personnel. Methods. We analyzed GAT responses from 2014 to 2018, consisting of 367 000 to 449 000 respondents per year. We used generalized estimating equations to predict the presence of each health behavior or indicator, aggregated by year and stratified on various demographics. Results. Key findings included decreases from 2014 to 2018 in risky health behaviors such as hazardous drinking (7.5% decrease) and tobacco use (7.9% decrease), dietary supplement use (5.0% to 10.6% decrease, depending on type), self-reported musculoskeletal injury (5.1% decrease), and pain interference (3.6% decrease). Physical activity, sleep, and nutritional habits largely remained consistent over time. Conclusions. In the Army, tobacco, alcohol, and risky dietary supplement usage appears to be declining, whereas lifestyle health behaviors have been stable. Whether these trends reflect responses to health education is unknown. The GAT provides useful insights into the health of the Army, which can be leveraged when developing health-related educational programs and policies. Public Health Implications. Health behaviors that have changed less over time (e.g., nutrition, sleep) may require novel approaches compared with those that changed more (e.g., dietary supplement use, drinking). (Am J Public Health. 2021;111(11):2064-2074. https://doi.org/10.2105/AJPH.2021.306456).


Asunto(s)
Indicadores de Salud , Personal Militar , Autoinforme , Adolescente , Adulto , Anciano , Femenino , Encuestas Epidemiológicas , Humanos , Estudios Longitudinales , Masculino , Persona de Mediana Edad , Vigilancia de la Población , Estados Unidos
6.
Clin J Sport Med ; 31(2): e80-e85, 2021 Mar 01.
Artículo en Inglés | MEDLINE | ID: mdl-30260813

RESUMEN

OBJECTIVE: To compare the epidemiology of concussion between athletes who are deaf or hard-of-hearing (D/HoH) and athletes who are hearing. DESIGN: Descriptive epidemiology study. SETTING: Data were collected from 2 Division III athletic programs. One institution is the world's only university designed to be barrier-free for students who are D/HoH. PARTICIPANTS: Six hundred ninety-three athletes who are D/HoH and 1284 athletes who are hearing were included in this study. Athletes participated in collegiate athletics during the 2012 to 2013 through the 2016 to 2017 academic years. INTERVENTIONS: Concussion data were provided by the athletic training staff at each institution. MAIN OUTCOME MEASURES: Concussion counts, concussion rate, and injury rate ratios (IRRs) with 95% confidence intervals (95% CIs). RESULTS: Thirty athletes who are D/HoH and 104 athletes who are hearing suffered concussions. Athletes who are hearing had an increased injury rate compared with athletes who are D/HoH for all sports combined (IRR = 1.87, 95% CI, 1.26-2.78). Football athletes who are hearing also had an increased injury rate compared with football athletes who are D/HoH (IRR = 3.30, 95% CI, 1.71-6.37). Concussion rate was higher for male athletes who are hearing than male athletes who are D/HoH (IRR = 2.84, 95% CI, 1.62-4.97). No other significant differences regarding concussion risk were identified. CONCLUSIONS: Athletes who are D/HoH in sex-comparable sports may not have a higher rate of concussion than athletes who are hearing. Rate of concussion in football may be greater among athletes who are hearing compared with athletes who are D/HoH.


Asunto(s)
Traumatismos en Atletas/epidemiología , Conmoción Encefálica/epidemiología , Sordera/epidemiología , Pérdida Auditiva/epidemiología , Adolescente , Traumatismos en Atletas/complicaciones , Béisbol/lesiones , Baloncesto/lesiones , Conmoción Encefálica/complicaciones , Comorbilidad , Sordera/complicaciones , Femenino , Fútbol Americano/lesiones , Pérdida Auditiva/complicaciones , Humanos , Incidencia , Masculino , Fútbol/lesiones , Estados Unidos/epidemiología , Adulto Joven
8.
Mil Med ; 184(11-12): e773-e780, 2019 12 01.
Artículo en Inglés | MEDLINE | ID: mdl-31125066

RESUMEN

INTRODUCTION: Musculoskeletal injuries (MSK-I) in the U.S. military accounted for more than four million medical encounters in 2017. The Military Entrance Processing Screen to Assess Risk of Training (MEPSTART) was created to identify MSK-I risk during the first 180 days of military service. METHODS: Active duty applicants to the United States Army, Navy, Air Force, and Marine Corps between February 2013 and December 2014 who consented completed a behavioral and injury history questionnaire and the MEPSTART screen [Functional Movement Screen (FMS), Y-Balance Test (YBT), Landing Error Scoring System (LESS), and Overhead Squat assessment (OHS)] the day they shipped to basic training. Male (n = 1,433) and Female (n = 281) applicants were enrolled and MSK-I were tracked for 180 days. Binomial logistic regression and multivariate Cox proportional hazards modeling were used to assess relationships among MEPSTART screens and MSK-I independent of age, BMI, sex, Service, injury history, and smoking status. Analyses were finalized and performed in 2017. RESULTS: The only functional screen related to injury was the LESS score. Compared to those with good LESS scores, applicants with poor LESS scores had lower odds of MSK-I (OR = 0.54, 95% CI = 0.30-0.97, p = 0.04), and a lower instantaneous risk of MSK-I during the first 180 d (HR = 0.58, 95%CI = 0.34-0.96, p = 0.04). However, secondary receiver operator characteristic (ROC) analyses revealed poor discriminative value (AUC = 0.49, 95%CI = 0.43-0.54). CONCLUSIONS: Functional performance did not predict future injury risk during the first 180 days of service. Poor LESS scores were associated with lower injury risk, but ROC analyses revealed little predictive value and limited clinical usefulness. Comprehensive risk reduction strategies may be preferable for mitigating MSK-I in military training populations.


Asunto(s)
Personal Militar/educación , Medición de Riesgo/normas , Enseñanza/normas , Adolescente , Femenino , Humanos , Modelos Logísticos , Masculino , Personal Militar/estadística & datos numéricos , Curva ROC , Medición de Riesgo/métodos , Medición de Riesgo/estadística & datos numéricos , Factores de Riesgo , Enseñanza/estadística & datos numéricos , Estados Unidos , Adulto Joven
10.
Int J Sports Phys Ther ; 13(6): 1008-1014, 2018 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-30534466

RESUMEN

BACKGROUND: Upper extremity injuries commonly occur in baseball players, and can often necessitate surgical interventions. Athletes recovering from previous surgeries may be at greater risk of a secondary injury due to potential residual deficits in global movement. Identifying individuals with residual movement dysfunction following surgery during a pre-participation examination may help health care professionals identify baseball players who may be at a greater risk of re-injury in their throwing arms so that appropriate interventions can be developed. PURPOSE: The purpose of this study was to assess relationships between history of shoulder or elbow surgeries and Functional Movement Screen™ (FMS™) shoulder mobility scores or Selective Functional Movement Assessment (SFMA) upper extremity patterns in collegiate baseball players. STUDY DESIGN: Cohort study. METHODS: One hundred seventy-six healthy, male, Division III collegiate baseball players (mean age = 19.65 ± 1.52 years) underwent preseason screening using the FMS™ shoulder mobility screen, and SFMA upper extremity patterns. Total FMS™ scores were dichotomized into "good" and "poor" groups (good = 2 or 3, poor = 0 or 1). SFMA scores were dichotomized into "good" and "poor" groups (good = functional non-painful (FN), poor = dysfunctional painful (DP), dysfunctional non-painful (DN), and functional painful (FP). Dichotomized FMS™ and SFMA scores were compared to questionnaire data regarding history of shoulder or elbow surgeries. RESULTS: Thirty participants (17%) reported a previous shoulder or elbow surgery in their dominant arms. Past surgeries in the shoulder or elbow were not related to FMS™ (odds ratio [OR] = 0.74, 95% confidence interval [CI] = 0.30, 1.82), p = 0.52) or SFMA performance (OR = 0.93, 95%CI = 0.38, 2.27, p = 0.88) independent of grade and playing position. CONCLUSION: History of shoulder or elbow surgery was not related to performance on the FMS™ shoulder mobility test or SFMA upper extremity patterns. Differences in the dates of surgery at the time of testing, and sport-specific adaptations of the upper extremities that are common in baseball players due to the cumulative tissue stress from years of throwing at the collegiate level, may explain these insignificant findings. LEVEL OF EVIDENCE: Level 3.

11.
J Athl Train ; 53(9): 906-914, 2018 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-30284458

RESUMEN

CONTEXT:: Data regarding the epidemiology of emergency-transport incidents (ETIs) of patients with sport-related injuries are lacking. Understanding the use of emergency services by athletic trainers can help improve emergency preparedness and prehospital care for injured student-athletes. OBJECTIVE:: To determine the frequencies and types of ETIs resulting from athletic participation. DESIGN:: Descriptive epidemiology study. SETTING:: Participating colleges and high schools during 2009-2010 to 2014-2015 and 2011-2012 to 2013-2014, respectively. PATIENTS OR OTHER PARTICIPANTS:: Student-athletes in 23 high school and 25 intercollegiate sports. MAIN OUTCOME MEASURE(S):: Data on injuries requiring emergency transport were collected by each team's athletic trainer via their respective online injury-tracking software. Athletic trainers also collected data on athlete-exposures (AEs). Emergency-transport incident frequencies and injury rates per 10 000 AEs with 95% confidence intervals (CIs) were reported. For each ETI, the sport, body part, injury mechanism, and final diagnosis were recorded. RESULTS:: A total of 339 and 146 ETIs were reported in collegiate and high school players, respectively. Collegiate women's ice hockey had the highest ETI rate (1.28/10 000 AEs; 95% CI = 0.71, 1.86). In high school, football had the highest rate at 0.80 per 10 000 AEs (95% CI = 0.64, 0.97). Athletes with head or face injuries required the most transports in college (n = 71, 20.9%) and high school (n = 33, 22.6%) across all sports. Strains (n = 50, 14.7%) and fractures (n = 35, 24.0%) were the leading diagnoses for patients undergoing transport in college and high school, respectively. CONCLUSIONS:: Athletic trainers should maintain a high level of emergency preparedness when working with sports that have high rates and numbers of ETIs. Athletes with injuries to the head/face required the most frequent transport across competition levels. Athletic trainers should have the appropriate equipment and protocols in place to handle these patients. Future researchers should examine the differences between field and hospital diagnoses to help improve prehospital care and decrease the likelihood of unnecessary emergency transports.


Asunto(s)
Traumatismos en Atletas/epidemiología , Servicios Médicos de Urgencia/estadística & datos numéricos , Transportes/estadística & datos numéricos , Adolescente , Atletas , Defensa Civil , Traumatismos Craneocerebrales/epidemiología , Femenino , Fútbol Americano/lesiones , Fracturas Óseas/epidemiología , Hockey/lesiones , Humanos , Incidencia , Masculino , Instituciones Académicas , Estudiantes , Estados Unidos/epidemiología , Universidades , Adulto Joven
12.
J Athl Train ; 53(1): 35-42, 2018 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-29314871

RESUMEN

CONTEXT: The fourth edition of the Preparticipation Physical Evaluation recommends functional testing for the musculoskeletal portion of the examination; however, normative data across sex and grade level are limited. Establishing normative data can provide clinicians reference points with which to compare their patients, potentially aiding in the development of future injury-risk assessments and injury-mitigation programs. OBJECTIVE: To establish normative functional performance and limb-symmetry data for high school-aged male and female athletes in the United States. DESIGN: Cross-sectional study. SETTING: Athletic training facilities and gymnasiums across the United States. PATIENTS OR OTHER PARTICIPANTS: A total of 3951 male and female athletes who participated on high school-sponsored basketball, football, lacrosse, or soccer teams enrolled in this nationwide study. MAIN OUTCOME MEASURE(S): Functional performance testing consisted of 3 evaluations. Ankle-joint range of motion, balance, and lower extremity muscular power and landing control were assessed via the weight-bearing ankle-dorsiflexion-lunge, single-legged anterior-reach, and anterior single-legged hop-for-distance (SLHOP) tests, respectively. We used 2-way analyses of variance and χ2 analyses to examine the effects of sex and grade level on ankle-dorsiflexion-lunge, single-legged anterior-reach, and SLHOP test performance and symmetry. RESULTS: The SLHOP performance differed between sexes (males = 187.8% ± 33.1% of limb length, females = 157.5% ± 27.8% of limb length; t = 30.3, P < .001). A Cohen d value of 0.97 indicated a large effect of sex on SLHOP performance. We observed differences for SLHOP and ankle-dorsiflexion-lunge performance among grade levels, but these differences were not clinically meaningful. CONCLUSIONS: We demonstrated differences in normative data for lower extremity functional performance during preparticipation physical evaluations across sex and grade levels. The results of this study will allow clinicians to compare sex- and grade-specific functional performances and implement approaches for preventing musculoskeletal injuries in high school-aged athletes.


Asunto(s)
Atletas , Traumatismos en Atletas/fisiopatología , Rendimiento Atlético/fisiología , Acondicionamiento Físico Humano/métodos , Medición de Riesgo , Instituciones Académicas , Adolescente , Traumatismos en Atletas/epidemiología , Traumatismos en Atletas/prevención & control , Estudios Transversales , Femenino , Humanos , Incidencia , Masculino , Fenómenos Fisiológicos Musculoesqueléticos , Factores Sexuales , Estados Unidos/epidemiología , Adulto Joven
13.
J Athl Train ; 53(11): 1025-1036, 2018 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-30715912

RESUMEN

CONTEXT: The advent of Web-based sports injury surveillance via programs such as the High School Reporting Information Online system and the National Collegiate Athletic Association Injury Surveillance Program has aided the acquisition of boys' and men's basketball injury data. OBJECTIVE: To describe the epidemiology of injuries sustained in high school boys' basketball in the 2005-2006 through 2013-2014 academic years and collegiate men's basketball in the 2004-2005 through 2013-2014 academic years using Web-based sports injury surveillance. DESIGN: Descriptive epidemiology study. SETTING: Online injury surveillance from basketball teams of high school boys (annual average = 100) and collegiate men (annual average = 55). PATIENTS OR OTHER PARTICIPANTS: Boys' and men's basketball players who participated in practices and competitions during the 2005-2006 through 2013-2014 academic years in high school or the 2004-2005 through 2013-2014 academic years in college. MAIN OUTCOME MEASURES: Athletic trainers collected time-loss (≥24 hours) injury and exposure data. Injury rates per 1000 athlete-exposures (AEs) were calculated. Injury rate ratios (IRRs) with 95% confidence intervals (CIs) compared injury rates by school size or division, time in season, event type, and competition level. RESULTS: The High School Reporting Information Online system documented 3056 time-loss injuries during 1 977 480 AEs; the National Collegiate Athletic Association Injury Surveillance Program documented 4607 time-loss injuries during 868 631 AEs. The injury rate was higher for college than for high school (5.30 versus 1.55/1000 AE; IRR = 3.43; 95% CI = 3.28, 3.59). The injury rate was higher for competitions than for practices in both high school (IRR = 2.38; 95% CI = 2.22, 2.56) and college (IRR = 2.02; 95% CI = 1.90, 2.14). The most common injuries at both levels were ligament sprains, muscle/tendon strains, and concussions; most injuries affected the ankle, knee, and head/face. Injuries were most often caused by contact with another player or noncontact mechanisms. CONCLUSIONS: Injury rates were greater among collegiate players compared with high school players and were greater during competitions than practices at both levels. Distributions of injuries by body part, diagnoses, and mechanisms of injury were similar, suggesting that athletes at both levels may benefit from similar injury-prevention strategies.


Asunto(s)
Traumatismos en Atletas/epidemiología , Baloncesto/lesiones , Internet , Adolescente , Atletas , Conmoción Encefálica/epidemiología , Humanos , Incidencia , Masculino , Instituciones Académicas , Traumatismos de los Tejidos Blandos/epidemiología , Esguinces y Distensiones , Estudiantes , Estados Unidos , Universidades , Adulto Joven
14.
J Athl Train ; 53(11): 1037-1048, 2018 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-30715913

RESUMEN

CONTEXT: The advent of Web-based sports injury surveillance via programs such as the High School Reporting Information Online system and the National Collegiate Athletic Association Injury Surveillance Program has aided the acquisition of girls' and women's basketball injury data. OBJECTIVE: To describe the epidemiology of injuries sustained in high school girls' basketball in the 2005-2006 through 2013-2014 academic years and collegiate women's basketball in the 2004-2005 through 2013-2014 academic years using Web-based sports injury surveillance. DESIGN: Descriptive epidemiology study. SETTING: Online injury surveillance from basketball teams in high school girls (annual average = 100) and collegiate women (annual average = 57). PATIENTS OR OTHER PARTICIPANTS: Girls' and women's basketball players who participated in practices and competitions during the 2005-2006 through 2013-2014 academic years in high school or the 2004-2005 through 2013-2014 academic years in college. MAIN OUTCOME MEASURE(S): Certified athletic trainers collected time-loss (≥24 hours) injury and exposure data. Injury rates per 1000 athlete-exposures (AEs) were calculated. Injury rate ratios (IRRs) with 95% confidence intervals (CIs) were used to compare injury rates by school size or division, time in season, event type, and competition level. RESULTS: The High School Reporting Information Online system documented 2930 time-loss injuries during 1 609 733 AEs; the National Collegiate Athletic Association Injury Surveillance Program documented 3887 time-loss injuries during 783 600 AEs. The injury rate was higher in college than in high school (4.96 versus 1.82/1000 AEs; IRR = 2.73; 95% CI = 2.60, 2.86). The injury rate was higher in competitions than in practices for both high school (IRR = 3.03; 95% CI = 2.82, 3.26) and collegiate (IRR = 1.99; 95% CI = 1.86, 2.12) players. The most common injuries at both levels were ligament sprains, concussions, and muscle/tendon strains; the majority of injuries affected the ankle, knee, and head/face. These injuries were often caused by contact with another player or a noncontact mechanism. CONCLUSIONS: Injury rates were higher in collegiate than in high school athletes and in competitions than in practices. Similarities in distributions of injuries by body parts, specific diagnoses, and mechanisms of injury suggest that both levels may benefit from similar injury-prevention strategies.


Asunto(s)
Traumatismos en Atletas/epidemiología , Baloncesto/lesiones , Internet , Adolescente , Atletas , Conmoción Encefálica/epidemiología , Femenino , Humanos , Incidencia , Instituciones Académicas , Traumatismos de los Tejidos Blandos/epidemiología , Esguinces y Distensiones/epidemiología , Estudiantes , Estados Unidos , Universidades , Adulto Joven
15.
Int J Sports Phys Ther ; 12(6): 960-966, 2017 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-29158957

RESUMEN

BACKGROUND: The shoulder mobility screen of the Functional Movement Screen™ (FMS™) and the upper extremity patterns of the Selective Functional Movement Assessment (SFMA) assess global, multi-joint movement capabilities in the upper-extremities. Identifying which assessment can most accurately determine if baseball players are at an increased risk of experiencing overuse symptoms in the shoulder or elbow throughout a competitive season may reduce throwing-related injuries requiring medical attention. PURPOSE: The purpose of this study was to determine if preseason FMS™ or SFMA scores were related to overuse severity scores in the shoulder or elbow during the preseason and competitive season. STUDY DESIGN: Cohort study. METHODS: Sixty healthy, male, Division III collegiate baseball players (mean age = 20.1 ± 2.0 years) underwent preseason testing using the FMS™ shoulder mobility screen, and SFMA upper extremity patterns. Their scores were dichotomized into good and bad movement scores, and were compared to weekly questionnaires registering overuse symptoms and pain severity in the shoulder or elbow during the season. RESULTS: Poor FMS™ performance was associated with an increased likelihood of experiencing at least one overuse symptom during the preseason independent of grade and position (adjusted odds ratio [OR] = 5.14, p = 0.03). Poor SFMA performance was associated with an increased likelihood of experiencing at least one overuse symptom during the preseason (adjusted OR = 6.10, p = 0.03) and during the competitive season (adjusted OR = 17.07, p = 0.03) independent of grade and position. CONCLUSION: FMS™ shoulder mobility and SFMA upper extremity pattern performance were related to the likelihood of experiencing overuse symptoms during a baseball season. Participants with poor FMSTM performances may be more likely to experience at least one overuse symptom in their shoulder or elbow during the preseason. Additionally, individuals with poor SFMA performances may be more likely to report overuse symptoms during the preseason or competitive season. LEVEL OF EVIDENCE: Level 3.

16.
J Athl Train ; 52(5): 464-473, 2017 May.
Artículo en Inglés | MEDLINE | ID: mdl-28414917

RESUMEN

CONTEXT: Variations in knee-sprain incidence among competition levels are unclear but may help inform prevention strategies in American football players. OBJECTIVE: To describe the epidemiology of knee sprains in youth, high school, and collegiate football players. DESIGN: Descriptive epidemiology study. SETTING: Injury and athlete-exposure (AE) data were collected from 3 injury-surveillance programs at the youth, high school, and collegiate competition levels. PATIENTS OR OTHER PARTICIPANTS: Data from 310 youth, 184 high school, and 71 collegiate football team-seasons were collected during the 2012 through 2014 seasons. MAIN OUTCOME MEASURE(S): Knee-sprain rates and risks were calculated for each competition level. Injury rate ratios (IRRs) and risk ratios (RRs) compared knee-sprain rates by competition level. Injury proportion ratios (IPRs) compared differences in surgery needs, recurrence, injury mechanism, and injury activity by competition level. RESULTS: Knee-sprain rates in youth, high school, and collegiate football were 0.16/1000 AEs, 0.25/1000 AEs, and 0.69/1000 AEs, respectively. Knee-sprain rates increased as the competition level increased (high school versus youth: IRR = 1.60; 95% confidence interval [CI] = 1.12, 2.30; collegiate versus high school: IRR = 2.73; 95% CI = 2.38, 3.96). Knee-sprain risk was highest in collegiate (4.3%), followed by high school (2.0%) and youth (0.5%) athletes. Knee-sprain risk increased as the competition level increased (high school versus youth: RR = 3.73; 95% CI = 2.60, 5.34; collegiate versus high school: RR = 2.14; 95% CI = 1.83, 2.51). Collegiate football had the lowest proportion of knee sprains that were noncontact injuries (collegiate versus youth: IPR = 0.54; 95% CI = 0.31, 0.95; collegiate versus high school: IPR = 0.59; 95% CI = 0.44, 0.79) and the lowest proportion that occurred while being tackled (collegiate versus youth: IPR = 0.44; 95% CI = 0.26, 0.76; collegiate versus high school: IPR = 0.71; 95% CI = 0.51, 0.98). CONCLUSIONS: Knee-sprain incidence was highest in collegiate football. However, level-specific variations in the distributions of knee sprains by injury activity may highlight the need to develop level-specific policies and prevention strategies that ensure safe sports play.


Asunto(s)
Traumatismos en Atletas , Fútbol Americano/lesiones , Traumatismos de la Rodilla , Esguinces y Distensiones , Adolescente , Atletas/estadística & datos numéricos , Traumatismos en Atletas/epidemiología , Traumatismos en Atletas/etiología , Traumatismos en Atletas/prevención & control , Humanos , Incidencia , Traumatismos de la Rodilla/epidemiología , Traumatismos de la Rodilla/etiología , Traumatismos de la Rodilla/prevención & control , Masculino , Evaluación de Necesidades , Estaciones del Año , Esguinces y Distensiones/epidemiología , Esguinces y Distensiones/etiología , Esguinces y Distensiones/prevención & control , Estados Unidos/epidemiología , Universidades/estadística & datos numéricos , Adulto Joven
17.
Am J Sports Med ; 45(2): 417-425, 2017 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-28146396

RESUMEN

BACKGROUND: Variations in ankle injury rates and distributions among competition levels are unclear, but such data may help inform strategies to prevent ankle sprains during American football. PURPOSE: To describe the epidemiological patterns of ankle sprains in youth, high school (HS), and collegiate American football. STUDY DESIGN: Descriptive epidemiological study. METHODS: Data regarding youth, HS, and college football athletes were collected from 3 injury surveillance programs: (1) the Youth Football Safety Study (YFSS), (2) the National Athletic Treatment, Injury and Outcomes Network (NATION), and (3) the National Collegiate Athletic Association (NCAA) Injury Surveillance Program (ISP). During the 2012-2014 seasons, the YFSS, NATION, and NCAA ISP included 310, 184, and 71 football team-seasons, respectively. Athletic trainers (ATs) attended each practice and game and reported injuries and athlete-exposures (AEs) via their preferred injury documentation application. Ankle sprain rates for each type of ankle sprain were calculated overall, by event type (ie, practices and games), and specifically for severe injuries (ie, participation restriction time >21 days) and recurrent injuries (as defined by ATs). Rate ratios (RRs) were used to compare ankle sprain rates by competition level and event type. Injury proportion ratios (IPRs) were used to compare differences in severity, surgical needs, recurrence, injury mechanism, and injury activity by competition level. RRs and IPRs with 95% confidence intervals excluding 1.00 were considered statistically significant. RESULTS: A total of 124, 897, and 643 ankle sprains were reported in youth, HS, and college football, respectively. This led to respective rates of 0.59, 0.73, and 1.19 sprains per 1000 AEs. The ankle sprain rate in college football was higher than the rates in HS (RR = 1.64; 95% CI, 1.48-1.82) and youth (RR = 2.01; 95% CI, 1.65-2.43) football. The proportion of ankle sprains that were recurrent in youth football was higher than the proportions in HS (IPR = 2.73; 95% CI, 1.68-4.50) and college (IPR = 2.19; 95% CI, 1.33-3.61) football. CONCLUSION: Ankle sprain rates were highest in college athletes. However, level-specific variations in ankle sprain severity and recurrence may highlight the need to develop level-specific policies and prevention strategies to reduce injury incidence.


Asunto(s)
Traumatismos del Tobillo/epidemiología , Fútbol Americano/lesiones , Adolescente , Traumatismos del Tobillo/etiología , Traumatismos en Atletas/epidemiología , Traumatismos en Atletas/etiología , Niño , Humanos , Masculino , Estaciones del Año , Estados Unidos/epidemiología , Adulto Joven
18.
J Athl Train ; 51(8): 658-661, 2016 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-27808574

RESUMEN

CONTEXT: Musculoskeletal injury-prediction methods vary and may have limitations that affect the accuracy of results and clinical meaningfulness. BACKGROUND: Research examining injury risk factors is meaningful, but attempting to extrapolate injury risk from studies that do not prospectively assess injury occurrence may limit clinical applications. Injury incidence is a vital outcome measure, which allows for the appropriate interpretation of injury-prediction analyses; a lack of injury-incidence data may decrease the accuracy and increase the uncertainty of injury-risk estimates. Extrapolating results that predict an injury risk factor to predicting actual injuries may lead to inappropriate clinical decision-making models. CONCLUSIONS: Improved understanding of the limitations of injury-prediction methods, specifically those that do not prospectively assess injuries, will allow clinicians to better assess the clinical meaningfulness of the results.


Asunto(s)
Traumatismos en Atletas/epidemiología , Sistema Musculoesquelético/lesiones , Humanos , Incidencia , Estudios Prospectivos , Medición de Riesgo , Factores de Riesgo
19.
Clin J Sport Med ; 26(6): 435-444, 2016 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-26978166

RESUMEN

OBJECTIVE: A stated goal of the preparticipation physical evaluation (PPE) is to reduce musculoskeletal injury, yet the musculoskeletal portion of the PPE is reportedly of questionable use in assessing lower extremity injury risk in high school-aged athletes. The objectives of this study are: (1) identify clinical assessment tools demonstrated to effectively determine lower extremity injury risk in a prospective setting, and (2) critically assess the methodological quality of prospective lower extremity risk assessment studies that use these tools. DATA SOURCES: A systematic search was performed in PubMed, CINAHL, UptoDate, Google Scholar, Cochrane Reviews, and SportDiscus. Inclusion criteria were prospective injury risk assessment studies involving athletes primarily ages 13 to 19 that used screening methods that did not require highly specialized equipment. Methodological quality was evaluated with a modified physiotherapy evidence database (PEDro) scale. MAIN RESULTS: Nine studies were included. The mean modified PEDro score was 6.0/10 (SD, 1.5). Multidirectional balance (odds ratio [OR], 3.0; CI, 1.5-6.1; P < 0.05) and physical maturation status (P < 0.05) were predictive of overall injury risk, knee hyperextension was predictive of anterior cruciate ligament injury (OR, 5.0; CI, 1.2-18.4; P < 0.05), hip external:internal rotator strength ratio of patellofemoral pain syndrome (P = 0.02), and foot posture index of ankle sprain (r = -0.339, P = 0.008). CONCLUSIONS: Minimal prospective evidence supports or refutes the use of the functional musculoskeletal exam portion of the current PPE to assess lower extremity injury risk in high school athletes. Limited evidence does support inclusion of multidirectional balance assessment and physical maturation status in a musculoskeletal exam as both are generalizable risk factors for lower extremity injury.


Asunto(s)
Traumatismos de la Pierna , Examen Físico/métodos , Adolescente , Desarrollo del Adolescente , Humanos , Fuerza Muscular , Equilibrio Postural , Factores de Riesgo
20.
Int J Sports Phys Ther ; 10(5): 622-7, 2015 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-26491612

RESUMEN

BACKGROUND: The Functional Movement Screen (FMS™) has been suggested for use in predicting injury risk in active populations, but time constraints may limit use of the screening test battery. Identifying one component of the FMS™ that can predict which individuals may perform poorly on the entire test, and therefore should undergo the full group of screening maneuvers, may reduce time constraints and increase pre-participation screening utilization. PURPOSE: The purpose of this study was to determine if performance on the FMS™ overhead deep squat test (DS) could predict performance on the entire FMS™. STUDY DESIGN: Cohort study. METHODS: One hundred and three collegiate athletes underwent offseason FMS™ testing. The DS and adjusted FMS™ composite scores were dichotomized into low performance and high performance groups with athletes scoring below 2 on the DS categorized as low performance, and athletes with adjusted FMS™ composite scores below 12 categorized as low performance. Scores of 2 or above and 12 or above were considered high performances for the DS test and adjusted FMS™ composite score respectively, and therefore low risk for movement dysfunction and potentially, injury. RESULTS: Individuals categorized as low performance as a result of the DS test had lower adjusted FMS™ composite scores (p < 0.001). DS scores were positively correlated with adjusted FMS™ composite scores (ρ = 0.50, p < 0.001). Binomial logistic regression identified an odds ratio of 3.56 (95% CI: 1.24, 10.23, p = 0.018) between DS and FMS™ performance categories. CONCLUSIONS: Performance on the DS test may predict performance on the FMS™ and help identify individuals who require further musculoskeletal assessment. Further research is needed to determine if DS performance can predict asymmetries during the FMS™. LEVEL OF EVIDENCE: Level 3.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...