Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 39
Filtrar
1.
J Dance Med Sci ; 27(3): 173-179, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37264604

RESUMO

INTRODUCTION: Understanding the physical and mental health of collegiate dancers is important for developing appropriate screening protocols and treatment interventions. This study aims to provide descriptive data on the overall health, injury burden, and well-being of a group of collegiate dancers, including the interactions between injury, nutrition, and mental health, to provide insight for wellness screening and interventions in collegiate dance programs. METHODS: Members of the School of Dance at the University of Utah were sent an electronic general health survey. The survey included questions regarding medical history, family history, injuries, diet, sleep quality, symptoms of depression and anxiety, and history of eating disorders. RESULTS: Of the 231 dancers who received the survey, 198 responded (response rate = 85.7%). Fifty 2% of respondents had an active injury. Symptoms of depression and anxiety were common (35.4%), and 37.4% of the dancers were interested in receiving mental health support. Symptoms of depression and anxiety had a significant association with both a history of injury and active injuries (P = .033 and .039, respectively). History of eating disorder was also significantly associated with active injuries (P = .005). The most commonly injured body area was ankle or foot (n = 144, 72.7%), followed by lower leg or shin (n = 76, 38.4%), and knee (n = 61, 30.8%). Over a quarter of the dancers (n = 54, 27.3%) reported having trouble sleeping, and 9.1% reported having a history of eating disorder. CONCLUSIONS: This study highlights the important interplay between mental health, sleep, nutrition, and injury. These results show that in a group of collegiate dancers, active injuries and mental health concerns are common, and that there are statistically significant associations between injury, nutrition, and mental health. These data provide insight into factors that affect dancer wellness and help inform future screening and intervention protocols for dance programs.


Assuntos
Dança , Humanos , Dança/lesões , Extremidade Inferior , Articulação do Tornozelo , , Universidades
3.
Br J Sports Med ; 2022 Dec 13.
Artigo em Inglês | MEDLINE | ID: mdl-36588428

RESUMO

OBJECTIVE: To describe the epidemiology of injuries at the Tokyo 2020 Paralympic Games, including injuries sustained in the new sports of badminton and taekwondo. METHODS: Injury data were obtained daily via the established web-based injury and illness surveillance system (WEB-IISS; 81 countries, 3836 athletes) and local organising committee medical facilities (81 countries, 567 athletes). Univariate unadjusted incidences (injuries per 1000 athlete days with 95% CIs), injury proportion (IP, %) and injury burden (days lost per 1000 athlete days) are reported. RESULTS: A total of 4403 athletes (1853 women, 2550 men) from 162 countries were monitored prospectively during the 3-day pre-competition and 12-day competition periods (66 045 athlete days). 386 injuries were reported in 352 athletes (IP=8.0%) with an incidence of 5.8 per 1000 athlete days (95% CI 5.3 to 6.5). Football 5-a-side (17.2), taekwondo (16.0), judo (11.6) and badminton (9.6) had the highest incidence. There was a higher incidence of injuries in the pre-competition period than in the competition period (7.5 vs 5.4; p=0.0053). Acute (sudden onset) injuries and injuries to the shoulder (0.7) and hand/fingers (0.6) were most common. Injury burden was 10.9 (8.6-13.8), with 35% of injuries resulting in time loss from training and competition. CONCLUSION: Compared with previous Paralympic Games, there was a reduction in injury incidence but higher injury burden at the Tokyo 2020 Paralympic Games. The new sports of taekwondo and badminton had a high injury incidence, with the highest injury burden in taekwondo, compared with other sports. These findings provide epidemiological data to inform injury prevention measures for high-risk sports.

4.
Br J Sports Med ; 2022 Dec 13.
Artigo em Inglês | MEDLINE | ID: mdl-36588431

RESUMO

OBJECTIVE: To describe the incidence and burden of illness at the Tokyo 2020 Paralympic Games, which was organised with strict COVID-19 countermeasures. METHODS: Daily illnesses were recorded via the web-based injury and illness surveillance system (teams with their own medical staff; n=81), and local polyclinic services (teams without their own medical staff; n=81). Illness proportion, incidence and burden were reported for all illnesses and in subgroups by sex, age, competition period, sports and physiological system. RESULTS: 4403 athletes (1853 female and 2550 male) from 162 countries were monitored for the 15-day period of the Tokyo Paralympic Games (66 045 athlete days). The overall incidence of illnesses per 1000 athlete days was 4.2 (95% CI 3.8 to 4.8; 280 illnesses). The highest incidences were in wheelchair tennis (7.1), shooting (6.1) and the new sport of badminton (5.9). A higher incidence was observed in female compared with male athletes (5.1 vs 3.6; p=0.005), as well as during the precompetition versus competition period (7.0 vs 3.5; p<0.0001). Dermatological and respiratory illnesses had the highest incidence (1.1 and 0.8, respectively). Illness burden was 4.9 days per 1000 athlete days and 23% of illnesses resulted in time loss from training/competition>1 day. CONCLUSION: The incidence of illness at the Tokyo 2020 Paralympic Games was the lowest yet to be recorded in either the summer or winter Paralympic Games. Dermatological and respiratory illnesses were the most common, with the burden of respiratory illness being the highest, largely due to time loss associated with COVID-19 cases. Infection countermeasures appeared successful in reducing respiratory and overall illness, suggesting implementation in future Paralympic Games may mitigate illness risk.

5.
Curr Sports Med Rep ; 20(6): 291-297, 2021 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-34099606

RESUMO

ABSTRACT: A web-based injury surveillance system was implemented through a collaboration between University of Utah researchers and the National Interscholastic Cycling Association (NICA) to better understand injury characteristics in mountain biking. Data were collected from NICA leagues during the 2018 and 2019 seasons. Injuries were tracked in 41,327 student-athlete-years, identifying 1750 unique injuries during 1155 injury events. Rider-dependent and rider-independent variables were analyzed. The most commonly reported injuries were concussion (23.6%), injuries to the wrist/hand (22.3%), and shoulder (15.6%). Half of all injury events occurred on downhills. Men and women reported similar yet significantly different injury rates (2.69% and 3.21%, respectively; P = 0.009). Women sustained more lower-limb injuries (37.8% vs 28.3%; P = 0.003). Nearly 50% of crashes resulted in an emergency room visit. Youth mountain bike racing is a rapidly growing sport. Acute traumatic injuries are common. Injury surveillance system data are now being used to inform injury prevention strategies and direct future research.


Assuntos
Ciclismo/lesões , Estudantes/estatística & dados numéricos , Atletas/estatística & dados numéricos , Ciclismo/estatística & dados numéricos , Concussão Encefálica/epidemiologia , Feminino , Traumatismos da Mão/epidemiologia , Humanos , Extremidade Inferior/lesões , Masculino , Veículos Off-Road/estatística & dados numéricos , Vigilância da População/métodos , Distribuição por Sexo , Lesões do Ombro/epidemiologia , Estudantes/classificação , Universidades/estatística & dados numéricos , Traumatismos do Punho/epidemiologia , Esportes Juvenis/lesões
6.
J Sci Med Sport ; 24(10): 1032-1037, 2021 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-32546436

RESUMO

OBJECTIVES: To describe the design and implementation of an injury surveillance system for youth mountain bike racing in the United States, and to report preliminary first-year results. DESIGN: Descriptive sports injury epidemiology study. METHODS: After two and a half years of development and extensive beta-testing, an electronic injury surveillance system went live in January, 2018. An automated email is sent to a Designated Reporter on each team, with links to the injury reporting form. Data collected include demographic information, injured body part, injury diagnosis, trail conditions and other factors associated with injury occurrence. RESULTS: 837 unique injuries were reported in 554 injury events among 18,576 student-athletes. The overall injury event proportion was 3.0%. The most common injury among student-athletes was concussion/possible concussion (22.2%), followed by injuries to the wrist and hand (19.0%). Among 8,738 coaches, there were 134 unique injuries reported that occurred in 68 injury events, resulting in an overall injury event proportion of 0.8%. The shoulder (38.2%) was the most commonly injured body part among coaches. Injuries among coaches tended to more frequently result in fractures, dislocations and hospital admission compared with injuries among student-athletes. Among student-athletes, female riders sustained lower limb injuries more than male riders (34.0% vs. 20.7%, p<0.001). CONCLUSIONS: A nationwide injury surveillance system for youth mountain bike racing was successfully implemented in the United States. Overall injury event proportions were relatively low, but many injury events resulted in concussions/possible concussions, fractures, dislocations and 4 weeks or longer of time loss from riding.


Assuntos
Traumatismos em Atletas/epidemiologia , Ciclismo/lesões , Vigilância da População/métodos , Traumatismos em Atletas/prevenção & controle , Feminino , Humanos , Incidência , Estudos Longitudinais , Masculino , Estudos Prospectivos , Estados Unidos/epidemiologia
7.
Drug Test Anal ; 11(11-12): 1747-1754, 2019 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-31697019

RESUMO

Understanding and characterizing confounding factors to the Athlete Biological Passport (ABP) is crucial for the reliable interpretation of biological profiles in the antidoping field. The physiological effects on hematological parameters and plasma volume (PV) following competition in a long-distance triathlon, as seen in the Ironman discipline, have yet to be fully described and are the focus of this study. Complete blood count blood tests were conducted on 19 Ironman triathletes before and after an Ironman triathlon to characterize changes in hematological parameters and the effect on ABP interpretation, as it was hypothesized that changes in the plasma volume may result in the presentation of atypical ABP profiles. Baseline blood samples were collected from the athletes prior to the event, and one sample was collected per day for up to 1 week following the race. Differences were observed between the male and female athletes across multiple parameters. Most importantly to the ABP, decreases in hemoglobin concentration (HGB) and hematocrit (HCT) were identified post-race, with the largest decreases identified on day +2. The average HGB returned to pre-race baseline levels on day +5. Beginning 5-6 days after the race, increases in the reticulocyte percentage (Ret%) were identified. Atypical Passport Findings were identified in 32% (6/19) of the ABPs, flagged mainly due to atypical hemoglobin concentration and one instance in which the OFF-score exceeded the adaptive model limits. These results offer a characterized timeline of hematological changes and expected shifting of plasma volume following an Ironman triathlon providing important data for the reliable interpretation of ABP profiles in this field.


Assuntos
Corrida , Adulto , Atletas , Desempenho Atlético , Contagem de Células Sanguíneas , Feminino , Hematócrito , Humanos , Masculino , Pessoa de Meia-Idade , Resistência Física , Volume Plasmático
8.
Clin J Sport Med ; 29(4): 329-335, 2019 07.
Artigo em Inglês | MEDLINE | ID: mdl-31241537

RESUMO

OBJECTIVE: To characterize factors associated with helmet use and risk-taking behavior among recreational skiers and snowboarders. DESIGN: Observational study. SETTING: Large, western United States mountain resort. PARTICIPANTS: 1285 male and female recreational skiers and snowboarders were interviewed during a single winter ski season. INDEPENDENT VARIABLES: Helmet use, demographic, and sport-related characteristics. MAIN OUTCOME MEASURES: Brief sensation seeking scale (BSSS) as a measure of risk-taking behavior and self-reported risk compensation. RESULTS: Of the respondents (N = 1285), 17.5%, 12.5%, and 70.0% reported that they never, sometimes, and always wore a helmet, respectively. Multiple linear regression analysis showed that individuals reporting sometimes wearing a helmet had significantly higher BSSS scores than those reporting never wearing a helmet (P = 0.031) or always wearing it (P = 0.018). Male gender, younger age, snowboarding, higher perceived sport ability, more days per year skiing or snowboarding, and more time spent in the terrain park were significantly associated with higher BSSS scores (P < 0.05). Logistic regression analysis focusing on subgroups of respondents who reported either sometimes or always wearing a helmet indicated that the odds of taking more risks when wearing a helmet for inconsistent helmet users was 75% higher than the odds for those who reported always wearing a helmet (P = 0.06). CONCLUSIONS: Inconsistent helmet users have characteristics of risk-taking behavior and risk compensation. Male gender, younger age, snowboarding, higher perceived sport ability, and more time spent on the mountain and in the terrain park are also important determinants of risk-taking behavior.


Assuntos
Dispositivos de Proteção da Cabeça/estatística & dados numéricos , Assunção de Riscos , Esqui , Adolescente , Adulto , Fatores Etários , Idoso , Idoso de 80 Anos ou mais , Traumatismos em Atletas/prevenção & controle , Criança , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Análise de Regressão , Fatores Sexuais , Estados Unidos , Adulto Jovem
9.
J Clin Endocrinol Metab ; 104(3): 906-914, 2019 03 01.
Artigo em Inglês | MEDLINE | ID: mdl-30295816

RESUMO

Context: Clomiphene is a performance-enhancing drug commonly abused by males in sport, but the extent to which testosterone increases in healthy males following its use is unknown. In addition, evidence suggests that clomiphene, a mixture of cis- and trans-isomers zuclomiphene and enclomiphene, is detectable in urine for months following use; the isomer-specific urinary detection window has yet to be characterized in a controlled study. Objective: To determine the effect of once-daily, 30-day clomiphene treatment on serum testosterone and gonadotropin levels in the subject population studied and the urinary clearance and detection window of clomiphene isomers following administration for antidoping purposes. Participants and Design: Twelve healthy males aged 25 to 38 years, representing a recreational athlete population, participated in this open-label, single-arm study. Intervention: Oral clomiphene citrate (50 mg) was self-administered once daily for 30 days. Serum and urine samples were collected at baseline and at days 7, 14, 21, 28, 30, 32, 35, 37, 44, 51, and 58; urine collections continued periodically up to day 261. Results: Mean testosterone, LH, and FSH levels increased 146% (SEM, ±23%), 177% (±34%), and 170% (±33%), respectively, during treatment compared with baseline. Serum drug concentrations and urinary excretion were nonuniform among individuals as isomeric concentrations varied. The zuclomiphene urinary detection window ranged from 121 to >261 days. Conclusions: Clomiphene significantly raised serum testosterone and gonadotropin levels in healthy men and thus can be abused as a performance-enhancing drug. Such abuse is detectable in urine for ≥4 months following short-term use.


Assuntos
Clomifeno/efeitos adversos , Sistema Hipotálamo-Hipofisário/efeitos dos fármacos , Substâncias para Melhoria do Desempenho/efeitos adversos , Testículo/efeitos dos fármacos , Administração Oral , Adulto , Clomifeno/administração & dosagem , Clomifeno/urina , Dopagem Esportivo/métodos , Dopagem Esportivo/prevenção & controle , Hormônio Foliculoestimulante/sangue , Gonadotropinas/sangue , Gonadotropinas/metabolismo , Voluntários Saudáveis , Humanos , Sistema Hipotálamo-Hipofisário/metabolismo , Hormônio Luteinizante/sangue , Masculino , Substâncias para Melhoria do Desempenho/administração & dosagem , Substâncias para Melhoria do Desempenho/urina , Autoadministração , Testículo/metabolismo , Testosterona/sangue , Testosterona/metabolismo
10.
Phys Sportsmed ; 46(3): 349-354, 2018 09.
Artigo em Inglês | MEDLINE | ID: mdl-29333913

RESUMO

OBJECTIVES: To examine whether changing weigh-in from the same day of the match to the day before the match and prohibiting 6-oz gloves are associated with fatalities in boxing matches sanctioned by the Japan Boxing Commission (JBC). METHODS: We analyzed the rates of boxing fatalities before and after the two rule changes above via secondary analysis of data. Demographics and boxing records of deceased boxers were examined using descriptive statistics, exact binomial test the Mann-Whitney-Wilcoxon test and Fisher's exact tests. RESULTS: As of this study, a total of 38 boxers (23.9 ± 3.3 years of age) reportedly died due to injuries sustained in JBC-sanctioned boxing matches since 1952. Changing weigh-in to the day before the match or prohibiting 6-oz gloves was not significantly associated with the rates of boxing fatalities 5 years and 10 years before and after the rule changes (p > 0.05). Deceased boxers after these rule changes were significantly older, completed significantly more rounds in the final match, and were significantly less likely to lose the previous match (prior to the final match) and to do so by knockouts (p < 0.05). CONCLUSION: Changing weigh-in to the day before the match and prohibiting 6-oz gloves may not result in reducing boxing fatalities.


Assuntos
Traumatismos em Atletas/mortalidade , Boxe/lesões , Adulto , Traumatismos em Atletas/prevenção & controle , Boxe/normas , Humanos , Japão , Masculino , Adulto Jovem
11.
J Strength Cond Res ; 32(2): 396-408, 2018 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-28135222

RESUMO

Teramoto, M, Cross, CL, Rieger, RH, Maak, TG, and Willick, SE. Predictive validity of national basketball association draft combine on future performance. J Strength Cond Res 32(2): 396-408, 2018-The National Basketball Association (NBA) Draft Combine is an annual event where prospective players are evaluated in terms of their athletic abilities and basketball skills. Data collected at the Combine should help NBA teams select right the players for the upcoming NBA draft; however, its value for predicting future performance of players has not been examined. This study investigated predictive validity of the NBA Draft Combine on future performance of basketball players. We performed a principal component analysis (PCA) on the 2010-2015 Combine data to reduce correlated variables (N = 234), a correlation analysis on the Combine data and future on-court performance to examine relationships (maximum pairwise N = 217), and a robust principal component regression (PCR) analysis to predict first-year and 3-year on-court performance from the Combine measures (N = 148 and 127, respectively). Three components were identified within the Combine data through PCA (= Combine subscales): length-size, power-quickness, and upper-body strength. As per the correlation analysis, the individual Combine items for anthropometrics, including height without shoes, standing reach, weight, wingspan, and hand length, as well as the Combine subscale of length-size, had positive, medium-to-large-sized correlations (r = 0.313-0.545) with defensive performance quantified by Defensive Box Plus/Minus. The robust PCR analysis showed that the Combine subscale of length-size was a predictor most significantly associated with future on-court performance (p ≤ 0.05), including Win Shares, Box Plus/Minus, and Value Over Replacement Player, followed by upper-body strength. In conclusion, the NBA Draft Combine has value for predicting future performance of players.


Assuntos
Desempenho Atlético/estatística & dados numéricos , Basquetebol/fisiologia , Pesos e Medidas Corporais , Teste de Esforço , Humanos , Masculino , Análise de Componente Principal , Estudos Prospectivos , Reprodutibilidade dos Testes , Adulto Jovem
12.
Orthop J Sports Med ; 5(11): 2325967117740862, 2017 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-29226165

RESUMO

BACKGROUND: Concussion prevention in the National Football League (NFL) is an important priority for player safety. The NFL now has modified game schedules, and one concern is that unconventional game schedules, such as a shortened rest period due to playing on a Thursday rather than during the weekend, may lead to an increased risk of injuries. HYPOTHESIS: Unconventional game schedules in the NFL are associated with an increased rate of concussion. STUDY DESIGN: Descriptive epidemiological study. METHODS: This study analyzed concussions and game schedules over the NFL regular seasons from 2012 to 2015 (4 years). Documented numbers of concussions, identified by use of the online database PBS Frontline Concussion Watch, were summarized by regular-season weeks. Association of days of rest and game location (home, away, or overseas) with the rate of concussion was examined by use of the χ2 test. Logistic regression analysis was performed to examine the relationships of days of rest and home/away games to the risk of repeated concussions, with adjustment for player position. RESULTS: A total of 582 concussions were analyzed in this study. A significantly greater number of concussions occurred in the second half of the season (P < .01). No significant association was found between the rate of concussion and the days of rest, game location, or timing of the bye week by the team or the opponent (P > .05). Game schedules were not significantly associated with the occurrence of repeat concussions (P > .05). CONCLUSION: Unconventional game schedules in the NFL, including playing on Thursday and playing overseas, do not seem to put players at increased risk of concussions.

13.
Wilderness Environ Med ; 28(3): 185-196, 2017 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-28755819

RESUMO

OBJECTIVE: To gather epidemiologic data on injury type, treatment, and recovery from rock climbing injuries. METHODS: Design: retrospective cross-sectional study. SETTING: web-based survey. PARTICIPANTS: rock climbers who sustained a climbing-related injury during the prior 24 months. Criteria for inclusion: aged ≥18 years; participation in rock climbing at least 4 times per year in the United States. INTERVENTIONS: none. MAIN OUTCOME MEASURES: percentage of injured climbers seeking medical care, providers seen, subspecialty referral, development of chronic problems, factors affecting return to climbing, injuries by climbing type, body region, and injury type. RESULTS: Data were collected over a 60-day period using the Research Electronic Data Capture (REDCap) survey system. Seven hundred and eight surveys were collected from 553 male and 155 female climbers. Thirteen hundred ninety seven injuries were reported, and 975 injuries were suitable for analysis. The most common provider initially seen was a primary care provider. Subspecialty referral was commonly obtained. Injury patterns differed by climbing type. The percentage of respondents that returned to climbing before their injury was fully healed was 51.1%, and 44.9% of respondents developed chronic problems related to their climbing injury. Twenty-eight percent of respondents were unable to return to their previous level of climbing performance. Several factors were associated with delayed recovery from climbing injury. CONCLUSIONS: A significant number of climbers sought healthcare after injury. A majority of climbers who sought treatment were referred to subspecialist providers. About one-half of climbers were symptomatic when they returned to climbing and developed chronic problems after injury. Factors associated with slower return to climbing included increasing age, smoking, fractures, and surgery.


Assuntos
Traumatismos em Atletas/epidemiologia , Traumatismos em Atletas/terapia , Montanhismo/lesões , Adulto , Traumatismos em Atletas/etiologia , Traumatismos em Atletas/reabilitação , Estudos Transversais , Feminino , Humanos , Masculino , Montanhismo/estatística & dados numéricos , Estudos Retrospectivos , Estados Unidos/epidemiologia , Adulto Jovem
14.
J Sci Med Sport ; 20(3): 230-235, 2017 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-27622705

RESUMO

OBJECTIVES: Injury management is critical in the National Basketball Association (NBA), as players experience a wide variety of injuries. Recently, it has been suggested that game schedules, such as back-to-back games and four games in five days, increase the risk of injuries in the NBA. The aim of this study was to examine the association between game schedules and player injuries in the NBA. DESIGN: Descriptive epidemiology study. METHODS: The present study analyzed game injuries and game schedules in the 2012-13 through 2014-15 regular seasons. Game injuries by game schedules and players' profiles were examined using an exact binomial test, the Fisher's exact test and the Mann-Whitney-Wilcoxon test. A Poisson regression analysis was performed to predict the number of game injuries sustained by each player from game schedules and injured players' profiles. RESULTS: There were a total of 681 cases of game injuries sustained by 280 different players during the three years (total N=1443 players). Playing back-to-back games or playing four games in five days alone was not associated with an increased rate of game injuries, whereas a significant positive association was found between game injuries and playing away from home (p<0.05). Playing back-to-back games and away games were significant predictors of frequent game injuries (p<0.05). CONCLUSIONS: Game schedules could be one factor that impacts the risk of game injuries in the NBA. The findings could be useful for designing optimal game schedules in the NBA as well as helping NBA teams make adjustments to minimize game injuries.


Assuntos
Agendamento de Consultas , Traumatismos em Atletas/epidemiologia , Basquetebol/lesões , Adulto , Traumatismos em Atletas/etiologia , Humanos , Privação do Sono/complicações , Estados Unidos/epidemiologia , Adulto Jovem
15.
Drug Test Anal ; 8(11-12): 1197-1203, 2016 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-27714988

RESUMO

The laboratory profile of intranasal testosterone gel has not been previously reported from an anti-doping perspective. Because intranasal testosterone gel is newly available as a commercial product, we sought to examine the laboratory parameters following administration of this formulation, with particular attention to anti-doping guidelines. Five healthy and active male subjects were administered testosterone intranasal gel three times daily for four weeks, using a pattern of five consecutive days on, two days off. Urine was collected after each five-day round of drug administration and analyzed using a full steroid screen and isotope ratio mass spectrometry (IRMS). Windows of detection for elevated testosterone/epitestosterone (T/E) and other steroid ratios, World Anti-Doping Agency (WADA) athlete biological passport (ABP) findings, and IRMS results were analyzed in this study. In the 0-24 h window post-administration, 70% of samples were flagged with a suspicious steroid profile and 85% were flagged as atypical passport findings according to the WADA ABP steroid module. In the 24-48 h window, 0% of samples displayed suspicious steroid profiles while 40% resulted in atypical passport findings. IRMS testing confirmed the presence of exogenous testosterone in 90% and 40% of samples in the 0-24 h and 24-48 h windows post-administration, respectively. Additionally, IRMS data were analyzed to determine commonalities in the population changes in δ13 C values of testosterone, androsterone, etiocholanolone, 5αAdiol, and 5ßAdiol. Though no discernible metabolic trend of the route of administration was identified, we discovered that intranasal gel testosterone is detectable using conventional anti-doping tests. Copyright © 2016 John Wiley & Sons, Ltd.


Assuntos
Administração Intranasal/métodos , Androsterona/análise , Biomarcadores/análise , Isótopos de Carbono/química , Epitestosterona/análise , Etiocolanolona/análise , Cromatografia Gasosa-Espectrometria de Massas/métodos , Esteroides/análise , Detecção do Abuso de Substâncias/métodos , Testosterona/administração & dosagem , Androsterona/química , Atletas , Biomarcadores/metabolismo , Dopagem Esportivo , Epitestosterona/química , Etiocolanolona/química , Humanos , Espectrometria de Massas , Esteroides/química , Testosterona/química , Fatores de Tempo
16.
J Strength Cond Res ; 30(5): 1379-90, 2016 May.
Artigo em Inglês | MEDLINE | ID: mdl-27100168

RESUMO

The National Football League (NFL) Scouting Combine is held each year before the NFL Draft to measure athletic abilities and football skills of college football players. Although the NFL Scouting Combine can provide the NFL teams with an opportunity to evaluate college players for the upcoming NFL Draft, its value for predicting future success of players has been questioned. This study examined whether the NFL Combine measures can predict future performance of running backs (RBs) and wide receivers (WRs) in the NFL. We analyzed the 2000-09 Combine data of RBs (N = 276) and WRs (N = 447) and their on-field performance for the first 3 years after the draft and over their entire careers in the NFL, using correlation and regression analyses, along with a principal component analysis (PCA). The results of the analyses showed that, after accounting for the number of games played, draft position, height (HT), and weight (WT), the time on 10-yard dash was the most important predictor of rushing yards per attempt of the first 3 years (p = 0.002) and of the careers (p < 0.001) in RBs. For WRs, vertical jump was found to be significantly associated with receiving yards per reception of the first 3 years (p = 0.001) and of the careers (p = 0.004) in the NFL, after adjusting for the covariates above. Furthermore, HT was most important in predicting future performance of WRs. The analyses also revealed that the 8 athletic drills in the Combine seemed to have construct validity. It seems that the NFL Scouting Combine has some value for predicting future performance of RBs and WRs in the NFL.


Assuntos
Desempenho Atlético/fisiologia , Exercício Físico/fisiologia , Futebol Americano/fisiologia , Corrida/fisiologia , Humanos , Masculino , Valor Preditivo dos Testes , Análise de Regressão
17.
PM R ; 8(3 Suppl): S125-32, 2016 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-26972261

RESUMO

Historical reports of doping in sports date as far back as the ancient Greek Olympic Games. The anti-doping community considers doping in sports to be cheating and a violation of the spirit of sport. During the past century, there has been an increasing awareness of the extent of doping in sports and the health risks of doping. In response, the anti-doping movement has endeavored to educate athletes and others about the health risks of doping and promote a level playing field. Doping control is now undertaken in most countries around the world and at most elite sports competitions. As athletes have found new ways to dope, however, the anti-doping community has endeavored to strengthen its educational and deterrence efforts. It is incumbent upon sports medicine professionals to understand the health risks of doping and all doping control processes.


Assuntos
Atletas/legislação & jurisprudência , Dopagem Esportivo/tendências , Esportes/legislação & jurisprudência , Humanos
18.
Am J Sports Med ; 44(6): 1455-62, 2016 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-26920432

RESUMO

BACKGROUND: The incidence rates (IRs) and factors associated with injuries in the sport of Paralympic athletics (track and field) have not been comprehensively and prospectively studied. PURPOSE: To determine injury IRs, characteristics of injuries, and associated factors in the sport of athletics at the London 2012 Paralympic Games. STUDY DESIGN: Cohort study; Level of evidence, 2. METHODS: A total of 977 athletes competing in the sport of athletics were followed over a total 10-day competition period of the Paralympic Games. Daily injury data were obtained via 2 databases: (1) a custom-built, web-based injury and illness surveillance system (WEB-IISS), maintained by team medical personnel; and (2) the organizing committee database, maintained by medical providers in the medical stations operated by the London Organising Committee of the Olympic and Paralympic Games. Athlete impairment and event discipline were obtained via the International Paralympic Committee athlete database. IRs (injuries per 1000 athlete-days) by impairment, event discipline, sex, and age were examined. RESULTS: The overall IR was 22.1 injuries per 1000 athlete-days (95% CI, 19.5-24.7). In track disciplines, ambulant athletes with cerebral palsy experienced a lower incidence of injuries (IR, 10.2; 95% CI, 4.2-16.2) when compared with ambulant athletes from other impairment categories. Athletes in seated throwing experienced a higher incidence of injuries (IR, 23.7; 95% CI, 17.5-30.0) when compared with athletes in wheelchair racing (IR, 10.6; 95% CI, 5.5-15.6). In both track and field disciplines, the majority of injuries did not result in time loss from competition or training. Ambulant athletes experienced the greatest proportion of injuries to the thigh (16.4% of all injuries; IR, 4.0), observed predominantly in track athletes. Wheelchair or seated athletes experienced the greatest proportion of injuries to the shoulder/clavicle (19.3% of all injuries; IR, 3.4), observed predominantly in field athletes. CONCLUSION: This is the first prospective cohort study examining injury IRs and associated factors in the sport of athletics at the Paralympic Games. Injury patterns were specific to the event discipline and athlete impairment. The majority of injuries occurred to the thigh (ambulant athletes) or shoulder/clavicle (wheelchair or seated athletes) and did not result in time loss.


Assuntos
Atletas , Traumatismos em Atletas/epidemiologia , Pessoas com Deficiência , Atletismo/lesões , Adolescente , Adulto , Idoso , Atletas/estatística & dados numéricos , Traumatismos em Atletas/etiologia , Pessoas com Deficiência/estatística & dados numéricos , Feminino , Humanos , Incidência , Londres/epidemiologia , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Fatores de Risco , Adulto Jovem
19.
PM R ; 8(6): 545-52, 2016 06.
Artigo em Inglês | MEDLINE | ID: mdl-26454234

RESUMO

BACKGROUND: The epidemiology of injury in Paralympic football has received little attention. A study of all sports at the London 2012 Paralympic Games identified football 5-a-side as the sport with the highest injury rate, meriting further detailed analysis, which may facilitate the development of strategies to prevent injuries. OBJECTIVE: To examine the injury rates and risk factors associated with injury in Paralympic football. DESIGN: Secondary analysis of a prospective cohort study of injuries to football 5-a-side and football 7-a-side athletes. SETTING: London 2012 Paralympic Games. PARTICIPANTS: Participants included 70 football 5-a-side athletes and 96 football 7-a-side athletes. Athletes from all but one country chose to participate in this study. METHODS: The Paralympic Injury and Illness Surveillance System was used to track injuries during the Games, with data entered by medical staff. MAIN OUTCOME MEASUREMENTS: Injury incidence rate (IR) and injury incidence proportion (IP). RESULTS: The overall IR for football 5-a-side was 22.4 injuries/1000 athlete-days (95% confidence interval [CI], 14.1-33.8) with an IP of 31.4 injuries per 100 athletes (95% CI, 20.9-43.6). In 5-a-side competition, 62.5% of injuries were associated with foul play. The overall IR for football 7-a-side was 10.4 injuries/1000 athlete-days (95% CI, 5.4-15.5), with an IP of 14.6 injuries per 100 athletes (95% CI, 7.5-21.6). The most commonly injured body region in both sports was the lower extremity. CONCLUSIONS: To our knowledge, this study is the first to examine IR and risk factors associated with injury in Paralympic football. Future studies are needed to determine mechanisms of injury and independent risk factors for injury, thus informing prevention strategies.


Assuntos
Traumatismos em Atletas/epidemiologia , Aniversários e Eventos Especiais , Futebol Americano , Humanos , Incidência , Londres , Estudos Prospectivos
20.
Orthop J Sports Med ; 3(12): 2325967115620365, 2015 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-26740957

RESUMO

BACKGROUND: The majority of studies on concussion in the National Football League (NFL) focus on testing, evaluation, and outcomes. Meanwhile, there is a paucity of research on how a team's style of play influences the risk of concussion. HYPOTHESIS: Style of play, such as offensive and defensive strategies, is associated with the rate of concussions in the NFL. STUDY DESIGN: Descriptive epidemiology study. METHODS: The current study retrospectively analyzed data from the 2012 to 2014 NFL regular seasons. Reported numbers of concussions were stratified by each team and each position and were compared based on style of play, including offensive scheme (West Coast offense, Air Coryell offense, or other offensive schemes) and defensive alignment (3-4 or 4-3), attempts statistics, per-drive statistics, and offensive and defensive productions, along with strength of schedule (SoS) and team quality measured by simple rating system (SRS). Data analyses included descriptive statistics, 1-way analysis of variance, correlation analysis, and regression analysis. RESULTS: There were 437 documented concussions during the 2012 to 2014 NFL regular seasons, with a mean 4.6 concussions per season per team. In general, players most involved in pass plays reported more concussions. The number of concussions sustained by offensive players was significantly higher among the teams adopting the West Coast offense (mean, 3.0) than among those utilizing the Air Coryell offense (mean, 1.6; P = .006) or those with non-West Coast offenses combined (mean, 1.9; P = .004). The multiple regression analysis revealed that the West Coast offense or not, SoS, and SRS explained 25.3% of the variance in the number of concussions by offensive players. After accounting for SRS, the West Coast offense was found to be a significant predictor of the number of concussions (P = .007), while there was a tendency for SoS to be inversely associated with the number of concussions (P = .105). None of the variables for attempts statistics, per-drive statistics, and offensive production were significantly associated with the number of concussions in the regression analysis. CONCLUSION: In the NFL, players most involved in pass plays appear to be at increased risk for concussions. The West Coast offense may be associated with a greater risk of concussion. Furthermore, teams with easier schedules may have more players sustaining concussions.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...