Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 39
Filtrar
1.
Artículo en Inglés | MEDLINE | ID: mdl-38348303

RESUMEN

Objective: Lower extremity ankle and knee injuries occur at a high rate in the National Basketball Association (NBA) often requiring surgical intervention. This study aimed to identify surgical rates and risk factors for surgical intervention using multivariate analysis in ankle and knee injuries in NBA player. Methods: Player demographics, performance metrics, advanced statistics, and injury characteristics were recorded using publicly available data. To standardize injury events over multiple years, injury events per 1000 athlete exposure events (AE, one player participating in one game) were calculated. Descriptive analysis and multivariate logistic regression were completed to find associations with surgical intervention in ankle and knee injuries. Results: A total of 1153 ankle and knee injuries were included in the analysis with 73 (6.33%) lower extremity injuries treated with surgery. Knee injuries had a higher incidence of surgical intervention (0.23 AE) than ankle injuries (0.04 AE). The most frequent surgical knee injury was meniscus tear treated with meniscus repair (0.05 AE) and the most frequent ankle surgery was surgical debridement (0.01 AE) Multivariate logistic regression indicated lower extremity injuries that required surgery were associated with more minutes per game played (odds ratio [OR] 1.13; p = 0.02), a greater usage rate (OR 1.02 p < 0.001), the center position (OR 1.64; 95% [CI] 1.2-2.24; p = 0.002) and lower player efficiency rating (OR 0.96; 95% p < 0.001). Conclusion: Knee surgery was significantly more frequent than ankle surgery despite similar injury rates per 1000 exposures. The center position had the greatest risk for lower extremity injury followed by minutes played while a higher player efficiency rating was protective against surgical intervention. Developing strategies to address these factors will help in the management and prevention of lower extremity injuries requiring surgical intervention.

2.
Phys Sportsmed ; 52(2): 160-166, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-36990061

RESUMEN

OBJECTIVE: To describe the epidemiology, mechanisms, treatment, and disability for facial injuries in National Basketball Association (NBA) athletes. METHODS: This was a retrospective descriptive epidemiological chart review using NBA Electronic Medical Record (EMR) system. Responses to injuries reported in games, practices, and other activities were used for all data analysis, except for game incidence rates. Incidence rates were calculated by the game-related facial injury incidence per total athlete exposure (player-games). RESULTS: There were 440 facial injuries among 263 athletes during the 5 NBA seasons with an overall single-season risk of 12.6% and a game incidence of 2.4 per 1000 athlete-exposures (95% CI: 2.18-2.68). The majority of injuries were lacerations (n = 159, 36.1%), contusions (n = 99, 22.5%), or fractures (n = 67, 15.2%), with ocular (n = 163, 37.0%) being the most commonly injured location. Sixty (13.6%) injuries resulted in at least one NBA game missed (224 cumulative player-games) with ocular injuries resulting in the most cumulative games missed (n = 167, 74.6%). Nasal fractures (n = 39, 58.2%) were the most common fracture location followed by ocular fractures (n = 12, 17.9%) but were less likely to lead to games missed (median = 1, IRQ: 1-3) than ocular (median = 7, IQR: 2-10) fractures. CONCLUSIONS: An average of one in eight NBA players sustained a facial injury each season with ocular injuries being the most common location. While most facial injuries are minor, serious injuries, especially ocular fractures, can result in games missed.


Asunto(s)
Baloncesto , Lesiones Oculares , Fracturas Óseas , Humanos , Estudios Retrospectivos , Baloncesto/lesiones , Incidencia , Lesiones Oculares/epidemiología , Fracturas Óseas/epidemiología
3.
Saudi J Ophthalmol ; 37(3): 222-226, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38074301

RESUMEN

PURPOSE: The purpose is to report financial loss, demographic metrics, and mechanisms of injury associated with eye injuries in the National Basketball Association (NBA) from the 2010-2011 to 2017-2018 seasons. METHODS: We performed a retrospective review of eye injuries in the NBA from the 2010-2011 to 2017-2018 seasons using publicly available information from Basketball Reference and the Pro Sports Transactions websites. Only injuries of the eye and adnexa that caused players to miss games in the regular season and playoffs were included in the study. Financial loss was calculated based on the regular season salary of the players and normalized for inflation with 2018 as the base year. RESULTS: There were 30 eye injuries causing a total of 106 missed games and $7,486,770 in financial losses across eight seasons. Linear regressions showed a moderately positive increase in eye injuries (Pearson's r = 0.68, P = 0.07, and 0.79 injuries per year/1000 game-days increase) and financial losses (Pearson's r = 0.67, P = 0.07, and $185.75 increase per year/1000 game-days) over time. There were significantly more games missed due to orbital fractures than games missed due to contusions/lacerations (11.5 vs. 2.8 missed games, P = 0.01). CONCLUSION: We demonstrate an increasing trend of eye injuries in the NBA, resulting in increased financial loss. Injuries may be varied in type and affect the number of games missed.

4.
Orthop J Sports Med ; 11(10): 23259671231202973, 2023 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-37810744

RESUMEN

Background: Shoulder and elbow function is essential to basic basketball actions. Outside of anterior shoulder instability, injuries in these joints are not well characterized in National Basketball Association (NBA) players. Purpose: To describe the epidemiology and associated risk factors of shoulder and elbow injuries in NBA players and identify factors that influence player performance upon return to play. Study Design: Descriptive epidemiology study. Methods: Historical injury data from the 2015-2020 NBA seasons were retrieved from Pro Sports Transactions, a public online database. An injury was defined as a health-related concern resulting in an absence of ≥1 NBA games. Primary measures included pre- and postinjury player efficiency rating (PER) and true shooting percentage (TS%) with interquartile ranges (IQRs), stratified by extremity dominance. Multivariate logistic regression analyses with stepwise regression were performed to identify risk factors associated with return-to-play performance. Results: A total of 192 shoulder and elbow injuries were sustained among 126 NBA athletes, with incidence rates of 1.11 per 1000 game exposures (GEs) and 0.30 per 1000 GEs, respectively. Sprain/strain and general soreness were the most common injury types in both the shoulder and the elbow. In the 2 years after injury, baseline PER was achieved in all groups, except for players with dominant shoulder injuries (baseline PER, 16 [IQR, 14-18] vs 2-year PER, 13 [IQR 11-16]; P = .012). Younger age was associated with quicker return to baseline PER (odds ratio, 0.77 [95% CI, 0.67-0.88]). Shoulder and elbow injuries did not negatively influence TS% upon return to play (baseline TS%, 0.55% [IQR, 0.51%-0.58%] vs 1-year TS%, 0.55% [IQR, 0.52%-0.58%]; P = .13). Conclusion: Dominant shoulder injuries negatively influenced PER during the first 2 seasons upon return to play in NBA players. Therefore, expectations that players with this type of injury immediately achieve baseline statistical production should be tempered. Shooting accuracy appears to remain unaffected after shoulder or elbow injury.

5.
BMC Sports Sci Med Rehabil ; 15(1): 130, 2023 Oct 12.
Artículo en Inglés | MEDLINE | ID: mdl-37828552

RESUMEN

BACKGROUND: Anterior cruciate ligament (ACL) injuries are among the most common injuries in the National Basketball Association (NBA), and it is important to investigate the actual nature of the injury because it can impair a player's performance after returning to the game. Although the moment of injury has been investigated, the details of the movements and circumstances leading to injury in basketball games are unknown. This study aimed to clarify the actions leading to ACL injuries and to investigate their characteristics, based on YouTube video analyses of the NBA players. METHODS: Players with ACL injuries in the NBA were identified through web-based research over 10 seasons (2011/2012-2021/2022, through October 2021), with 29 recorded videos of ACL injuries in the NBA. Actions were categorized based on basketball-specific gestures, and determined whether the player was in contact with an opponent or not and, if so, the location of the contact was analyzed focusing on two time points: at the injury frame (IF) and one step before the injury frame (IF-1). The "injury leg" timing was counted for each of the first and second steps after ball possession. RESULTS: The majority (68.2%) of ACL injury occurred during the 2 steps phase (only two steps can proceed after ball retention in basketball, so we defined them as two steps) in the offense action, and most notably during the first step (80.0%). 73.3% of players who were injured during the 2 steps phase got contact to an area other than the knee (Indirect contact) at the IF-1, with 81.8% of contact being located in the upper body contralateral to the respective knee injury. The probability of players with ACL injuries during the 2 steps at the IF-1 who got Indirect contact was statistically significantly greater than those who got no contact with other players (p = 0.042). CONCLUSIONS: We argue that including pre-injury play and contact falls into the novelty category. Through YouTube-based video analyses, this study revealed that ACL injuries tend to be characterized by specific types of actions, the timing of contact, and the location of contact in NBA players.

6.
Cureus ; 15(7): e42499, 2023 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-37637654

RESUMEN

Introduction Basketball players are at increased risk of thumb collateral ligament injury (ulnar collateral ligament (UCL) and radial collateral ligament (RCL)). Methods The National Basketball Association (NBA) players with thumb collateral ligament surgery were identified using publicly available data. Performance statistics, ligament injuries (UCL or RCL), return to sport (RTS) time, laterality, and injury dates were recorded. Cases were matched 1:1 with controls based on age (±1 year), body mass index (BMI), NBA experience (±1 year), and performance statistics prior to the index date. RTS was defined as playing in one NBA game postoperatively. Career longevity was evaluated. Summary statistics were calculated, and Student's t-tests (ɑ = 0.001) were performed. Results All 47 players identified with thumb collateral ligament surgeries returned to sport. Thirty-three players (age: 26.9 ± 3.0) had one year of postoperative NBA experience for performance analysis. Career length (case: 9.6 ± 4.1, control: 9.4 ± 4.3, p > 0.001) was not significantly different from controls (p > 0.001). The same season time to RTS (n = 20) was 7.1 ± 2.4 weeks. Off-season or season-ending surgery (n = 13) RTS time was 28.4 ± 18.7 weeks. Neither thumb collateral ligament (UCL, n = 7; RCL, n = 10; unknown, n = 16) had an identifiable difference between the groups when evaluating career length. Career length, games/season, and performance were not different for players who underwent surgery on their dominant thumb (63.6%, 21/33) compared to controls (p > 0.001). Conclusion RTS rate is high in NBA athletes undergoing thumb collateral ligament surgery. Players do not experience decreased performance or career length due to thumb collateral ligament surgery, regardless of a dominant or non-dominant thumb injury.

7.
Orthop J Sports Med ; 11(7): 23259671231184459, 2023 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-37529529

RESUMEN

Background: Ankle injuries are more common in the National Basketball Association (NBA) compared with other professional sports. Purpose/Hypothesis: The purpose of this study was to report the incidence and associated risk factors of ankle injuries in NBA athletes. It was hypothesized that factors associated with an increased physiologic burden, such as minutes per game (MPG), usage rate, and associated lower extremity injury, would be associated with increased ankle injury risk and time loss. Study Design: Descriptive epidemiology study. Methods: Ankle injury data from the 2015-2016 through 2020-2021 NBA seasons were evaluated. The truncated 2019-2020 season due to the COVID-19 pandemic was omitted. The primary outcome was the incidence of ankle injuries, reported per 1000 game-exposures (GEs). Secondary analysis was performed to identify risk factors for ankle injuries through bivariate analysis and multivariable logistic regression of player demographic characteristics, performance statistics, injury characteristics, and previous lower extremity injuries. Factors influencing the time loss after injury were assessed via a negative binomial regression analysis. Results: A total of 554 ankle injuries (4.06 injuries per 1000 GEs) were sustained by NBA players over 5 NBA seasons, with sprain/strain the most common injury type (3.71 injuries per 1000 GEs). The majority of ankle injury events (55%) resulted in 2 to 10 game absences. The likelihood of sustaining an ankle injury was significantly associated with a greater number of games played (P = .029) and previous injury to the hip, hamstring, or quadriceps (P = .004). Increased length of absence due to ankle injury was associated with greater height (P = .019), MPG (P < .001), usage rate (P = .025), points per game (P = .011), and a prior history of foot (P = .003), ankle (P < .001), and knee injuries (P < .001). Conclusion: The incidence of ankle injuries was 4.06 per 1000 GEs in professional basketball players. Games played and prior history of hip, hamstring, or quadriceps injuries were found to be risk factors for ankle injuries. Factors associated with physiologic burden such as MPG and usage rate were associated with an increased time loss after injury.

8.
Front Psychol ; 13: 951779, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36483704

RESUMEN

This study aimed to compare differences in the match performances between home and away games during pre- and post-COVID-19 lockdown and to identify the key factors to match success with and without spectators. The sample consisted of 1,549 basketball matches including 971 games of the 2019-2020 regular season before the COVID-19 lockdown and 578 ghost matches of the 2020-2021 regular season after the COVID-19 pandemic. The independent t-test was used to explore the differences before and after COVID-19 while univariate and multivariable logistic regression models were used to identify the key factors to match success between matches with and without spectators. Our study identified that offensive rebounds were the only indicator differentiating between home and away games after the COVID-19 lockdown. Furthermore, home teams won more matches than away matches before the COVID-19 whereas home advantage had no impact on winning matches after the COVID-19. Our study suggested that crowd support may play a key role in winning games in the NBA. Furthermore, independently of the pre-and post-COVID19 pandemic, free throws made, three-point field goals made, defensive rebounds, assists, steals, personal fouls, and opponent quality were key factors differentiating between win and loss. Coaches and coaching staff can make informed decisions and well prepare for basketball match strategies.

9.
Front Sports Act Living ; 4: 977692, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36329855

RESUMEN

Within the National Basketball Association (NBA), players and teams maintain that having healthy players sit out some games during the regular season may help them be more productive during the post-season. This decision to not play in order to rest the player, aptly noted as a DNP-Rest decision on injury reports, is in line with team and player goals, and fits with a growing body of evidence in support of the power of rest for health and performance. However, these practices conflict with some goals of the league, which has a vested interest in having the top talent play to attract broadcasters, advertisers, live spectators, and thus, enhance viewership. The current study is among the first to test the theory that strategically resting healthy players during the regular season results in better performance, as indicated by Player Efficiency Rating (PER) and Win Shares, during the post-season. Utilizing data from the 2016-17 through the 2020-21 NBA seasons, there was not sufficient evidence to suggest that resting more games during the regular season results in better performance in the post-season. Findings from a nested case-control study of 184 players (92 cases; 92 controls) also showed no differences in the change in performance from regular to post-season between cases of players who received rest during the regular season and matched controls. Although the restorative effects of rest might be considerable in the short term, the current study provides additional evidence to suggest that the impact may not carry over into the post-season.

10.
Orthop J Sports Med ; 10(10): 23259671221126485, 2022 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-36225389

RESUMEN

Background: Players in the National Basketball Association (NBA) are at risk for lower extremity stress fractures, partly because of the sport's high-intensity demand on the lower body. Purpose: To provide insight on the identification and management of potential risk factors associated with lower extremity stress fractures in NBA athletes. Study Design: Case series; Level of evidence, 4. Methods: A retrospective study was conducted using the NBA electronic medical record database for all players who were on an NBA roster for ≥1 game from the 2013-2014 through 2018-2019 seasons. Player characteristics, games missed, and treatment methodology were independently analyzed. Results were presented as incidence per 1000 player-games. Results: There were 22 stress fractures identified in 20 NBA players over the course of 6 years, with an average of 3.67 stress fractures per year and an incidence of 0.12 stress fractures per 1000 player-games. Most stress fractures occurred in the foot (17/22), and 45% (10/22) of stress fractures were treated surgically, with the most common site of operation being the navicular. On average, approximately 37 games and 243 days were missed per stress fracture injury. There was no significant difference in time to return to play between high-risk stress fractures treated operatively versus nonoperatively (269.2 vs 243.8 days; P = .82). Conclusion: The overall incidence of stress fractures in NBA players was 0.12 per 1000 player-games, and a high percentage of players returned to NBA activity after the injury. There was a relatively even distribution between high-risk stress fractures treated operatively and nonoperatively. When comparing high-risk stress fractures treated operatively to ones treated nonoperatively, no significant difference in average time to return to play in the NBA was found.

11.
Orthop J Sports Med ; 10(9): 23259671221121116, 2022 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-36081413

RESUMEN

Background: Health and safety concerns surrounding the coronavirus 2019 (COVID-19) pandemic led the National Basketball Association (NBA) to condense and accelerate the 2020 season. Although prior literature has suggested that inadequate rest may lead to an increased injury risk, the unique circumstances surrounding this season offer a unique opportunity to evaluate player safety in the setting of reduced interval rest. Hypothesis: We hypothesized that the condensed 2020 NBA season resulted in an increased overall injury risk as compared with the 2015 to 2018 seasons. Study Design: Descriptive epidemiology study. Methods: A publicly available database, Pro Sports Transactions, was queried for injuries that forced players to miss ≥1 game between the 2015 and 2020 seasons. Data from the 2019 season were omitted given the abrupt suspension of the league year. All injury incidences were calculated per 1000 game-exposures (GEs). The primary outcome was the overall injury proportion ratio (IPR) between the 2020 season and previous seasons. Secondary measures included injury incidences stratified by type, severity, age, position, and minutes per game. Results: A total of 4346 injuries occurred over a 5-season span among 2572 unique player-seasons. The overall incidence of injury during the 2020 season was 48.20 per 1000 GEs but decreased to 39.97 per 1000 GEs when excluding COVID-19. Despite this exclusion, the overall injury rate in 2020 remained significantly greater (IPR, 1.42 [95% CI, 1.32-1.52]) than that of the 2015 to 2018 seasons (28.20 per 1000 GEs). On closer evaluation, the most notable increases seen in the 2020 season occurred within minor injuries requiring only a 1-game absence (IPR, 1.53 [95% CI, 1.37-1.70]) and in players who were aged 25 to 29 years (IPR, 1.57 [95% CI, 1.40-2.63]), averaging ≥30.0 minutes per game (IPR, 1.67 [95% CI, 1.47-1.90]), and playing the point guard position (IPR, 1.67 [95% 1.44-1.95]). Conclusion: Players in the condensed 2020 NBA season had a significantly higher incidence of injuries when compared with the prior 4 seasons, even when excluding COVID-19-related absences. This rise is consistent with the other congested NBA seasons of 1998 and 2011. These findings suggest that condensing the NBA schedule is associated with an increased risk to player health and safety.

12.
Front Psychol ; 13: 917980, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36160580

RESUMEN

The main research question addressed in this study is if and how the shooting pattern and field-goal accuracy have changed in the NBA league in the past decade. This study analyzes NBA game data from the 2011-2012 regular season to the 2020-2021 season. Field goal attempts are grouped into five categories by the shooting distance. The Mann-Kendall trend test was employed to examine if changes are statistically significant (p < 0.05) over the years. Sixteen equal segments in one basketball game, each with 3 min, were analyzed to examine the shooting pattern in different game segments. Results reveal an increasing trend in the percentage of 3-pointer shooting, which has nearly doubled from 22 to 39%. Meanwhile, the percentage of field goals within the range of 16-24 ft has decreased from 20 to 10%. Field-goal accuracy has shown an increasing trend for all shooting distances except for the 3-pointer shooting. The second and fourth 3-min within each quarter have the highest number of field goals. The first quarter has a higher shooting accuracy than the rest three quarters. In addition, results reveal that the last 3-min in each quarter has the lowest shooting accuracy. Reasons for the patterns of field goals in different segments are discussed from the perspective of game rule changes, the fatigue effects, and coaches' game strategies. The reasons for changes in activity level and performance in different quarters are also discussed. This study offers new insights into the changes in basketball shooting patterns and accuracy in NBA games in the past decade. Practical meanings of this study for basketball players, coaches, and sports psychologists, as well as the strength and limitations of this study, are discussed.

13.
Chronobiol Int ; 39(10): 1399-1410, 2022 10.
Artículo en Inglés | MEDLINE | ID: mdl-35980109

RESUMEN

This investigation aimed to clarify the influence of circadian change and travel distance on National Basketball Association (NBA) team performance using a dataset from the 2014-2018 seasons. Data from 9,840 games were acquired from an open-access source. Game point differential and team free-throw percentage served as outcome variables. Time zone change (TZΔ) captured raw circadian delay/advance based on travel for a game and adjusted TZΔ (AdjTZΔ) evolved TZΔ by allowing acclimation to a novel TZ. We also further categorized AdjTZΔ into AdjTZΔ_A, which assumed travel the day before each game and AdjTZΔ_B, which assumed teams spent as many days in their home city as possible. Travel distance for each game was calculated. Linear mixed-effects modeling estimated associations, with games nested within team and year. Adjusted associations accounted for differences in team ability, whether the game was home or away, and whether the game occurred on the second half of a back-to-back game sequence. Greater circadian misalignment, regardless of delay or advance, and increasing travel distance negatively influenced NBA game performance. Yet, results suggest that performance outcomes may be more influenced by travel distance than circadian misalignment. Moreover, circadian misalignment and travel distance interacted to significantly influence game point differential. Furthermore, differences in results across analyses were observed between AdjTZΔ_A and AdjTZΔ_B, which suggests that subtle differences in constructed travel schedules can have notable impact on NBA performance outcomes. Lastly, playing on the second half of a back-to-back sequence emerged as a robust predictor of performance disadvantage, which corroborates the existing literature and provides further support for NBA schedule changes purposed to enhance competitive equity by reducing the number of back-to-back games across a season. These findings can help guide NBA teams on key strategies for reducing travel-related disadvantages and inform schedule makers on critical factors to prioritize across future schedules to attenuate competitive inequity from travel. Furthermore, they can help direct teams towards scenarios that are best to target for load management purposes due to the cumulative disadvantage arising from travel-related factors, opponent quality, game location, and game sequence.


Asunto(s)
Rendimiento Atlético , Baloncesto , Ritmo Circadiano , Humanos , Estudios Retrospectivos , Viaje , Enfermedad Relacionada con los Viajes
14.
Orthop J Sports Med ; 10(7): 23259671221105257, 2022 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-35898206

RESUMEN

Background: The extent to which concussions affect Women's National Basketball Association (WNBA) athletes has not been thoroughly examined. Purpose: To evaluate the incidence and impact of concussion injuries occurring in the WNBA. Study Design: Descriptive epidemiology study. Methods: Publicly available records were searched to identify all documented basketball-related concussions from WNBA seasons 1997 to 2020. Player demographics, injury details, and basketball career information were collected. Concussion incidence and return-to-play (RTP) timing were evaluated before and after the institution of the WNBA concussion protocol in 2012. Minutes per game and game score per minute were compared 5 games before and 5 games after the concussion was sustained. Player game availability and RTP performance were also compared with an age-, body mass index-, position-, and experience-matched control group of players who did not sustain any injuries during the index season. Results: A total of 70 concussions among 55 players were reported in the WNBA from 1997 to 2020, with a mean incidence of 2.9 ± 2.3 concussions per season. After the implementation of the WNBA concussion protocol, the incidence significantly increased from 1.7 to 5.0 concussions per season (P < .001). All players returned after a first-time concussion, missing a mean of 3.8 ± 4.7 games and 17.9 ± 20.7 days. After the adoption of the concussion protocol, the time to RTP significantly increased with games missed (P = .006) and days missed (P = .006). Minutes per game and game score per minute were not significantly affected by sustaining a concussion (P = .451 and P = .826, respectively). Conclusion: Since the adoption of the WNBA concussion protocol in the 2012 season, the incidence of concussions increased significantly. Athletes retained a high rate of RTP after missing a median of 4 games, and the time to RTP increased after the institution of the concussion protocol. Player game availability and performance within the same season were not significantly affected by concussion injuries after a successful RTP.

15.
Prev Med Rep ; 25: 101661, 2022 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-35127348

RESUMEN

This is the first real-world study to examine the association between a voluntary 16-ounce (oz) portion-size cap on sugar-sweetened beverages (SSB) at a sporting arena on volume of SSBs and food calories purchased and consumed during basketball games. Cross-sectional survey data from adults exiting a Brooklyn, NY, USA arena (Barclays, n = 464) with a 16-oz portion-size restriction and a Manhattan, NY, USA arena with no portion-size restriction (Madison Square Garden, control, n = 295) after the portion cap policy was put in place from March through June 2014 were analyzed. Linear regression models adjusting for sex, age, BMI, ethnicity, race, marital status, education, and income were used to compare the two arenas during the post-implementation period. The survey response rate was 45.9% and equivalent between venues. Among all arena goers, participants at Barclays purchased significantly fewer SSB oz (-2.24 oz, 95% CI [-3.95, -0.53], p = .010) and consumed significantly fewer SSB oz (-2.34 oz, 95% CI[-4.01, -0.68], p = .006) compared with MSG after adjusting for covariates. Among those buying at least one SSB, Barclays' participants purchased on average 11.03 fewer SSB oz. (95% CI = [4.86, 17.21], p < .001) and consumed 12.10 fewer SSB oz (95% CI = [5.78, 18.42], p < .001). There were no statistically significant differences between arenas in food calories and event satisfaction. In addition, no one reported not ordering a drink due to small size. An SSB portion-size cap was associated with purchasing and consuming fewer SSB oz. without evidence of decreasing satisfaction with the event experience.

16.
Am J Sports Med ; 50(5): 1416-1429, 2022 04.
Artículo en Inglés | MEDLINE | ID: mdl-34213367

RESUMEN

BACKGROUND: Numerous studies have reported the incidence and outcomes of injuries in the men's and women's National Basketball Association (NBA and WNBA, respectively). PURPOSE: To synthesize published data regarding the incidence and outcomes of all injuries in the NBA and WNBA in a comprehensive review. STUDY DESIGN: Systematic review; Level of evidence, 4. METHODS: Following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines, we searched 3 electronic databases (PubMed, MEDLINE, Embase) for studies of all levels of evidence since 1990 pertaining to injuries sustained by active players in the NBA and WNBA. Studies were excluded if the cohort of interest included ≤3 active players. RESULTS: The initial search of the 3 databases yielded 1253 unique studies, of which 49 met final inclusion criteria for this review. Only 4 studies included athletes in the WNBA. Based on the mean annual incidence, the 5 most common orthopaedic sports injuries sustained in the NBA were concussions (9.5-14.9 per year), fractures of the hand (3.5-5.5 per year), lower extremity stress fractures (4.8 per year), meniscal tears (2.3-3.3 per year), and anterior cruciate ligament tears (1.5-2.6 per year). Cartilage defects treated using microfracture, Achilles tendon ruptures, and anterior cruciate ligament injuries were 3 injuries that led to significant reductions in performance measurements after injury. CONCLUSION: With advances in sports technology and statistical analysis, there is rapidly growing interest in injuries among professional basketball athletes. High-quality prospective studies are needed to understand the prevalence and effect of injuries on player performance and career length. This information can inform preventative and treatment measures taken by health care providers to protect players and guide safe return to play at a high level.


Asunto(s)
Lesiones del Ligamento Cruzado Anterior , Traumatismos en Atletas , Baloncesto , Conmoción Encefálica , Traumatismos de la Rodilla , Traumatismos en Atletas/epidemiología , Baloncesto/lesiones , Femenino , Humanos , Masculino
17.
Orthop J Sports Med ; 9(9): 23259671211030473, 2021 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-34527757

RESUMEN

BACKGROUND: Achilles tendon ruptures are devastating in elite athletes. There are currently no studies examining the effects of Achilles tendon rupture on performance outcomes in the Women's National Basketball Association (WNBA). HYPOTHESIS: Athletes in the WNBA who sustained an Achilles tendon rupture and underwent subsequent surgical repair will sustain declines in performance metrics when compared with their preinjury statistics and matched controls. STUDY DESIGN: Cohort study; Level of evidence, 3. METHODS: Seventeen WNBA players who sustained an Achilles tendon rupture from 2000 to 2019 were identified through publicly available injury reports and player profiles. Athlete information collected included age, body mass index, position, and service in the WNBA when the tear occurred. Statistics were collected for 1 season before and 2 seasons after the injury, and the player efficiency rating (PER) was calculated. Players were matched to uninjured controls by service in the WNBA, position, and performance statistics. RESULTS: On average, players were 28 years of age at the time of Achilles tendon rupture, with a service time in the WNBA of 6.5 years. Four players never returned to play in the WNBA, while 7 players failed to play more than 1 season. Players who did return played significantly fewer minutes per game compared with preinjury in both postinjury seasons 1 and 2 (mean difference, -6.11 and -6.54 min/game, respectively; P < .01 for both) and had a significantly decreased PER in postinjury season 2 (mean difference, -2.53; P = .024). After returning to play, the injured players experienced significant decreases when compared with controls in field goals (-0.85 vs +0.20; P = .047), free throws (-1.04 vs +0.12; P < .01), steals (-0.48 vs +0.24; P = .018), and points scored (-2.89 vs +0.58; P = .014). CONCLUSION: WNBA players experienced significant decreases in performance metrics after Achilles tendon rupture compared with their preinjury levels and compared with uninjured controls. Overall, 23.5% of players failed to return to the WNBA, while 41.2% failed to play for more than 1 season.

18.
Orthop J Sports Med ; 9(9): 23259671211025305, 2021 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-34504899

RESUMEN

BACKGROUND: The extent to which patellar tendinopathy affects National Basketball Association (NBA) athletes has not been thoroughly elucidated. PURPOSE: To assess the impact patellar tendinopathy has on workload, player performance, and career longevity in NBA athletes. STUDY DESIGN: Cohort study; Level of evidence, 3. METHODS: NBA players diagnosed with patellar tendinopathy between the 2000-2001 and 2018-2019 seasons were identified through publicly available data. Characteristics, return to play (RTP), player statistics, and workload data were compiled. The season of diagnosis was set as the index year, and the statistical analysis compared post- versus preindex data acutely and in the long term, both within the injured cohort and with a matched healthy NBA control cohort. RESULTS: A total of 46 NBA athletes were included in the tendinopathy group; all 46 players returned to the NBA after their diagnosis. Compared with controls, the tendinopathy cohort had longer careers (10.50 ± 4.32 vs 7.18 ± 5.28 seasons; P < .001) and played more seasons after return from injury (4.26 ± 2.46 vs 2.58 ± 3.07 seasons; P = .001). Risk factors for patellar tendinopathy included increased workload before injury (games started, 45.83 ± 28.67 vs 25.01 ± 29.77; P < .001) and time played during the season (1951.21 ± 702.09 vs 1153.54 ± 851.05 minutes; P < .001) and during games (28.71 ± 6.81 vs 19.88 ± 9.36 minutes per game; P < .001). Players with increased productivity as measured by player efficiency rating (PER) were more likely to develop patellar tendinopathy compared with healthy controls (15.65 ± 4.30 vs 12.76 ± 5.27; P = .003). When comparing metrics from 1 year preinjury, there was a decrease in games started at 1 year postinjury (-12.42 ± 32.38 starts; P = .028) and total time played (-461.53 ± 751.42 minutes; P = .001); however, PER at 1 and 3 years after injury was unaffected compared with corresponding preinjury statistics. CONCLUSION: NBA players with a higher PER and significantly more playing time were more likely to be diagnosed with patellar tendinopathy. Player performance was not affected by the diagnosis of patellar tendinopathy, and athletes were able to RTP without any impact on career longevity.

19.
Orthop J Sports Med ; 9(5): 23259671211002296, 2021 May.
Artículo en Inglés | MEDLINE | ID: mdl-34017878

RESUMEN

This 2020 NBA Orthobiologics Consensus Statement provides a concise summary of available literature and practical clinical guidelines for team physicians and players. We recognize that orthobiologic injections are a generally safe treatment modality with a significant potential to reduce pain and expedite early return to play in specific musculoskeletal injuries. The use of orthobiologics in sports medicine to safely reduce time loss and reinjury is of considerable interest, especially as it relates to the potential effect on a professional athlete. While these novel substances have potential to enhance healing and regeneration of injured tissues, there is a lack of robust data to support their regular use at this time. There are no absolutes when considering the implementation of orthobiologics, and unbiased clinical judgment with an emphasis on player safety should always prevail. Current best evidence supports the following: Key Points There is support for the use of leukocyte-poor platelet-rich plasma in the treatment of knee osteoarthritis. There is support for consideration of using leukocyte-rich platelet-rich plasma for patellar tendinopathy. The efficacy of using mesenchymal stromal cell injections in the management of joint and soft tissue injuries remains unproven at this time. There are very few data to suggest that current cell therapy treatments lead to any true functional tissue regeneration. Meticulous and sterile preparation guidelines must be followed to minimize the risk for infection and adverse events if these treatments are pursued.Given the high variability in orthobiologic formulations, team physicians must stay up-to-date with the most recent peer-reviewed literature and orthobiologic preparation protocols for specific injuries.Evidence-based treatment algorithms are necessary to identify the optimal orthobiologic formulations for specific tissues and injuries in athletes.Changes in the regulatory environment and improved standardization are required given the exponential increase in utilization as novel techniques and substances are introduced into clinical practice.

20.
Phys Sportsmed ; 49(3): 271-277, 2021 09.
Artículo en Inglés | MEDLINE | ID: mdl-34010095

RESUMEN

Background: Achilles tendon ruptures are devastating injuries for National Basketball Association (NBA) players: prior studies have demonstrated decreased performance following return-to-play, but none have evaluated the effect of injury on rate-adjusted contextual statistics to assess the true change in performance. Additionally, there exists a paucity of data on the independent impact on defensive performance following return.Hypothesis: Compared to both control-matched peers and preoperative careers, we hypothesize that player production based on rate-adjusted contextual statistics will significantly decline following Achilles tendon rupture.Study design: Retrospective Cohort StudyMethods: Publicly available NBA injury data on Achilles tendon rupture were reviewed from the 1996 -1997 to the 2016-2017 seasons. Controls were matched based on height, position, age, and rate-adjusted statistics. Extracted data included Value over Replacement Player Rating, Box Plus-Minus, Win Shares, offensive rating, defensive rating, and time to return-to-play, and was collected for the season before and two seasons following injury.Results: Twenty-five NBA players with surgically treated complete Achilles ruptures met inclusion and exclusion criteria. The return-to-play rate from Achilles tendon ruptures from 1996-1997 to 2016-2017 was 80%, with a mean recovery period of 311.0 ± 100.9 days. After 2 years, performance significantly declined for Value over Replacement Player Rating, Box Plus-Minus, and offensive rating compared to controls and cases. However, there was no significant effect on defensive rating (P = 0.38). After two seasons, returning players had a Value over Replacement Player Rating that was 24.1% below pre-injury levels, contributed 1.4 fewer points per 100 possessions by Box Plus-Minus, and yielded 2.4 fewer wins by Win Shares.Conclusions: Achilles tendon rupture results in significant decreases in offensive production and career longevity. The injury does not have a significant impact on defensive production.Clinical relevance: Achilles tendon ruptures significantly affect basketball players ability to return-to-play, and their in-game performance.Level of evidence: 3.


Asunto(s)
Tendón Calcáneo , Traumatismos en Atletas/epidemiología , Rendimiento Atlético , Baloncesto , Volver al Deporte , Tendón Calcáneo/lesiones , Baloncesto/lesiones , Humanos , Estudios Retrospectivos , Rotura
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA