Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 42
Filter
1.
BMJ Open Sport Exerc Med ; 10(1): e001815, 2024.
Article in English | MEDLINE | ID: mdl-38268523

ABSTRACT

Objectives: To describe the injury profile of a novel format cricket competition ('The Hundred') and compare injury incidence and prevalence between the men's and women's competitions. Methods: Medical staff prospectively collected injury data from the eight men's and women's teams during the 2021-2023 competitions. Injury definitions and incidence calculations followed the international consensus statement. Results: In the men's competition, 164 injuries were recorded, compared with 127 in the women's competition. Tournament injury incidence was 36.6 (95% CI 31.4 to 42.7) and 32.5 (95% CI 27.3 to 38.7)/100 players/tournament in the men's and women's competition, respectively. Non-time-loss incidence (men's 26.6 (95% CI 22.2 to 31.8), women's 24.6 (95% CI 20.1 to 30.0)/100 players/tournament) was higher than time-loss incidence (men's 10.0 (95% CI 7.5 to 13.5), women's 7.9 (95% CI 5.6 to 11.3)/100 players/tournament). Injury prevalence was 2.9% and 3.6% in the men's and women's competitions, respectively. Match fielding was the most common activity at injury in both competitions. The thigh and hand were the most common body location time-loss injury in the men's and women's competitions, respectively. Conclusion: A similar injury profile was observed between the men's and women's competition. Preventative strategies targeting thigh injuries in the men's competition and hand injuries in the women's competition would be beneficial. Compared with published injury rates, 'The Hundred' men's presents a greater risk of injury than Twenty20 (T20), but similar to one-day cricket, with 'The Hundred' women's presenting a similar injury risk to T20 and one-day cricket. Additional years of data are required to confirm these findings.

2.
J Sci Med Sport ; 27(1): 25-29, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37953165

ABSTRACT

OBJECTIVES: Explore whether injury profiles and mechanisms differ between red (First-Class multi-day) ball cricket and white (One-Day and Twenty20 limited over) ball cricket in elite men's domestic cricket from 2010 to 2019. DESIGN: Retrospective cohort analysis. METHODS: Injury incidence calculated according to the updated international consensus statement on injury surveillance in cricket, along with seasonal days lost and injury severity descriptive statistics. RESULTS: Across both cricket types, bowling resulted in the most seasonal days lost (mean 1942, 95 % confidence interval: 1799-2096) and highest mean injury severity (30 days, 95 % confidence interval: 28-33), with the lumbar spine the body region with the most seasonal days lost (mean 432 seasonal days; 95 % confidence interval: 355-525) from bowling. Injury incidence was higher in white ball compared to red ball cricket (per unit of time), with bowling (and its various phases) the most frequently occurring mechanism in both cricket types (white ball: 67.0 injuries per 1000 days of play [95 % confidence interval: 59.6-75.3]; red ball: 32.4 injuries per 1000 days of play [95 % confidence interval: 29.1-36.1]). When bowling, the abdomen and thigh were the body regions most injured from white (13.4 injuries per 1000 days of play [95 % confidence interval: 10.3-17.4]), and red ball (6.4 injuries per 1000 days of play [95 % confidence interval: 5.0-8.2]) cricket respectively. Overall, clear differences emerged in the nature and mechanism of injuries between red ball cricket and white ball cricket. CONCLUSIONS: Bowling presents the highest injury risk (across both cricket types), as well as highlighting the increased risk of injuries from diving during fielding and running between the wickets when batting, in shorter white ball cricket.


Subject(s)
Athletic Injuries , Cricket Sport , Gryllidae , Running , Male , Animals , Humans , Athletic Injuries/epidemiology , Athletic Injuries/etiology , Retrospective Studies
3.
Psychol Sport Exerc ; 68: 102447, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37162794

ABSTRACT

The COVID-19 pandemic has had a profound impact on many people's lives, including the use of bio-secure environments to facilitate the continuation of professional sport. Although it is well documented that the pandemic has negatively impacted mental health, the impact of bio-bubbles on mental health is yet to be investigated. In the present study we sought to identify the impact of bio-bubbles on the mental health of those residing within, and then to explore the underlying mechanism of any such impact. Individuals (n = 68) who resided in England and Wales Cricket Board (ECB) created bio-bubbles between March 2020 and April 2021 provided data, regarding their time inside and outside of bio-bubbles, on measures of mental health and basic psychological need satisfaction and frustration. Analysis revealed that bio-bubbles increased anxiety and depression and reduced wellbeing. Additionally, MEMORE mediation analyses revealed that autonomy frustration mediated the relationship between bubble status and all mental health markers. Furthermore, compared to men, women were more likely to experience elevated levels of anxiety and depression inside the bubble. The findings suggest that bio-bubbles negatively impact mental health and further suggest that satisfaction and frustration of basic psychological needs is a contributing factor. Findings suggest organizations tasked with creating bio-bubbles would do well to tailor their environment with an awareness of the importance of basic psychological needs and sex differences in relation to mental health. To the best of our knowledge, this research represents the first investigation of the impact of bio-bubbles on mental health.

4.
BMJ Open Sport Exerc Med ; 9(2): e001481, 2023.
Article in English | MEDLINE | ID: mdl-37073173

ABSTRACT

Elite adult male fast bowlers have high lumbar spine bone mineral, particularly on the contralateral side to their bowling arm. It is thought that bone possesses its greatest ability to adapt to loading during adolescence, but it is unknown at what age the greatest changes in lumbar bone mineral and asymmetry develops in fast bowlers. Objectives: This study aims to evaluate the adaptation of the lumbar vertebrae in fast bowlers compared to controls and how this is associated with age. Methods: 91 male fast bowlers and 84 male controls aged 14-24 years had between one and three annual anterior-posterior lumbar spine dual-energy-X-ray absorptiometry scans. Total (L1-L4) and regional ipsilateral and contralateral L3 and L4 (respective to bowling arm) bone mineral density and content (BMD/C) were derived. Multilevel models examined the differences in lumbar bone mineral trajectories between fast bowlers and controls. Results: At L1-L4 BMC and BMD, and contralateral BMD sites, fast bowlers demonstrated a greater negative quadratic pattern to their accrual trajectories than controls. Fast bowlers had greater increases in BMC in L1-L4 between 14 and 24 years of 55% compared with controls (41%). Within vertebra, asymmetry was evident in all fast bowlers and increased by up to 13% in favour of the contralateral side. Conclusions: Lumbar vertebral adaptation to fast bowling substantially increased with age, particularly on the contralateral side. The greatest accrual was during late adolescence and early adulthood, which may correspond with the increasing physiological demands of adult professional sport.

5.
Eur J Sport Sci ; 23(5): 667-675, 2023 May.
Article in English | MEDLINE | ID: mdl-35414351

ABSTRACT

The aim of this study is to determine if bone mineral density (BMD) and bone asymmetry differs between female cricket fast bowlers, spin bowlers and batters. BMD was determined at the total body, lumbar spine, and proximal femurs in 12 fast bowlers, 13 batters and 11 spin bowlers from pre-season DXA scans. High Z-scores at the total body, lumbar spine, and proximal femur were observed in all cricketers (mean Z-scores: +1.4 to +3.3) compared with a general age matched reference population. Fast bowlers had significantly greater BMD on the contralateral side of the lumbar spine compared with the ipsilateral side (p = 0.001, 5.9-12.1%). No asymmetry was found between hips in all groups. All cricket positions demonstrated high BMD at all measured sites. The lumbar spine of fast bowlers is asymmetric, with significantly greater BMD on the contralateral side of the spine, particularly at L4, possibly in response to the asymmetric lumbar loading patterns observed in bowling.HighlightsElite female cricketers demonstrate high BMD at total body, lumbar spine and proximal femur sites, regardless of playing position compared with a general age and ethnic group matched reference population.Fast bowlers have greater BMD on the contralateral (opposite bowling arm) side of the lumbar spine compared with the ipsilateral side, while a symmetrical pattern was observed in spin bowlers and batters.No asymmetry in BMD or section modulus between hips was observed at any proximal femur site for any cricket position.


Subject(s)
Bone Density , Sports , Humans , Female , Bone Density/physiology , Sports/physiology , Bone and Bones , Absorptiometry, Photon , Lumbosacral Region
6.
J Sci Med Sport ; 25(10): 828-833, 2022 Oct.
Article in English | MEDLINE | ID: mdl-36064501

ABSTRACT

OBJECTIVES: The aims of this study were to determine whether lumbar areal bone mineral density differed between cricket fast bowlers with and without lumbar stress fracture, and whether bone mineral density trajectories differed between groups during rehabilitation. DESIGN: Cross-sectional and cohort. METHODS: 29 elite male fast bowlers received a post-season anteroposterior lumbar dual-energy X-ray absorptiometry scan and a lumbar magnetic resonance imaging scan to determine stress fracture status. Participants were invited for three additional scans across the 59 weeks post baseline or diagnosis of injury. Bone mineral density was measured at L1 - L4 and ipsilateral and contralateral L3 and L4 sites. Independent-sample t-tests determined baseline differences in bone mineral density and multilevel models were used to examine differences in bone mineral density trajectories over time between injured and uninjured participants. RESULTS: 17 participants with lumbar stress fracture had lower baseline bone mineral density at L1 - L4 (7.6 %, p = 0.034) and contralateral sites (8.8-10.4 %, p = 0.038-0.058) than uninjured participants. Bone mineral density at all sites decreased 1.9-3.0 % by 20-24 weeks before increasing to above baseline levels by 52 weeks post injury. CONCLUSIONS: Injured fast bowlers had lower lumbar bone mineral density at diagnosis that decreased following injury and did not return to baseline until up to a year post-diagnosis. Localised maladaptation of bone mineral density may contribute to lumbar stress fracture. Bone mineral density loss following injury may increase risk of recurrence, therefore fast bowlers require careful management when returning to play.


Subject(s)
Fractures, Stress , Spinal Fractures , Absorptiometry, Photon/adverse effects , Athletes , Bone Density , Cross-Sectional Studies , Fractures, Stress/diagnostic imaging , Humans , Lumbar Vertebrae/diagnostic imaging , Lumbar Vertebrae/injuries , Male
7.
J Sports Sci ; 40(12): 1336-1342, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35635278

ABSTRACT

Cricket fast bowling is associated with a high prevalence of lumbar bone stress injuries (LBSI), especially in adolescent bowlers. This has not been sufficiently explained by risk factors identified in adult players. This study aimed to examine the incidence of LBSI in adolescent fast bowlers over a prospective study and potential risk factors. Forty asymptomatic male fast bowlers (aged 14-17 years) received baseline and annual lumbar dual-energy X-ray absorptiometry (DXA) and magnetic resonance imaging (MRI) scans, and musculoskeletal and bowling workload assessment; 22 were followed up after one year. LBSI prevalence at baseline and annual incidence were calculated. Potential risk factors were compared between the injured and uninjured groups using T-tests with Hedges' g effect sizes. At baseline, 20.5% of participants had at least one LBSI. Subsequent LBSI incidence was 27.3 ± 18.6 injuries per 100 players per year (mean ± 95% CI). Injured bowlers were older on average at the beginning of the season preceding injury (16.8 versus 15.6 years, g = 1.396, P = 0.047). LBSI risk may coincide with increases in bowling workload and intensity as bowlers step up playing levels to more senior teams during late adolescence whilst the lumbar spine is immature and less robust.


Subject(s)
Athletic Injuries , Back Injuries , Sports , Adolescent , Adult , Athletic Injuries/epidemiology , Athletic Injuries/etiology , Humans , Lumbar Vertebrae/diagnostic imaging , Lumbar Vertebrae/injuries , Male , Prospective Studies , Risk Factors
8.
J Sci Med Sport ; 25(6): 474-479, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35246382

ABSTRACT

OBJECTIVES: Describe hamstring injury incidence across competition formats, activity at time of injury, and time of season, facilitating the identification of injury risk factors in elite men's senior First-Class County Cricket. DESIGN: Prospective cohort. METHODS: Hamstring time loss injury incidence (between format, activity, and time of season) calculated for elite men's senior First-Class County Cricket seasons 2010 to 2019. RESULTS: The diagnosis with the highest seasonal incidence was 'Biceps femoris strain grade 1-2' (2.5 injuries/100 players). Hamstring injury incidence was highest in One-Day cricket (mean 27.2 injuries/1000 team days). Running between wickets when batting was the activity associated with the highest incidence in the shorter competition formats (8.4 and 4.8 injuries/1000 team days for One-Day and T20, respectively). Bowling delivery stride or follow through was the activity with the highest incidence for longer multi-day Test format (mean 2.3 injuries/1000 team days), although similar incidence was observed across all formats for this activity. Most injuries were sustained at the start of the season (April; 22.7 injuries/1000 team days), with significantly fewer injuries at end of the season (September; 4.1 injuries/1000 team days). CONCLUSIONS: Similar bowling injury incidence across formats suggests hamstring injury risk is associated more with the activity itself, whereas injury risk when batting was susceptible to changes in match intensity. The notably higher (albeit non-significant) incidence in April may allude to a lack of preparedness to meet the physical demands of the start of the season. The findings have practical relevance for practitioners, identifying potential opportunities for preventative strategies.


Subject(s)
Athletic Injuries , Leg Injuries , Soft Tissue Injuries , Athletic Injuries/epidemiology , Athletic Injuries/etiology , Humans , Incidence , Leg Injuries/epidemiology , Male , Prevalence , Prospective Studies , Wales/epidemiology
9.
Int J Sports Med ; 43(4): 381-386, 2022 Apr.
Article in English | MEDLINE | ID: mdl-34535018

ABSTRACT

This study describes hand fracture and dislocation injuries in terms of anatomical distribution, incidence and impact on playing time in registered professional adult male players of all 18 First Class England and Wales County Cricket clubs over a five-year period from 2010-2014. Prospectively collected injury surveillance data for 1st and 2nd Team matches (Twenty20, One day and four-day) and training were analysed. There were 109 hand fractures and 53 dislocations. Hand injury was commonest during fielding (60%, 98/162) compared to batting, bowling or wicket-keeping. Exposed parts of the hand including tips of all digits, the index finger, thumb ray and little finger ray were most frequently injured with 78% (125/160) of all injuries where anatomical location was recorded. Match injury incidence for batsmen was highest in four-day matches (0.071 injuries per 1000 overs batted) but for other player roles it was highest in Twenty20 matches (0.587 per 1000 overs bowled). Player unavailability for selection to play was incurred in 82% (89/109) of hand fractures but only 47% (25/53) of dislocations. This study clarifies the hand fracture and dislocation injury burden for this population.


Subject(s)
Athletic Injuries , Hand Injuries , Sports , Adult , Athletic Injuries/epidemiology , Athletic Injuries/etiology , England/epidemiology , Hand Injuries/epidemiology , Humans , Incidence , Male , Wales/epidemiology
10.
Clin J Sport Med ; 32(3): e300-e307, 2022 05 01.
Article in English | MEDLINE | ID: mdl-34009794

ABSTRACT

OBJECTIVE: To determine if playing position, a higher playing standard, and nonhelmet use are related to an increased odds of joint-specific injury and concussion in cricket. DESIGN: Cross-sectional cohort. PARTICIPANTS: Twenty-eight thousand one hundred fifty-two current or former recreational and high-performance cricketers registered on a national database were invited to participate in the Cricket Health and Wellbeing Study. Eligibility requirements were aged ≥18 years and played ≥1 cricket season. INDEPENDENT VARIABLES: Main playing position (bowler/batter/all-rounder), playing standard (high-performance/recreational), and helmet use (always/most of the time/occasionally/never). MAIN OUTCOME MEASURES: Cross-sectional questionnaire data included cricket-related injury (hip/groin, knee, ankle, shoulder, hand, back) resulting in ≥4 weeks of reduced exercise and self-reported concussion history. Crude and adjusted (adjusted for seasons played) odds ratios and 95% confidence interval (CIs) were estimated using logistic regression. RESULTS: Of 2294 participants (59% current cricketers; 97% male; age 52 ± 15 years; played 29 ± 15 seasons; 62% recreational cricketers), 47% reported cricket-related injury and 10% reported concussion. Bowlers had greater odds of hip/groin [odds ratio (95% CI), 1.9 (1.0-3.3)], knee [2.0 (1.4-2.8)], shoulder [2.9 (1.8-4.5)], and back [2.8 (1.7-4.4)] injury compared with batters. High-performance cricketers had greater odds of injury and concussion than recreational cricketers. Wearing a helmet most of the time [2.0 (1.4-3.0)] or occasionally [1.8 (1.3-2.6)] was related to higher odds of self-reported concussion compared with never wearing a helmet. Concussion rates were similar in cricketers who always and never wore a helmet. CONCLUSIONS: A higher playing standard and bowling (compared with batting) were associated with greater odds of injury. Wearing a helmet occasionally or most of the time was associated with higher odds of self-reported concussion compared with never wearing a helmet.


Subject(s)
Athletic Injuries , Brain Concussion , Sports , Adolescent , Adult , Aged , Athletic Injuries/epidemiology , Brain Concussion/epidemiology , Cross-Sectional Studies , Female , Head Protective Devices , Humans , Male , Middle Aged
11.
Int J Sports Med ; 43(6): 526-532, 2022 06.
Article in English | MEDLINE | ID: mdl-34555858

ABSTRACT

This study aimed to investigate the impact of COVID-19 enforced prolonged training disruption and shortened competitive season, on in-season injury and illness rates. Injury incidence and percent proportion was calculated for the 2020 elite men's senior domestic cricket season and compared to a historical average from five previous regular seasons (2015 to 2019 inclusive). The injury profile for the shortened 2020 season was generally equivalent to what would be expected in a regular season, except for a significant increase in medical illness as a proportion of time loss (17% compared to historic average of 6%) and in-season days lost (9% compared to historic average of 3%) due to COVID-19 related instances (most notably precautionary isolation due to contact with a confirmed or suspected COVID-19 case). There was a significant increase in the proportion of in-season days lost to thigh injuries (24% compared to 9%) and a significant decrease in the proportion of days lost to hand (4% compared to 12%) and lumbar spine (7% compared to 21%) injuries. These findings enhance understanding of the impact prolonged period of training disruption and shortened season can have on cricket injuries and the challenges faced by practitioners under such circumstances.


Subject(s)
Athletic Injuries , COVID-19 , Leg Injuries , Athletic Injuries/epidemiology , COVID-19/epidemiology , Humans , Incidence , Male , Seasons
12.
Int J Sports Med ; 43(4): 344-349, 2022 Apr.
Article in English | MEDLINE | ID: mdl-34560790

ABSTRACT

This exploratory retrospective cohort analysis aimed to explore how algorithmic models may be able to identify important risk factors that may otherwise not have been apparent. Their association with injury was then assessed with more conventional data models. Participants were players registered on the England and Wales Cricket Board women's international development pathway (n=17) from April 2018 to August 2019 aged between 14-23 years (mean 18.2±1.9) at the start of the study period. Two supervised learning techniques (a decision tree and random forest with traditional and conditional algorithms) and generalised linear mixed effect models explored associations between risk factors and injury. The supervised learning models did not predict injury (decision tree and random forest area under the curve [AUC] of 0.66 and 0.72 for conditional algorithms) but did identify important risk factors. The best-fitting generalised linear mixed effect model for predicting injury (Akaike Information Criteria [AIC]=843.94, conditional r-squared=0.58) contained smoothed differential 7-day load (P<0.001), average broad jump scores (P<0.001) and 20 m speed (P<0.001). Algorithmic models identified novel injury risk factors in this population, which can guide practice and future confirmatory studies can now investigate.


Subject(s)
Athletic Injuries , Cricket Sport , Adolescent , Female , Humans , Young Adult , Algorithms , England/epidemiology , Retrospective Studies , Risk Factors , Cricket Sport/injuries , Athletic Injuries/epidemiology
13.
Med Sci Sports Exerc ; 54(3): 438-446, 2022 03 01.
Article in English | MEDLINE | ID: mdl-34711706

ABSTRACT

INTRODUCTION: Localized bone mineral density (BMD) adaptation of the lumbar spine, particularly on the contralateral side to the bowling arm, has been observed in elite male cricket fast bowlers. No study has investigated this in adolescents, or the role of fast bowling technique on lumbar BMD adaptation. This study aims to investigate lumbar BMD adaptation in adolescent cricket fast bowlers, and its relationship with fast bowling technique. METHODS: Thirty-nine adolescent fast bowlers underwent anteroposterior dual x-ray absorptiometry scan of their lumbar spine. Hip, lumbopelvic and thoracolumbar joint kinematics, and vertical ground reaction kinetics were determined using three-dimensional motion capture and force plates. Significant partial (covariate: fat-free mass) and bivariate correlations of the technique parameters with whole lumbar (L1-L4) BMD and BMD asymmetry (L3 and L4) were advanced as candidate variables for multiple stepwise linear regression. RESULTS: Adolescent fast bowlers demonstrated high lumbar Z-Scores (+1.0; 95% confidence interval [CI], 0.7-1.4) and significantly greater BMD on the contralateral side of L3 (9.0%; 95% CI, 5.8%-12.1%) and L4 (8.2%; 95% CI, 4.9%-11.5%). Maximum contralateral thoracolumbar rotation and maximum ipsilateral lumbopelvic rotation in the period between back foot contact and ball release (BR), as well as contralateral pelvic drop at front foot contact, were identified as predictors of L1 to L4 BMD, explaining 65% of the variation. Maximum ipsilateral lumbopelvic rotation between back foot contact and BR, as well as ipsilateral lumbopelvic rotation and contralateral thoracolumbar side flexion at BR, were predictors of lumbar asymmetry within L3 and L4. CONCLUSIONS: Thoracolumbar and lumbopelvic motion are implicated in the etiology of the unique lumbar bone adaptation observed in fast bowlers whereas vertical ground reaction force, independent of body mass, was not. This may further implicate the osteogenic potential of torsional rather than impact loading in exercise-induced adaptation.


Subject(s)
Bone Density/physiology , Cricket Sport/physiology , Lumbar Vertebrae/physiology , Movement/physiology , Absorptiometry, Photon , Adolescent , Biomechanical Phenomena , Humans , Male
14.
BMJ Open Sport Exerc Med ; 7(4): e001128, 2021.
Article in English | MEDLINE | ID: mdl-34950503

ABSTRACT

OBJECTIVES: Investigate the observable player behaviours and features of both concussive (HS-C) and non-concussive (HS-NC) helmet strikes and describe their impact on playing performance. METHODS: Elite male cricketers sustaining helmet strikes between the 2016 and 2018 seasons were identified by the England and Wales Cricket Board. Medical records identified players sustaining a concussion and those in whom concussion was excluded. Retrospective cohort analysis was performed on batting and bowling performance data available for these players in the 2 years prior to and 3 months post helmet strike. Video analysis of available incidents was conducted to describe the characteristics of the helmet strike and subsequent observable player behaviours. The HS-C and HS-NC cohorts were compared. RESULTS: Data were available for 194 helmet strikes. 56 (29%) resulted in concussion. No significant differences were seen in playing performance in the 3 months post concussive helmet strike. However, a significant decline in batting performance was seen in this period in the HS-NC group (p<0.001).Video features signifying motor incoordination were most useful in identifying concussion post helmet strike, however, typical features suggesting transient loss of consciousness were not seen. Features such as a longer duration pause prior to the batsman resuming play and the level of concern shown by other players were also useful features. CONCLUSION: HS-NC may be more significant for player performance than previously thought. Guidance for using video replay to identify concussion in cricket may need to be modified when compared with other field sports.

16.
Int J Sports Med ; 42(12): 1058-1069, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34255324

ABSTRACT

A review of literature on the role of fomites in transmission of coronaviruses informed the development of a framework which was used to qualitatively analyse a cricket case study, where equipment is shared and passed around, and identify potential mitigation strategies. A range of pathways were identified that might in theory allow coronavirus transmission from an infected person to a non-infected person via communal or personal equipment fomites or both. Eighteen percent of potential fomite based interactions were found to be non-essential to play including all contact with another persons equipment. Six opportunities to interrupt the transmission pathway were identified, including the recommendation to screen participants for symptoms prior to play. Social distancing between participants and avoiding unnecessary surface contact provides two opportunities; firstly to avoid equipment exposure to infected respiratory droplets and secondly to avoid uninfected participants touching potential fomites. Hand sanitisation and equipment sanitisation provide two further opportunities by directly inactivating coronavirus. Preventing players from touching their mucosal membranes with their hands represents the sixth potential interruption. Whilst potential fomite transmission pathways were identified, evidence suggests that viral load will be substantially reduced during surface transfer. Mitigation strategies could further reduce potential fomites, suggesting that by comparison, direct airborne transmission presents the greater risk in cricket.


Subject(s)
COVID-19/transmission , Fomites/virology , Pandemics/prevention & control , Sports Equipment , COVID-19/prevention & control , Hand/virology , Humans , Physical Distancing , Touch
17.
BMJ Open Sport Exerc Med ; 7(1): e000910, 2021.
Article in English | MEDLINE | ID: mdl-33537152

ABSTRACT

INTRODUCTION: Epidemiology reporting within the cricketing medical literature has emerged over the past 2 years, with a focus on physical injuries. Despite mental health in elite sport gaining increasing recognition, few studies have addressed mental health symptoms and disorders within cricket. Recently, cricketers have been prominent in the mainstream media describing their lived experiences of mental illness. As a result, some have withdrawn from competition and suggested there is an unmet need for mental health services within the sport. OBJECTIVES: (i) To appraise the existing evidence on mental health symptoms and disorders amongst cricketers. (ii) To provide guidance on shaping mental health research and services within cricket. DESIGN: A narrative review of the literature from inception of available databases until 26 July 2019, with analysis and recommendations. RESULTS: Five studies were included in this narrative review. Studies covered a range of mental health symptoms and disorders, including distress, anxiety, depression, sleep disturbance, suicide, adverse alcohol use, illicit drug use, eating disorders and bipolar disorder. Results indicated that cricketers are at high risk for distress, anxiety, depression and adverse alcohol use. When compared with the general population, cricketers are more likely to experience anxiety and depressive symptoms. Rates of suicide were proposed to be high for test cricketers. Overall, studies to date have been of low quality, demonstrating non-rigorous research methods. Some studies have relied on non-validated questionnaires to collect self-reported data on mental health symptoms and disorders, while others have presented biographical data obtained through searches of the media. CONCLUSIONS: The results of this narrative review highlight the lack of evidence underpinning mental health services for athletes within cricket. We suggest the following recommendations for future research and practice: (i) normalising mental health symptoms and disorders; (ii) working with and helping vulnerable demographic segments within the target population; (iii) designing and implementing early recognition systems of mental health symptoms and disorders; (iv) addressing the mental health needs of cricketers on a population basis.

18.
Int J Sports Med ; 42(5): 407-418, 2021 May.
Article in English | MEDLINE | ID: mdl-33511617

ABSTRACT

A review of risk factors affecting airborne transmission of SARS-CoV-2 was synthesised into an 'easy-to-apply' visual framework. Using this framework, video footage from two cricket matches were visually analysed, one pre-COVID-19 pandemic and one 'COVID-19 aware' game in early 2020. The number of opportunities for one participant to be exposed to biological secretions belonging to another participant was recorded as an exposure, as was the estimated severity of exposure as defined from literature. Events were rated based upon distance between subjects, relative orientation of the subjects, droplet generating activity performed (e. g., talking) and event duration. In analysis we reviewed each risk category independently and the compound effect of an exposure i. e., the product of the scores across all categories. With the application of generic, non-cricket specific, social distancing recommendations and general COVID-19 awareness, the number of exposures per 100 balls was reduced by 70%. More impressive was the decrease in the most severe compound ratings (those with two or more categories scored with the highest severity) which was 98% and the reduction in exposures with a proximity <1 m, 96%. Analysis of the factors effecting transmission risk indicated that cricket was likely to present a low risk, although this conclusion was somewhat arbitrary omitting a comparison with a non-cricketing activity.


Subject(s)
Air Microbiology , COVID-19/transmission , Cricket Sport , Physical Distancing , Aerosols , Cough/virology , Environmental Exposure , Humans , Pandemics , Respiration , Risk Factors , SARS-CoV-2 , Sneezing , Social Interaction
19.
Med Sci Sports Exerc ; 53(3): 581-589, 2021 03 01.
Article in English | MEDLINE | ID: mdl-32910096

ABSTRACT

INTRODUCTION: Lumbar bone stress injuries (LBSI) are the most prevalent injury in cricket. Although fast bowling technique has been implicated in the etiology of LBSI, no previous study has attempted to prospectively analyze fast bowling technique and its relationship to LBSI. The aim of this study was to explore technique differences between elite cricket fast bowlers with and without subsequent LBSI. METHODS: Kinematic and kinetic technique parameters previously associated with LBSI were determined for 50 elite male fast bowlers. Group means were compared using independent-samples t-tests to identify differences between bowlers with and without a prospective LBSI. Significant parameters were advanced as candidate variables for a binary logistic regression analysis. RESULTS: Of the 50 bowlers, 39 sustained a prospective LBSI. Significant differences were found between injured and noninjured bowlers in rear knee angle, rear hip angle, thoracolumbar side flexion angle, and thoracolumbar rotation angle at back foot contact; the front hip angle, pelvic tilt orientation, and lumbopelvic angle at front foot contact; and the thoracolumbar side flexion angle at ball release and the maximal front hip angle and ipsilateral pelvic drop orientation. A binary logistic model, consisting of rear hip angle at back foot contact and lumbopelvic angle at front foot contact, correctly predicted 88% of fast bowlers according to injury history and significantly increased the odds of sustaining an LBSI (odds ratio, 0.88 and 1.25, respectively). CONCLUSIONS: Lumbopelvic motion is implicated in the etiology of LBSI in fast bowling, with inadequate lumbopelvifemoral complex control as a potential cause. This research will aid the identification of fast bowlers at risk of LBSI, as well as enhancing coaching and rehabilitation of fast bowlers from LBSI.


Subject(s)
Biomechanical Phenomena/physiology , Cricket Sport/injuries , Lumbar Vertebrae/injuries , Athletes , Athletic Injuries/etiology , Cricket Sport/physiology , Foot/physiology , Hip Joint/physiology , Humans , Knee Joint/physiology , Logistic Models , Male , Posture/physiology , Prospective Studies , Range of Motion, Articular/physiology , Regression Analysis , Spine/physiology , Young Adult
20.
J Sci Med Sport ; 24(2): 141-145, 2021 Feb.
Article in English | MEDLINE | ID: mdl-32839107

ABSTRACT

OBJECTIVES: To examine the relationship between injuries and team success in professional cricket. DESIGN: Prospective cohort analysis. METHODS: A prospective cohort of all match time-loss injuries and County Championship point tallies for nine seasons (from 2010 to 2018 inclusive) for all 18 First-Class County Cricket (FCCC) cricket teams in England and Wales. Two injury measures of match time-loss injury incidence and burden were assessed for within-team (linear mixed model on season-to-season changes) and between-team (correlation on differences averaged over all seasons) effects. County Championship league points tally was used as the measure of team success. RESULTS: A moderate negative correlation was found between injury burden and team performance (r=-0.36; 90% CI -0.66 to 0.05; likely negative, P=0.15). A reduction in match injury incidence of 2 match time-loss injuries per 1000 days of play (90% CI 1.4-2.9, P=0.10) within a team, or a reduction in match injury burden of 75 days per 1000 days of play (90% CI 50-109, P=0.053) in any given season was associated with the smallest worthwhile change in County Championship points (+13 points) for Division 1, but not for Division 2. CONCLUSION: Moderate reductions in injury burden are associated with potentially worthwhile effects on performance for a domestic cricket team in the County Championship Division 1.


Subject(s)
Athletic Injuries/epidemiology , Athletic Performance , Competitive Behavior/physiology , Cricket Sport/injuries , Athletic Injuries/prevention & control , England/epidemiology , Humans , Incidence , Prospective Studies , Wales/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL