RESUMO
INTRODUCTION: Long QT syndrome (LQTS) is a cardiac channelopathy characterized by QT prolongation and a potential for arrhythmic syncope, sudden cardiac arrest or deaths (SCA/SCD). It has been speculated that patients with LQTS might have a primary sinoatrial node (SAN) phenotype of chronotropic insufficiency (CI). This has not been demonstrated convincingly before because of the potentially confounding effects of beta blocker (BB) therapy. Herein, we set out to determine whether untreated patients with LQTS demonstrate intrinsic CI. METHODS AND RESULTS: A retrospective review of all treadmill exercise stress tests (TEST) was performed on patients with one of the three most common LQTS genotypes: LQT1, LQT2, and LQT3. For each patient, the first TEST completed while off BB was analyzed. Patients with prior left cardiac sympathetic denervation (LCSD) therapy were excluded. CI was defined as having an age- and gender-predicted peak heart rate (HR) < 85% and/or a predicted HR reserve (HRR) < 80%. Overall, 463 LQTS patients (245 LQT1, 125 LQT2, and 93 LQT3) were included (267 female [58%]; mean age at time of TEST [29 ± 17 years]). Mean % predicted peak HR for all LQTS patients was 87.6% (range 42.9% - 119.1%) and mean % predicted HRR was 80% (range 19.1% - 153%). Overall, half of all LQTS patients (n = 234; 51%) displayed CI; 64% of patients with LQT1 (n = 157), 37% with LQT2 (n = 46), and 33% with LQT3 (n = 31). Patients with LQT1 were most likely to exhibit CI compared to patients with LQT2 (p < .0001) and LQT3 (p < .0001). CI was significantly more common in LQT1 compared to controls (p < .0001), while there was no difference between LQT2 (p = .5) or LQT3 and controls (p > .9). Presence of CI was not a predictor of LQTS-associated symptoms, BB side effects or likelihood of future breakthrough cardiac events (BCE). CONCLUSIONS: Patients with LQTS, particularly LQT1, demonstrate a SAN phenotype of CI. If assessing BB therapy effect by impact on peak HR, the patient's pretreatment peak HR, rather than an age- and gender-predicted maximum HR, should be used.
RESUMO
Exercise stress testing (EST) is commonly used to evaluate chest pain, with some labs using 85% of age-predicted maximum heart rate (APMHR) as an endpoint for EST. The APMHR is often calculated using the formula 220-age. However, the accuracy of this formula and 85% APMHR as an endpoint may be questioned. Moreover, failing to reach 85% APMHR (known as chronotropic insufficiency) may also indicate poor cardiovascular prognosis, but measurements, such as percentage heart rate reserve (%HRR), maximum rate pressure product (MRPP), and the maximum metabolic equivalent of tasks (METs) reached during EST may provide better prediction of cardiovascular outcomes than not reaching 85% of APMHR. There is a need to incorporate comprehensive measurements to improve the diagnostic and prognostic capabilities of EST.
RESUMO
BACKGROUND: People with HIV (PWH) have lower exercise capacity than peers without HIV, which may be explained by chronotropic incompetence, the inability to increase heart rate during exercise. METHODS: The Exercise for Healthy Aging Study included adults aged 50 to 75 years with and without HIV. Participants completed 12 weeks of moderate-intensity exercise, before randomization to moderate or high intensity for 12 additional weeks. We compared adjusted heart rate reserve (AHRR; chronotropic incompetence <80%) on cardiopulmonary exercise testing by HIV serostatus and change from baseline to 12 and 24 weeks using mixed effects models. RESULTS: Among 32 PWH and 37 controls (median age, 56 years; 7% female), 28% of PWH vs 11% of controls had chronotropic incompetence at baseline (P = .067). AHRR was lower among PWH (91% vs 101%; difference, 10%; 95% CI, 1.9%-18.9%; P = .02). At week 12, AHRR normalized among PWH (+8%; 95% CI, 4%-11%; P < .001) and was sustained at week 24 (+5%; 95% CI, 1%-9%; P = .008) versus no change among controls (95% CI, -4% to 4%; P = .95; interaction P = .004). After 24 weeks of exercise, 15% of PWH and 10% of controls had chronotropic incompetence (P = .70). CONCLUSIONS: Chronotropic incompetence contributes to reduced exercise capacity among PWH and improves with exercise training.
Assuntos
Exercício Físico , Infecções por HIV , Frequência Cardíaca , Humanos , Feminino , Pessoa de Meia-Idade , Masculino , Infecções por HIV/fisiopatologia , Idoso , Frequência Cardíaca/fisiologia , Exercício Físico/fisiologia , Envelhecimento Saudável/fisiologiaRESUMO
Background: Rating of perceived exertion (RPE) is considered a valid method for prescribing prolonged aerobic steady-state exercise (SSE) intensity due to its association with physiological indicators of exercise intensity, such as oxygen uptake (VÌO2) or heart rate (HR). However, these associations between psychological and physiological indicators of exercise intensity were found during graded exercise tests (GXT) but are currently used to prescribe SSE intensity even though the transferability and validity of the relationships found during GXT to SSE were not investigated. The present study aims to verify whether (a) RPE-HR or RPE-VÌO2 relations found during GXTs are valid during SSEs, and (b) the duration and intensity of SSE affect these relations. Methods: Eight healthy and physically active males (age 22.6 ± 1.2 years) were enrolled. On the first visit, pre-exercise (during 20 min standing) and maximal (during a GXT) HR and VÌO2 values were measured. Then, on separate days, participants performed 4 SSEs on the treadmill by running at 60% and 80% of the HR reserve (HRR) for 15 and 45 min (random order). Individual linear regressions between GXTs' RPE (dependent variable) and HRR and VÌO2 reserve (VÌO2R) values (computed as the difference between maximal and pre-exercise values) were used to predict the RPE associated with %HRR (RPEHRR) and %VÌO2R (RPEVÌO2R) during the SSEs. For each relation (RPE-%HRR and RPE-%VÌO2R), a three-way factorial repeated measures ANOVA (α = 0.05) was used to assess if RPE (dependent variable) was affected by exercise modality (i.e., RPE recorded during SSE [RPESSE] or GXT-predicted), duration (i.e., 15 or 45 min), and intensity (i.e., 60% or 80% of HRR). Results: The differences between RPESSE and GXT-predicted RPE, which were assessed by evaluating the effect of modality and its interactions with SSE intensity and duration, showed no significant differences between RPESSE and RPEHRR. However, when RPESSE was compared with RPEVÌO2R, although modality or its interactions with intensity were not significant, there was a significant (p = 0.020) interaction effect of modality and duration yielding a dissociation between changes of RPESSE and RPEVÌO2R over time. Indeed, RPESSE did not change significantly (p = 0.054) from SSE of 15 min (12.1 ± 2.0) to SSE of 45 min (13.5 ± 2.1), with a mean change of 1.4 ± 1.8, whereas RPEVÌO2R decreased significantly (p = 0.022) from SSE of 15 min (13.7 ± 3.2) to SSE of 45 min (12.4 ± 2.8), with a mean change of -1.3 ± 1.5. Conclusion: The transferability of the individual relationships between RPE and physiological parameters found during GXT to SSE should not be assumed as shown by the results of this study. Therefore, future studies modelling how the exercise prescription method used (e.g., RPE, HR, or VÌO2) and SSE characteristics (e.g., exercise intensity, duration, or modality) affect the relationships between RPE and physiological parameters are warranted.
Assuntos
Teste de Esforço , Exercício Físico , Frequência Cardíaca , Consumo de Oxigênio , Esforço Físico , Humanos , Masculino , Frequência Cardíaca/fisiologia , Esforço Físico/fisiologia , Consumo de Oxigênio/fisiologia , Adulto Jovem , Teste de Esforço/métodos , Exercício Físico/fisiologia , Exercício Físico/psicologia , Adulto , Percepção/fisiologiaRESUMO
PURPOSE: Waste collection is considered particularly heavy work, although no previous study has yet investigated the strain of bulk waste collection. The aim of this study is to determine the workload of bulk waste workers in practice. METHOD: We conducted a cross-sectional field-study. Fourteen male volunteers from the bulk waste collection of the municipal sanitation department in Hamburg, Germany, were included. Performance was determined by cardiopulmonary exercise testing under laboratory conditions. During the shift, each worker was accompanied by a researcher, and heart rate (HR) was recorded under field conditions using an HR watch with a belt system. We examined mean HR, relative heart rate (RHR), relative aerobic strain (RAS), calculated oxygen uptake ([Formula: see text]) and individual ventilatory threshold 1 (VT1) as parameters of workload during their daily work. RESULTS: During the shift, HR was scaled: 102 bpm (SD 10.2), RHR: 36.9%, [Formula: see text]: 1267 ml/min (SD 161), RAS: 49.4% (SD 9.3), and [Formula: see text] in relation to VT1: 75% (SD 18.5). There was no significant difference between oxygen consumption during the main task of lifting and carrying bulky waste and the individual [Formula: see text] at VT1. CONCLUSION: Although the burden of the main task of lifting and carrying bulky waste is very high (at VT1 for more than 3 h), interruptions from other tasks or formal breaks spread the burden over the entire shift. The total workload exceeded most recommendations in the literature across the different work periods. However, the total burden remains below VT1, the only parameter that takes individual endurance performance into account. We recommend again VT1 as an individual upper limit for prolonged occupational work.
RESUMO
Introduction: Heart rate (HR) monitors are rarely used by people living with disabilities (PLWD), and their accuracy is undocumented. Thus, this study aims to describe the HR response during the Team Twin co-running program and, secondly, to assess the agreement and accuracy of using HR monitors among PLWD. Methods: This 16-week single-arm observational study included 18 people with various disabilities. During the study, the subjects wore a Garmin Vivosmart 4 watch (wrist). To evaluate the agreement and accuracy we applied Garmin's HRM-DUAL™ chest-worn HR monitors for comparison with the Vivosmart 4. The HR response analysis was performed descriptively and with a mixed regression model. The HR agreement and accuracy procedure was conducted on a subsample of five subjects and analyzed using Lin's concordance analysis, Bland and Altman's limits of agreement, and Cohen's kappa analysis of intensity zone agreement. This study was prospectively registered at Clinical Trials.gov (NCT04536779). Results: The subjects had a mean age of 35 (±12.6), 61% were male, 72% had cerebral palsy were 85% had GMFCS V-IV. HR was monitored for 202:10:33 (HH:MM:SS), with a mean HR of 90 ± 17 bpm during training and race. A total of 19% of the time was spent in intensity zones between light and moderate (30%-59% HR reserve) and 1% in vigorous (60%-84% HR reserve). The remaining 80% were in the very light intensity zone (<29% HR reserve). HR was highest at the start of race and training and steadily decreased. Inter-rater agreement was high (k = 0.75), limits of agreement were between -16 and 13 bpm, and accuracy was acceptable (Rc = 0.86). Conclusion: Disability type, individual, and contextual factors will likely affect HR responses and the agreement and accuracy for PLWD. The Vivosmart 4, while overall accurate, had low precision due to high variability in the estimation. These findings implicate the methodical and practical difficulties of utilizing HR monitors to measure HR and thus physical activity in adapted sports activities for severely disabled individuals.
RESUMO
Background: People with HIV (PWH) have lower exercise capacity compared to HIV-uninfected peers, which may be explained by chronotropic incompetence (CI), the inability to increase heart rate during exercise. Methods: The Exercise for Healthy Aging Study included adults ages 50-75 with and without HIV. Participants completed 12 weeks of moderate intensity exercise, before randomization to moderate or high intensity for 12 additional weeks. We compared adjusted heart rate reserve (AHRR; CI <80%) on cardiopulmonary exercise testing by HIV serostatus, and change from baseline to 12 and 24 weeks using mixed effects models. Results: Among 32 PWH and 37 controls (median age 56, 7% female, mean BMI 28 kg/m2), 28% of PWH compared to 11% of controls had CI at baseline (p=0.067). AHRR was lower among PWH (91 vs 102%; difference 11%, 95% CI 2.5-19.7; p=0.01). At week 12, AHRR normalized among PWH (+8%, 95% CI 4-11; p<0.001) and was sustained at week 24 (+5, 95%CI 1-9; p=0.008) compared to no change among controls (95%CI -4 to 4; p=0.95; pinteraction=0.004). After 24 weeks of exercise, only 15% PWH and 10% of controls had CI (p=0.70). Conclusions: Chronotropic incompetence contributes to reduced exercise capacity among PWH and improves with exercise training.
RESUMO
To examine the effect of 10-week interval training (IT) at varying intensities on serum muscle damage indicators and antioxidant capacity and determine its effect on the 800-m records of adolescent middle-distance runners. Twenty male high-school middle-distance runners were randomized between the high-intensity IT (HIIT; n=10) and the medium-intensity IT (MIIT; n=10) groups. Three sessions/week for 10 weeks (total of 30 sessions) were performed; one session of IT was for 60 min. The high and medium exercise intensities were set at 90%-95% and 60%-70% heart rate reserve (HRR), respectively. Intensity at rest was 40% HRR for both groups. Weight training was performed at 60%-70% of one repetition maximum for two sessions/week. The changes in serum muscle damage indicators and antioxidant capacity in the two groups were measured, and their effects on the 800-m records were analyzed. The 10-week training reduced serum muscle damage indicators in middle-distance runners, but only the HIIT group displayed a decrease in creatine kinase. For the change in antioxidant capacity, the two groups demonstrated no significant change in malondialdehyde (MDA), whereas the HIIT group exhibited a significant increase in super-oxide dismutase (SOD). IT also reduced the 800-m records in middle-distance running, and the effect was stronger in the HIIT group. In conclusion, 10-week HIIT can have a positive effect on muscle damage indicators, showed a significant increase in SOD as a key indicator of anti-oxidant capacity, and improved the 800-m records in middle-distance runners.
RESUMO
In North America, Hispanic migrant farmworkers are being exposed to occupational ergonomic risks. Due to cultural differences in the perception and reporting of effort and pain, it was unknown whether standardized subjective ergonomic assessment tools could accurately estimate the directly measured their physical effort. This study investigated whether the subjective scales widely used in exercise physiology were associated with the direct measures of metabolic load and muscle fatigue in this population. Twenty-four migrant apple harvesters participated in this study. The Borg RPE in Spanish and the Omni RPE with pictures of tree-fruit harvesters were used for assessing overall effort at four time points during a full-day 8-h work shift. The Borg CR10 was used for assessing local discomfort at the shoulders. To determine whether there were associations between the subjective and direct measures of overall exertion measures, we conducted linear regressions of the percentage of heart rate reserve (% HRR) on the Borg RPE and Omni RPE. In terms of local discomfort, the median power frequency (MPF) of trapezius electromyography (EMG) was used for representing muscle fatigue. Then full-day measurements of muscle fatigue were regressed on the Borg CR10 changes from the beginning to the end of the work shift. The Omni RPE were found to be correlated with the % HRR. In addition, the Borg RPE were correlated to the % HRR after the break but not after the work. These scales might be useful for certain situations. In terms of local discomfort, the Borg CR10 were not correlated with the MPF of EMG and, therefore, could not replace direct measurement.
Assuntos
Fazendeiros , Carga de Trabalho , Humanos , Teste de Esforço , Eletromiografia , Fadiga Muscular/fisiologia , Esforço Físico/fisiologia , Frequência Cardíaca/fisiologiaRESUMO
Although the treadmill and cycle ergometer are commonly used for exercise stress electrocardiography (ECG) testing, they are often difficult to perform with children. We herein evaluated the utility and safety of the 2-minute jump test (2MJT) as a simple, alternative exercise test. One hundred patients, including 60 male patients, with an average age at study commencement of 10.7±3.5 years (mean±standard deviation) and with no exercise restriction who underwent a cardiac check-up between November 2020 and March 2022 at the study center were included. After recording their resting ECG, they jumped for 2 minutes during ECG recording, and the change in heart rate (HR), ECG findings, and occurrence of adverse events were investigated. As a result, patients jumped 185±60 times in two minutes, and their HR increased from 76±13 beats/min at rest to 172±18 beats/min at peak during the test. Ninety (90%) patients attained the ideal target HR of > 150 beats/minute. During the recovery period after loading, five patients had abnormal ECG findings (ventricular extrasystoles, second-degree atrioventricular block, and atrial extrasystoles in two, two, and one patient, respectively) but completely resolved spontaneously within three minutes. Our findings suggested that the 2MJT is a useful and safe exercise test capable of inducing sufficient increase in HR in a short time in children.
RESUMO
Depression in athletes is prevalent, and antidepressant treatment may have a cardiovascular impact. We present a case, documented by serial exercise testing, of exertional intolerance due to chronotropic incompetence associated with tricyclic antidepressant use. This case underscores the importance of understanding the mechanism of action and side effects of antidepressants. (Level of Difficulty: Advanced.).
RESUMO
Background: The percentages of heart rate (%HRR) or oxygen uptake (%VÌO2R) reserve are used interchangeably for prescribing aerobic exercise intensity due to their assumed 1:1 relationship, although its validity is debated. This study aimed to assess if %HRR and %VÌO2R show a 1:1 relationship during steady-state exercise (SSE) and if exercise intensity and duration affect their relationship. Methods: Eight physically active males (age 22.6 ± 1.2 years) were enrolled. Pre-exercise and maximal HR and VÌO2 were assessed on the first day. In the following 4 days, different SSEs were performed (running) combining the following randomly assigned durations and intensities: 15 min, 45 min, 60% HRR, 80% HRR. Post-exercise maximal HR and VÌO2 were assessed after each SSE. Using pre-exercise and post-exercise maximal values, the average HR and VÌO2 of the last 5 min of each SSE were converted into percentages of the reserves (%RES), which were computed in a 3-way RM-ANOVA (α = 0.05) to assess if they were affected by the prescription parameter (HRR or VÌO2R), exercise intensity (60% or 80% HRR), and duration (15 or 45 min). Results: The %RES values were not affected by the prescription parameter (p = 0.056) or its interactions with intensity (p = 0.319) or duration and intensity (p = 0.117), while parameter and duration interaction was significant (p = 0.009). %HRRs and %VÌO2Rs did not differ in the 15-min SSEs (mean difference [MD] = 0.7 percentage points, p = 0.717), whereas %HRR was higher than %VÌO2R in the 45-min SSEs (MD = 6.7 percentage points, p = 0.009). Conclusion: SSE duration affects the %HRR-%VÌO2R relationship, with %HRRs higher than %VÌO2Rs in SSEs of longer duration.
Assuntos
Teste de Esforço , Consumo de Oxigênio , Masculino , Humanos , Adulto Jovem , Adulto , Frequência Cardíaca/fisiologia , Consumo de Oxigênio/fisiologia , Exercício Físico/fisiologia , OxigênioRESUMO
PURPOSE: The present study aimed at designing a simple, affordable, yet safe transfer assistive device (TAD) for lower limb impaired individuals in the context of a developing country. METHODS: The preliminary study carried out, comprising a pilot survey with the involvement of stakeholder's views to design and develop the proposed device that had a unique feature. To evaluate the present TAD in terms of user's comfort of use and level of physical strain, subjects including 19 healthy students serving as "patients" have participated in a laboratory-simulated setting. Data was collected based on user's physiologic effort and rate of perceived exertion using Heart rate monitoring device (Polar RS 400 heart taster) and Borg's scale, respectively. RESULTS: The data were analysed statistically and revealed that the regression equation for predicting the RPE from HR showed 31.3% of the variance in RPE was predictable from the level of HR. The ANOVA significance also indicates the model is statistically significant with (p < 0.013). Similarly, the estimated strain level has computed in terms of %HRR, and the physical strain averaged over the subjects who performed the task (n = 19) was expressed in terms of (mean ± SD) %HRR were (16.21±7.64%) which was a relatively smaller strain level as compared to the previous research report. CONCLUSION: The present device found a potentially affordable solution for reducing fatigue and strain level that might develop during unassisted transfer. Similarly, the unique feature of the armpit support has contributed to dynamic contact force reduction and as double safety support.Implications for rehabilitationThe use of transfer assistive device is associated with increased patient satisfaction and privacy of usersImproved patient adherence and cooperation with caregivers in rehabilitation center.Encourages the rehabilitation settings to use transfer assistive devices instead of manual handling there by increase the recovery period.
Assuntos
Tecnologia Assistiva , Cadeiras de Rodas , Países em Desenvolvimento , Humanos , Satisfação do Paciente , Inquéritos e QuestionáriosRESUMO
This study was to explore the correlation between heart rate reserve (HRR) to coronary flow velocity reserve (CFVR), using adenosine stress echocardiography (SE), in patients with angina and no obstructive coronary artery disease (ANOCA). 111 ANOCA patients underwent adenosine SE were enrolled, which were divided into two groups, impaired CFVR group (CFVR < 2) and control groups (CFVR ≥ 2). The relationships between HRR and impaired CFVR were explored in total and subgroup to sex. A reduced HRR during adenosine infusion was seen in ANOCA patients with impaired CFVR (25.73 ± 8.39 vs. 34.30 ± 19.93, P < 0.001). Compared to respective controls, the blunted HRR to adenosine was more pronounced in female patients (women: 27.21 ± 8.01 vs. 39.48 ± 10.57, P < 0.001; men: 24.05 ± 8.70 vs. 29.12 ± 8.69, P = 0.041). A strong association between CFVR and a blunted HRR was observed in women (r = 0.46, P < 0.001), while no association in men (r = 0.18, P = 0.199). In female, a multivariate logistic regression identified HRR as the strongest negative predictor of impaired CFVR [HR (95% CI) = 0.854 (0.764-0.956), P = 0.006]. Based on the ROC curve, HRR < 35% was a strong indicator of impaired CFVR, with AUC of 0.838, sensitivity of 70%, and specificity of 87% in females. A blunted HRR was seen in patients with impaired CFVR, with a most pronounced effect being seen in female patients. The blunted HRR < 35% is intricately linked to impaired CFVR in women with ANOCA beyond the value of traditional risk factors, which could ultimately contribute to risk stratification of impaired CFVR in such patients.
RESUMO
This study aimed to examine the changes in the blood fatigue indicators, inflammatory markers, and stress hormones following an 8-week intensity interval training in sprinters, and to investigate the effects on changes in the 100-m sprint records. Twenty sprinters from a boys' high school were equally assigned to high-intensity and medium-intensity interval training groups, and three 60-min interval training sessions were performed per week for 8 weeks, for a total of 24 sessions. Exercise intensity was defined as 85%-95% and 75%-85% of heart rate reserve for high- and medium-intensity training, respectively. At rest, both groups had an exercise intensity of 60% of the heart rate reserve. Our results showed decreased fatigue indicators, inflammatory markers, and stress hormone levels after high-intensity and medium-intensity interval training, with no difference between the training levels. In addition, the 100-m sprint records were different in high- and medium-intensity interval training groups, based on the lactate dehydrogenase and adrenocorticotropic hormone levels. In conclusion, medium-intensity interval training with a reserve heart rate of ≥75% can have a positive effect on blood fatigue indicators, inflammatory markers, and stress hormones in sprinters. Specifically, the changes in adrenocorticotropic hormone level seen in the high-intensity interval training group were found to have a significant effect on the 100-m sprinting records.
RESUMO
BACKGROUND: Age-predicted maximum heart rate (APMHR) has been demonstrated to be a poor predictor of future cardiovascular (CV) events and is yet to be validated as a termination point during exercise testing. In contrast, maximum rate pressure product (MRPP) is recognized as a strong predictor of CV outcome with superior CV event prediction over APMHR. Heart rate reserve (HRR) has been shown to be a powerful predictor of CV mortality during exercise testing, however thus far, this is not confirmed for non-fatal CV events. The aim of this study was to compare APMHR, MRPP and HRR as predictors of CV events following otherwise negative exercise treadmill testing. METHODS: After exclusions, 1080 patients being investigated for coronary artery disease performed an exercise stress echocardiogram (ESE) to volitional fatigue on a motorised treadmill. Blood pressure was measured manually, and ultrasound images performed as per current American Society of Echocardiography guidelines. Rate pressure product and HRR were calculated throughout the test and maximum values were identified. Patients were followed for 5.3±2.6 mean years. RESULTS: From receiver operating characteristic analysis, cut points were established for APMHR (94.6%) (AUC 0.687), MRPP (25085) (AUC 0.729) and HRR% (95.9) (AUC 0.688). MRPP outperformed both APMHR and HRR% for the prediction of future CV events. Furthermore, on Cox proportional hazard analysis MRPP was the strongest uni- and multivariate predictor (p<0.0001) with APMHR and HRR% failing to reach any statistical significance. CONCLUSIONS: The current study demonstrates the substantial prognostic power of MRPP over both APMHR and HRR% to predict CV events following an otherwise negative ESE for myocardial ischemia.
RESUMO
BACKGROUND: Exercise performance depend on the ability of the cardiovascular system to respond to a wide range of metabolic demands and physical exertion. OBJECTIVES: To investigate the habitual smoking effects in heart rate response and heart rate recovery after step test in athletes. METHODS: Seventy-eight physically healthy active athletes (45 non-smokers and 33 smokers) aging 27±8 years old, participated in this study. All participants completed the International Physical Activity Questionnaire and performed the six-minute step test. Cardiovascular parameters such (resting heart rate, peak heart rate, heart rate at 1 min after testing, heart rate recovery, recovery time, blood pressure at rest, and post-testing blood pressure) were recorded. RESULTS: Smoker-athletes had higher resting heart rate (76 ± 9bpm vs. 72 ± 10bpm, p<0.05), maximum heart rate (154 ± 18bpm vs. 147 ± 17bpm, p<0.05) and recovery time (7min 25sec ± 6min 31sec vs. 4min 21sec ± 4min 30sec, p<0.05) than non-smoker athletes. Scores from the IPAQ were approximately the same (M=7927 ± 10303, M= 6380 ± 4539, p<0.05). CONCLUSION: Smoking was found to affect athletes' cardiovascular fitness. The change of the athletes' heart rate recovery and recovery time contributes to the adaptation of cardiovascular function in training requirements.
Assuntos
Atletas , Teste de Esforço/métodos , Exercício Físico/fisiologia , Frequência Cardíaca/fisiologia , não Fumantes , Fumantes , Adulto , Humanos , Masculino , Consumo de Oxigênio , Resistência Física/fisiologia , Adulto JovemRESUMO
BACKGROUND: Previously, young males administered 200 mg/week of testosterone enanthate during 28 days of energy deficit (EDef) gained lean mass and lost less total mass than controls (Optimizing Performance for Soldiers I study, OPS I). Despite that benefit, physical performance deteriorated similarly in both groups. However, some experimental limitations may have precluded detection of performance benefits, as performance measures employed lacked military relevance, and the EDef employed did not elicit the magnitude of stress typically experienced by Soldiers conducting operations. Additionally, the testosterone administered required weekly injections, elicited supra-physiological concentrations, and marked suppression of endogenous testosterone upon cessation. Therefore, this follow-on study will address those limitations and examine testosterone's efficacy for preserving Solder performance during strenuous operations. METHODS: In OPS II, 32 males will participate in a randomized, placebo-controlled, double-blind trial. After baseline testing, participants will be administered either testosterone undecanoate (750 mg) or placebo before completing four consecutive, 5-day cycles simulating a multi-stressor, sustained military operation (SUSOPS). SUSOPS will consist of two low-stress days (1000 kcal/day exercise-induced EDef; 8 h/night sleep), followed by three high-stress days (3000 kcal/day and 4 h/night). A 23-day recovery period will follow SUSOPS. Military relevant physical performance is the primary outcome. Secondary outcomes include 4-comparment body composition, muscle and whole-body protein turnover, intramuscular mechanisms, biochemistries, and cognitive function/mood. CONCLUSIONS: OPS II will determine if testosterone undecanoate safely enhances performance, while attenuating muscle and total mass loss, without impairing cognitive function, during and in recovery from SUSOPS. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT04120363.
RESUMO
Cardiovascular disease (CVD) risk factors cluster in an individual. Exercise is universally recommended to prevent and treat CVD. Yet, clinicians lack guidance on how to design an exercise prescription (ExRx) for patients with multiple CVD risk factors. To address this unmet need, we developed a novel clinical decision support system to prescribe exercise (prioritize personalize prescribe exercise [P3-EX]) for patients with multiple CVD risk factors founded upon the evidenced-based recommendations of the American College of Sports Medicine (ACSM) and American Heart Association. To develop P3-EX, we integrated (1) the ACSM exercise preparticipation health screening recommendations; (2) an adapted American Heart Association Life's Simple 7 cardiovascular health scoring system; (3) adapted ACSM strategies for designing an ExRx for people with multiple CVD risk factors; and (4) the ACSM frequency, intensity, time, and time principle of ExRx. We have tested the clinical utility of P3-EX within a university-based online graduate program in ExRx among students that includes physicians, physical therapists, registered dietitians, exercise physiologists, kinesiologists, fitness industry professionals, and kinesiology educators in higher education. The support system P3-EX has proven to be an easy-to-use, guided, and time-efficient evidence-based approach to ExRx for patients with multiple CVD risk factors that has applicability to other chronic diseases and health conditions. Further evaluation is needed to better establish its feasibility, acceptability, and clinical utility as an ExRx tool.
RESUMO
BACKGROUND: Peak oxygen uptake (peak VO2 ) and heart rate reserve (HRR) are independent prognostic markers of cardiovascular disease. However, the impact of peak VO2 and HRR on long-term prognosis after off-pump coronary artery bypass grafting (OP-CABG) remains unclear. HYPOTHESIS: To determine the prognostic impact of peak VO2 and HRR in patients after OP-CABG. RESULTS: We enrolled 327 patients (mean age, 65.1 ± 9.3 years; male, 80%) who underwent OP-CABG and participated in early phase II cardiac rehabilitation. All participants underwent cardiopulmonary exercise testing (CPET) at the beginning of such rehabilitation. Overall, 48 (14.6%) patients died during the median follow-up period of 103 months. The non-survivor had significantly lower levels of peak VO2 (10.6 ± 0.5 vs. 13.7 ± 0.2 ml/kg/min, p < .01) and HRR (24.2 ± 1.8 vs. 32.7 ± 0.8 beats/min, p < .01) than the survivor. In both groups, peak VO2 significantly correlated with HRR (p < .01). Moreover, patients were divided into four groups according to the peak VO2 and HRR levels for predicting total mortality. The low-peak VO2 /low-HRR group had a significantly higher mortality risk than the other groups (hazards ratio, 5.61; 95% confidence interval, 2.59-12.16; p < .01). After adjusted the confounding factors, peak VO2 and HRR were independently associated with total mortality (both p < .05). CONCLUSIONS: HRR is a simple parameter of CPET and an important prognostic marker for the risk stratification of total mortality even in patients with low-peak VO2 after OP-CABG.