Modelling the HRV Response to Training Loads in Elite Rugby Sevens Players

New paper in collaboration with my colleagues Sean Williams, Dan Howells et al. Full-text link below.

Modelling the HRV Response to Training Loads in Elite Rugby Sevens Players

Key Points

  • A systems theory approach can be used to describe the variation in chronic HRV responses to training within elite Rugby Sevens players.
  • For the majority of athletes, model parameters can be used to accurately predict future responses to training stimuli.
  • Responses that diverge from the predicted values may serve as a useful flag for the investigation of changes in lifestyle factors.
  • Internal training load measures (sRPE) markedly outperformed external load measures (HSD) in predicting future HRV responses to training stimuli.

Abstract

A systems modelling approach can be used to describe and optimise responses to training stimuli within individuals. However, the requirement for regular maximal performance testing has precluded the widespread implementation of such modelling approaches in team-sport settings. Heart rate variability (HRV) can be used to measure an athlete’s adaptation to training load, without disrupting the training process. As such, the aim of the current study was to assess whether chronic HRV responses, as a representative marker of training adaptation, could be predicted from the training loads undertaken by elite Rugby Sevens players. Eight international male players were followed prospectively throughout an eight-week pre-season period, with HRV and training loads (session-RPE [sRPE] and high-speed distance [HSD]) recorded daily. The Banister model was used to estimate vagallymediated chronic HRV responses to training loads over the first four weeks (tuning dataset); these estimates were then used to predict chronic HRV responses in the subsequent four-week period (validation dataset). Across the tuning dataset, high correlations were observed between modelled and recorded HRV for both sRPE (r = 0.66 ± 0.32) and HSD measures (r = 0.69 ± 0.12). Across the sRPE validation dataset, seven of the eight athletes met the criterion for validity (typical error <3% and Pearson r >0.30), compared to one athlete in the HSD validation dataset. The sRPE validation data produced likely lower mean bias values, and most likely higher Pearson correlations, compared to the HSD validation dataset. These data suggest that a systems theory approach can be used to accurately model chronic HRV responses to internal training loads within elite Rugby Sevens players, which may be useful for optimising the training process on an individual basis.

HRV responses to in-season training among D-1 college football players

During spring training camp, we found that Linemen demonstrate the greatest reductions in LnRMSSD at ~20 h post-training, followed by Mid-Skill and Skill, possibly reflecting inadequate cardiovascular recovery between consecutive-day sessions for the larger players, despite lower PlayerLoad values. (Full-text available here)

Our first follow-up study during the early  part of the competitive season found the same position-based trend, where Linemen demonstrated the greatest reductions in LnRMSSD at ~20 h post-training, followed by Mid-Skill and Skill. However, the magnitude of the reductions in LnRMSSD during the in-season were smaller relative to spring camp. We speculate that both reduced PlayerLoad values (15-22% lower than spring camp) and adaptation to intense preseason training in the heat and humidity during the preceding weeks account for the smaller LnRMSSD reductions observed during the early part of the competitive season. (Full-text available here: Cardiac_Autonomic_Responses_to_In_Season_Training_Among_Division1_College_Football_Players)

Cardiac-Autonomic Responses to In-Season Training Among Division-1 College Football Players.

Despite having to endure a rigorous in-season training schedule, research evaluating daily physiological recovery status markers among American football players is limited. The purpose of this study was to determine if recovery of cardiac-autonomic activity to resting values occurs between consecutive-day, in-season training sessions among college football players. Subjects (n = 29) were divided into groups based on position: receivers and defensive backs (SKILL, n = 10); running backs, linebackers and tight-ends (MID-SKILL, n = 11) and linemen (LINEMEN, n = 8). Resting heart rate (RHR) and the natural logarithm of the root-mean square of successive differences multiplied by twenty (LnRMSSD) were acquired at rest in the seated position prior to Tuesday and Wednesday training sessions and repeated over three weeks during the first month of the competitive season. A position × time interaction was observed for LnRMSSD (p = 0.04), but not for RHR (p = 0.33). No differences in LnRMSSD between days was observed for SKILL (Tuesday = 82.8 ± 9.3, Wednesday = 81.9 ± 8.7, p > 0.05). Small reductions in LnRMSSD were observed for MID-SKILL (Tuesday = 79.2 ± 9.4, Wednesday = 76.2 ± 9.5, p < 0.05) and LINEMEN (Tuesday = 79.4 ± 10.5, Wednesday = 74.5 ± 11.5, p < 0.05). The individually averaged changes in LnRMSSD from Tuesday to Wednesday were related to PlayerLoad (r = 0.46, p = 0.02) and body mass (r = -0.39, p = 0.04). Cardiac-parasympathetic activity did not return to resting values for LINEMEN or MID-SKILL prior to the next training session. Larger reductions in LnRMSSD tended to occur in players with greater body mass despite having performed lower workloads, though some individual variability was observed. These findings may have implications for how coaches and support staff address training and recovery interventions for players demonstrating inadequate cardiovascular recovery between sessions.

Figure 1

Our next paper, currently in production, will feature HRV responses among positions throughout the entire preparatory and competitive season.

Interpreting HRV Trends in Athletes: High Isn’t Always Good and Low Isn’t Always Bad

This article was written for the FreelapUSA site. The intro is posted below. You can follow the link for the full article. Thanks to Christopher Glaeser from Freelap for inviting my contribution as I’ve found this site to be a great resource.

Interpreting HRV Trends in Athletes: High Isn’t Always Good and Low Isn’t Always Bad

Heart rate variability (HRV) monitoring has become increasingly popular in both competitive and recreational sports and training environments due to the development of smartphone apps and other affordable field tools. Though the concept of HRV is relatively simple, its interpretation can be quite complex. As a result, considerable confusion surrounds HRV data interpretation. I believe much of this confusion can be attributed to the overly simplistic guidelines that have been promoted for the casual-end, non-expert user.

In the context of monitoring fatigue or training status in athletes, a common belief is that high HRV is good and low HRV is bad. Or, in terms of observing the overall trend, increasing HRV trends are good, indicative of positive adaptation or increases in fitness while decreasing trends are bad, indicative of fatigue accumulation or “overtraining” and performance decrements. In this article I address the common notions of both acute and longitudinal trend interpretation, and discuss why and when these interpretations may or may not be appropriate. We will briefly explore where these common interpretations or “rules” have come from within the literature, and then discuss some exceptions to these rules.

Continue reading article on FreelapUSA