When it comes to analysis, it’s easy to be blinkered and focused on the outcomes. In the context of high level sport, this is obvious: The match. Watch games, find flaws & strengths, and then draw conclusions from there. Make no mistake – matches and competition are where you can garner the most efficient data-points from. However, when you take a step back from it all, how much of a team’s week is oriented in this way? i.e. playing games? Not much. Within a typical schedule, there will be 1-2 matches, 1 full rest day, and other 5 will be training: The balance and allocation of resource/workflows is all off.
Coaches and their support staff take great effort into designing training sessions on the pitch, but the actual diagnosis of the effectiveness of training and the growth of the team/individual is an afterthought. How, as an analyst, can we help change that by increasing our outputs in these settings and bridging the gap between practice (where much of development is made) and competitive fixtures (where this development is shown). This article is just that : A “how-to” of sorts to that presents ways to maximise training workflows and outputs gathered from practice.

Video Analysis
Video is the most natural way to aid with development in training – Pretty much every session at the top level is filmed, and there are a number of ways you can use this footage to both improve the quality of sessions going forward, and how players individually/collectively learn from their work on the training ground. Even if you don’t have access to high quality equipment or software, the concepts presented will help you fine-tune your collection of video in a way which supports enhanced information grabbing.
The Overall/Collective
First off, outside of simply recording all of your training sessions the analyst needs to find simple/effective ways to break down said film and use it to their advantage. A usual practice is around ninety minutes to two hours – “splitting” out drill by drill into digestable sizes is massively important. However, even within that there can be a lot of unnecessary filler. When creating a live coding window, I always (see below) tag each bout of a particular drill with a “Successful” or “Unsuccessful” tag. Even the best organised, thought out, etc. training sessions will have huge margins of error (after all, that’s the point of training) so added these extra filters onto your sessions cuts out on the time needed for you in the review process to analyse it. Going into your work week, that day’s exercises should always have a goal in mind:
- What concepts of our game model do we want to emphasise?
- How can the above connect to that week’s match/opponents?
- Can we do all of this effectively without overloading the players?
The aforementioned tags can be used for the players – i.e. a daily video review sessions which help display coaching concepts, or internally amongst the staff – for databasing purposes (allowing them to plan future sessions or tweak them as needed), and for coaching development to see how they themselves can improve.

The Individual
In terms of the individual side of video analysis, alongside the tags mentioned previously, I always make sure my Sportscode windows (or whatever analysis software I’m using) feature the ability for me to tag every specific player who is partaking in an exercise. When dealing with a squad of 20+ players, these basic additions allow to you further cut the unnecessary fat off your drills. On top of this, you can also add similar “Success/Failure” tags for individual players for use in one on one review sessions. In my past working with teams, I always made it a point to sit down with individuals as much as possible to provide them with personalised video coaching: This kind of private analysis is very commonly used with match footage – The day after a game, you sit down with an athlete and walk through the positive or negative. However, the use of training in addition to this provides the player with a better picture of what they need to improve on. While in matches there exists obvious external factors which inhibit (or simply make it tougher) for players to show off various skillsets, the demands placed in them are hyper-focused and singular in comparison. By showing them how they perform in training sessions and instructing them where they improve/where they shine, the picture is crystal clear for them to see – It requires much less “translation.” Video is the best tool for this in general, but making sure your review process is tailored for individual processes is massively important.
Data Analysis
The data side of training analysis is arguably the toughest to define and gauge accurately – Even the most sophisticated clubs don’t have outputs in practice settings (outside of physical/GPS data), so this has largely an afterthought in most scenarios. Data providers don’t collect training data for obvious reasons thanks to obvious limitations, so if you want to get objective evidence based metrics throughout the week, you need to be creative and do a bit of leg work.
Basic Concepts
The starting point when looking to self-collect data is what is important to you – Namely, the team’s game model and KPIs: Any extra “noise” with your collection will be wasted time. Looking at the example below, you have to fine home-brewed ways to quantify and assess all of these concepts to match. In a previous article I’ve written here on Soccer Detail for my Analysis In Action series, I go more into detail on how to self collect data in this manner – Which can be found here.

Once you’ve found the KPIs in accordance with your team, it all comes down to how you actually want to collect it in practice – Video, as ever, is the best platform in a training scenario, linking us back to previous paragraphs covered. If you have access to advanced video analysis software you can tag these KPIs during or after training to find some kind of objective measurements of success/failure/overall concepts in this environment. To create a scenario, if you were training pressing triggers, you could easily gain basic percentages of how your team performed in regaining the ball by simply adding tags to every round of a drill. If this is done consistently, you can see how your success rate trends over a period of weeks – Allowing the analyst to provide the coaches with information to see if the sessions they are putting on are having a tangible effect.
According to some studies, the ability of instructors in the sports world to accurately remember events lies somewhere around 40-60%, as per a study by Peter Laird & Laura found in the International Journal of Performance Analysis in Sport in 2008. Arming yourself with this information in both the training and match environments empowers you and the rest of the staff to have increased clarity in all your processes. Even if you’re a performance analyst, coach, etc. and you don’t have software which allows for highly sophisticated tagging of phases for each drill, you can do this sort of collection in a way which you can – Hand written notes will do just fine!
Applying/Using The Data
Once you have all your data from training – Be it linked to video (my preferred method), in basic spreadsheet format, or even hand written notes, how can you apply it and make sense of it? After all, the hard part about training is trying to parse information – The demands placed on players are obviously very different than a match, and the ratio of one metric isn’t a 1:1 ratio. Because of this, there is some level of subjectivity to it all (alongside the basic fact that most of your metrics are based off a subjective model of play). To put it quite simply there is no real right or wrong way, but I will say how I like to present training data in my experience.
Ultimately, what is the end goal of training/practice? To prepare the team for match day. I’d say the best word to describe the data you collect and categorise outside of game day would be “proxy” metrics. While there is always going to be some level of subjectivity to training data, arming yourself with as much information as possible helps you gauge collective/individual performance across multiples periods of the week – Training is where growth and success is built: Gaining the full picture in this area allows you to see why results in-game are the way they are more clearly.

The presentation techniques of data are a key point to performance analysis on the whole, and these home-brewed metrics are no different. Personally, I always like to tie everything I show off with video – It’s the most relatable medium across the board. Because of this I always tie in all the numbers and information into the video analysis sessions I put on: Not so much “literally” showing the numbers (i.e. the pressing example I gave earlier), but using the data to inform you and the staff of what things need to be trained more or less throughout the week. If you consistently are failing to get what you want from sessions focusing on final third play, and the numbers match that, it is a good sign that you need to put more effort and time into this side of the game.
Databasing/Organisation
This concept has been touched on throughout, but having a sophisticated databasing system helps tie all sides of video and data analysis in a training setting together – Maximising their effectiveness and helping you do what is actual important: i.e. the analysis itself. If you utilise all the concepts highlighted before, it’ll make this area much simpler.
If you film every session throughout the season, you will have a lot of footage to use for your benefit, but you can easily get lost in the sheer depth of it all. Personally, while I do record practices from start to finish, I never have the full recordings uploaded to the video hosting server/service of my choice. In training (as there is in matches, but that’s another subject entirely) there is a lot of dead time – Essentially, periods which are not important. While I’ve already mentioned the ways to tag video to break it down or split it up, once you have these sessions in their many specific parts, you need to organise them in a way which makes sense:
- Date of the session
- Phase of play (when thinking of the match) which is focused on
- Specific demands of the drill (11v11 vs. small sided, man-up v. man down, shadow play v. opposed, etc.)
If you have your footage looked at through this lens, you can categorise it in such a way which is easy for you to understand – When you need to reference back to a session in order to show it to your players in film review, look back at it to see how a particular drill ran (the design concepts, etc.), and what went well/went poorly, it’ll be a very simple process.
The data side to the organisational process is very much the same – If you have clear and concise metrics that remain consistent throughout the season, looking back at the data helps it all make sense and applicable. How you do this is up to you: If you want to link it all to your video, make bespoke dashboards to reference back to, and more. The choice is yours. However, it’s vitally important that it doesn’t slow down your workflows and become a constant inconvenience to work with.
Final Thoughts
Ultimately, all of this article is how I’ve used training to my advantage throughout my professional career – You might disagree with some parts about how to go about it in practice (pun intended), but regardless, training is very much an undervalued part of the analysis workflow. At most you’ll have two matches per week to dissect if you only use these opportunities as a way to gauge both individual and team performance. Having a more refined/regimented approach to training breakdowns increases that tenfold. Whether it be through complex breakdowns of footage that matches with your team’s style of play, or self collecting data to add objective numbers to practice, there is so much you can add in this field. As an analyst (or a coach, if this is your remit), your job is to try and find ways to empower those you work alongside with information, and the training pitch is where this information has been an afterthought for far too long. Regardless of the level you work at, all these concepts presented are applicable – It just requires the workflows to be put in place to allow you to do so! From there, the insights drawn from training will be of vital importance.
Recent Comments