This is my first piece on cricket and quite possibly the only one for the foreseeable future. It's a game I used to follow a lot as a kid and hence I'm familiar with how the game works, but I'm a noob at the modern day intricacies of it. I'll keep it short, technical and to-the-point. Non followers of cricket/corporate finance may find it difficult to relate to the article
So, I was going through this article the other day about a metric which is supposed to indicate how useful a player is in T20 cricket. Similar to how batting average is an excellent indicator of the performance of a batsman is tests, the metric : Strike-rate+average is used to measure roughly the effectiveness of a player in T20s. My short criticism of this is that while strike-rate is something a player scores over 100 balls and average is the what he scores before getting out, there is a bias in the metric towards players having high strike-rates since on an average, players (batsmen, bowlers and keepers) get out before playing 100 balls. Now, there can be an argument that strike-rates are more important than average in T20s and that the metric somehow accommodates for this increased importance by giving more weight to the strike-rate. However, the likelihood that the weightage provided in this simple metric being correct is very low (although the metric scores very high on simplicity due to the strike-rates and averages being readily available) and I would suggest a calculation of the average balls a player faces before getting out and calculating the actual weight that should be given to the average and strike-rate respectively. But why should strike-rates have a higher weightage in T20s?
Now, for anyone having even the most remote knowledge of cricket would know by intuition that strike-rates are more important in shorter versions of the game than in the longer versions like tests. It's something like this - when there is a smaller risk of the team being bowled out well before the allotted overs (or, time in tests- to avoid draws) the team can afford to lose more wickets sooner and thus play a high risk game which allows for a few wickets to fall. High risk tends to give high returns- players are expected to (and do) score much faster in shorter versions of the game but they might get out quicker. In longer versions, the risk of failure if wickets fall is much higher and so scoring runs is more important than scoring runs fast. To draw an imperfect comparison with the world of Credit Risk and finance, a team getting out well before its allotted quota of overs or time = default (it must be noted that there is no allotted quota of survival for corporations), faster scoring rates= financial/other forms of leverage (let's keep it financial leverage to keep things simple) and different forms of cricket represent different worlds where the risk of default is affected differently by financial leverage (while keeping in mind again that there is no expected time-span for survival of corporations).
So, when a player A scores a 3 ball 12 and a player B scores a 20 ball 30, you can judge which one is better (with all the imperfections of using statistics to model real world decisions) if you go back to the metric and combine the strike-rate and 'average' of A (400,12) with sufficient weightage given to both depending upon which form of cricket we are looking at and compare it with the metric value obtained by using the strike-rate and average of B (150,30). However, some complexities which are very difficult to factor in to the model include (1) The fact that batsmen are often expected to stay at the crease longer and make the innings stable (2) Performances in clutch situations especially while chasing require a higher average than strike-rate and (3) A reliable middle order (higher average) often gives the top order batsman more confidence and might result in a higher strike-rate for them.
Going back to finance terminology and how the financial leverage is impacted when a player scores a 60 ball 80 instead of a 10 ball 30: he is scoring for his team (getting profits for the corporation) at a slightly slower rate (the returns are lower) but resulting in lower financial risk for the team. Until now, we have assumed that it is not possible for a batsman to consistently score at higher speeds without risking dismissal. There can also be individuals who score at very high strike-rates without taking much risk, (I'm slightly deviating from the concept of risk being measured at the team level to go to risk at the individual level)- these are akin to the market beaters of the investing world and manage higher returns without taking much risk. The risk here is slightly different from what we commonly talk about in cricket- it doesn't really matter how 'risky'/aerial the shots of a cricketer are; what matters is the probability of him getting out and subsequently increasing the probability of default of the team. A batsman getting out can be considered as a default in itself, and much simpler to measure than the risk of the team getting all out faster if a particular batsman gets out (conditional probability-> Sachin during the late 90s if out quickly would have resulted in a much higher probability of the team getting all-out than with other Indian batsmen). And so, we go back to the metric which weighs and combines average and strike-rate (somehow), to reward players who score consistently higher without risking getting out themselves. The most obvious flaw with the metric (which exists with plain averages as well) is that some players who get out in the last few overs taking extreme personal risks, but not having a substantial impact on the team getting all-out (defaulting) get punished by the metric, which uses probability of a person getting out instead of a more accurate : increased marginal probability of a team getting all-out due to the player getting out. And, due to all these complexities and more, human judgement is extremely important to compensate for the lack of common sense that statistics have.
So, I was going through this article the other day about a metric which is supposed to indicate how useful a player is in T20 cricket. Similar to how batting average is an excellent indicator of the performance of a batsman is tests, the metric : Strike-rate+average is used to measure roughly the effectiveness of a player in T20s. My short criticism of this is that while strike-rate is something a player scores over 100 balls and average is the what he scores before getting out, there is a bias in the metric towards players having high strike-rates since on an average, players (batsmen, bowlers and keepers) get out before playing 100 balls. Now, there can be an argument that strike-rates are more important than average in T20s and that the metric somehow accommodates for this increased importance by giving more weight to the strike-rate. However, the likelihood that the weightage provided in this simple metric being correct is very low (although the metric scores very high on simplicity due to the strike-rates and averages being readily available) and I would suggest a calculation of the average balls a player faces before getting out and calculating the actual weight that should be given to the average and strike-rate respectively. But why should strike-rates have a higher weightage in T20s?
Now, for anyone having even the most remote knowledge of cricket would know by intuition that strike-rates are more important in shorter versions of the game than in the longer versions like tests. It's something like this - when there is a smaller risk of the team being bowled out well before the allotted overs (or, time in tests- to avoid draws) the team can afford to lose more wickets sooner and thus play a high risk game which allows for a few wickets to fall. High risk tends to give high returns- players are expected to (and do) score much faster in shorter versions of the game but they might get out quicker. In longer versions, the risk of failure if wickets fall is much higher and so scoring runs is more important than scoring runs fast. To draw an imperfect comparison with the world of Credit Risk and finance, a team getting out well before its allotted quota of overs or time = default (it must be noted that there is no allotted quota of survival for corporations), faster scoring rates= financial/other forms of leverage (let's keep it financial leverage to keep things simple) and different forms of cricket represent different worlds where the risk of default is affected differently by financial leverage (while keeping in mind again that there is no expected time-span for survival of corporations).
So, when a player A scores a 3 ball 12 and a player B scores a 20 ball 30, you can judge which one is better (with all the imperfections of using statistics to model real world decisions) if you go back to the metric and combine the strike-rate and 'average' of A (400,12) with sufficient weightage given to both depending upon which form of cricket we are looking at and compare it with the metric value obtained by using the strike-rate and average of B (150,30). However, some complexities which are very difficult to factor in to the model include (1) The fact that batsmen are often expected to stay at the crease longer and make the innings stable (2) Performances in clutch situations especially while chasing require a higher average than strike-rate and (3) A reliable middle order (higher average) often gives the top order batsman more confidence and might result in a higher strike-rate for them.
Going back to finance terminology and how the financial leverage is impacted when a player scores a 60 ball 80 instead of a 10 ball 30: he is scoring for his team (getting profits for the corporation) at a slightly slower rate (the returns are lower) but resulting in lower financial risk for the team. Until now, we have assumed that it is not possible for a batsman to consistently score at higher speeds without risking dismissal. There can also be individuals who score at very high strike-rates without taking much risk, (I'm slightly deviating from the concept of risk being measured at the team level to go to risk at the individual level)- these are akin to the market beaters of the investing world and manage higher returns without taking much risk. The risk here is slightly different from what we commonly talk about in cricket- it doesn't really matter how 'risky'/aerial the shots of a cricketer are; what matters is the probability of him getting out and subsequently increasing the probability of default of the team. A batsman getting out can be considered as a default in itself, and much simpler to measure than the risk of the team getting all out faster if a particular batsman gets out (conditional probability-> Sachin during the late 90s if out quickly would have resulted in a much higher probability of the team getting all-out than with other Indian batsmen). And so, we go back to the metric which weighs and combines average and strike-rate (somehow), to reward players who score consistently higher without risking getting out themselves. The most obvious flaw with the metric (which exists with plain averages as well) is that some players who get out in the last few overs taking extreme personal risks, but not having a substantial impact on the team getting all-out (defaulting) get punished by the metric, which uses probability of a person getting out instead of a more accurate : increased marginal probability of a team getting all-out due to the player getting out. And, due to all these complexities and more, human judgement is extremely important to compensate for the lack of common sense that statistics have.
No comments:
Post a Comment