“As Fama put it, life always has a fat tail.”
Roger Lowenstein
The real world is a strange place. Nothing works as it should. Never is that more true than in the risk management business. Black Swan events that statistically should occur once every a few thousand days occur much more often than that. The smartest minds in the world inside Long Term Capital Management went bankrupt on an infinitesimal statistical probability. Casinos that manage risk recognise that Martingale betting systems are real. They statistically should not work, but time and time and again, gamblers can beat the roulette wheel by utilising them to the extent that they are now banned the world over.
Of course, statistical outcomes are influenced by many exogenous factors. Roulette wheels and balls have slight imperceptible differences. Liquidity in finance plays a huge difference. Correlations can be very misleading and change rapidly in totally unforeseen ways. Human nature plays a large part in Black Swans. The list goes on, and many of these things are imperceptible. However, the basic fundamental analysis that many risk managers rely on is flawed when you are dealing with non-normal distributions, particularly those with fat tails that we see in the Crypto Market.
This paper aims to outline how we at LibertyRoad Capital utilise Extreme Value Theory to calculate Extreme Value at Risk “(EVaR”), which we will demonstrate can be very useful as a forward-looking indicator in risk analysis, especially when it is incorporated into the investment process as a risk sizing metric.
This paper is not intended as a quantitative paper. References are provided for those readers who wish to dive deeper into the subject, but it is intended to outline first the (well-known) deficiencies in traditional Value at Risk and secondly show that risk analysis is not just about monitoring static risk limits. EVaR, when applied correctly inside the investment process, can be a valuable source of Alpha in that it undeniably improves risk-adjusted returns.
The genesis of how LibertyRoad incorporates EVT in utilising EVaR came from The Cambridge Strategy, which Russell Thompson co-founded in 2003, and which pioneered risk management inside the investment process, firstly via “risk of ruin”, then Omega Functions and then Extreme Value Theory.
The implementation of improving risk management techniques and the use of EVaR as a trade sizing tool is documented in the paper “ Risk Management at the Trade Level in Foreign Exchange Strategies”, which is on the LibertyRoad website. Risk Management was incorporated into the investment process via Risk Adjusted Trade Size (“RATS”), initially using a Sharpe ratio-based calculation (“SRATS”), then Omega Functions via the CS Ratio (“CS RATS”), and finally via Extreme Value Theory (“ERATS”).
The results on improving risk management are below:
It can be seen that Cambridge saw a 30% increase in risk-adjusted returns solely from incorporating a dynamic risk management mechanism utilising Extreme Value Theory into the investment process. Cumulative return increased, and Annualised Volatility decreased. Risk management can be a source of Alpha.
However, EVT is not a panacea. November 2022 proved that. It is a tool, and many weapons in the risk management arsenal need to be utilised to develop robust dynamic risk management protocols that can deal with a “life that always has a fat tail”. This paper looks at the impact of FTX on The LibertyRoad risk management process and the practical implementation of EVT and EVaR in the process during November/December 2022. It further deals with the lessons learned and improvements made in our risk management system, and finally, what other factors outside of EVT need to be considered for anyone interested in implementing holistic risk oversight.
1. Introduction
“If you want to get an idea of a friend’s temperament, ethics, and personal elegance, you need to look at him under the tests of severe circumstances, not under the regular rosy glow of daily life.”
Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable
Risk Management is an evolutionary subject. It is often thought of as being dry and static and is used by risk managers who monitor risk by utilising pre-set risk limits based on an analysis of historical returns underpinned by pre-set assumptions about the distribution of returns. It is the fundamental basis of Value at Risk (“VaR”) models, where distributions such as variance-covariance or historical simulation are utilised. The distribution of returns is analysed based on an assumption such as, the distribution will be normally distributed, and a level of loss is estimated based on a level of confidence based on variance, typically 95% or 99%.
However, there are several fundamental problems with this type of approach:
1. The VaR approach will fail if the distribution of returns is non-normal, in particular, if it is fat-tailed.
2. The approach tells us absolutely nothing about the potential size of the loss beyond the level of confidence.
3. The approach is inherently static. There is no reason why risk should not be dynamically adjusted and targeted based on changing market conditions and data points, especially those that indicate that the distribution of returns is getting fatter.
A very intuitive and helpful way to address these issues is via Extreme Value Theory (“EVT”). EVT does not assume a normal distribution, and is intended to model the tails of a distribution beyond the level of confidence, giving much better visibility to the size of potential loss in Black Swan event environments. It is also well suited to be dynamically adjusted as market conditions change, as it is particularly sensitive to left-hand tail events and will quickly adjust to left-hand tail events as the tail is modelled specifically. With a variance-covariance model or historical simulation, one outlying data point is unlikely to affect the underlying VaR in a significant way. Therefore, EVT can be a source of alpha, not just a tool for monitoring risk, but an intrinsic part of the investment process.
The purpose of this paper is to articulate the concept of EVT as opposed to VaR to a non-quantitative reader. Anybody wishing to dive deeper into the subject should explore the reference papers footnoted. Educated risk-taking is a good practice. Nobody ever did well in life without taking risks. Avoid painting the Sistine Chapel floor, and look at the ceiling.
2. Value at Risk
“If you hear a “prominent” economist using the word ‘equilibrium,’ or ‘normal distribution,’ do not argue with him; just ignore him, or try to put a rat down his shirt.”
Nassim Nicholas Taleb
VaR measures the maximum expected loss over a given time period at a given confidence level that may arise from uncertain market factors. It is mathematically expressed as:
Where:
μ is the mean of returns, σ is the standard deviation of returns and Zα is the standardised value at a α significance level. It is intuitively easy to understand. It assumes a normal distribution of returns, and looks solely at the left hand tail, therefore risk is viewed as a bad thing. It allows somebody to view the risk in a portfolio and gain an insight into the expectations of a loss being over a certain amount. For example a daily VaR of 5% to 2 Standard Deviations of confidence implies that the portfolio would reasonably expect to lose more than 5% in 2.28 days out of each 100 days, and that 95.44 of each 100 days would have a return of between -5% and +5%.
VaR can be calculated in several ways. The three most common are:
- Variance-Covariance
It uses the variances and covariances of assets for VaR calculation and is a parametric method as it depends on the parameters of the probability distribution of price changes or returns. It assumes that asset returns are normally distributed around the mean of the bell-shaped probability distribution. Assets may have a tendency to move up and down together or against each other. This method assumes that the standard deviation of asset returns and the correlations between asset returns are constant over time.
- Monte Carlo Simulation
A large number of simulations are run with a random process for a variable of interest (such as asset returns in finance) covering a wide range of possible scenarios. These variables are drawn from pre-specified probability distributions that are assumed to be known. Monte Carlo simulations inherently try to recreate the distribution of the return of a position from which VaR can be computed.
- Historical Simulation
It is a non-parametric method of VaR calculation, as VaR is estimated directly from the returns without estimation. This methodology is based on the approach that the pattern of historical returns is indicative of the pattern of future returns. The first step is to collect data on movements in market variables (such as equity prices, interest rates, commodity prices, etc.) over a long time period, we sort the distribution of historical returns in ascending order (basically in the order of worst to best returns observed over the period). We can now interpret the VaR for a one-day time horizon based on a selected confidence level (probability).
It is not my intention to outline the computational methods behind each, or the advantages or disadvantages of each, but a detailed analysis of each method can be found in Jorian (1997)[1]. and Dowd (1998).[2]
3. Extreme Value Theory
“Million to one chances crop up nine times out of ten.”
Terry Pratchett
The main objective of EVT is to make inferences about sample extrema (maxima or minima).
In order to implement EVT, there are certain decisions and assumptions that must be decided. The following section outlines what LibertyRoad does in the practical implementation of EVT in the crypto space, but at a high level. Section 4 outlines the differences between the traditional way to calculate EVaR using EVT and the way that LibertyRoad calculates EVaR. However, the overall process is as follows:
- Determine the distribution to be used.
- Decide on a sampling strategy (typically Block Maxima v Peak over Threshold).
- Decide on a sampling threshold.
- Model the samples with given distribution and convert to Extreme Value at Risk (EVaR).
1. Determine the distribution to be used.
In this context the so-called Generalised Extreme Value distribution (“GEV”) plays a central role. Also known as the Fisher Tippett theorem, it can be shown that for a broad class of distributions the normalised sample maxima (i.e. the highest values in a sequence of iid random variables) converge towards the GEV distribution with increasing sample size.
It is complicated mathematics, but essentially the GEV incorporates three specific types of probability distribution based on a shape factor, x, which can be either >0, <0, or=0. The GEV includes three extreme value distributions as special cases, the Frechet distribution (x > 0, fat tailed), the Weibull distribution (x < 0,short tailed), and the Gumbel distribution (x = 0, thin tailed). For our purposes here, EVT is interested in GEV, where the shape factor x >0, and that allows us to model a distribution based on the distribution that corresponds with Frechet. The two obvious choices are the t-distribution and the Generalised Pareto Distribution (“GPD”). The normal distribution is therefore excluded from any Frechet distribution by definition, as it does have a characteristic of x>0. So the first step in modelling the distribution of any tail is the choice of a qualifying distribution.
Extreme Value Theory practitioners tend to gravitate toward using the Generalised Pareto Distribution for modelling fat tails. The original Pareto distribution was developed by Vilfredo Pareto, and is used to describe the distribution of incomes.[3]
The Generalised Pareto Distribution is defined as:
2. Decide on a Sampling Strategy.
EVT relies on a means of producing sample data and the extremes of the distribution. The two main ways of doing this are Block Maxima (“BM”) and Peaks over Threshold (“PoT”).
With Block Maxima the data is divided up into discrete blocks, and the highest observation in each block is taken. With Peaks over Threshold, all observations over a given threshold are included in the sampled date.
Block Maxima versus Peak Over Threshold
Both methodologies have advantages and disadvantages, but at LibertyRoad, we use Peaks over Threshold. It is because PoT is more efficient if complete (time) series without gaps are available, as all values exceeding a certain threshold can serve as a basis for model fitting. In some cases, fitting distributions to Block Maxima data is a wasteful approach as only one value per block is used for modelling, while the threshold excess approach potentially provides more information on extremes.
3. Picking the Threshold.
Setting the threshold is often calculated using the conditional mean of the exceedance size over the threshold (given that an exceedance occurred). It is often referred to as the Mean Excess Function (“MEF”) and can be shown as:
The choice of the threshold is important. It must be considered extreme, and the biggest challenge here is:
“The threshold must be sufficiently high to ensure that the asymptotics underlying the GPD approximation are reliable, thus reducing the bias. However, the reduced sample size for high thresholds increases the variance of the parameter estimates.”
What this essentially means is if the threshold is too high, then the sample size will be reduced in such a way that the original distribution is not well-represented.
So where to pick the threshold? There are several ways to do this. At LibertyRoad, we utilise our own AI to compare actual against forecasted returns and place the threshold in a dynamic way where there is a decoupling. We believe it provides valuable Alpha and conservative risk management where the predictive ability of our models may get overwhelmed by Black Swan events. This is covered in the next section.
However, historically, in practice, the threshold is set where the Mean Excess Function becomes linear. The function above the threshold u is set where the function becomes linear. In general, we know that the MEF is a linear function in the case of the Generalised Pareto Distribution, and we further know that in practice for the lower threshold levels, the MEF is not linear, and the distribution is not GPD. Therefore, we can test the MEF for different levels of the threshold and apply the threshold when the function becomes linear.
4. Calculating Extreme Value at Risk.
Once we know the values of the parameters of the GPD distribution, we can use them to calculate the value at risk. We start with the fact that the GPD is a good approximation of the excess distribution function:
In addition, F(u) can be numerically a
where N denotes the total number of data (including profits), and Nu denotes the number of exceedances over the threshold u
We then denote the probability level of (1-day) VaR by α, i.e. we want to calculate VaR 1 (1 − α). Given that the random variable X denotes 1-day losses taken with the positive sign, we have:
In addition, let x p be such that x p + u = VaR 1 (1 − α). Hence one has:
Implying that
Finally
4. LibertyRoad Practical Implementation of Extreme Value Theory and the Calculation of Extreme Value at Risk (“EVaR”).
“60% of the time, it works every time.”
Brian Fantana. Anchorman
“In theory, theory and practice are the same. In practice, they are not.”
Albert Einstein
Practice makes perfect. So how do we calculate a EVaR in the real world utilising EVT?
Firstly, we decide on a distribution to model the sample that we obtain for the left hand tail distributions. At LibertyRoad we utilise the Generalised Extreme Value distribution, with the special case of a Frechet distribution. Then, we use the Generalised Pareto Distribution to model our sample distribution.
The sample is calculated by taking every option in the portfolio, and calculating Black Scholes pricing on each option using a Monte Carlo simulation model. The deviations we use are, by definition, interested in sample extrema, and we conservatively use large variations in the two main options variables, underlying spot price and implied volatility. Then, that produces a distribution of returns based on moving our two variables, and that produces the sample distribution. The distribution is also used in the calculator of our risk matrix shown in Section 6, which has separate risk limits attached to it.[4]
The sample distribution is collected using a Peaks over Threshold technique. It allows us to obtain the maximum sample size without missing clusters, which is the main disadvantage of Block Maxima.
Finally, the sample is taken utilising a threshold cut off. We look at the sample, and establish the mean Excess Function, which is the conditional mean of the exceedance size over threshold. By using the Generalised Pareto Distribution, we know that the MEF is linear, above higher thresholds u, so we therefore, calculate the MEF for each value for u utilising the GPD, and we observe where it is linear.
Mathematically this is:
The threshold is then conditionally set at the point where the distribution becomes linear. This is easily seen below.
Mean Excess Function Plotted Against Daily Losses.
However, at LibertyRoad, we then employ a machine learning algorithm that tracks the predicted values in utilising the threshold u, to see in reality how linear the actual distribution is above u. Utilising this machine learning algorithm allows us to slightly adjust the threshold u based on the real world observations plotted against the predicted distribution. That allows the threshold to dynamically adjust to moving markets, and we believe it provides us with valuable alpha.
Predicting the Threshold
Using the parameters in section 3, we now know all of the parameters of the Generalised Pareto Distribution, and then we then solve for EVaR from the maths provided in the previous section, to 2 standard deviations of confidence.
Also importantly, because the Mean Excess Function is linear, we now also have visibility of predictions of potential losses above two standard deviations of confidence.
The end result is a much more accurate, conservative and dynamically adjusted measurement of VaR than when compared to traditional measurement of VaR.
The impact can actually be quite significant, and in complex financial derivatives with several embedded types of risk, the EVaR can be significantly higher than that predicted by normal estimations of VaR.
Below is a comparison of the VaR outputs at different significance levels output from our Python models into our UI. At 99% confidence, it can easily be seen that EVaR is double traditional VaR. It may well go a long way towards understanding why Black Swans occur much more often than could reasonably be assumed, when traditional parametric estimates of VaR are utilised.
Parametric VaR compared to EVaR.
It is all programmed in Python and then is delivered to our User Interface (“UI”), where we deliver an EVaR for each product live in real time. We store every EVaR for each of our products each minute and also provide a live risk graph showing each Delta product and EVaR live over the last 24 hours.
LibertyRoad – Live EVAR screen in UI showing Live EVaR
This is the real world implementation of risk management that delivers Alpha and consequently much better risk adjusted returns, as we can see from November 2022.
5. LibertyRoad – Real World Implementation of EVaR. FTX and the 8th November 2022.
“Nothing is perfect; nothing is imperfect. Perfection and imperfection reside in your perception.”
Debasish Mridha MD
LibertyRoad actively targets a given level of risk on an annualised basis that, over the medium term, should generate a given return and, therefore, a given level of risk adjusted return.
The level of risk is directly affected by the amount of risk taken for each trade, so that the portfolio in its entirety targets a given level of risk. It is done at a granular level by having a Extreme Risk Adjusted Trade Size (“ERATS”) applied to each trade we execute. The summation of all the trades in a portfolio generates a portfolio Extreme Value at Risk, level of the risk, that we would expect over the medium term to generate a given return.
Therefore the ERATS is directly affected by the level of computed EVaR, whereby any movements in the level of expected shortfall in the distribution above our threshold results in an adjustment in the ERATS, which consequently adjusts overall targeted risk.
It is a proactive and conservative approach to risk management as obviously the level of EVaR at higher levels of confidence will be considerably higher than predicted by a
traditional VaR model and will obviously give much more visibility to the actual level of a potential loss above the traditional Standard Deviation level of confidence used with VaR.
To give an idea of how powerful this approach can be, below is the LibertyRoad EVaR for our Ethereum Negative Gamma strategy (which was the worst affected by FTX) from the 6th of November to the 8th of November. The collapse of FTX was a true Black Swan event in the crypto markets. The prices of digital assets collapsed on the 8th of November, and ETH fell 37% in 2 days. Not only that, but volatility in ETH went up 300%, and liquidity in the derivatives markets collapsed as many market makers either pulled out of the market, or were forced out from bankruptcy due to having funds on the FTX exchange.
The collapse in digital assets began on the 8th of November at about 5pm, as can be seen below.
Ethereum Price Movement November 6th – November 9th
Looking at the chart below, you can see the LibertyRoad EVaR was stable and contained for the two days before the FTX collapse, and then around 4 am on the morning of the 8th of November, it suddenly spiked by 900%. It had the effect of firstly reducing the Extreme Risk Adjust Trade Size in the portfolio for any new trades and secondly reducing the portfolio EVaR by reducing the Delta of the portfolio. Unfortunately, it was not possible to reduce the Vega of the book significantly before FTX collapsed due to a lack of liquidity, but the impact was still significant.
From the graph below, you can see the impact this move had on our portfolio, as around 3 am, the portfolio started to reduce, and between 3 am and 9 am, the Delta position went from 100% to flat.
Despite our EVaR measurements increasing and risk being reduced in the collapse of FTX, it did not prevent LibertyRoad from losing 27% in November. However, December saw a sharp recovery in returns as we were able to avoid a much worse outcome by Delta hedging the portfolio. Unfortunately, liquidity was particularly poor over that period, primarily due to many market makers dropping out of the market who had funds on FTX.
Hedging the Vega of the book was very difficult, and we had to run the Vega position through November, manage the Negative Gamma and let the options positions expire in December, which then saw a sharp improvement in returns.
In normal circumstances, we could have reduced the Vega over a few days and hedged the portfolio against the huge surge of volatility, but the net November-December loss was actually very satisfying given the huge moves, and this was largely attributable to being able to delta hedge the portfolio in advance by utilising EVaR on the night of November 7th/8th. As the next section covers however, EVaR is not everything. A strong risk management process has many facets. Despite the bad month in November, cutting our Delta before the move saved significant losses, and greatly enhanced the risk adjusted returns.
6. How to Develop a Robust Comprehensive Risk Management Process
“Overconfidence in numbers and quantitative techniques and in our ability to represent extreme events should be subject to severe criticism because it lulls us into a false sense of security.”
Thomas S. Coleman, A Practical Guide to Risk Management
Utilise the best risk reporting framework available to you, but extremely important is to understand what your risk system is telling you. Risk metrics should be easily available, clearly presented and understood by recipients. The limits should be monitored and breaches addressed. The limits are not inviolate. They can be changed if appropriate.
LibertyRoad’s main risk measurement is EVaR. It is shown live for each strategy in our user interface, and is particularly powerful because it is easily understood. It encompasses all elements of risk as a distribution of returns is generated from a Monte Carlo simulation, and that simulation encompasses both price movements and volatility movements. Data is still a problem, however. Derivatives are still relatively new, and only about three years of data is available on volatility, so it is important to recognise deficiencies in your systems due to factors such as that.
Employ independent oversight. The reporting framework needs to communicate both to practitioners that are taking risk, but also to senior management who are monitoring the risk. It is a very good practice to summarise all the risk parameters in one report and have a non-trading risk manager/compliance officer sign off the risk limits and utilisation each day. Very importantly, somebody outside of the risk takers must have the ability to close a position to access trading platforms and cut or close the risk.
Risk often has many constituents. Understand the impacts of each one. As November 8th has shown, the true black swan events tend to decimate long term correlations and risk elements such as credit and volatility can become very closely correlated. For options for example, LibertyRoad utilises a live inhouse risk matrix where price movements can be isolated from volatility movements. The matrix can add hedges to determine the impact they have on the overall portfolio.
A Scenario Grid from the LibertyRoad User Interface.
Risk must be managed and implemented by experienced professionals who understand the process. There is a price for everything. There is no substitute for experience. A hard-earned experience allows you often to see the wood from the trees and isolate and close the really dangerous parts of a portfolio, and manage the less dangerous ones. It is exactly what happened in November 2022. It might have been possible to buy enough volatility to hedge the book, but losses would definitely have been much greater both in November, but additionally there would have been no ability to make the returns in December, as the options would have been cut. Ultimately, considering we had short term options positions, which would all expire in November and December, we covered the directional risk and managed the volatility risk. Experience is worth paying for in risk management, especially with true Black Swans such as 911, Lehman, and LTX.
Implement a risk management framework which is proactive and forward-looking. If the granularity on the tail distribution starts predicting higher potential losses resulting from an extreme event, as we saw in the early hours of November 8th, reduce risk. In LibertyRoad, our EVaR framework has a direct impact on the size of the trades being placed. If EVaR starts rising, the risk taken will go down, and the targeted risk will be reduced. In periods of low EVaR, the shortfall can be made up by targeting higher returns and risk.
Risks are not all about potential loss as predicted by value in risk in all its forms. You must have a framework to manage:
1. Liquidity:
It is the most important. Risk can generally be reduced and managed in liquid markets. In November 2022, LibertyRoad reduced all our exposures so that none were beyond December 2022. In that way, we were able to delta hedge the book, and let the extremely unprofitable options that had produced the losses in November, roll off in December in the form of extreme theta bleeding back into the books. Options positions out 3, 6 or 12 months in that environment would have been extremely problematic to manage. However, in retrospect some of the Open Interest (“OI”) positions by contract were too large, and we have revised our liquidity limits down post 8th November.
2.Position Sizing:
It is related to liquidity, but it is extremely important to have a well-structured portfolio that is not too big in a particular position so that it becomes vulnerable to other market actors or is effectively unmanageable due to the size. It was the problem Long Term Capital Management faced.
In complex derivative books, a well-laddered maturity profile is desirable. Where the concentration risk is limited both by a strike and by maturity, and this can easily be achieved by utilising Open Interest sub-limits in the risk management process.
3.Counterparty:
Counterparty assessment is very important. Prime Brokers, Custodians, OTC trading counterparties and exchanges need to be regularly assessed to ensure they are fit for purpose. For LibertyRoad, this involves employing a traffic light system where a number of considerations are assessed on a regular basis to ensure that counterparty risk is kept to a minimum. Communication with your service providers is equally important.
Considerations in this regard include;
- Credit Rating
- Reserve Backing for Exchanges
- Balance sheet and Profitability Analysis
- Security Review
- Sector Risk
- Individual Exposure Risk
- Legal Relationship Analysis.
- Quality of Board Oversight
- Quality of Management Team
- Risk Management Processes in Place
4. Credit
Credit risk needs to be addressed for those counterparties that trade in debt.
7. Lessons Learned from the FTX Collapse
“There are certain life lessons that you can only learn in the struggle.”
Idowu Koyenikan, Wealth for All: Living a Life of Success at the Edge of Your Ability
It is very important to learn from experience. FTX was certainly a life lesson in the overall struggle. On the 8th of November, 7 day ATM Implied Volatility in ETH had the biggest one day rise in history. It went from 83.15% on the 8th of November to 182.95% on the 9th of November.
Liquidity evaporated overnight. Many market makers were forced out of business due to losses or having funds on FTX. It was a true Black Swan event, and it is important to learn the lessons from it. Therefore, we have improved our risk management further.
- We have been even more conservative on the levels of confidence employed in our EVaR risk management process to better identify and model Black Swans, especially where the data is still not very mature.
- We are now working on an even more advanced probability matrix with both EVaR and VaR produced at different levels.
- We have introduced an automatic delta hedging process where our AI models automatically delta hedge within certain parameters. While this might not have helped on the 8th of November, as the delta was already covered, it would have significantly improved the ongoing management of the negative gamma and definitely would have improved returns in November.
- We have implemented improvements to the AI generation of the threshold picked for the left hand tail, which will improve the sampling for the distribution to be modelled.
- We have dramatically increased the Vega movements in our Monte Carlo simulations to more accurately reflect the extreme moves we saw in volatility.
- We are working on some independent liquidity indicators and limits to integrate into our overall system.
8. Conclusion
“We all have s**t on our shoes. We’ve just got to realize it so we don’t track it into the house.”
Karl Marlantes, What It is Like to Go to War
Risk is everywhere. It is unavoidable. But asset managers are mandated to manage that risk and, as much as possible, make sure it stays outside the front door and does not break into the house. No risk management system is infallible, just ask Long Term Capital management, but sophisticated, forward looking risk management is an essential part of the armoury and does produce significant improvements in risk adjusted returns. That is unarguable.
Despite taking losses in our LibertyRoad Ethereum strategies in November 2022, Section 4 outlines how utilising EVT and EVaR helped limit those losses, which were primarily due to a lack of liquidity in volatility products, and by running a flat Delta hedged portfolio due to a huge spike in EVaR the night before the collapse, we were able to navigate a very difficult period, manage the negative gamma, which was admittedly difficult, and allow those losing option positions to mature in December, and over the combined two months the returns were acceptable given the size of the moves.
Risk management must be embedded in the investment process. LibertyRoad utilises a targeted level of risk, which has an expected level of return. That risk is primarily driven by the Risk Adjusted Trade size, which will proactively add or reduce risk and is cognizant of the annualised level of risk and whether it is running hot or cold and will adjust accordingly depending on market conditions.
A solid risk management system, however, is not all about the statistical calculation of a distribution of returns generated by a given portfolio. It is much broader and wider than that. Section 6 highlights just some of the considerations that need to be implemented in a solid, professional, risk management framework.
1. Jorion, P. (1997): Value at Risk – The New Benchmark for Controlling Market Risk. McGraw-Hill, New York.
2. Dowd, K. (1998): Beyond Value at Risk. Wiley, Chicester u.a.
3. (see Jurca, of Univerzita Komensheko, chapter 4).
4. LibertyRoad utilise three “red line” risk limits. 1. A daily EVaR limit. 2. A live Delta versus underlying leverage limit and 3. A ruin limit utilising an unhedged % move in spot and a unhedged % move in volatility. It is in this limit where the risk matrix is used.