Almanak x Bancor: Protocol Economy Assessment Report
We’re Almanak: a revenue — optimization & risk management platform for DeFi & Gaming.
We utilize agent-based simulations to advise web3 communities on strategic decisions and parameters.
Our mission is to leverage data science & tradi-fi knowledge to maximize protocols profitability, while simultaneously ensuring their economic security.
We’re proud to present our first major publication: The Almanak x Bancor Protocol Economy Assessment Report”, which we have prepared for the Bancor Network.
(You can download the report here).
Bancor Network is a household name within DeFi — as a protocol responsible for the invention of the first automated market maker (AMM) liquidity pool in 2017, it has remained at the forefront of DeFi innovation with features such as Single Sided Liquidity, while simultaneously developing one of the most knowledgeable communities in the space. You can learn more about the Bancor Network network here.
This report is just the beginning — As Almanak, we’re interested in providing the Bancor DAO with continuous strategic intelligence support, and have submitted an official engagement proposal on the Bancor DAO’s forum.
The following article summarizes the findings of the report and explains our approach methodology.
Bancor’s Economy Assessment Report by Almanak:
In the aftermath of the May 2022 crash, Almanak identified areas of improvements to both enhance the recovery plan and introduce sustainable protocol risk management mechanisms that optimize the ratio of Capital Efficiency and Deficit. Almanak’s proposed optimization methodology is based on the design of “protocol levers” and their dynamic calibration through an agent-based simulation framework. The latter uses evolutionary algorithms and domain expertise to build a set of dynamic decision-making solutions, implemented within a scalable, modular and composable architecture. This offers the flexibility to design and deploy tailored data-driven agents, replicating actions on the Bancor protocol. Almanak framework also integrates validation methods to ensure proper generalization. The main result brought forward by the protocol risk management framework is that the protocol profitability is a function of volatility and arbitrage volume which can be actively managed through concerted tuning of the pools’ dynamic swapping fees and trading liquidity (TL).
Bancor V3 Recent Updates and Current State:
During the June 2022 market downturn, Bancor V3 experienced a high volume of withdrawals that triggered Liquidity Protection and covered it with BNT. This instantly triggered liquidation, causing a significant BNT price divergence from other assets. Bancor governance decided to disable Liquidity Protection. The protocol-wide deficit amounts to $46M dollars as of late August 2022 while Bancor V3 token pools had a surplus of ~$7.6M in their V2 pool counterparts that the Bancor DAO will need to deliberate on how to resolve.
To quickly respond and tackle the recovery, the protocol was updated (see V2.1 Vortex set to 100% and V3 Vortex set to 90%), allowing V2.1 to collect 100% of networks fees to buy BNT, swap BNT for vBNT and then burn vBNT, and V3 to collect 90% of network fees to buy BNT. Additionally, to address the pools deficit, 90% of BNT-side aggregated trade fees are being used to be burnt by the Vortex.
Objective and Approach:
The report presents Almanak’s approach to design an optimization solution for Bancor’s V3 risk-reward ratio between impermanent loss (IL) and protocol revenue. The solution improves Bancor’s short-term deficit time-to-recovery and contributes to the overall protocol’s long-term sustainability.
We describe the ongoing economic and technical situation Bancor is currently in. We then explain our methodology to design the optimization solution. We discuss the protocol levers and agent design as well as the simulation engine architecture and solution validation. Training of agents’ interactions with the environment and the simulation process itself are also outlined.
The numerical assessment is performed on the following assets: ETH, WBTC, LINK and DAI. Due to the structure of Bancor’s Omnipools, BNT is necessarily included, though not analyzed separately.
Our methodology pertains to the field of sensitivity analysis aiming to determine the system input variables (i.e. optimization levers) which contribute to the quantities of interest depending on the system output. The goal is to produce a set of dynamic, data-driven rules, integrable to the client’s architecture to enhance protocol risk management. We rely on expert domain investigation and numerical validation using screening, measures of importance, and deep exploration of variation range over mixtures of historical and simulated data. By building convex sets of parameters and factors, we use evolutionary optimization schemes to design representative sets of scenarios informing the relationships between input and output.
Given Bancor V3 observed challenges, Almanak has investigated potential optimization levers within Bancor V3 to mitigate the risk of realized (confirmed withdrawals of user deposits) and unrealized (value at risk due to changes in asset prices) vault deficit and to maximize the protocol’s profit.
To ensure governance continuity, Almanak also prioritizes parameters that Bancor DAO can act upon. Levers are broadly screened and filtered through domain expertise involving fundamental and qualitative analysis to isolate the most potent parameters affecting the two main metrics of interest: protocol capital efficiency and protocol deficit. Our investigation thus leads the optimization framework to focus on the following two axes:
- Maximization of the revenue of the protocol (Protocol Capital Efficiency)
- Minimization of protocol exposure to unrealized IL (Protocol Deficit)
With these axes as north stars, the following parameters were retained as subject to optimization:
- Pool swapping fees: Introduction and optimization of dynamic fees per pool to maximize protocol revenue from trading. Swapping fees are by far the most important revenue stream for the protocol and are therefore one of the main parameters to tune. This primarily boils down to answering the question of what is the volatility and how much arbitrage trading “should” take place. Based on the volatility and how large the share is and the recurrence of the trades, swapping fees are adjusted, thus impacting the number of incoming trades (the higher the fees, the lower that number). The swapping fee is calibrated as the sum of a base fee and a risk add-on that includes arbitrage and volatility factors, and optimized for during our simulations.
- Pool depth: Introduction and optimization of dynamic on-curve trading liquidity in order to minimize protocol exposure to further impermanent loss. We do that by dynamically changing the pool size by moving the trading liquidity to an idle, staked-liquidity state, thus changing the factor of the CPMM pool equation, which can be interpreted as TL. Changes are triggered after volatility thresholds. Both the extent of the change and the volatility threshold are optimized for during our simulations.
Virtual environments are simulated in order to replicate real world conditions in terms of both chain mechanisms and market conditions. For each of these virtual simulated environments, we copy the same mechanics of participants that are present on-chain. To achieve this, we define multiple programmatic agents who can act upon and receive feedback from virtualized simulated environments. Agents are of three types: Liquidity Providers, Traders, and Arbitrageurs.
Each agent’s actions and interactions affect the live simulated environment which also affects other agents, their behavior, and subsequent actions. An agent’s behavior in an environment is modeled after real world market participants via a combination of machine learning, probabilistic programming, and domain expertise.
Each run of the simulation is therefore unique as the combinations of agents and their parameters are different. The simulation environments are seeded by price trajectories, with a plethora generated to cover a maximum of potential scenarios and validate the protocol lever combination. Price trajectories can be a combination of historical prices, a recombination of prices from various time periods, or a variation of black swan events injected into data and simulated trajectories.
Validation of the efficacy of the simulation is a complex task. For each simulation run, simulation logs, agent logs, and operational metrics are generated. We validate the logs to ensure that the system is operating effectively as intended. To measure the efficacy of our output, it undergoes overfitting tests as mentioned above. To measure the efficacy of the simulation and agent-based modeling solution, we perform walk forward optimization and tests which not only validate our approach but help to refine hyperparameters.
Another hyperparameter optimization is to take data input as a parameter in itself. Data drift detection is used to ascertain the optimal amount and use of data while both walk forward and data drift will validate the “longevity” or strength of each output before a new set of recommendations are required. Agents also undergo a similar validation, including the use of out-of-sample datasets to determine whether their fundamental approach has been effective, in addition to a comparison of real-life execution behaviors vs. simulation behavior with our domain expertise.
The general tendency of the solution hints at larger trading liquidity and lower swapping fees at times of medium volatility. For extremely volatile markets, the solutions indicate that profitability may be optimized by maintaining either (i) very high fees and very low TL, in case the market is facing high volatilites, or (ii) very low fees and very high TL, in case of a flat market. Overall, the favored solution at this stage improves profitability by 12% and burns an additional 15% of vBNT over time.
Follow our impact:
You can find Almanak on Twitter, LinkedIn and here, on Medium.
You can read more about who we are on our website.
For engagements & questions, please contact us via our website’s form or simply send us a DM through one of the channels. We’re always happy to chat!
The next wave of innovation in crypto will be driven by data.