FRM Part II Cheat
FRM Part II Cheat
Cost of liquiditation, L VAR Early warning indicator Liquidity and reserves management
Dollar spread: s$ = ask − bid The M.E.R.I.T framework is the one for analysing We can define the ”Liquidity position”of a bank
Proportional Bid Offer
P spread: s% = ask−bid
mid Early Warning Indicator (EWI): as: N etLiquidityP osition = SupplyOf Liquidity −
αi
cost of liquidation: i s% 2 P - [M]easure: build measures that must be Granular, DemandF orLiquidity in this two side we can iden-
Liquidity Adjusted Var: LV AR = i 0.5(µi + λσi )αi Sharp, and forward looking to balance sheet structure tify:
where λ is the desired quantile, σ is the spread vola- - [E]scalation: define how measure escalate in case of - supply of liquidity: deposits, revenues from sale,
tility. P problems customer loan repayment, sales of assets, borrowings.
Optimal Strategy: V ARα + i 0.5qi p(qi ) you want to - [R]eporting: define a report (daily) - demand of liquidity: withdrawal, volume of accep-
find qi minimizing. - [I]ntegrated system: IT system good enoug table loan, borrowings repayment, operating expense,
- [T]reshold: define treshold of measures. (green, yel- dividend payments.
Liquidity Funding risk low, red). The subraction is never 0, there is always a liquidity
Liquidity Covergae Ratio under Basel III: surplus or deficit. How to meet liquidity demand?
HighQualityLiquidAsset
LR = N etCashOutf Investement function - Asset conversion: sell asset to transform it in cash
low30DaysP eriod
Net stable Funding Ratio: N SF R = There are two main class of instrument banks invest - borrow asset: borrowing cash.
AmountOf StableF unding
. In a CRR-like on: best approach is a balance between the two.
RequiredAmountOf StableF unding
way this two quantities are found computing each - Money Market Instrument: maturity less then one
category to factors. year - Capital Market Instrument: greater maturities. There are 4 methods implemented to meet liquidity
Consider that the two measures above are Reports There are 10 factors than influence the choice of an needs:
that implies a screening of all assets/liabilities. (LCR instrument: - source and use of funds approach: based on eco-
more on asset NSFR more on liabilites). 1 - Expected Rate of Return: YTM of investments. nometric models, you estimate use and source of
2 - Tax Exposure: you need to considere the after tax funds. That is for instance: estimated∆inDeposit −
return. For this you have to consider marginal income estimated∆inloans so the expected gap. Impor-
Liquidity and leverage tant you have to see the ∆. NB: source =
A A
tax rate (the tax on the income of the firm, 21% for
Leverage: L = E = A−D =1+ D E corporate bond, 0 for municipal bond), and then do: max(0, ∆InLiabilities) − min(0, ∆inAsset) while
ROE and Leverage: ROE = L × ROA − (L − Bef oreT axY ield(1 − M arginalIncomeT axRate) = use = min(0, ∆InLiabilities) − max(0, ∆inAsset)
1)CostOf Debt L is Leverage Af terT axGrossY eld). TEY (tax equivalent yield) is - structure of funds: assign to each source and use of
Leverage Effect: ∂ROE∂L = ROA − CostOf Debt This the opposite. in Bank Qualified bonds you can deduct fund a category and a buffer percentage. (hot money,
say how much increase the ROE for each leverage unit part of the interest for buying the municipal bond stable fund, vulnerable fund)
you get. (usually 80%). - liquidity indicators appraoch: use indicators like
Short selling has great leverage. - Credit Risk ’cash position indicator’ , ’liquid securities indicator’,
Leverage on derivatives is not trivial. On Linear De- - Interest rate risk plrdgrd securities ratio.
rivative the asset is usually the underlying, while for - business risk - market signal approach: it is more connected to go-
option the delta. you consider in both asset and lia- - liquidity risk vernance, trust.
bility the same value (commitment approach). - call risk
An investor like use leverage on volatile securities (le- - Prepayment risk
verage is a risk that make more sense with risky as- - inflation risk
sets). - pledging requirement: sometime you need to put col-
Another way to mean ”asset liquidity risk and liabi- lateral or backed asset with other securities for low,
lities liquidity risk”is ”transaction liquidity risk and this are pledging requirement.
Funding liquidity risk”.
Liquidity Adjusted Var: The idea is that if you want
to liquidate a positionq i T days you are exposed to a
loss: V AR(1day) × (1+T )(1+2T
6T
)
Then you need to
sum this to worst case spread.
monitoring liquidity Liquidity stress testing, reporting and contingency plan
A cash flow can be deterministic and stochastic in We have Funding liquidity , but this funding support:
both time and amount. Bank must deals with both operational (day to day operation), contingent liqui-
funding (too expensive in the fund of its activities) dity (buffer/liquidity one have in case of stress), stra-
and liquidity (inability to repay obligation) risk. 2 tegic (long term projects), restricted (for instance col-
concept: Term structure of cumulated expected cash lateralization). Important purpose of a stress test is to
flow (TSECCF) is what bank expect to have, then understand how much Contingent liquidity one need.
there is Liquidity Generation Capacity (LGC) that is The point is to design a model for stressing liquidity
the ability of the bank to generate liquidity when not to understand the correct buffer to have. So starting
expected. The term structure of expected liquidity from present liquid asset buffer (LAB), one can com-
is the sum of the two (TSL=TSECCF+TSL). TSL pute a stress inflow and outflow, getting a stressed
must be positive. This curves help identifying liqui- LAB.
dity gap or to stress liquidity. Indeed, once defined When reporting a liquidity stress test one of the main
the cash flows, one can define the CashFlow at risk reporting is the cash flow survival horizon and then
(CFAR), that is the stress at certain confidence le- maturity gap report. both has time on x and cash
vel of TSECCF. From here you can also create term flows on y, but what represent in a case is the total
structure on unexpected positive and negative cash cumulated cash flow, in the other cash flow coming
flows (alpha or 1- alpha of CFAR). The term struc- from specific instrument.
ture of expected cash flow is just the ordered vector of The so called CFP (contingency funding plan) is a se-
all expected cash flows. There is also TSAA (available ries of procedure/policy that an institution build for
asset). avoiding problems during stressed liquidity periods.
It involve governance, business line, monitoring.
Failure mechanics of dealer banks
Main tasks of a dealer banks are: security dealing, Illiquid asset
underwriting, trading, Prime brokerage and and OTC Exist a illiquidity risk premium. Selection bias
derivatives (all Sell Side operation). One of the main (overestimate alpha and underestimate beta), survi-
risk is in the financing, for instance the flight of short vorship bias, infrequent sampling. Filter timeseries
term creditors, or of prime brokerage clients. Idea of with AR
solution is laddering maturities, diversify line of cre-
dit, creates buffers of cash. Deposit and non deposit funding
The only concept to keep in mind here is that the ba-
Liquidity transfer pricing sel III regulation give more important to deposit as
Ideally a bank should charge the funding operation to a source of stable funding. Indeed, non-deposit fun-
its business line. HOw they are charged ? in terms ding is more volatile because of rating degradation or
of perfomance correction over the business. There default, while the deposit, even if can be withdraw, of-
are 3 approach: zero cost of funding (no spread is fer far less risk of a bank run. The Net stable funding
charged for funding) averaged pool cost of funding ratio measure capture this.
(you take the average expense on source of fund and
charge it) and matched maturity cost of funding (
you do something similar to swap term structure).
last one is the best. Remember this formula for con-
tingent liquidity pricing : (limit − drawn)/limit ×
likelyhoodOf Drawn × bpsOf T ermLiquidity
Credit Risk Management
Analyzing banking risk Assessing credit risk Capital structure in bank
Having a lending policy is really important for bank. When assessing credit risk, we need to identify a Pro- A bank capital is usually divided in: book capital (ac-
It must contains, type of loans, exposure limit (on bability of Default (PD). There are 3 great approach tual TIER 1 + TIER 2), regulatory capital (minimum
outstanding), concentration limit, set maturity, over- for this: of book capital imposed by regulator to mantain sol-
sight impairment, collateral requirement. As inter- Data-driven (empirical model): the most famous is vency), economic capital (also know as risk capital,
nal classification of credit grade in banking, we can the Altman Z-score, but nowadays there are a lot, Ma- it is the estimate of the unexpected loss of the credit
distinguish 5 categories: standard or passed (no pro- chine Learnig is within this class. The idea of retail portfolio). How to estimante all? 1) you have first of
blem), mentioned or watched (some due payment or banking at the end fall in this category (asses credit all expectedloss = P D × EAD × LGD. This comes
borrower situation changed), substandard (risk of de- quality by few regressor like salary, see FICO) from the idea that the loss follow binomial distribu-
fault), doubtfull (likely to default) and loss ( defaulted structural model: like merton, KMV, moody’s kmv, tion. AS this is the case, the distribution will have a
or non performing). The non performing loans (NPL) CreditMetrics. Idea is to understand PD based on volatility
p that defines UL given by: unexpectedloss =
are an important categories in the credit world. To internal health of a company (for instance A/L, en- P D(1 − P D)×EAD×LGD. Having a capital mul-
prevent losses, banks set loans loss provisions (esti- dogenous). tiplier CM we can define the economic capital at cer-
mating expected loss) and reserve (saving on retained reduced-form model: like creditRisk+. Idea is to con- tain confidence level: EconomicCapital = U L × CM .
earning ). Under IFRS, there are 3 stage of provision nect PD to exogenous factor, like statistics on same Knowing the UL of a single loan, if a portfolio contains
a bank need to put aside (impairment model): for sectors. You see the event of default through the mar- similar loans with correlation ofp default ρ, then we can
performing loans, you need just to estimate expected ket. CreditRisk+. estimate: U Lportf olio = U Li (n + ρ(n2 − n)) and
√
loss for 12 months, for underperforming , lifetime ex- Merton: the probability of default is the probability of the unexpected loss contribution is: U LCi = U Li ρ).
pected loss, for NPA , lifetime ECL and interest reve- exercise of a call option with underlying price equals NB: unexpected loss contribution is different from UL
nues. The board must challenge loan situation. This to asset of firms, strike its liabilities. You have to of single loan if there is correlation. The former re-
is really different respect Basel that try to estimate think with real probability (µ as drift.) lation holdPnonly in the case all are equals, otherwise:
Unexpected loss. creditRisk+: it calibrate the event of default and cre- U Lj ρ
U LCi = j=1 U Lp U Li . To find the economic capital
dit loss with poisson distribution, sometimes hazard for a position you need to multiply U LC for a capital
rates is also stochastic. multiplier CM . And so the economic capital for a
creditMetrics: it use transition matrix (markov) to portfolio.
calculate a VaR on portfolio . It give an estimate of Unexpected loss:
credit losses distribution. It accounts credit migra- √
U L = EAD LGD2 σP D + P DσLGD . sigma pd is
tion. usually estimated assu ming a binomial distribution
Altman Z-score: you have the following regressors: of default: σP D = P D(1 − P D). When thinking
X1=working capital/total asset, X2=Retained economic capital , assume the structure of the debt
earning/Total asset, X3=EBIT/total asset, in terms of priority of repayment in case of default.
X4=Equity/liabilities, X5=Sales/total assets. If First of all depositors, then senior debt holder, then
the outcome is greater then 3, the default is unlikely, junior debt holders, then equity owner. So, usu-
if it is 2.7-3 it is ”on alert”, if it is 1.8-2.7 default is ally, the treshold of economic capital is between ju-
very likely, if it is less then 1.8 default is close. nior and senior debt, meaning that the economic
loanRevenues
A famous metrics is RAROC = CapitalAtRIsk . Re- capital is also the loss at which senior debt hol-
member Capital at risk = Unexpected loss. der start to bear. So T otalAsset − (Depositor +
seniorDebt) = junior + equity is the economic ca-
pital. i.e. T otalAsset/(depositor + SeniorDebt) re-
present a percentage of loss that must be present only
at 99% of case (given a certain distribution of asset
return).
Estimate and mitigate credit exposure xVa, credit metrics
Under basel II, there is the so called IRB approach. general correction on market value: value + CV A −
For estimating PD (and therefore regulatory capital) DV A−F V A−KV A−M V A. FVA: fund value adjust-
for a position you can employ a single factor mo- ment, come from the need of funding. KVA: capital
del (vasicek in IRB): you take a market factor, and value adjustment come from the need of regulatory ca-
define the distance to default as k(i) − βm p where pital. MVA: margin value adjustment come from the
−1 need to margin. DVA: debt value adjustment: come
k(i) = NCDF (P D). you divide it by σ = 1 − β 2
and you get the Z-score that is the conditional PD from the possibility you have to default. CVA: credit
= N(Z). Rememeber if 2 credits follow single markets value adjustment come from the possibility counter-
models, the correlation between them is βi βj . In the party has for the default.
context of copulas we get something similar to the The exposure: it exist expected exposure
above concept (marginal probability of market return EE = M AX(v, 0), ExpectedP ositiveExposure :
EP E = averageOf EEoverLif e = T1 i EE(i),
P
and idiosyncratic return and then correlation, united
with copula) Ef f ectiveExpectedP ositiveExposure : EEP E =
Structured product are often uses for mitigate credit youT akeOnlyIncresingEEaAndComputeEP E,
exposures- This are often quite complex credit risk P otenitalF utureExposure =
product, for instance covered bond, CLO, CDO. They M AX(F utureV alueα% , 0). PFE and EE are
have usually an important correlation strucutre and really important for cva as they are used for com-
the main idea is to use loans as collateral for a debt puting CVA and DVA (togheter with PD). At the
issuance. Are interesting for transfer credit risk. end CV A = LGD × EP E × P D even if a bit more
CLO: this are structured product that has tranching. sophisticated. The only trick here is to put the
Important is that you divide your pool of loan in an correct LGD and PD. With 2 counterparties, let’s
overcollateralized way: equity tranche take overcolla- say A and B, if A has to compute the CVA, she need
teral. Each time you define a cushion, called ”divert”, LGDB , P DB and EP EA . This is also call UCVA
that define a treshold over which the equity tranch as it doesn’t consider the possibility for a default of
start to be paid, otherwise it directlt finish on over- A. The BCVA, instead, considers also the default of
collateral account. this is OC(T)+R(T) where R is A so BCVA = CVA+DVA. So in a bilateral CVA:
the recovery at time T. There is waterfall of payment, CV AA = LGDB × EP EA × P DB (1 − P DA ) and
that follows test that should be done by custodian. DV AA = LGDA × EN EB × P DA (1 − P DB ) (similar
To price properly this instrument you should account to CVA of B).
correlations between loans. Is not simple to estimate NB: you don’t have credit exposure if you sell an op-
losses in this products as you don t know correlation. tion.
NB: Credit Var means quantileOFLoss - expected
loss.
Default correlation under single factor model: given
π1,2 is the joint probability of default, and p1 , p2
the single PD, one can get the correlation as: ρ =
√ π1,2 −p √1 p2 .
p1 (1−p1 ) p2 (1−p2 )
Market Risk management
VaR Extreme Value theory Backtesting
historical VAR: non parametric. you take historical The idea of extreme value is to give a parametric esti- All the backtesting are based on binomial distribu-
realization of return and order it. filterd historical mate of the tail only. Under peak over the threshold tion. The idea is that, given a certain confidence le-
simualtion: here the idea is to account for heteros- approach, you specify a threshold u and you appro- vel, you expect a breach of the var confidence level
chedasticity of return. Idea is that Rt = σt ϵt mea- ximate the probability of P (X − u > x|X > u) = follows a binomial distribution: f (x) = Tx px (1 −
ning that the sigma is not constant. So you need a F (x+u)−F (u)
as a generalized Pareto distribution: p)T −x . With central limit theorem one can use a
1−F (u)
GARCH (σt2 = ω + αRt−1 2 2
+ βσt−1 ), IGARCH (like z = √ x−pT that follows a N(0,1). We can define th-
2 2 2 p(1−p)T
EWMA σt = αRt−1 + (1 − α)σt−1 ) or more often −1/ξ
ξx reshold of z (each threshold will define type 1 and type
an AGARCH (σt2 = ω + α(Rt−1 − γ)2 + βσt−1 2 1 − 1 + ̸ 0
) to , if ξ =
Gξ,β (x) = β 2 errors). This kind of backtesting is called uncondi-
estimate local volatility. Then you divide Actual re-
x tional coverage as they do not account for clustering
1 − exp −
turn by local volatility and multiply back by current , if ξ = 0
β of excedence. Other test account for the conditio-
volatility: Rtf iltered = σT Rσt . This way you get a
t
nal coverage meaning. Christopphersen is one of this,
different quantile. coherent risk measure: a gene- The idea of POT is to get observation beyond the that test also independeces between excedence. Basel
ralized´risk measure is something that is defined as threshold and fit the distribution.
1 committe used an unconditional coverage test called
Mϕ = 0 qp ϕ(p)dp where q is the quantile, where ϕ(q) putting together the formula for P and for G, insu- traffic light.
is called risk aversion and must be weakly incresing lating F(x), one can get a formula for the VAR gi- Type I error: reject a model that is true;
for having a coherent risk measure. Coherent a risk ven the other parameters (putting n−N u
= F (u)): Type II error: accept a model that is wrong .
n
measure is a measure that is: monotone: for X > Y β n −ξ
V AR = u + ξ {[ Nu (1 − α)] − 1} some count for If you shrink the acceptance region, you are increa-
then ρ(X) ≤ ρ(Y ) that means if the expected value of sing the power of the test (power is 1-type II error
a portfolio is greater, we expect it VAR to be smaller; arriving here. And ES = V1−ξAR
+ β−ξu
1−ξ
Another approach is the block maxima that take only probability)
positive homogeneity: ρ(λX) = λρ(X) that is if you The kupiec (POF) test can be seen as an extension
increase the market values that VAR scale up accor- the maximum value. It will have a distribution known
as GEV (general extreme value): of the formula on z above. The idea is that the test
dingly; translation invariance: ρ(X + C) = ρ(X) − C (1−p)N −x px
statistic: 2ln( (1−x/n) 2
N −x (x/n)x ) follow χ1 with one de-
that is, adding cash decrease VAR, subadditivtiy: (
−1/ξ
exp − (1 + ξz) , ξ ̸= 0 gree of freedom.
ρ(X + Y ) ≤ ρ(X) + ρ(Y ). Hξ,µ,σ (z) =
−z Christopphersen extend this test adding an indepen-
For Normal and lognormal value at risk you need to exp (−e ) , ξ=0
dece test. Idea is to have multinomial probability
distinguish√ following dynamics: Normal: V ART =
with T00 ,T01 ,T10 ,T11 . The test for the indepence is
−µT + σ T ϵα where epsilon will give you VAR (1−π )N −T00 π
T00
(1−π )N −T11 π
T11
00 11
(e.g. 95% is 1.645). Lognormal : V ART = 1 − built as : 2ln( 0
(1−x/n)N −x (x/n)x
11
) follow
√
2
exp[(µ−0.5σ )T −σ T ϵα again epsilon give you VAR. where z = x−µσ In Risk management often are used 2
a χ1 , here the null hypothesis test is that π11 = π01 =
On QQ-plot just remember a fat tail and slight tail: multivariate extreme value distribution and copulas, (1 − π00 ) = (1 − π10 )
on right is respectively down points and up points, on like gumbel, frechet. A challenging is the so called Probability integrated
the left is the opposite (to have both fat or both slight transform (PIT) test. It is a test for the entire dis-
you need S) Stochastic interest model tribution. Ideally, if you map each return you see in
Important concept in stocastic interest rate model- the timeseries back to it cdf value, you should get a
Stochastic interest model ling is mean reverting and half time. In the fol- uniform distribution on long horizon. If your model is
lowing : a(b − rt )dt r is a mean reverting process conservative you get a peak centered, if it is aggressive
In an orhnstein-ulenbeck process (both Vasicek,
(ornsthein-ulenbeck) and a is the speed of the rever- you see a lot of 0 and 1.
gauss+ and spin) the process decay on its long term
sion, b the long term mean. The half time is defines as
level: RT = b + (R0 − m)e−aT , b lont term mean see
T1/2 = ln(2)
a . It represent the persistence (more per-
right. If we regress 2year to 3 years, we are finding
sistency kill volatility). In vasicek
we have rate after t
e−aT , as it is like R3years ≃ (R2years )e−aT (see above)
years as r0 e−kT + Θ 1 − e−kT . Gauss+ model cap-
ture the hump-shaped volatility of rate, Vasicek has a
decreasing volatility. Remember that this modelling
try to go beyond pure expectation theory that work
only with short-middle maturity.
Fixed income and term structure modelling Interest rate model primer
Yield based hedging (based on DV01) is not always Model 1: dr = σdW
the best choice as it doesn’t capture historical correla- Model 2: dr = λdt + σdW
tion (it make only price-related assumption). So often Ho-Lee: dr = λ(t)dt + σdW
a correction on hedge ratio based on the beta of the Vasicek: dr = k(θ − r)dt + σdW
DV 01hedged yield
regression is done: h = DV 01hedging βhedged−hedging Model 3: dr = λ(t)dt + σ(t)dW
When you want to price call on bond you can use Model 3 (S): dr = λ(t)dt + σe−αt dW √
trees. CIR (Cox...): dr = k(θ − r)dt + σ rdW
The implied drift of rates in risk-neutral world is given Lognormal (model 4): dr = ardt + σrdW
by: ∆rup pup + ∆rdown pdown Lognormal: d[ln(r)] = a(t)dt + σdW
General expected return depends on: E[ dp p ] =
Lognormal (Black-Karasinski): d[ln(r)] =
1 2 2
f (t)dt − DE[dr] + 2 C σ dt k(t)[ln θ(t) − ln r]dt + σ(t)dW
Risk-averse investor expected return contains a terms Tuckman explain in deep how to get spot rates ha-
λ that already embeds convexity and represent a risk ving a model of interest rates. The key ideas here are
premium: E[ dp the fact that drift captures the difference in proba-
p ] = r0 dt + λDdt
Convexity adjustment: in general it is a correction due bility between an upward and downward movement
to stocastic nature of underlying. In case of bond, if and that volatility low the term structure of spot rate
you are asked to compute it, ideally you need first to (convexity adjustment).
construct price under pure expectation (for instance, The Gauss+ model includes three factors, each mo-
using the expected rate) and then using trees for ins- deled as an Ornstein-Uhlenbeck process:
tance in which each scenario has a certain probability
to occur. subtracting the prices you get an adjust-
dr = −ar (r − m) dt (short-term)
ment.
p In case of a rate is similar, remember this : p
f ∗y
1/price − 1 = rate dm = −am (m − l) dt + σm (ρdW1 + 1 − ρ2 dW2 ) (middle-term)
dl = −al (l − µ) dt + σl dW1 (t) (long-term)
volatility smile
It is a model used because it fit well the dynamic and
The main info to know are: the smile captures higher
the shape of term structure and also of volatility of
weight of the tail. For instance, the smile of FX op-
term structure. It is used in proprietary trading stra-
tion on high and low strike rise because jump in fx
tegies and market maker valuation.
are more probable that those implied by lognormal
probability. This is reflect to an higher price of deep
out-of-the-money or deep in the money option. For
equity, the tail that is most probable than the log-
normal one is the left tail (great loss or down shock).
This result in a smirk of equity option, i.e., pricing
higher in-the-money call or out-of-the money put.
i=1 i=1 2) standardised approach: idea is to assign different present plan for recovery) and minimum capital re-
weight to different business line (8), that again has as quirement (MCR, absolute level to not be breached).
in which now the netting is taken into account. NRR
notional the average gross incomePnover the past three
is the ratio between Exposure with netting and expo-
years: CapitalRequirement = i=1 GIi · βi the beta
sure without netting.
are given by regulation. Then there is the advanced
Further, the Basel I review introduced a capital requi-
measurement approch (AMA), that calibrat a poisson
rement for assets in the trading book. It introduced
distribution for incident and weibull for severity and
standardized approach and internal approach. Basel
then run montecarlo.
I review added also a tier 3 capital.
In 1999 started the revision of Basel II, up to 2004.
The idea was to implement other 2 pillars, and to
include operational risk