I'm trying to implement SABR volatility model using sdeint
package. Basically, the model consists of two equations
dx1 = 0*x1*dt + x2*x1^b*dWt
dx2 = 0*x2*dt + a*x2*dZt
I tried to rewrite the second example of the official documentation, but I encountered the following issue:
SDEValueError: y0 has length 2. So G must either be a single function
returning a matrix of shape (2, m), or else a list of m separate
functions each returning a column of G, with shape (2,)
My code
import numpy as np
import sdeint
A = np.array([[0, 0],
[ 0, 0]])
a = 1
b = 0.5
B = np.diag([1, a]).T
tspan = np.linspace(0.0, 10.0, 10001)
x0 = np.array([3.0, 3.0])
def f(x, t):
return A.dot(x)
def G(x, t):
BB = np.array([ x[1]*x[0]**b, x[1] ])
print(BB)
return B.dot(BB)
result = sdeint.itoint(f, G, x0, tspan)
Basically, it should be a square diagonal matrix in your case:
def G(x, t):
BB = np.diag([ x[1]*x[0]**b, x[1] ])
return BB