You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It's pretty easy to write a Julia function that takes in initial conditions of a differential equation, uses the DifferentialEquations.jl to solve that differential equation, and returns the results of that equation. That Julia function will be automatically differential, and I'd like to be able to convert it into a PyTorch-compatible object (with gradient information preserved).
Possible Implementation
####################### Package #########################fromjuliacallimportMainasjlimportnumpyasnpimporttorchfromtorch.autogradimportFunction, gradcheckloss=jl.seval("loss(f, grad) = x -> (sum(pyconvert(Array, f(x)) .* grad))")
try:
gradient=jl.seval("using ForwardDiff: gradient; gradient")
except:
jl.seval("import Pkg; Pkg.add(\"ForwardDiff\")")
gradient=jl.seval("using ForwardDiff: gradient; gradient")
classCallJuliaFunction(Function):
@staticmethoddefforward(ctx, f, x):
ctx.f=fctx.save_for_backward(x)
np_x=x.detach().numpy()
jl_res=f(np_x)
np_res=np.array(jl_res)
torch_res=torch.from_numpy(np_res)
returntorch_res@staticmethoddefbackward(ctx, grad_output):
f=ctx.fx, =ctx.saved_tensorsnp_x=x.detach().numpy()
np_grad_output=grad_output.detach().numpy()
ls=loss(f, np_grad_output)
jl_grad=gradient(ls, np_x)
np_grad=np.array(jl_grad)
torch_grad=torch.from_numpy(np_grad)
returnNone, torch_grad######################## Tests ##########################x=torch.randn(3,3,dtype=torch.double,requires_grad=True)
f=jl.seval("f(x) = 2 .* x")
f2=lambdax: f(x) # hack to work around https://github.com/JuliaPy/PythonCall.jl/issues/390# Use it by calling the apply method:print(x)
output=CallJuliaFunction.apply(f, x)
print(output)
output=CallJuliaFunction.apply(f2, x)
print(output)
# gradcheck takes a tuple of tensors as input, check if your gradient# evaluated with these tensors are close enough to numerical# approximations and returns True if they all verify this condition.input= (f2, torch.randn(3,3,dtype=torch.double,requires_grad=True),)
test=gradcheck(CallJuliaFunction.apply, input, eps=1e-6, atol=1e-4)
print(test)
The text was updated successfully, but these errors were encountered:
This feels almost like a reverse https://github.com/rejuvyesh/PyCallChainRules.jl to me. Torch.jl may not be the best place for it since this library brings a lot of unnecessary baggage which one wouldn't need if one is using PyTorch already. That said, I'm sure people could get use out of it being in a package.
Motivation and description
I want to use an ODESolver in a PyTorch ML stack.
It's pretty easy to write a Julia function that takes in initial conditions of a differential equation, uses the DifferentialEquations.jl to solve that differential equation, and returns the results of that equation. That Julia function will be automatically differential, and I'd like to be able to convert it into a PyTorch-compatible object (with gradient information preserved).
Possible Implementation
The text was updated successfully, but these errors were encountered: