-
-
Notifications
You must be signed in to change notification settings - Fork 26k
Add stricter gradient check for log marginal likelihood in Gaussian Processes #31543
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Somehow some commits from other branches got mixed up in this PR hence the weird number of commit messages (the files should be restored now) |
Are we going to go with scipy.differentiate as suggested by @lorentzenchr? |
@conradstevens yes, your initial manual_grad implementation is similar to |
First step in fixing the error in the log-marginal likelihood gradient calculations in Gaussian Processes as noticed in #31366. Also related to #31289.
What does this implement/fix? Explain your changes.
Implements a stricter test for
test_lml_gradient
by replacing the manual gradient calculation usingscip.optimize.approx_fprime
withscipy.differentiate.derivative
and calculating the gradient over several different length_scales@conradstevens and @lorentzenchr any suggestions are welcome (cc @ogrisel and @antoinebaker)
TO DO (perhaps within this PR or separately):
theta
underGaussianProcessRegressor
b
) in_BinaryGaussianProcessClassifierLaplace