Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scoping of "front-end" work #262

Open
st-- opened this issue Jan 20, 2022 · 6 comments
Open

Scoping of "front-end" work #262

st-- opened this issue Jan 20, 2022 · 6 comments

Comments

@st--
Copy link
Member

st-- commented Jan 20, 2022

We were discussing having a concerted effort at giving us a nicer frontend for fitting GPs. Before we sit down and do that, let's discuss what should be in scope? What should be out of scope? Who should be the target audience?

@willtebbutt
Copy link
Member

For "phase 1" of this stuff, the kind of thing I'm imagining is providing simple wrapper code around the stuff that we use day-to-day. In particular, I still want to have complete freedom to define a kernel in whichever way I fancy using ParameterHandling.

In the first instance, I'm imagining maybe a 1-function API:

fit(build_kernel_or_gp, initial_parameters, x::AbstractVector, y::AbstractVector{<:Real})

which outputs a PosteriorGP. build_kernel_or_gp would need to map from the initial_parameters to either a valid Kernel or AbstractGP, and initial_parameters would need to be ParameterHandling.jl-friendly.
(There are probably some kwargs we would want to permit to customise e.g. the mean function, or optimiser).

So the kind of user I'm imagining is someone who would be happy specifying build_kernel and initial_parameters.

We could of course add reasonable defaults for these, such as an SEKernel with learnable variance and lengthscale.

You can imagine other methods of fit which would work with pseudo-point approximations and non-Gaussian likelihoods.

Basically, I want to produce something which is flexible to still be useful for us for us on a regular basis, but which is simple enough / has enough functionality already implemented to be useful to someone who doesn't know much about GPs.

@theogf
Copy link
Member

theogf commented Jan 31, 2022

So the kind of user I'm imagining is someone who would be happy specifying build_kernel and initial_parameters.
We could of course add reasonable defaults for these, such as an SEKernel with learnable variance and lengthscale.

How about writing a function which takes a given built kernel and automatically creates build_kernel with the right parametrization (positive lengthscale etc)?

@willtebbutt
Copy link
Member

I'm in favour of that being a thing, but I would rather have it in addition to what I've proposed above. If you've got what I've proposed, I think you could implement what you're proposing as some functionality that lives on top of it. i.e.

function fit(kernel, x::AbstractVector, y::AbstractVector{<:Real})
    build_kernel, initial_parameters = sensibly_named_function(kernel)
    return fit(build_kernel, initial_parameters, x, y)
end

@theogf
Copy link
Member

theogf commented Jan 31, 2022

Yes exactly!
It would the same idea that we have with vec_of_vecs where we expect a given structure internally but provide easy wrapper to make it a nice UX.

This would be much more user friendly than having to build the whole function + parameters structure!

@willtebbutt
Copy link
Member

Indeed, while retaining flexibility when needed!

@simsurace
Copy link
Member

An attempt at this was started at https://github.com/JuliaGaussianProcesses/AutoGPs.jl

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants