Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: Sketch of vector-valued GP functionality #218

Draft
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

willtebbutt
Copy link
Member

Summary

What: Proposal for a vector-valued GP.

Why: It might be a convenient thing to have around for some use-cases in which people really do think about their multi-output GPs as being vector-valued.

Why: it's easy to provide, because we already have the functionality, we just need to package it nicely.

Proposed changes

Add a vector-valued GP. Not proposing to export at this point in time, because I don't know what I even think about it.

Contains a sketch of the implementation in which the intent is hopefully clear. Does not contain proper testing, and I don't believe that the API is complete.

What alternatives have you considered?

I considered not bothering with this, and I still think that's a good option -- I really don't know whether or not I like this, but I can imagine that it would ease cognitive burden in some common workflows.

Breaking changes

No breaking changes.

src/vector_valued_gp.jl Outdated Show resolved Hide resolved
@@ -0,0 +1,77 @@
# Represents a GP whose output is vector-valued.
struct VectorValuedGP{Tf<:AbstractGP}
f::Tf
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

f must be, in some sense, a multi-output GP. We don't have a way to enforce that though, but my feeling is that's fine?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could make AbstractGP parametric on the kernel type and e.g.
struct GP{Tm<:MeanFunction,Tk<:Kernel} <: AbstractGP{Tk}
with a struct VectorValuedGP{Tk<:MOKernel,Tf<:AbstractGP{Tk}}?

Copy link
Member

@theogf theogf Aug 27, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would stop you to use any composition on it though...
This looks like a "trait" kind of problem. One could define is_multioutput and recursively check any multioutput kernel like:

is_multioutput(k::MOKernel) = true
is_multioutput(k::Kernel) = false
is_multioutput(k::TransformedKernel) = is_mulitoutput(k.kernel)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, the thing on which to parametrise the type should perhaps be the domain on which the GP is defined - that could be made to work for transformations as well then. But perhaps too much effort for it to be worth it. And I'm assuming that the "multi-output" kernel would generally be the outermost: In the end the question is just whether the final kernel object accepts the (all_the_features, output_index) tuple correctly.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would stop you to use any composition on it though...

Indeed. We've got a number of AbstractGPs now which don't have a kernel field.

And I'm assuming that the "multi-output" kernel would generally be the outermost: In the end the question is just whether the final kernel object accepts the (all_the_features, output_index) tuple correctly.

Exactly. Throwing an error if we find out that the inputs aren't compatible the GP to which they're supplied isn't super nice, but it's what we do throughout the ecosystem at the minute. The reason being that giving the GP knowledge about its domain is a massive hassle, and it seems to work fine most of the time 🤷

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

# I gave up figuring out how to properly subtype MatrixDistribution, but I want this to
# subtype a distribution type which indicates that samples from this distribution produces
# matrix of size num_features x num_outputs, or something like that.
struct FiniteVectorValuedGP{Tv<:VectorValuedGP, Tx<:AbstractVector, TΣy<:Real}
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TΣy<:Real because I'm lazy and haven't figured out what more general semantics might look like here.

willtebbutt and others added 2 commits August 27, 2021 16:56
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
# I gave up figuring out how to properly subtype MatrixDistribution, but I want this to
# subtype a distribution type which indicates that samples from this distribution produces
# matrix of size num_features x num_outputs, or something like that.
struct FiniteVectorValuedGP{Tv<:VectorValuedGP, Tx<:AbstractVector, TΣy<:Real}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[JuliaFormatter] reported by reviewdog 🐶

Suggested change
struct FiniteVectorValuedGP{Tv<:VectorValuedGP, Tx<:AbstractVector, TΣy<:Real}
struct FiniteVectorValuedGP{Tv<:VectorValuedGP,Tx<:AbstractVector,TΣy<:Real}

@willtebbutt
Copy link
Member Author

Added posterior implementation.

@codecov-commenter
Copy link

Codecov Report

Merging #218 (fa1ceaa) into master (df23fdf) will decrease coverage by 1.59%.
The diff coverage is 81.08%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #218      +/-   ##
==========================================
- Coverage   98.02%   96.43%   -1.60%     
==========================================
  Files          10       11       +1     
  Lines         355      393      +38     
==========================================
+ Hits          348      379      +31     
- Misses          7       14       +7     
Impacted Files Coverage Δ
src/vector_valued_gp.jl 81.08% <81.08%> (ø)
src/finite_gp_projection.jl 100.00% <0.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update df23fdf...fa1ceaa. Read the comment docs.

Comment on lines +20 to +23
# Construct equivalent FiniteGP.
x_f = KernelFunctions.MOInputIsotopicByOutputs(vx.x, vx.v.num_outputs)
f = vx.v.f
fx = f(x_f, vx.Σy)
Copy link
Member

@st-- st-- Aug 30, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The amount of code-duplication in here seems to beg for something like

function _vector_equivalent_finitegp(vx::FiniteVectorValuedGP)
    x_f = KernelFunctions.MOInputIsotopicByOutputs(vx.x, vx.v.num_outputs)
    f = vx.v.f
    fx = f(x_f, vx.Σy)
    return fx
end

(or _construct_equivalent_finitegp)
what do you think?

Comment on lines +28 to +30
# Construct the matrix-version of the quantity.
M = reshape(m, length(vx.x), vx.v.num_outputs)
return M
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could also have a

function _vectorvaluedgp_reshape(vx::FiniteVectorValuedGP, a)
    return reshape(a, length(vx.x), vx.v.num_outputs)
end

NB: should the return type be a ColVecs ?

@willtebbutt willtebbutt marked this pull request as draft September 15, 2023 19:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants