-
Notifications
You must be signed in to change notification settings - Fork 228
Description
Sampling from priors works perfectly:
using Turing # v0.21.10
@model function test()
x ~ Normal()
y ~ Uniform()
return 0.
end
rand(test())
# (x = 1.310427089277206, y = 0.2538675545828133)
I wonder if it would be possible to implement quantile
to sample from the prior using the quantile transform:
u = [.1, .5]
quantile(test(), u)
# MethodError: no method matching length(::DynamicPPL.Model{Main.var"workspace#336".var"#test#1", (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext})
# The function `length` exists, but no method is defined for this combination of argument types.
Quantile transform simply maps a u ~ Uniform
to the desired distribution, e.g.,
using Distributions
quantile(Normal(), .5)
# 0.
Use cases:
The hypercube [0,1]ᴺ
parametrization u ↦ θ
is e.g. used in Nested Sampling.
It is effective in inference because fixed length steps in u
space result in prior-dependent step lengths in the original θ
parameter space.
And it can be used to unify inference with discrete and continuous variates. (e.g. jaxns)
If this could be implemented, hooking up NestedSamplers.jl would be relatively simple:
# From README of https://github.com/TuringLang/NestedSamplers.jl
d = 10
@model function funnel()
θ ~ Truncated(Normal(0, 3), -3, 3)
z ~ MvNormal(zeros(d - 1), exp(θ) * I)
return x ~ MvNormal(z, I)
end
model = funnel()
data = model()
prior_transform(X) = quantile(funnel(), X)
logl(X) = loglikelihood(funnel() | (x=data,), prior_transform(X))
NestedModel(logl, prior_transform)
Since this is the way many random variates are generated computationally, maybe implementing this would relatively easy?
I could give it a try with some pointers, since I only recently picked up Julia again and don't know Turing.jl.