Individual PK Parameters following Bayesian Estimation

Hello,
I am trying to run inspect function following Bayesian estimation to get the individual random effects but I am getting the following error

ERROR: type BayesMCMCResults has no field approx
Stacktrace:
 [1] getproperty(x::Pumas.BayesMCMCResults{Pumas.BayesLogDensity{PumasModel{ParamSet{NamedTuple{(:θ, :Ω, :σ), Tuple{Constrained{DiagNormal, VectorDomain{Vector{Float64}, Vector{Float64}, Vector{Float64}}}, InverseWishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}}, Gamma{Float64}}}}, var"#5#17", var"#6#18", var"#8#20", var"#10#22", Central1, var"#11#23", var"#14#26", Nothing}, Vector{Subject{NamedTuple{(:CONC,), Tuple{Vector{Union{Missing, Float64}}}}, T2, Vector{Pumas.Event{Float64, Float64, Float64, Float64, Float64, Float64, Int64}}, Vector{Float64}} where T2}, Vector{Float64}, ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(Pumas.logdensity), Float64}, Float64, 8, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(Pumas.logdensity), Float64}, Float64, 8}}}, DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, NamedTuple{(), Tuple{}}}, Vector{Vector{Vector{Float64}}}, Vector{Vector{NamedTuple}}}, f::Symbol)
   @ Base ./Base.jl:33
 [2] inspect(fpm::Pumas.BayesMCMCResults{Pumas.BayesLogDensity{PumasModel{ParamSet{NamedTuple{(:θ, :Ω, :σ), Tuple{Constrained{DiagNormal, VectorDomain{Vector{Float64}, Vector{Float64}, Vector{Float64}}}, InverseWishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}}, Gamma{Float64}}}}, var"#5#17", var"#6#18", var"#8#20", var"#10#22", Central1, var"#11#23", var"#14#26", Nothing}, Vector{Subject{NamedTuple{(:CONC,), Tuple{Vector{Union{Missing, Float64}}}}, T2, Vector{Pumas.Event{Float64, Float64, Float64, Float64, Float64, Float64, Int64}}, Vector{Float64}} where T2}, Vector{Float64}, ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(Pumas.logdensity), Float64}, Float64, 8, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(Pumas.logdensity), Float64}, Float64, 8}}}, DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, NamedTuple{(), Tuple{}}}, Vector{Vector{Vector{Float64}}}, Vector{Vector{NamedTuple}}})
   @ Pumas /builds/PumasAI/PumasSystemImages-jl/.julia/packages/Pumas/HDuXQ/src/estimation/diagnostics.jl:1096
 [3] top-level scope
   @ REPL[13]:1

Is there a different way to run inspect function specific to Bayesian estimation ?

Could you please post your model and some simulated data for us to debug the error?

Hi Dr. Storopoli,
Thanks for the reply, Please find attached the code

using Pumas, MCMCChains

model_bayes = @model begin
  @param begin
    θ ~ Constrained(
      MvNormal(
        [3.0, 5.0],
        Diagonal(ones(2))
      ),
      lower = zeros(2),
      upper = fill(50.0, 2),
      init  = [3.0,5.0])
    Ω ~ InverseWishart(6, diagm(fill(0.2, 2)) .* (6 + 3 + 1))
    σ ~ Gamma(1.0, 0.5)
  end

  @random begin
    η ~ MvNormal(Ω)
  end

  @pre begin
    CL = θ[1]*(WT/50)^0.75*exp(η[1])
    Vc = θ[2]*(WT/50)*exp(η[2])
  end

  @covariates WT

  @dynamics Central1

  @derived begin
    μ := @. Central / Vc
    CONC ~ @. Normal(μ, σ)
  end
end

df = CSV.read("./df_PUMAS.csv",DataFrame)

estimation = read_pumas(df,
    observations=[:CONC], 
    id=:ID,
    time=:TIME,
    evid=:EVID,
    mdv = :MDV,
    amt=:AMT,
    rate = :RATE,
    cmt = :CMT,
    covariates = [:WT]
    )

param = (θ = [3.0, 5.0], Ω = [1.0 0.0; 0.0 1.0], σ = 0.5)

result = fit(model_bayes, estimation, param, Pumas.BayesMCMC();
  nsamples=2000, nadapts=1000)

and the dataset
dataset

@ahmed.salem it’s not too easy to get the individual random effects now. I would wait until the next release where we have many more Bayesian features coming out that let you access samples from the posterior or posterior predictive distribution of any quantity in the model.

2 Likes

Hi @mohamed82008 , Thanks for the reply. I have a followup question. In the fit output the following information is reported

Iterations        = 1:2000
Thinning interval = 1
Chains            = 1
Samples per chain = 2000

My question is about how to specify how many chains I would like to use (say for example I want to run 4 chains). How can I specify this? Is there an argument inside fit function to define the chain or I need to do certain maneuver? . Same also for thinning.

I am also a little bit unfamiliar with nadapts argument… is this similar to the burn in phase ? if no, how can I specify the number of samples to be in the burn in phase

Finally, I am getting warning messages like these during the fitting process

┌ Warning: The current proposal will be rejected due to numerical error(s).
│   isfinite.((θ, r, ℓπ, ℓκ)) = (true, false, false, false)
└ @ AdvancedHMC /builds/PumasAI/PumasSystemImages-jl/.julia/packages/AdvancedHMC/MIxdK/src/hamiltonian.jl:47
┌ Warning: The current proposal will be rejected due to numerical error(s).
│   isfinite.((θ, r, ℓπ, ℓκ)) = (true, false, false, false)
└ @ AdvancedHMC /builds/PumasAI/PumasSystemImages-jl/.julia/packages/AdvancedHMC/MIxdK/src/hamiltonian.jl:47
┌ Warning: The current proposal will be rejected due to numerical error(s).
│   isfinite.((θ, r, ℓπ, ℓκ)) = (true, false, false, false)
└ @ AdvancedHMC /builds/PumasAI/PumasSystemImages-jl/.julia/packages/AdvancedHMC/MIxdK/src/hamiltonian.jl:47
┌ Warning: timestamp of type Missing unknown
└ @ MCMCChains /builds/PumasAI/PumasSystemImages-jl/.julia/packages/MCMCChains/VwH27/src/chains.jl:364
┌ Warning: timestamp of type Missing unknown
└ @ MCMCChains /builds/PumasAI/PumasSystemImages-jl/.julia/packages/MCMCChains/VwH27/src/chains.jl:364
Chains MCMC chain (2000×6×1 Array{Float64, 3}):

Do these warnings indicate a problem with the model specification ?

Thanks!

how to specify how many chains I would like to use (say for example I want to run 4 chains).

There is an nchains keyword argument you can specify in the fit function. In the next release, it will be 4 by default.

I am also a little bit unfamiliar with nadapts argument… is this similar to the burn in phase ?

No, nadapts is different from burn-in. In Pumas, we use the No-U-Turn Sampler (NUTS) algorithm which is a variant of the Hamiltonian Monte Carlo sampling algorithm that’s more robust and was popularised by Stan. Our NUTS implementation and the Stan implementation are pretty much identical. The way the NUTS algorithm works involves 2 phases: a sampling + adaptation phase and a sampling with no adaptation phase. In the first phase, the algorithm tries to fine tune its parameters, e.g. the so-called step size and mass matrix, to achieve a certain target acceptance ratio. Once it’s done fine tuning the parameters, adaptation stops. nadapts describes how many of the nsamples iterations are going to be spent doing adaptation next to sampling. You can choose to get rid of all the adaptation steps as “burn-in” but that might be too much to throw away.

if no, how can I specify the number of samples to be in the burn in phase

In the next release, there is a function that lets you do burn-in and thinning as a post-processing step.

Finally, I am getting warning messages like these during the fitting process
Do these warnings indicate a problem with the model specification ?

The timestamp warning is fixed in the next release. The numerical error warning is not a big deal if it only shows up a few times. Basically, during sampling sometimes the sampler can wander off to parameter values that either cause the differential equation solver to not converge, or cause the likelihood to be 0 (log likelihood = -Inf). In these cases, you get numerical errors and the NUTS sampler will “reject” this problematic parameter value. These numerical errors can more commonly happen at the first part of the sampling rather than the later parts of the sampling for well-defined models. Sometimes good initialisation can also help avoid these numerical errors. If you are getting too many such warnings, then this is a sign your model “blows up” too often so you can try restricting the parameter space by using stronger priors centred around the parameter values that are known to “work”, as in they have non-zero likelihoods and the ODE solver converges. Alternatively, start with a simpler model and stat building the complexity up step by step until sampling breaks and then try to understand why it breaks.

3 Likes

Great thanks for the detailed reply! The warning is happening just two to three times at the beginning of the sampling process and the parameter estimates makes so I do not think it is a problem.

Hi @mohamed82008 ,

I currently have access to Pumas v2.1. Could you please let me what functions to use to get the Bayesian individual posteriors?

Thanks

@ahmed.salem, I think you can find more at the docs, Estimating Parameters of Pumas Models · Pumas.

You would probably need to convert the output to a MCMCChains.jl output, e.g.

my_chain = Chains(fit_result.chain)

Hi @storopoli
I tried the function you provided but it gives the following error

julia> my_chain = Chains(result_1.chain)
ERROR: type BayesMCMCResults has no field chain
Stacktrace:
 [1] getproperty(x::Pumas.BayesMCMCResults{Vector{Pumas.ThreadedBayesLogDensity{PumasModel{(θ = 2, Ω = 3, σ = 1), 2, ParamSet{NamedTuple{(:θ, :Ω, :σ), Tuple{Constrained{FullNormal, VectorDomain{Vector{Float64}, Vector{Int64}, Vector{Float64}}}, InverseWishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}}, Gamma{Float64}}}}, var"#5#17", var"#6#18", var"#8#20", var"#10#22", Central1, var"#11#23", var"#14#26", Nothing}, NamedTuple{(:θ, :Ω, :σ), Tuple{Vector{Float64}, Matrix{Float64}, Float64}}, Vector{Subject{NamedTuple{(:CONC,), Tuple{Vector{Union{Missing, Float64}}}}, T2, Vector{Pumas.Event{Float64, Float64, Float64, Float64, Float64, Float64, Int64}}, Vector{Float64}} where T2}, Vector{Vector{Float64}}, Vector{Vector{Float64}}, Vector{ForwardDiff.GradientConfig{ForwardDiff.Tag{Pumas.Tag1, Float64}, Float64, 8, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Pumas.Tag1, Float64}, Float64, 8}}}}, Vector{DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}}, NamedTuple{(), Tuple{}}}}, Vector{Vector{Vector{Float64}}}, Vector{AbstractMCMC.SamplingStats}}, f::Symbol)
   @ Base ./Base.jl:42
 [2] top-level scope
   @ ~/data/code/BAYESIAN_PUMAS/Code.jl:60

Also, I think the document has not been updated yet to include information about getting individual parameters or individual posteriors

Hi @mohamed82008, I am following up on my last response since I have access to Pumas 2.1 now. Could you please let me know the new feature that allow accessing the individual posteriors of PKPD parameters of each subject included in the analysis?