JAGS: Invalid vector argument to ilogit - jags

I am trying to run the following model. It has several logit links in it, which are in loops.
model {
for(i in 1:8){
year.eff[i] ~ dnorm(0,tau.yr)
annual.controlrenest[i] <- pow(ilogit(beta0+year.eff[i]+beta.renest),37)
annual.treatmentrenest[i] <-
pow(ilogit(beta0+year.eff[i]+beta.treatment+beta.renest),37)
annual.controlnest[i] <- pow(ilogit(beta0+year.eff[i]),37)
annual.treatmentnest[i] <- pow(ilogit(beta0+year.eff[i]+beta.treatment),37)
annual.controlrenest[i] <- pow(ilogit(beta0+year.eff[i]+beta.renest),37)
annual.treatmentrenest[i] <-
pow(ilogit(beta0+year.eff[i]+beta.treatment+beta.renest),37)
}
# Likelihood
for(i in 1:539){
for(j in (enter.nest[i]+1):left.nest[i]){
logit(phinest[i,j]) <- beta0 + year.eff*year.nest[i] +
beta.treatment*Treatment[i] + beta.renest*Renest[i]
mu[i,j] <- phinest[i,j] * n[i,j-1]
n[i,j] ~ dbern(mu[i,j])
}
}
}
However, I am getting the following error and can't seem to figure out where the issue is. I have gotten this error before when my logit link wasn't in a loop (you can't vectorize a logit link in JAGS).
Error in checkForRemoteErrors(lapply(cl, recvResult)) :
3 nodes produced errors; first error: RUNTIME ERROR:
Invalid vector argument to ilogit

Related

Error in update.jags(model, n.iter, ...) : Error in node sd[1] Invalid parent values

I am having error in node sd[1], it says invalid parent values in the compiler. I am working with a gaussian model for "Galaxies" data from "MASS"p package of R.
library(rjags)
library(MASS)
library(mcsm)
data("galaxies")
summary(galaxies)
y = galaxies
ngroups = 2
jags_data = list(y=y, n=length(y), ngroups=ngroups)
gaussmodel = "
model {
for (i in 1:n) {
y[i] ~ dnorm(mu[z[i]], tau[z[i]])
z[i] ~ dcat(group_probs)
}
group_probs ~ ddirich(d)
for (j in 1:ngroups) {
mu_raw[j] ~ dnorm(0, 1E-6)
tau[j] ~ dgamma(0.001, 0.001)
sd[j] = pow(tau[j], -0.5)
d[j] = 2
}
mu = sort(mu_raw)
}
"
model = jags.model(textConnection(gaussmodel), data=jags_data,
n.chains=4)
update(model,n.iter=1E4)
samples = coda.samples(model=model, variable.names=c("mu", "sd", "group_probs"), n.iter=1E4, thin=5)
I don't know much about rjags and bayesian analysis in detail but I think your problem is in the sd line of the code where sd=pow(tau[j],-0.5)
I believe the -0.5 is the problem. I am not sure if you intended for the value to be negative but it seemed that the value caused some problems to suffix in your dirichlet model.
Taking away the negative value seemed to do the trick.

Jags gets wired, returns redefining node errer occasionally, due to different data

I created my code based on this :
http://users.aims.ac.za/~mackay/BUGS/Manual05/Examples1/node29.html
Now I use different seeds to simulate data. It is strange enough that some seeds give me a redefining node dN[1,1] on line 18 error, while others do not. Could someone help please? BTW why is dN[1,1] on line 18 at the first place? How does Jags count lines?
error message :
"
RUNTIME ERROR:
Compilation error on line 18.
Attempt to redefine node dN[1,1]
"
bugsmodel <- "
# Set up data
data{
for(i in 1:N)
{
for(j in 1:bigt)
{
Y[i,j] <- step(obs.t[i] - t[j] + eps)
dN[i, j] <- Y[i, j] * step(t[j + 1] - obs.t[i] - eps) * fail[i]
}
}
}
# Model
model
{
for(i in 1:N){
betax[i,1] <- 0
for(k in 2:(p+1)){
betax[i,k] <- betax[i,k-1] + beta[k-1]*x[i,k-1]
}
}
for(j in 1:bigt) {
for(i in 1:N) {
dN[i, j] ~ dpois(Idt[i, j]) # Likelihood
Idt[i, j] <- Y[i, j] * exp(betax[i,p+1]) * dL0[j] # Intensity
}
dL0[j] ~ dgamma(mu[j], c)
mu[j] <- dL0.star[j] * c # prior mean hazard
}
c <- 0.001
r <- 0.1
for (j in 1 : bigt) {
dL0.star[j] <- r * (t[j + 1] - t[j])
}
for(k in 1:p){
beta[k] ~ dnorm(0.0,0.000001)
}
}"

RJAGS output Node inconsistent with parents

Hi everyone I'm new in JAGS and currently doing a bayesian inference using mcmc through RJAGS. I've been trying my best to debug my code until I'm stuck with this error "Error in node e1[3] Node inconsistent with parents".
e1<-c(1,1,0,1,1,0,0,1,0,0,1,0,1,1,1,1,1,0,0,1,1,0,1,1,1,1,0,0,1,1,0,0,0,0,0,1,1,1,1,1,0,0,0,1,1,0,1,1,0,0,1,1,1,1,0,1)
e2<-c(1,1,0,1,1,1,0,1,0,1,1,1,0,1,1,1,1,0,1,1,1,1,0,0,1,1,1,1,1,1,1,1,0,1,1,0,1,1,0,1,1,0,0,1,1,1,0,1,1,0,1,1,1,1,1)
c1<-c(2412,3485,881,1515,1824,1603,865,2638,332013,7379,1,1189,1,106,278,1406,9408,21596,15880,833,543,611,272,7883,
1,15091,11642,849,203,566,425,1,125124,687196,21377,3901,1131,543,1,21218,1118,5519,434800,1288,4700,820,659,6644,
1198,3581,1013,1021,5877,833,1,11797)
c2<-c(1189,905,902,1154,20896,14973,1665,1,1096,309,641,1,935,282,
1,566,2245,112,18366,1096,1476,1,2486,1131,607,67,19390,284,
641,566,1154,1,2672,4857,1131,1231,4594,655,1127,4187,1223,417,
3381,1,1006,1,1920,4964,1911,765 ,876,14,942,849,4130)
n<-c(56,55)
model<-function() {
#control
for(i in 1:n[1]){
e1[i] ~ dbern(p1[i])
c1[i] ~ dlnorm(eta[1],lambda[1])
p1[i] <- ilogit(alpha[1]+beta[1]*c1[i])
}
#treatment
for(i in 1:n[2]){
e2[i] ~ dbern(p2[i])
c2[i] ~ dlnorm(eta[2],lambda[2])
p2[i] <- ilogit(alpha[2]+beta[2]*c2[i])
}
for (t in 1:2) {
eta[t] ~ dgamma(9,2)
lambda[t] ~ dunif(0,100)
alpha[t] ~ dnorm(0,1)
beta[t] ~ ddexp(0, gamma)
}
gamma ~ dunif(1,10)
}
library(R2jags)
dataJags <- list("n","c1","c2","e1","e2")
params <- c("eta","lambda","alpha","beta")
inits <- function(){
list(eta=runif(2,0,1),lambda=runif(2,0,1),alpha=runif(2,0,1),beta=runif(2,0,1)
)
}
n.iter <- 10000
n.burnin <- 5000
n.thin <- floor((n.iter-n.burnin)/500)
flxpin <- jags(dataJags, inits, params, model.file=model,
n.chains=3, n.iter, n.burnin, n.thin,
DIC=TRUE, progress.bar="text")
I hope anyone can give me how to debug this. Thanks a lot.

Is possible to define a random limit for a loop in JAGS?

I am trying to implement a Weibull proportional hazards model with a cure fraction following the approach outlined by Hui, Ibrahim and Sinha (1999) - A New Bayesian Model for Survival Data with a Surviving Fraction. However, I am not sure if it is possible to define a random limit for a looping in JAGS.
library(R2OpenBUGS)
library(rjags)
set.seed(1234)
censored <- c(1, 1)
time_mod <- c(NA, NA)
time_cens <- c(5, 7)
tau <- 4
design_matrix <- rbind(c(1, 0, 0, 0), c(1, 0.2, 0.2, 0.04))
jfun <- function() {
for(i in 1:nobs) {
censored[i] ~ dinterval(time_mod[i], time_cens[i])
time_mod[i] <- ifelse(N[i] == 0, tau, min(Z))
for (k in 1:N[i]){
Z[k] ~ dweib(1, 1)
}
N[i] ~ dpois(fc[i])
fc[i] <- exp(inprod(design_matrix[i, ], beta))
}
beta[1] ~ dnorm(0, 10)
beta[2] ~ dnorm(0, 10)
beta[3] ~ dnorm(0, 10)
beta[4] ~ dnorm(0, 10)
}
inits <- function() {
time_init <- rep(NA, length(time_mod))
time_init[which(!status)] <- time_cens[which(!status)] + 1
out <- list(beta = rnorm(4, 0, 10),
time_mod = time_init,
N = rpois(length(time_mod), 5))
return(out)
}
data_base <- list('time_mod' = time_mod, 'time_cens' = time_cens,
'censored' = censored, 'design_matrix' = design_matrix,
'tau' = tau,
'nobs' = length(time_cens[!is.na(time_cens)]))
tc1 <- textConnection("jmod", "w")
write.model(jfun, tc1)
close(tc1)
# Calling JAGS
tc2 <- textConnection(jmod)
j <- jags.model(tc2,
data = data_base,
inits = inits(),
n.chains = 1,
n.adapt = 1000)
I observed the below error:
Error in jags.model(tc2, data = data_base, inits = inits(), n.chains = 1, :
RUNTIME ERROR:
Compilation error on line 6.
Unknown variable N
Either supply values for this variable with the data
or define it on the left hand side of a relation.
I am not entirely certain, but I am pretty sure that you cannot declare a random number of nodes in BUGS in general, so it would not be a specific JAGS' quirk.
Nevertheless, you can get a way around that.
Since BUGS is a declarative language instead of a procedural one, it is enough to declare an arbitrary but deterministic number of nodes (let's say "large enough") and then associate only a random number of them with a distribution and with observed data, leaving the remaining nodes deterministic.
Once you have observed the maximum value of N[i] (let's say N.max), you can pass it as a parameter to JAGS and then change this code of yours:
for (k in 1:N[i]){
Z[k] ~ dweib(1, 1)
}
into this:
for (k in 1:N.max){
if (k <= N[i]){
Z[k] ~ dweib(1, 1)
} else {
Z[k] <- 0
}
}
I hope this will do the trick in your case. So please give feedback latter about it.
Needless to say, if you have some non-zero, observed data associated to a deterministic Z[k], then all hell breaks loose inside Jags...

using data.table with multiple threads in R

Is there a way to utilize multiple threads for computation using data.table in R? For example let's say i have the following data.table:
dtb <- data.table(id=rep(1:10000, 1000), x=1:1e7)
setkey(dtb, id)
f <- function(m) { #some really complicated function }
res <- dtb[,f(x), by=id]
Is there a way to get R to multithread this if f takes a while to compute? What about in the case that f is quick, will multithreading help or is most of the time going to be taken by data.table in splitting things up into groups?
I am not sure that this is "multi-threading", but perhaps you meant to include a multi-core solution? If so, then look at this earlier answer: Performing calculations by subsets of data in R found with a search for "[r] [data.table] parallel"
Edit: (doubling of speed on a 4 core machine, but my system monitor suggests this only used 2 cores during the mclapply call.) Code copied from this thread: http://r.789695.n4.nabble.com/Access-to-local-variables-in-quot-j-quot-expressions-tt2315330.html#a2315337
calc.fake.dt.mclapply <- function (dt) {
mclapply(6*c(1000,1:4,6,8,10),
function(critical.age) {
dt$tmp <- pmax((dt$age < critical.age) * dt$x, 0)
dt[, cumsum.lag(tmp), by = grp]$V1})
}
mk.fake.df <- function (n.groups=10000, n.per.group=70) {
data.frame(grp=rep(1:n.groups, each=n.per.group),
age=rep(0:(n.per.group-1), n.groups),
x=rnorm(n.groups * n.per.group),
## These don't do anything, but only exist to give
## the table a similar size to the real data.
y1=rnorm(n.groups * n.per.group),
y2=rnorm(n.groups * n.per.group),
y3=rnorm(n.groups * n.per.group),
y4=rnorm(n.groups * n.per.group)) }
df <- mk.fake.df
df <- mk.fake.df()
calc.fake.dt.lapply <- function (dt) { # use base lapply for testing
lapply(6*c(1000,1:4,6,8,10),
function(critical.age) {
dt$tmp <- pmax((dt$age < critical.age) * dt$x, 0)
dt[, cumsum.lag(tmp), by = grp]$V1})
}
mk.fake.dt <- function (fake.df) {
fake.dt <- as.data.table(fake.df)
setkey(fake.dt, grp, age)
fake.dt
}
dt <- mk.fake.dt()
require(data.table)
dt <- mk.fake.dt(df)
cumsum.lag <- function (x) {
x.prev <- c(0, x[-length(x)])
cumsum(x.prev)
}
system.time(res.dt.mclapply <- calc.fake.dt.mclapply(dt))
user system elapsed
1.896 4.413 1.210
system.time(res.dt.lapply <- calc.fake.dt.lapply(dt))
user system elapsed
1.391 0.793 2.175

Resources