Fixing parameter of random effect with gamm in R - gam

Do you know if it is possible to fit a parameter of a random effect using gamm from the mgcv package?
For example a Normal distribution N(0, c) and fix c to a certain value instead of being estimated?
Thank you in advance!
Best regards,
Bayesianboy
I searched for the documentation but could not find anything.

Related

Does JAGS have a log function for bases other than e?

I am trying to calculate a log with base 2 in JAGS, but can't find a way to implement this.
In the documentation I can't find a way to do this, and I am hoping I am missing something,
or that someone knows a workaround.
Thanks in advance for any help,
Benny
Log base 2 (or the binary logarithm) can be calculated with this trick here (link to wikipedia). As an example in R using the natural log:
log_2_result <- log(15, base = 2)
log_2_trick <- log(15) / log(2)
identical(log_2_result, log_2_trick)
[1] TRUE
JAGS has the log function, so you could use a similar approach to above (for log_2_trick). An important thing to note, however, is that because log is a link function in JAGS you can only input a scalar into it.

How to get all solutions which causes to optimal result in Gurobi

I have an implementation in Gurobi in python. My problem has different choices of selecting parameters to reach the optimal result. Now I need all solutions which reach the optimal value of result. How can I get them ? I know the blow code which just returns one solution.
if m.status == GRB.Status.OPTIMAL:
for v in m.getVars():
print (v.varname, v.x)
This can be achieved using the SolutionPool. You can specify how many solutions you want Gurobi to return.

PySCIPOpt/SCIP - Branching/Separation with fractional variable

I began using PySCIPOpt/SCIP for a Coursera course on discrete optimization. I'd need to implement a simple separation from a fractional variable and wonder how to do it. Online SCIP literature does not provide relevant example.
Any Python example for me to get inspired for my assignment?
Thank you for the answer. Indeed I spent some hours reading SCIP documentation and I have trouble interfacing SCIp methods in Python.
I have been able to implement in Python a simple constraint handler to add first-type cuts and I'd like to add a separator to add second-type cuts.
The latter cuts are typically x = 0 or 1 cuts based on fractional x values and I stumble more with syntax - addCut() - and using generic methods than the process itself.
A Python example, a bit more involved than tsp.py, would greatly help me.
Your question is quite broad. I try to give some hints on where to look for an answer:
general information on separators in SCIP
difference between constraint handlers and separators
Python example on a separator implemented as constraint handler to solve the Traveling Salesman Problem
I can try to answer your question more precisely, if you explain your application/problem in more detail.

How to use DEoptim in J/addon/math

I am trying to find the global maxima of a function and have found this package in j. However, after reading through the examples in demo I did not know how to use it yet. Anyone who can give an example about how to find the optimized parameters in a multi-parameter function, such as the function written below? I appreciate what you will do.
given that epsilon are data points given, omega and alpha have the range between 0 and 1.

Does Julia have a way to solve for unknown variables

Is there a function in Julia that is similar to the solver function in Excel where I can provide and equation, and it will solve for the unknown variable? If not, does anybody know the math behind Excel's solver function?
I am not expecting anybody to solve the equation, but if it helps:
Price = (Earnings_1/(1+r)^1)+(Earnings_2/(1+r)^2)++(Earnings_3/(1+r)^3)+(Earnings_4/(1+r)^4)+(Earnings_5/(1+r)^5)+(((Earnings_5)(RiskFreeRate))/((1+r)^5)(1-RiskFreeRate))
The known variables are: Price, All Earnings, and RiskFreeRate. I am just trying to figure out how to solve for r.
Write this instead as an expression f(r) = 0 by subtracting Price over to the other side. Now it's a rootfinding problem. If you only have one variable you're solving for (looks to be the case), then Roots.jl is a good choice.
fzero(f, a::Real, b::Real)
will search for a solution between a and b for example, and the docs have more choices for algorithms when you don't know a range to start with and only give an initial condition for example.
In addition, KINSOL in Sundials.jl is good when you know you're starting close to a multidimensional root. For multidimensional and needing some robustness to the initial condition, I'd recommend using NLsolve.jl.
There's nothing out of the box no. Root finding is a science in itself.
Luckily for you, your function has an analytic first derivative with respect to r. That means that you can use Newton Raphson, which will be extremely stable for your function.
I'm sure you're aware your function behaves badly around r = -1.

Resources