number of times, a posterior distribution of the parameter of interest will emerge based on the
time spent on each point in the parameter space.
The Gibbs Sampler Algorithm for the Royle and Nichols (2003) Model
Step 1: Selecting the initial values for r, 2 and N,.
Iteration 1
r ) : random number chosen from a Uniform (0,1) distribution
So, q(1) = logit[r()]
(1) : random number chosen from a Gamma (a,b), where a and b are the shape and scale
parameters initially selected.
N,() : random number chosen from a Poisson[ 1)], where i = 1, 2, ...... R sites.
Step 2: Updating the values of r, 2 and N,.
Iterationj [ranging from 2 to a large number]
[NO) I w,., A'1), r-1)] : random number drawn according to Equation 2-10 where
i= 1, 2, ...... R sites
[A2)I {N,)}] : random number drawn according to equation (8). The '{}' indicates the
entire vector of site abundances.
[r/) | {w}, {N,)}] : random number drawn according to the proportionality relationship of
Equation 2-9. Consequently r() --
1+e10
Step 2 is repeated a large number of times. Using the Equations 2-8 and 2-10 the updates
for NO) and A(J) can be made quite directly in the Gibbs sampler. However, making the updates
for r/) requires the use of the Metropolis algorithm (Gelman et al., 1995) with a Gaussian
proposal distribution since Equation 2-9 is only a proportionality relationship.
Simulation Design
Royle and Nichols (2003) have already shown the performance of the model in varying
large sample situations and have established that the likelihood-based inference works
reasonably well for inferences about estimates of A for even low values ofr and Twhen R is 200
or greater. However, in their simulation design, they have chosen values for the true value of A