Given a two-dimensional parameter $(\theta_1,\theta_2)$ and a target density $\pi(\theta_1,\theta_2)$, you can implement numerous Metropolis-Hastings moves among which
- Generate a proposed value from a joint distribution with associated density $\varpi(\theta_1,\theta_2)$ and accept this proposed value with the Metropolis-Hastings acceptance probability;
- Generate a proposed value one parameter at a time from a one-dimensional proposal distribution $\varpi_i(\theta_i)$ and accept this proposal with the Metropolis-Hastings acceptance probability;
In particular, if the component-wise proposal is the full conditional, this corresponds to the regular Gibbs sampler, as illustrated below.
[Excerpt from our book Introducing Monte Carlo Methods with R, Example 7.3, pp.202-203, with a few typos corrected]
Consider the posterior distribution on $(\theta,\sigma^2)$ associated with the joint model
\begin{eqnarray}
X_i &\sim& {\cal N}(\theta,\sigma^2), \quad i=1, \ldots, n, \\
\theta &\sim& {\cal N}(\theta_0,\tau^2\sigma^2)\,,\quad
\sigma^2 \sim {\cal IG}(a,b),\nonumber
\end{eqnarray}
where ${\cal IG}(a,b)$ is the inverted Gamma distribution
(that is, the distribution of the inverse of a Gamma variable),
with density $b^a (1/x)^{a+1}e^{-b/x}/\Gamma(a)$ and with $\theta_0, \tau^2, a, b$ specified.
Writing $\mathbf{x} = (x_1, \ldots,x_n)$, the posterior distribution on $(\theta,\sigma^2)$ is given by
\begin{eqnarray}\label{eq:firstjoint}
f(\theta,\sigma^2|\mathbf{x}) &\propto& \left[\frac{1}{(\sigma^2)^{n/2}}e^{-\sum_i(x_i-\theta)^2\big/2\sigma^2}\right] \\
&&\times \left[\frac{1}{\tau\sigma }e^{-(\theta-\theta_0)^2/2\tau^2\sigma^2}\right] \times \left[\frac{1}{(\sigma^2)^{a+1}}e^{-1/b \sigma^2}\right],\nonumber
\end{eqnarray}
from which we can get the full conditionals of $\theta$ and $\sigma^2$. (Note that this is not a regular conjugate setting in
that integrating $\theta$ or $\sigma^2$ in this density does not produce a standard density.)
We have
\begin{eqnarray}\label{eq:firstposterior}
\pi(\theta | \mathbf{x},\sigma^2) &\propto&e^{-\sum_i(x_i-\theta)^2/2\sigma^2}e^{-(\theta-\theta_0)^2/2\tau^2 \sigma^2}\,, \\
&&\\
\pi(\sigma^2 | \mathbf{x},\theta) &\propto& \left(\frac{1}{\sigma^2}\right)^{(n+2a+3)/2}\exp\left\{-\frac{1}{2\sigma^2} \left(\sum_i(x_i-\theta)^2+(\theta-\theta_0)^2/\tau^2 +2/b\right)\right\}\,.
\end{eqnarray}
These densities correspond to
$$
\theta | \mathbf{x},\sigma^2\sim
{\cal N}\left(\frac{\sigma^2}{\sigma^2+n \tau^2}\;\theta_0 + \frac{n\tau^2}{\sigma^2+n \tau^2}
\;\bar x, \; \frac{\sigma^2 \tau^2}{\sigma^2+n \tau^2}\right)
$$
and
$$
\sigma^2 | \mathbf{x},\theta \sim {\cal IG}\left(\frac{n}{2}+a, \frac{1}{2}\sum_i( x_i - \theta)^2+b \right),
$$
where $\bar x$ is the empirical average of the observations,
as the full conditional distributions to be used in a Gibbs sampler.
A study on metabolism in 15-year-old females yielded the following data, denoted by $\mathbf{x}$,
> x=c(91,504,557,609,693,727,764,803,857,929,970,1043,
+ 1089,1195,1384,1713)
> x=log(x)
corresponding to their energy intake, measured in megajoules,
over a 24 hour period. Using the normal model above, with $\theta$ corresponding to the
true mean energy intake, the Gibbs sampler can be implemented as
> xbar=mean(x)
> sh1=(n/2)+a
> sigma2=theta=rep(0,Nsim) #init arrays
> sigma2[1]=1/rgamma(1,shape=a,rate=b) #init chains
> B=sigma2[1]/(sigma2[1]+n*tau2)
> theta[1]=rnorm(1,m=B*theta0+(1-B)*xbar,sd=sqrt(tau2*B)
> for (i in 2:Nsim){
+ B=sigma2[i-1]/(sigma2[i-1]+n*tau2)
+ theta[i]=rnorm(1,m=B*theta0+(1-B)*xbar,sd=sqrt(tau2*B))
+ ra1=(1/2)*(sum((x-theta[i])^2))+b
+ sigma2[i]=1/rgamma(1,shape=sh1,rate=ra1)
+ }
where theta0, tau2, a, and b are specified values.
The posterior means of $\theta$ and $\sigma^2$ are 872.402 and 136,229.2, giving as an estimate of $\sigma$ 369.092.
Histograms of the posterior distributions of $\theta$ and $\sigma$ are given in the following Figure.

self-studytag would seem necessary. – Xi'an Dec 06 '16 at 12:05