I am working on Bayesian estimation: suppose that $X_1,\dots, X_n$ is an iid sample from Uniform$[0,\theta]$. Assume a Pareto prior for $\theta\sim Pareto(\alpha,\beta)$, i.e. $$ f(\theta)=\frac{\alpha\beta^\alpha}{\theta^{\alpha+1}}, \, \theta\ge \beta, \alpha>0, \beta>0 $$
What is the Bayes estimator of $\theta$? Do the prior and posterior belong to the same family of distributions (i.e., conjugate prior)?
(2) What does this estimator converge to as $n\to \infty$?
My work is as follows.
The prior distribution: $$ \pi(\theta)=\frac{\alpha\beta^\alpha}{\theta^{\alpha+1}}I[\theta\ge \beta] $$ and the likelihood function is $$ f(X|\theta)=\prod_{i=1}^n f(x_i;\theta)=\frac{1}{\theta^n}I[0\le X_{(1)}\le X_{(n)}\le \theta] $$ where $X_{(1)}\le \dots X_{(n)}$.
Then the posterior distribution is $$ \pi(\theta|X)\approx \pi(\theta)L(X|\theta)=\frac{\alpha\beta^\alpha}{\theta^{n+\alpha+1}}I[0\le X_{(1)}\le X_{(n)}\le \theta]I[\theta\ge \beta] $$
So for $\beta\ge X_{(n)}$, $\pi(\theta|X)\approx \frac{\alpha\beta^\alpha}{\theta^{n+\alpha+1}}I[\theta\ge \beta]$.
For $\beta<X_{(n)}$, $\pi(\theta|X)\approx \frac{\alpha\beta^\alpha}{\theta^{n+\alpha+1}}I[\theta\ge X_{(n)}]$.
But I have no idea about the distribution of $\theta|X$? Does it mean $\theta|X\sim Pareto(n+\alpha,\beta)$?
For the Bayes estimator, $$ \hat{\theta}=E[\theta|X]=\int_R \theta \pi(\theta|x)dx $$
As $\beta\ge X_{(n)}$, $$ \hat{\theta}=E[\theta|X]=\int_\beta^\infty \theta \frac{(n+\alpha)\beta^{n+\alpha}}{\theta^{n+\alpha+1}}d\theta=\frac{(n+\alpha)\beta}{n+\alpha-1} $$
As $\beta< X_{(n)}$, the pdf of $\theta|X$ is $$ g(\theta|x)=\frac{(n+\alpha)(X_{(n)})^{n+\alpha}}{\theta^{n+\alpha+1}}I[\theta\ge X_{(n)}], $$ then $$ \hat{\theta}=E[\theta|X]=\frac{(n+\alpha)X_{(n)}}{n+\alpha-1} $$
It seems that this result is not right because the Bayes estimator should be weighted of the prior mean ($E[\theta]=\frac{\alpha\beta}{\alpha-1}$) and sample mean $\bar{X}$.
