Suppose that I have $X_{i} \overset{i.i.d.}{\sim} P$ with $E[X_{i}]=\mu$ and $V[X_{i}^{2}] = \sigma^{2}<\infty$.
Then by the central limit theorem I know that: \begin{align} \sqrt{n} (\bar{X}_{n} - \mu) \overset{d}{\to} N(0,\sigma^{2}) \end{align} where $\bar{X}_{n}$ is the sample average. Suppose for some silly reason I know the value of $\sigma^{2}$. Then this asymptotic approximation allows me to justify confidence sets for $\mu$ of the form: \begin{align} \bar{X}_{n} \pm q_{\alpha/2} \sqrt{\frac{\sigma^{2}}{n}} \end{align} where $q_{\alpha/2}$ is the $\alpha/2^{th}$ quantile of the standard normal. In particular: \begin{align} \lim_{n \to \infty} P \left(q_{\alpha/2} \leq \sqrt{n} \frac{(\bar{X}_{n} - \mu)}{\sigma} \leq -q_{\alpha/2} \right) = 1-\alpha\\ \implies \lim_{n \to \infty} P \left(\bar{X}_{n} + q_{\alpha/2}\frac{\sigma}{\sqrt{n}} \leq \mu \leq \bar{X}_{n} -q_{\alpha/2}\frac{\sigma}{\sqrt{n}} \right) = 1-\alpha\\ \end{align} For simplicity, let: $$CI_{1} = \left[\bar{X}_{n} + q_{\alpha/2}\frac{\sigma}{\sqrt{n}} , \bar{X}_{n} -q_{\alpha/2}\frac{\sigma}{\sqrt{n}} \right]$$ Now suppose that I am a strange statistician, and that rather than the confidence interval constructed above, I prefer a confidence interval (for whatever reason) of my own making: $$CI_{2} = \left[\bar{X}_{n} + q_{\alpha/2}\frac{\sigma}{\sqrt{n}}+b_{n} , \bar{X}_{n} -q_{\alpha/2}\frac{\sigma}{\sqrt{n}} -b_{n}\right]$$ where $b_{n} = o(n^{-1/2})$ is some vanishing deterministic sequence. Note that $CI_{2}$ also provides $1-\alpha$ coverage probability asymptotically.
My question: is there any reason to prefer $CI_{1}$ to $CI_{2}$? Asymptotically they are the same, so I suspect any reason would need to appeal to finite sample arguments. For example, I can always construct the sequence $b_{n}$ such that $CI_{1}$ and $CI_{2}$ are VERY different in finite sample. So what statistical justification would lead someone to use $CI_{1}$ versus $CI_{2}$? Is there a name for the desirable property $CI_{1}$ possesses that $CI_{2}$ does not?
Thanks so much!