Fisher's information associated with a statistic $T$ is the Fisher information associated with the distribution of that statistic
$$I_T(\theta) = \mathbb E_\theta\Big[\frac{\partial}{\partial \theta}\log f^T_\theta(T(X))^\prime \frac{\partial}{\partial \theta}\log f^T_\theta(T(X))\Big]$$
It is thus possible to compare Fisher's informations between statistics. For instance, Fisher's information associated with a sufficient statistic is the same as that of the entire sample X. On the other end of the spectrum, Fisher's information provided by an ancillary statistic is null.
Finding Fisher's information provided by the sample median is somewhat of a challenge. However, running a Monte Carlo experiment with $n$ large shows that the variance of the median is approximately 1.5-1.7 times larger than the variance of the empirical mean, which implies that the Fisher information is approximately 1,5-1.7 times smaller for the median. The exact expression of the (constant) Fisher information about $\theta$ attached with the median statistic $X_{(n/2)}$ of a $\mathcal N(\theta,1)$ sample is
$$1 − \mathbb E_0\left[\frac{∂^2}{∂θ^2}
\left\{ (n/2 − 1) \log \Phi (X_{(n/2)}) + (n − n/2) \log\Phi (-X_{(n/2)} )\right\}\right]
$$
where expectation is under $\theta=0$. It also writes as
$$1+n\mathbb E[Z_{n/2:n}\varphi(Z_{n/2:n})]-n\mathbb E[Z_{n/2:n-1}\varphi(Z_{n/2:n-1})]+\\
\frac{n(n-1)}{n/2-2}\varphi(Z_{n/2-2:n-2})^2+
\frac{n(n-1)}{n-n/2-1}\varphi(Z_{n/2:n-2})^2\tag{1}$$
(after correction of a typo in the thesis).
As stated in this same thesis
The median order statistics contain the most information about θ.
(...) For n = 10, the X 5:10 and X 6:10 each contain 0.6622 times the
total information in the sample. For n = 20, the proportion of
information contained in the median statistic is 0.6498.
Since $1/0.6498=
1.5389$, it is already close to $1/4\varphi(0)^2=1.5707$. While a Monte Carlo approximation of (1) returns $1.5706$ for $n=10^4$.