Given a $K$-dimensional real-valued vector $\mathbf{z} = (z_1, z_2, \ldots, z_K)$, I know that the softmax function returns a vector $\sigma(\mathbf{z})$ with positive elements summing to 1 via the following formula:
$$ \sigma(\mathbf{z})_j = \frac{e^{z_j}}{\sum_{k=1}^K e^{z_k}}, j = 1, \ldots, K $$
Recently a colleague mentioned to me a variant of the softmax function, which I'll call $\sigma'(\cdot)$, that takes the following form:
$$ \sigma'(\mathbf{z})_j = \frac{\alpha^{r(\mathbf{z}, j)}}{\sum_{k=1}^K \alpha^{r(\mathbf{z}, k)}}, j = 1, \ldots, K $$
Here, $\alpha > 0$ is a constant and $r(\mathbf{z}, j)$ is the rank of $z_j$ within the vector $\mathbf{z}$, so the smallest value takes rank 1, the second smallest rank 2, and so on. The largest index takes rank $K$.
Does this softmax variant (or a similar one based on the ranks of a vector's values instead of the values themselves) have a name, and is it used in practice?