In statistics, we often assume that a particular variable follows a certain distribution. For example, if we know $Y \in \{0, 1\}$, then we can assume $Y \sim \text{Bernoulli}(p)$, since using the Bernoulli is really the only way to model a binary random variable. This assumption allows us to rely on tools such as logistic regression that make specific parametric assumptions about the form of $Y$.
However, relationships between the type of data and the random variable are not always so clear-cut. If we know that $Y$ represents a count of something, then there are multiple options - Binomial, Negative Binomial, Poisson, etc. - depending on how that count is generated. Furthermore, other distributions purport to model specific types of data, such as the Beta for proportions, and the Gamma for time-to-event.
Hence, we can infer logical statements like "If $Y \sim \text{Beta}$, then $Y$ is a proportion." But what about the converse? Is it true that "If $Y$ is a proportion, then $Y\sim \text{Beta}$"? When can we say "If $Y$ is a _______, then $Y\sim f_Y(y)$? Intuitively, it seems possible for the Bernoulli ("If $Y \in \{0, 1\}$, then $Y\sim \text{Bernoulli}(p)$) but I have no proof of this. Do any other distributions possess this property? And how would you prove something like this?
EDIT: To clarify, I suppose a more specific question is: when does the support of a random variable imply its distribution (without necessarily restricting the number of parameters)?
