Here are some resources that can help.
The UCLA Introduction to Generalized Linear Mixed Models nicely works up from linear mixed models in general to what's meant by a "generalized" linear mixed model.
There is a related R Data Analysis Example on Mixed Effects Logistic Regression that goes into great detail on how to set up and work with the results of a mixed logistic regression.
This site's lmer cheat sheet explains how the notation corresponds to what coefficient values the model will try to estimate.
For both your models, what's hidden is the default logit (log-odds) link for the binomial generalized linear regression regression. If $y$ is the 1/0 outcome, then both models fit the logit of the expectation of $y$, $E(y)$, with respect to a linear predictor $\eta$:
$$\text{logit}(E(y))=\eta. $$
The linear predictor is what you provide on the right side of the formula, and differs for your two models. For a mixed model, this is often written in a matrix form:
$$\eta= X\beta +Z \gamma,$$
where $X$ represents the fixed-effect predictors with corresponding coefficients $\beta$ and $Z$ represents the random effects with corresponding coefficients $\gamma$. Each of your models only has a single fixed-effect predictor (inst or inst_diff_1).
The cheat sheet shows that the first model has random intercepts in the linear predictor associated with fish_ID; those are modeled as having a Gaussian distribution with 0 mean and variance estimated from the data. The second model has both random intercepts and random slopes (with respect to inst_diff_1) associated with fish_ID, again modeled as Gaussian distributions with 0 means and variances estimated from the data. The cheat sheet shows that, in the form you specified, a correlation between the random intercepts and slopes is also estimated.