Assume I have a high school statistics class under my belt:
One common misunderstanding is that 95% efficacy means that in the Pfizer clinical trial, 5% of vaccinated people got COVID. But that's not true; the actual percentage of vaccinated people in the Pfizer (and Moderna) trials who got COVID-19 was about a hundred times less than that: 0.04%.
What the 95% actually means is that vaccinated people had a 95% lower risk of getting COVID-19 compared with the control group participants, who weren't vaccinated. In other words, vaccinated people in the Pfizer clinical trial were 20 times less likely than the control group to get COVID-19.
-Reference: https://www.livescience.com/covid-19-vaccine-efficacy-explained.html
I would like to understand with a "concrete" example how the these numbers are calculated, so I can understand the contours, limitations and assumptions of what it means to 95% (mRNA C19 vaccines) vs ~70% (JNJ).
A concrete example or a pointer to such an example written for at a college freshman level is appreciated.