0

I would like to use a bayesian model to determine which position (X, Y) is most likely the best position to score a goal in soccer. For this I have a dataset of a soccer club with all its goals and shots with X and Y coordinates and the type of shot for some seasons. To make it more clear, imagine we have the following visualization of goals scored at certain positions:

In this plot I didn't show the no goals, but I hope this makes it more clear. As I said I also have the `(1|shotType) of the shot/goal. A goal/shot can be taken in the following ways:

  • LeftFoot
  • RightFoot
  • Head

I was thinking of using a Bayesian Binary Logistic Regression (If you think something is better feel free to let me know) since we have a binary Target (Goal or No goal) and two numeric variable and one grouping variable. So I fitted the following model (result2 -> 0 is no goal and 1 is goal):

library(brms)  
fit <- brm(result2 ~ X + Y + (1|shotType), 
           data = df, 
           family = bernoulli(link = "logit"))
fit
#> Warning: There were 55 divergent transitions after warmup. Increasing
#> adapt_delta above 0.8 may help. See
#> http://mc-stan.org/misc/warnings.html#divergent-transitions-after-warmup
#>  Family: bernoulli 
#>   Links: mu = logit 
#> Formula: result2 ~ X + Y + (1 | shotType) 
#>    Data: df (Number of observations: 2728) 
#>   Draws: 4 chains, each with iter = 2000; warmup = 1000; thin = 1;
#>          total post-warmup draws = 4000
#> 
#> Group-Level Effects: 
#> ~shotType (Number of levels: 4) 
#>               Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
#> sd(Intercept)     1.02      0.69     0.24     2.76 1.01      525      771
#> 
#> Population-Level Effects: 
#>           Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
#> Intercept   -12.18      1.16   -14.51    -9.96 1.00      988      606
#> X            11.34      1.04     9.27    13.39 1.00     2463     2335
#> Y             0.23      0.47    -0.66     1.13 1.01     2025     2420
#> 
#> Draws were sampled using sampling(NUTS). For each parameter, Bulk_ESS
#> and Tail_ESS are effective sample size measures, and Rhat is the potential
#> scale reduction factor on split chains (at convergence, Rhat = 1).

Created on 2023-11-01 with reprex v2.0.2

But I am now at the point, that I don't understand how to interpet these results. I was assuming you could see a CI of what the best range is of to score a goal, for example X (0.8-0.9) and Y (0.25-0.3) which is like a square (good to know, the X and Y are in range of 0 to 1 and not in meters). So I was wondering if anyone could help me how to interpret these results and if I'm using the right model for determining the most likely positions to score a goal?


Here is some reproducible data of df:

structure(list(result2 = c(0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 
1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 
1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 
0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 1, 1, 0, 0, 0, 1, 0, 0, 
0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 1, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 
0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 
0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 0, 
0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 
1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 
0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 
0, 0, 0, 1, 0), X = c(0.736, 0.895, 0.706, 0.901, 0.852, 0.952, 
0.959, 0.917, 0.937, 0.763, 0.892, 0.732, 0.933, 0.885, 0.85, 
0.852, 0.937, 0.972, 0.917, 0.872, 0.778, 0.84, 0.859, 0.917, 
0.809, 0.953, 0.841, 0.917, 0.734, 0.974, 0.877, 0.913, 0.907, 
0.952, 0.714, 0.803, 0.907, 0.964, 0.885, 0.759, 0.914, 0.853, 
0.907, 0.887, 0.739, 0.943, 0.812, 0.816, 0.822, 0.927, 0.926, 
0.913, 0.917, 0.907, 0.9, 0.902, 0.972, 0.78, 0.783, 0.93, 0.756, 
0.973, 0.926, 0.78, 0.917, 0.88, 0.958, 0.917, 0.869, 0.876, 
0.818, 0.949, 0.916, 0.75, 0.938, 0.727, 0.958, 0.774, 0.923, 
0.934, 0.851, 0.743, 0.9, 0.925, 0.98, 0.898, 0.86, 0.78, 0.876, 
0.86, 0.785, 0.91, 0.704, 0.729, 0.964, 0.879, 0.869, 0.844, 
0.841, 0.883, 0.853, 0.967, 0.95, 0.966, 0.901, 0.903, 0.893, 
0.857, 0.85, 0.734, 0.952, 0.873, 0.853, 0.903, 0.773, 0.772, 
0.855, 0.8, 0.804, 0.923, 0.839, 0.853, 0.918, 0.93, 0.973, 0.879, 
0.815, 0.916, 0.885, 0.897, 0.871, 0.717, 0.905, 0.839, 0.691, 
0.76, 0.89, 0.937, 0.844, 0.875, 0.885, 0.893, 0.982, 0.851, 
0.906, 0.738, 0.901, 0.888, 0.891, 0.775, 0.873, 0.957, 0.885, 
0.844, 0.752, 0.894, 0.876, 0.909, 0.912, 0.761, 0.752, 0.885, 
0.849, 0.862, 0.904, 0.857, 0.891, 0.919, 0.796, 0.861, 0.768, 
0.755, 0.855, 0.735, 0.763, 0.751, 0.864, 0.768, 0.785, 0.794, 
0.91, 0.814, 0.876, 0.909, 0.9, 0.755, 0.833, 0.848, 0.689, 0.845, 
0.815, 0.906, 0.797, 0.914, 0.854, 0.808, 0.921, 0.878, 0.734, 
0.773, 0.856, 0.97, 0.861, 0.888, 0.972, 0.912, 0.914, 0.911, 
0.907, 0.948, 0.876, 0.905, 0.859, 0.721, 0.914, 0.957, 0.721, 
0.806, 0.882, 0.963, 0.746, 0.951, 0.905, 0.762, 0.706, 0.955, 
0.829, 0.728, 0.889, 0.913, 0.86, 0.811, 0.896, 0.878, 0.942, 
0.9, 0.81, 0.875, 0.881, 0.941, 0.746, 0.816, 0.923, 0.894, 0.788, 
0.865, 0.864, 0.782, 0.932, 0.856, 0.723, 0.886, 0.92, 0.917, 
0.764, 0.968, 0.88, 0.915, 0.774, 0.911, 0.904, 0.743, 0.958, 
0.961, 0.867, 0.809, 0.913, 0.927, 0.803, 0.778, 0.952, 0.872, 
0.874, 0.875, 0.854, 0.879, 0.885, 0.786, 0.968, 0.863, 0.897, 
0.809, 0.927, 0.943, 0.775, 0.782, 0.969, 0.917, 0.879, 0.963, 
0.919, 0.86, 0.921, 0.849, 0.905, 0.76, 0.967, 0.692, 0.934, 
0.895, 0.767, 0.812, 0.811, 0.958, 0.93, 0.962, 0.741, 0.747, 
0.76, 0.853, 0.911, 0.889, 0.811, 0.791, 0.789, 0.867, 0.82, 
0.75, 0.872, 0.967, 0.903, 0.846, 0.797, 0.908, 0.905, 0.771, 
0.933, 0.72, 0.912, 0.979, 0.885, 0.86, 0.959, 0.977, 0.921, 
0.872, 0.9, 0.912, 0.754, 0.836, 0.89, 0.786, 0.832, 0.899, 0.838, 
0.808, 0.839, 0.916, 0.861, 0.929, 0.818, 0.902, 0.799, 0.924, 
0.963, 0.903, 0.732, 0.869, 0.758, 0.846, 0.872, 0.965, 0.793, 
0.728, 0.886, 0.806, 0.823, 0.925, 0.906, 0.96, 0.885, 0.846, 
0.806, 0.903, 0.88, 0.883, 0.799, 0.821, 0.883, 0.949, 0.886, 
0.865, 0.478, 0.917, 0.876, 0.721, 0.843, 0.866, 0.906, 0.791, 
0.954, 0.768, 0.869, 0.906, 0.955, 0.924, 0.893, 0.855, 0.855, 
0.862, 0.792, 0.973, 0.907, 0.744, 0.876, 0.838, 0.755, 0.672, 
0.909, 0.681, 0.939, 0.756, 0.783, 0.86, 0.895, 0.92, 0.817, 
0.855, 0.883, 0.873, 0.872, 0.881, 0.875, 0.902, 0.889, 0.846, 
0.727, 0.854, 0.903, 0.916, 0.86, 0.806, 0.893, 0.866, 0.807, 
0.974, 0.813, 0.961, 0.905, 0.882, 0.919, 0.758, 0.905, 0.667, 
0.929, 0.808, 0.804, 0.967, 0.763, 0.715, 0.786, 0.731, 0.71, 
0.775, 0.935, 0.801, 0.91, 0.798, 0.932, 0.773, 0.829, 0.818, 
0.961, 0.758, 0.893, 0.753, 0.919, 0.895, 0.873, 0.82, 0.885, 
0.93, 0.917, 0.866, 0.848, 0.793, 0.901, 0.885, 0.927, 0.924, 
0.855, 0.88, 0.855, 0.863, 0.892, 0.819, 0.829, 0.871, 0.893, 
0.909, 0.758, 0.885, 0.786, 0.924, 0.87, 0.72, 0.775, 0.848, 
0.983, 0.75), Y = c(0.702, 0.597, 0.479, 0.484, 0.659, 0.34, 
0.716, 0.524, 0.439, 0.682, 0.699, 0.515, 0.678, 0.375, 0.277, 
0.579, 0.53, 0.543, 0.342, 0.48, 0.434, 0.696, 0.75, 0.355, 0.618, 
0.608, 0.299, 0.397, 0.512, 0.417, 0.472, 0.32, 0.371, 0.512, 
0.643, 0.532, 0.388, 0.579, 0.5, 0.562, 0.382, 0.748, 0.48, 0.569, 
0.617, 0.589, 0.605, 0.355, 0.612, 0.693, 0.767, 0.687, 0.456, 
0.593, 0.65, 0.396, 0.68, 0.395, 0.347, 0.437, 0.367, 0.412, 
0.287, 0.627, 0.528, 0.555, 0.691, 0.427, 0.367, 0.539, 0.658, 
0.567, 0.569, 0.587, 0.458, 0.404, 0.615, 0.495, 0.723, 0.424, 
0.539, 0.709, 0.476, 0.6, 0.41, 0.629, 0.606, 0.573, 0.297, 0.72, 
0.364, 0.481, 0.482, 0.567, 0.64, 0.551, 0.311, 0.658, 0.608, 
0.662, 0.6, 0.416, 0.597, 0.647, 0.485, 0.534, 0.545, 0.521, 
0.553, 0.382, 0.569, 0.383, 0.407, 0.326, 0.388, 0.472, 0.543, 
0.645, 0.589, 0.521, 0.548, 0.717, 0.539, 0.666, 0.419, 0.33, 
0.318, 0.66, 0.5, 0.522, 0.618, 0.585, 0.347, 0.357, 0.565, 0.684, 
0.417, 0.502, 0.65, 0.693, 0.5, 0.68, 0.396, 0.569, 0.569, 0.64, 
0.476, 0.739, 0.591, 0.394, 0.558, 0.483, 0.274, 0.314, 0.699, 
0.47, 0.361, 0.324, 0.546, 0.448, 0.602, 0.5, 0.377, 0.654, 0.721, 
0.342, 0.453, 0.473, 0.261, 0.496, 0.713, 0.346, 0.482, 0.623, 
0.469, 0.394, 0.623, 0.62, 0.633, 0.579, 0.563, 0.576, 0.536, 
0.256, 0.435, 0.496, 0.678, 0.52, 0.579, 0.515, 0.684, 0.392, 
0.443, 0.395, 0.383, 0.339, 0.539, 0.482, 0.618, 0.536, 0.599, 
0.461, 0.321, 0.418, 0.597, 0.485, 0.436, 0.588, 0.421, 0.419, 
0.756, 0.575, 0.592, 0.488, 0.516, 0.56, 0.578, 0.736, 0.587, 
0.503, 0.503, 0.587, 0.445, 0.569, 0.576, 0.488, 0.555, 0.433, 
0.41, 0.497, 0.545, 0.728, 0.655, 0.328, 0.456, 0.638, 0.439, 
0.779, 0.365, 0.563, 0.338, 0.383, 0.449, 0.474, 0.501, 0.629, 
0.405, 0.6, 0.486, 0.623, 0.521, 0.597, 0.503, 0.705, 0.376, 
0.516, 0.648, 0.52, 0.668, 0.453, 0.678, 0.31, 0.598, 0.582, 
0.391, 0.6, 0.38, 0.503, 0.533, 0.664, 0.489, 0.615, 0.306, 0.541, 
0.57, 0.623, 0.5, 0.681, 0.499, 0.281, 0.455, 0.51, 0.565, 0.666, 
0.333, 0.338, 0.439, 0.343, 0.53, 0.491, 0.5, 0.462, 0.43, 0.519, 
0.668, 0.419, 0.466, 0.599, 0.614, 0.645, 0.522, 0.472, 0.491, 
0.503, 0.505, 0.466, 0.688, 0.6, 0.727, 0.666, 0.73, 0.6, 0.51, 
0.73, 0.696, 0.3, 0.611, 0.672, 0.54, 0.591, 0.487, 0.692, 0.392, 
0.493, 0.421, 0.475, 0.576, 0.541, 0.513, 0.567, 0.5, 0.63, 0.612, 
0.35, 0.576, 0.503, 0.32, 0.494, 0.366, 0.685, 0.291, 0.491, 
0.561, 0.645, 0.623, 0.555, 0.439, 0.272, 0.328, 0.534, 0.538, 
0.666, 0.496, 0.564, 0.325, 0.188, 0.416, 0.56, 0.558, 0.423, 
0.581, 0.504, 0.407, 0.649, 0.485, 0.324, 0.349, 0.51, 0.596, 
0.579, 0.5, 0.558, 0.579, 0.454, 0.34, 0.582, 0.674, 0.431, 0.458, 
0.437, 0.664, 0.371, 0.978, 0.562, 0.575, 0.6, 0.554, 0.606, 
0.367, 0.358, 0.328, 0.333, 0.604, 0.433, 0.319, 0.46, 0.367, 
0.417, 0.376, 0.647, 0.605, 0.343, 0.549, 0.46, 0.611, 0.345, 
0.624, 0.629, 0.324, 0.467, 0.567, 0.651, 0.663, 0.611, 0.364, 
0.421, 0.445, 0.665, 0.388, 0.369, 0.431, 0.637, 0.709, 0.339, 
0.473, 0.747, 0.5, 0.684, 0.701, 0.612, 0.56, 0.456, 0.64, 0.524, 
0.684, 0.421, 0.658, 0.49, 0.478, 0.31, 0.434, 0.31, 0.371, 0.299, 
0.515, 0.536, 0.567, 0.567, 0.533, 0.534, 0.676, 0.431, 0.452, 
0.64, 0.456, 0.573, 0.358, 0.646, 0.695, 0.403, 0.718, 0.611, 
0.501, 0.304, 0.423, 0.639, 0.573, 0.614, 0.627, 0.564, 0.5, 
0.285, 0.573, 0.557, 0.472, 0.66, 0.358, 0.5, 0.559, 0.668, 0.458, 
0.569, 0.384, 0.284, 0.233, 0.328, 0.661, 0.403, 0.482, 0.672, 
0.286, 0.5, 0.753, 0.508, 0.548, 0.65, 0.483, 0.266, 0.482, 0.461
), shotType = c("LeftFoot", "RightFoot", "RightFoot", "Head", 
"LeftFoot", "RightFoot", "LeftFoot", "Head", "Head", "RightFoot", 
"RightFoot", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "OtherBodyPart", "Head", "RightFoot", "RightFoot", 
"RightFoot", "RightFoot", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "LeftFoot", "RightFoot", "LeftFoot", "Head", "Head", 
"RightFoot", "Head", "RightFoot", "RightFoot", "RightFoot", "LeftFoot", 
"Head", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", "Head", 
"RightFoot", "RightFoot", "Head", "RightFoot", "RightFoot", "RightFoot", 
"LeftFoot", "LeftFoot", "Head", "OtherBodyPart", "RightFoot", 
"LeftFoot", "LeftFoot", "LeftFoot", "RightFoot", "LeftFoot", 
"RightFoot", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", 
"Head", "LeftFoot", "LeftFoot", "Head", "RightFoot", "LeftFoot", 
"LeftFoot", "Head", "LeftFoot", "LeftFoot", "RightFoot", "RightFoot", 
"LeftFoot", "RightFoot", "LeftFoot", "Head", "RightFoot", "RightFoot", 
"LeftFoot", "Head", "LeftFoot", "LeftFoot", "RightFoot", "RightFoot", 
"LeftFoot", "RightFoot", "LeftFoot", "RightFoot", "RightFoot", 
"RightFoot", "LeftFoot", "LeftFoot", "LeftFoot", "RightFoot", 
"RightFoot", "RightFoot", "RightFoot", "LeftFoot", "Head", "LeftFoot", 
"Head", "Head", "Head", "LeftFoot", "LeftFoot", "LeftFoot", "Head", 
"LeftFoot", "RightFoot", "RightFoot", "RightFoot", "LeftFoot", 
"LeftFoot", "RightFoot", "LeftFoot", "Head", "RightFoot", "LeftFoot", 
"Head", "RightFoot", "RightFoot", "LeftFoot", "RightFoot", "RightFoot", 
"RightFoot", "Head", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "RightFoot", "LeftFoot", "Head", "LeftFoot", "RightFoot", 
"LeftFoot", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "LeftFoot", "LeftFoot", "RightFoot", "RightFoot", 
"Head", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", "RightFoot", 
"LeftFoot", "RightFoot", "Head", "RightFoot", "LeftFoot", "RightFoot", 
"RightFoot", "RightFoot", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "LeftFoot", "LeftFoot", "LeftFoot", "RightFoot", 
"Head", "RightFoot", "RightFoot", "LeftFoot", "RightFoot", "RightFoot", 
"RightFoot", "RightFoot", "LeftFoot", "RightFoot", "Head", "RightFoot", 
"RightFoot", "LeftFoot", "LeftFoot", "RightFoot", "LeftFoot", 
"RightFoot", "RightFoot", "RightFoot", "RightFoot", "LeftFoot", 
"LeftFoot", "LeftFoot", "Head", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "Head", "LeftFoot", "RightFoot", "Head", "Head", 
"LeftFoot", "LeftFoot", "Head", "RightFoot", "LeftFoot", "LeftFoot", 
"LeftFoot", "RightFoot", "RightFoot", "Head", "RightFoot", "RightFoot", 
"LeftFoot", "LeftFoot", "RightFoot", "Head", "RightFoot", "LeftFoot", 
"LeftFoot", "RightFoot", "LeftFoot", "RightFoot", "RightFoot", 
"RightFoot", "LeftFoot", "LeftFoot", "LeftFoot", "RightFoot", 
"LeftFoot", "LeftFoot", "LeftFoot", "LeftFoot", "LeftFoot", "Head", 
"LeftFoot", "LeftFoot", "RightFoot", "RightFoot", "LeftFoot", 
"RightFoot", "LeftFoot", "RightFoot", "RightFoot", "LeftFoot", 
"RightFoot", "RightFoot", "Head", "Head", "RightFoot", "Head", 
"LeftFoot", "LeftFoot", "LeftFoot", "RightFoot", "RightFoot", 
"RightFoot", "RightFoot", "Head", "RightFoot", "RightFoot", "RightFoot", 
"Head", "RightFoot", "RightFoot", "LeftFoot", "LeftFoot", "LeftFoot", 
"LeftFoot", "RightFoot", "LeftFoot", "RightFoot", "RightFoot", 
"Head", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", "LeftFoot", 
"RightFoot", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", 
"Head", "Head", "LeftFoot", "LeftFoot", "Head", "LeftFoot", "RightFoot", 
"Head", "LeftFoot", "RightFoot", "RightFoot", "LeftFoot", "LeftFoot", 
"RightFoot", "RightFoot", "Head", "LeftFoot", "RightFoot", "LeftFoot", 
"RightFoot", "RightFoot", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "RightFoot", "LeftFoot", "RightFoot", "RightFoot", 
"RightFoot", "Head", "LeftFoot", "LeftFoot", "LeftFoot", "RightFoot", 
"Head", "RightFoot", "Head", "RightFoot", "Head", "RightFoot", 
"LeftFoot", "RightFoot", "RightFoot", "RightFoot", "Head", "Head", 
"RightFoot", "RightFoot", "LeftFoot", "RightFoot", "LeftFoot", 
"LeftFoot", "RightFoot", "RightFoot", "RightFoot", "LeftFoot", 
"LeftFoot", "RightFoot", "LeftFoot", "Head", "RightFoot", "LeftFoot", 
"RightFoot", "Head", "Head", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "RightFoot", "LeftFoot", "RightFoot", "RightFoot", 
"RightFoot", "Head", "LeftFoot", "RightFoot", "RightFoot", "Head", 
"Head", "LeftFoot", "RightFoot", "LeftFoot", "RightFoot", "RightFoot", 
"RightFoot", "LeftFoot", "RightFoot", "LeftFoot", "Head", "RightFoot", 
"RightFoot", "LeftFoot", "RightFoot", "LeftFoot", "RightFoot", 
"LeftFoot", "RightFoot", "LeftFoot", "RightFoot", "RightFoot", 
"RightFoot", "RightFoot", "Head", "RightFoot", "Head", "RightFoot", 
"RightFoot", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", 
"LeftFoot", "LeftFoot", "RightFoot", "LeftFoot", "LeftFoot", 
"RightFoot", "RightFoot", "LeftFoot", "Head", "RightFoot", "RightFoot", 
"RightFoot", "RightFoot", "RightFoot", "LeftFoot", "LeftFoot", 
"LeftFoot", "LeftFoot", "LeftFoot", "LeftFoot", "LeftFoot", "LeftFoot", 
"RightFoot", "RightFoot", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "LeftFoot", "LeftFoot", "RightFoot", "LeftFoot", 
"RightFoot", "RightFoot", "RightFoot", "LeftFoot", "Head", "RightFoot", 
"Head", "LeftFoot", "LeftFoot", "RightFoot", "Head", "LeftFoot", 
"LeftFoot", "LeftFoot", "RightFoot", "RightFoot", "RightFoot", 
"RightFoot", "RightFoot", "RightFoot", "Head", "LeftFoot", "LeftFoot", 
"RightFoot", "LeftFoot", "LeftFoot", "LeftFoot", "LeftFoot", 
"LeftFoot", "LeftFoot", "RightFoot", "RightFoot", "Head", "LeftFoot", 
"RightFoot", "RightFoot", "RightFoot", "RightFoot", "LeftFoot", 
"RightFoot", "LeftFoot", "RightFoot", "LeftFoot", "RightFoot", 
"Head", "RightFoot", "LeftFoot", "LeftFoot", "RightFoot", "RightFoot", 
"LeftFoot", "RightFoot", "LeftFoot", "LeftFoot", "LeftFoot", 
"Head", "LeftFoot", "LeftFoot", "RightFoot", "Head", "RightFoot", 
"RightFoot", "LeftFoot", "RightFoot", "RightFoot", "RightFoot"
)), row.names = c(997L, 691L, 1052L, 713L, 364L, 1413L, 1542L, 
1900L, 1418L, 1662L, 1072L, 2001L, 24L, 669L, 292L, 1240L, 1361L, 
2308L, 213L, 505L, 1871L, 2573L, 1892L, 207L, 1272L, 1628L, 2099L, 
337L, 1448L, 1273L, 454L, 2668L, 100L, 928L, 642L, 1980L, 672L, 
732L, 2356L, 1709L, 2713L, 104L, 295L, 2213L, 12L, 1730L, 2567L, 
475L, 2471L, 205L, 1884L, 403L, 2082L, 1456L, 1754L, 693L, 1427L, 
450L, 1869L, 707L, 2658L, 1308L, 1868L, 2397L, 2277L, 2564L, 
420L, 1045L, 184L, 2273L, 2113L, 235L, 1836L, 944L, 2008L, 2144L, 
1380L, 549L, 1109L, 498L, 1607L, 1111L, 1657L, 2043L, 587L, 2415L, 
1393L, 2449L, 71L, 555L, 1706L, 802L, 1412L, 1890L, 2138L, 2544L, 
1951L, 141L, 773L, 1256L, 1745L, 1989L, 2346L, 2098L, 2421L, 
2155L, 2112L, 344L, 606L, 795L, 1741L, 2218L, 1119L, 974L, 2360L, 
2641L, 2357L, 153L, 1289L, 2115L, 771L, 751L, 1190L, 147L, 182L, 
1019L, 302L, 2682L, 150L, 172L, 2286L, 2189L, 2539L, 1633L, 134L, 
2663L, 1126L, 2403L, 1330L, 370L, 2568L, 2464L, 272L, 2037L, 
698L, 2394L, 232L, 2071L, 442L, 428L, 937L, 2627L, 1422L, 985L, 
1860L, 1478L, 136L, 269L, 48L, 374L, 853L, 807L, 1443L, 1209L, 
1031L, 2255L, 1761L, 966L, 1493L, 2555L, 1430L, 2728L, 1471L, 
167L, 2411L, 2271L, 1705L, 858L, 2100L, 81L, 668L, 948L, 1546L, 
654L, 2535L, 1985L, 953L, 2079L, 1797L, 2508L, 421L, 652L, 1760L, 
2030L, 2530L, 525L, 1700L, 896L, 1847L, 1554L, 2094L, 2345L, 
1629L, 950L, 1202L, 1778L, 1197L, 1866L, 1804L, 2450L, 2662L, 
1874L, 2592L, 2118L, 2426L, 933L, 1930L, 596L, 1118L, 1362L, 
1454L, 1501L, 2561L, 2052L, 1408L, 2183L, 2106L, 1846L, 338L, 
1283L, 1216L, 777L, 2396L, 2461L, 2350L, 2036L, 1453L, 260L, 
2691L, 202L, 877L, 378L, 511L, 967L, 494L, 528L, 1849L, 2107L, 
1133L, 2153L, 708L, 859L, 2085L, 330L, 783L, 2281L, 59L, 2709L, 
746L, 866L, 2369L, 1120L, 1041L, 1434L, 2306L, 1442L, 755L, 2613L, 
792L, 1743L, 598L, 2013L, 1410L, 971L, 481L, 2351L, 2027L, 1873L, 
1328L, 2048L, 935L, 1219L, 1934L, 198L, 1532L, 629L, 319L, 1947L, 
661L, 1496L, 999L, 864L, 51L, 2174L, 2011L, 2574L, 1507L, 2366L, 
2515L, 373L, 703L, 479L, 2451L, 473L, 2448L, 765L, 1725L, 814L, 
46L, 1092L, 2503L, 124L, 463L, 1178L, 1525L, 1945L, 610L, 701L, 
574L, 34L, 1246L, 2047L, 2408L, 638L, 1487L, 2416L, 784L, 2127L, 
1131L, 2297L, 1348L, 1902L, 357L, 1245L, 2436L, 1626L, 1230L, 
1349L, 1659L, 1363L, 590L, 1187L, 1803L, 177L, 1581L, 246L, 1166L, 
1591L, 532L, 1450L, 2163L, 588L, 1966L, 945L, 2509L, 18L, 1834L, 
1577L, 2051L, 2181L, 1012L, 1499L, 1864L, 224L, 33L, 893L, 1829L, 
2221L, 1914L, 2095L, 2303L, 1823L, 2717L, 1617L, 2130L, 286L, 
2628L, 2684L, 440L, 1121L, 2265L, 868L, 1020L, 2191L, 2393L, 
523L, 522L, 1472L, 2639L, 1500L, 1795L, 744L, 1560L, 384L, 1769L, 
2407L, 1248L, 2643L, 1300L, 2654L, 2531L, 2497L, 80L, 487L, 161L, 
259L, 2248L, 1391L, 158L, 212L, 947L, 123L, 684L, 1887L, 759L, 
2438L, 1002L, 356L, 145L, 1236L, 743L, 894L, 552L, 2458L, 1415L, 
132L, 2237L, 1170L, 1513L, 99L, 1372L, 2223L, 2553L, 1490L, 1086L, 
2065L, 2246L, 640L, 1150L, 878L, 1912L, 250L, 924L, 1537L, 382L, 
1268L, 1370L, 1903L, 1449L, 2429L, 695L, 149L, 895L, 885L, 2309L, 
2075L, 2337L, 2637L, 1586L, 930L, 641L, 1299L, 393L, 1083L, 567L, 
242L, 2349L, 1800L, 984L, 307L, 1350L, 2249L, 780L, 2205L, 1876L, 
1196L, 2311L, 1508L, 2540L, 1395L, 2669L, 430L, 2062L, 2417L, 
1547L, 712L, 30L, 1341L, 367L, 1204L, 1921L, 1529L, 256L, 2190L, 
1241L, 1224L, 2251L, 305L, 521L, 1323L), class = "data.frame")
Quinten
  • 379
  • 1
  • 5
  • 17
  • 1
    Just a quick comment: logistic regression assumes that the odds of scoring a goal vary linearly with changes in x-y coordinates on the field. I don't think this is plausible a-priori, and furthermore it won't have a well-defined maximum (the predicted log-odds of goal will be described by a tilted plane). I think the model would need at least also quadratic and interaction terms. Alternatively, something more sophisticated like a generalized addiditve model (GAM). – matteo Nov 06 '23 at 15:52
  • 1
    Why not say that the best position is the one whose 1-meter circle contains the greatest number of goal shots? That may not be Bayesian but it’s as simple as you could want and reasonable enough. – Matt F. Nov 07 '23 at 01:21
  • 1
    The best place may be roughly X = 1, Y = 0.5, in front of the middle of the net — and if so, it’s probably not worth the time to do elaborate modeling. – Matt F. Nov 07 '23 at 07:29
  • 1
    A classic Bayesian example of coordinate inference is "Gull's Lighthouse Problem". PDFs (including Gull's) and videos discussing a Bayesian coordinate inference problem are available on the internet. – krkeane Nov 09 '23 at 13:26
  • Bayesian methods are about expressing and manipulating uncertainty of knowledge. For continuous parameters, such as those in your problem, its not Bayesian in spirit to focus on the most likely ** point **. See discussion https://stats.stackexchange.com/a/148475/43149 – krkeane Nov 09 '23 at 13:33
  • Have you thought of re-parameterizing it in terms of angle and distance? – Björn Nov 12 '23 at 14:06

1 Answers1

1

the best position

You could say that the best position is the position where we have the largest probability of a goal. That is

$$p(goal|x,y)$$

With a Bayesian approach you would compute

$$p(goal|x,y,data)$$

and you could have some parametric model that describes $p(goal|x,y)$ based on a few parameters $\beta_i$ for which we have prior distributions. Then based on the data we can update the distributions for those variables and compute the posterior.

$$f(\beta|data) \propto f(data|\beta) f(\beta)$$

with the given $f(\beta|data)$ you can describe also $p(goal|x,y,data)$ and figure out which place is the best.


The computations are not easy and the choice of model with coëfficiënts $\beta$ and their prior is subjective. I would go instead with MattF's comment and just compute at every point x,y the frequency of goals for all attempts within a certain range.

  • Thank you for your clear answer! I was really looking for a way to describe this in an bayesian approach. I will have a further look in the coefficients and MattF's comment, but this brings me in the right direction. – Quinten Nov 13 '23 at 12:14