I apologize in advance if this question is poorly-posed: I'm an astronomer, not a statistician. My question is specifically aimed to help me figure out whether Gaussian processes is an appropriate technique for my problem.
Using a telescope and a fiber-fed spectrograph, my project has taken the optical spectrum of a galaxy at many locations. The sampling pattern for a single pointing is in the first image, and is repeated three times in total, with different spatial offsets, in order to fill in the gaps (second image). Ideally, I would want to construct estimates of certain quantities over a grid covering the galaxy.
My naive method would be to analyze each fiber's spectrum separately, so that I had $3 N_{fibers}$ point-estimates of the quantities of interest, and then construct a Gaussian process to estimate those quantities everywhere. Similarly, I could construct a Gaussian process for the spectra themselves, then analyze the GP on my grid of choice to find the quantities I'm interested in. However, I'm not sure this is even a valid approach, since my observations are not discrete, but rather, are coincident.
Unlike, for example, soil scientists, who might sample dirt from a very discrete location, and then move 50 meters away & repeat, my observations overlap spatially, so I am integrating over all the light a galaxy gives off. It's not obvious to me that I would be allowed to neglect any spatial variation that may exist within a given measurement. In other words, is a Gaussian process even valid when individual sampling locations are not small? Can I build in an additional spatial term to account for the the light "mixing" within a single fiber?
Addendum: Traditionally, spectra are just interpolated, resampled on a grid, and then analyzed, which also strikes me as extremely wrong--but if I'm going to rain on colleagues' parades, I want to at least present an alternative method.

