I want to interpolate a gridded dataset using python. For example if I have an array of size 10x10, it should become 20x20 with an interpolation (linear, cubic) of the added values. The problem is that the data is NOT in an regular gridded format completely, hence the values outside the target area should be ignored for interpolation. Let's say all green data around the borders should be ignored:
If they won't get ingored it will leed to a distortion of interpolation at the border margin. This potential solution does not do what I am looking for. Is there no easy solution to ingore NaN values oder masked data ?
I am struggeling a lot of hours now on this issue und tried to solve it with:
But I don't get it to work.
The only hack I was able to do is, to set all the values outside the target area to a very high number like 999999 and then mask all values wich are greater then the highest values inside the target area. It's clear that it will only work for interpolation method linear. So I think thats for the most cases not an appropriate solution