10

I have a set of rasters in DTED format which I load into PostGIS database using raster2pgsql command line tool.

Each of rasters is simply stored in a row and described by rid and a value of raster format.

Now, I want to create a database function which takes longitude and latitude of a point and returns a value of pixel corresponding to this point.

The problem I have is that it takes quite a lot of time to do that (3-4 secs) because the database works on Odroid board.

I know that the dataset I process is quite big (the rasters cover the whole UK area) but as I am not very familiar with PostgreSQL and PostGIS I suspect it can be done faster.

This is what I've done so far:

SELECT ST_Value(rast, ST_GeomFromText(CONCAT('POINT(', $1, ' ', $2, ')'), 4326))
FROM (
    SELECT * FROM rasters
    WHERE rast && ST_GeomFromText(CONCAT('POINT(', $1, ' ', $2, ')'), 4326)
) x;

$1 and $2 are long and lat respectively.

PolyGeo
  • 65,136
  • 29
  • 109
  • 338
zedsdead
  • 109
  • 3
  • 2
    Did you cut the raster into tiles when you import to postgis? (parameter -t width x height)? – mutolisp Oct 16 '14 at 15:02
  • Yes, I did. It improved a performance a little bit. I should probably also add that the database is on Odroid board, therefore it works significantly slower than on Desktop PC. I was just wondering if I can change somehow the approach to rasters processing so it does less unnecessary computations. For example initially, I was calling the ST_Value function on all of the rasters and then I would look for the row which actually contains some value. It was the simplest approach but worked much slower. – zedsdead Oct 16 '14 at 15:51
  • 2
    Using ST_SetSRID(ST_MakePoint($1, $2),4326) in lieu of string concats might save you some time if there are enough iterations. – Scro Oct 16 '14 at 15:51
  • 1
    It does not seem to help a lot but thank you. I was thinking about adding another column when table is created that would contain simply a bounding box for a single raster. Maybe this way a right raster could be found faster... Also, I was wondering if pre-calculation of the pixel position in the raster based on the corner coordinates and pixel step in lon/lat might help... If anyone has any thoughts on that I will be grateful for sharing them :) – zedsdead Oct 16 '14 at 16:02
  • 2
    Maybe you could try to use “explain” to check where the bottle neck is. – mutolisp Oct 16 '14 at 16:54
  • What are you using for storage? If you're using a slow external HDD I would be suspicious that's your bottleneck. I would suggest installing NewRelic's monitoring or something similar to help determine what's causing the slowdown. I've seen similar slowdowns with PostgreSQL on other machines with moderate CPU and RAM, but using a slow 5200 RPM drive. – RustProof Labs Nov 06 '15 at 22:40
  • Because you say you are new to PostgreSQL and PostGIS, I suggest you try these guides to increase your database performance from this answer: http://gis.stackexchange.com/a/194576/72697 – kttii Dec 14 '16 at 17:20
  • What is your tile size? Have you added a spatial index? Constraints? All of these can help speed up queries https://gis.stackexchange.com/a/239246/37341 – nronnei May 04 '17 at 21:35

1 Answers1

1

You could try this :

--calculate and store geom point just One time
WITH point_geom AS 
(
    SELECT ST_setsrid(ST_GeomFromText('POINT('|| $1 || ' '|| $2 || ')'), 4326) as geom
)
-- Your subquery is maybe useless , alias "x" isn't used
SELECT ST_Value( rast, point_geom.geom )
FROM rasters
WHERE rast && point_geom.geom;

But the real problem is raster queriing ; tiling the dataset should speed up queries. You can try to use PostGIS WKT Raster and follow this tutorial.

I Hope it will be usefull,

Benno
  • 736
  • 7
  • 16