1

I'm using free tiger geocoder for getting coordinates, but the performance is really slow for the bigger states. E.G.

A query for a small state can take 300ms while a query for a big state can take about 1min. My guess here is that the machine runs out of memory for big states.

  • 1
    What are the specs of your server. RAM is generally more important. I suspect though you can get better performance by tuning your setup (postgresql conf and also you may not have all indexes in place for the geocoder). – Regina Obe Jul 12 '17 at 17:32
  • @LR1234567 it has 8GB of RAM. How can I check the indexes? – Luis Ramon Ramirez Rodriguez Jul 13 '17 at 01:19
  • 1
    another factor is I/O (read/write) speed using SSD will improve speed. see this for large amounts of geocoding with tiger geocoder and postgis http://dracodoc.github.io/2015/11/17/Geocoding/ you get your 300ms sub 100ms – Mapperz Jul 13 '17 at 04:02
  • 2
    There are some tweaks to postgres that can be applied like 'shared_buffer' see https://gis.stackexchange.com/questions/150825/how-fast-should-i-expect-postgis-to-geocode-well-formatted-addresses – Mapperz Jul 13 '17 at 04:04

1 Answers1

1

With 8 GB you shouldn't be getting such poor performance.

If you are using PostGIS tiger geocoder, make sure you have run these steps after your load as documented in step 9 of https://postgis.net/docs/manual-2.3/postgis_installation.html#install_tiger_geocoder_extension

SELECT install_missing_indexes();
vacuum analyze verbose tiger.addr;
vacuum analyze verbose tiger.edges;
vacuum analyze verbose tiger.faces;
vacuum analyze verbose tiger.featnames;
vacuum analyze verbose tiger.place;
vacuum analyze verbose tiger.cousub;
vacuum analyze verbose tiger.county;
vacuum analyze verbose tiger.state;
vacuum analyze verbose tiger.zip_lookup_base;
vacuum analyze verbose tiger.zip_state;
vacuum analyze verbose tiger.zip_state_loc;
Regina Obe
  • 10,503
  • 1
  • 23
  • 28