I'm trying to develop a web map with a clickable chloropleth layer of streets in a large city. There are approximately 40k polylines in this layer, producing a 21MB GeoJSON. This is too much data to send to the client at once and an unnecessary amount to map at the city scale. Our data is in PostGIS and we'd like to Leaflet for the front-end. I have investigated the more common Python web-servers but I haven't easily found documentation/tutorials/Q&A about solving something similar with geoDjango/geoAlchemy2.
The answer to Adding/Removing Leaflet GeoJSON layers shows how to load different layers at different zoom levels (and caching them), but doesn't cover a bounding-box filter on data coming from the server. This answer has the rather unhelpful solution of "just load all the data client-side first". The answer to How to add a bounding box filter to this leaflet WFS request? shows how to load data within the bounding box on every map move, but, unlike the first answer, appears to discard and re-draw the layer every time the map is moved.
geojson-vt by Mourner solves the client-side rendering problem (106MB of zipcodes, wow!) but doesn't seem to solve the data-transfer issue. The post does hint that this has traditionally been done server-side.
The best way to optimize the data for all zoom levels and screens is to cut it into vector tiles. Traditionally, this is done on the server, using tools like Mapnik and PostGIS.
What python web-server framework can load and display slices of a large number of polylines within different zoom levels & bounding boxes?
vector-tile– raphael Jun 27 '17 at 17:22