Subdivision of geographic data is a panacea to problems you didn’t know you had.
Maybe you deal with vector data, so*you*pre-tile your vector data to ship to the browser to render– you’re makin’ smaller data. Maybe you use cutting edge PostGIS so you apply ST_Subdivide*to keep your data smaller than the database page size like Paul Ramsey describes here. Smaller’s better… . Or perhaps you are forever reprojecting your data in strange ways, across problematic boundaries or need to buffer in an optimum coordinate system to avoid distortion.*Regardless of the reason, smaller is better.
Maybe you aren’t doing vector work, but this time raster.*What’s the equivalent tiling process? *I wrote about this for GeoServer almost 5 (eep!) years ago now*(with a slightly more recent follow up)*and much of what I wrote still applies:
gdal_retile.py -v -r bilinear -levels 4 -ps 6144 6144 -co "TILED=YES" -co "BLOCKXSIZE=256" -co "BLOCKYSIZE=256" -s_srs EPSG:3734 -targetDir aerial_2011 --optfile list.txtLet’s break this down a little:
First we choose our resampling method for our pyramids (bilinear). Lanzcos would also be fine here.
-r bilinear
Next we set the number of resampling levels. This will depend on the size of the dataset.
-levels 4
Next we specify the pixel and line size of the output geotiff. This can be pretty large. We probably want to avoid a size that forces the use of bigtiff (i.e. 4GB).
-ps 6144 6144
Now we get into the geotiff data structure — we internally tile the tifs, and make them 256×256 pixels. We could also choose 512. We’re just aiming to have our tile size near to the size that we are going to send to the browser.
-co "TILED=YES" -co "BLOCKXSIZE=256" -co "BLOCKYSIZE=256"
Finally, we specify our coordinate system (this is state plane Ohio), our output directory (needs created ahead of time) and our input file list.
-s_srs EPSG:3734 -targetDir aerial_2011 --optfile list.txt
That’s it. Now you have a highly optimized raster dataset that can:
get the level of detail necessary for a given request,
and can extract only the data necessary for a given request.
Pretty much any geospatial solution which uses GDAL can leverage this work to make for very fast rendering of raster data to a tile cache. If space is an issue, apply compression options that match your use case.
أكثر...
Maybe you deal with vector data, so*you*pre-tile your vector data to ship to the browser to render– you’re makin’ smaller data. Maybe you use cutting edge PostGIS so you apply ST_Subdivide*to keep your data smaller than the database page size like Paul Ramsey describes here. Smaller’s better… . Or perhaps you are forever reprojecting your data in strange ways, across problematic boundaries or need to buffer in an optimum coordinate system to avoid distortion.*Regardless of the reason, smaller is better.
Maybe you aren’t doing vector work, but this time raster.*What’s the equivalent tiling process? *I wrote about this for GeoServer almost 5 (eep!) years ago now*(with a slightly more recent follow up)*and much of what I wrote still applies:
- Pre-tile your raw data in modest chunks
- Use geotiff so you can use internal data structures to have even smaller tiles inside your tiles
- Create pyramids / pre-summarized data as tiles too.
gdal_retile.py -v -r bilinear -levels 4 -ps 6144 6144 -co "TILED=YES" -co "BLOCKXSIZE=256" -co "BLOCKYSIZE=256" -s_srs EPSG:3734 -targetDir aerial_2011 --optfile list.txtLet’s break this down a little:
First we choose our resampling method for our pyramids (bilinear). Lanzcos would also be fine here.
-r bilinear
Next we set the number of resampling levels. This will depend on the size of the dataset.
-levels 4
Next we specify the pixel and line size of the output geotiff. This can be pretty large. We probably want to avoid a size that forces the use of bigtiff (i.e. 4GB).
-ps 6144 6144
Now we get into the geotiff data structure — we internally tile the tifs, and make them 256×256 pixels. We could also choose 512. We’re just aiming to have our tile size near to the size that we are going to send to the browser.
-co "TILED=YES" -co "BLOCKXSIZE=256" -co "BLOCKYSIZE=256"
Finally, we specify our coordinate system (this is state plane Ohio), our output directory (needs created ahead of time) and our input file list.
-s_srs EPSG:3734 -targetDir aerial_2011 --optfile list.txt
That’s it. Now you have a highly optimized raster dataset that can:
get the level of detail necessary for a given request,
and can extract only the data necessary for a given request.
Pretty much any geospatial solution which uses GDAL can leverage this work to make for very fast rendering of raster data to a tile cache. If space is an issue, apply compression options that match your use case.
أكثر...