I am doing a two step process:
- Polygonize a PNG raster via
gdal_polygonize.py
- Add a custom field named
id
- Upload the resulting vector to PostGIS via
ogr2ogr
I am currently doing it in two steps, like
id=e07c16a1-2de7-4221-8b18-754c5e8f07bb
gdal_polygonize.py "${id}.png" -f "ESRI Shapefile" "${id}.shp"
ogr2ogr -dialect "SQLITE" -sql "SELECT *, '${id}' AS id FROM \"${id}\"" \
-append -f "PostgreSQL" 'PG:host={host} user={user} dbname={db}' \
'${id}.shp' -nln {schema}."{table}" \
--config PG_USE_COPY=YES -progress -lco PRECISION=NO
This implies writing and reading a shapefile to disk - then I need to remove the shapefile.
I would like to simply pipe the output of gdal_polygonize.py
into ogr2ogr
so I avoid the write/read operation, which adds significance latency to the whole operation.
Something like
gdal_polygonize.py "${id}.png" "${id}.png" -f "ESRI Shapefile" /vsistdout/ | ogr2ogr ...
Is this even possible? If so, how would I read that file in the second command?
-f
also in gdal_polygonize. Write vectors directly into PostGIS and run alter table + update in the database, or with ogrinfo and-sql
, as postprocess. For more sophisticated solution you should probably use Python.