[pgpointcloud] pdal hexbin
Howard Butler
howard at hobu.co
Mon Aug 3 13:48:03 PDT 2015
> On Aug 3, 2015, at 3:23 PM, <Dean.Hand.ctr at nga.mil> <Dean.Hand.ctr at nga.mil> wrote:
>
> Classification: UNCLASSIFIED//FOUO
> ======================================================
>
> I'm using writers.pgpointcloud on my hexbin pipeline--serialization.
> Is there a method in pipeline to create a geom column from the boundary metadata on the hexbin -out file? The idea being that I can create view from this column using the polygons.
> To that, creating columns from other metadata fields to use as feature data in that new view.
Dean,
There isn't a way to do this *inline* with pipeline operations. You have to do it manually at this time. You probably need a script to do some combination of these steps:
* Use --pipeline-serialization as part of execution of the pipeline that inserts the data
* Pick out the pcid from the options on the `writers.pgpointcloud` node.
* Read the metadata to grab the POLYGON WKT from the hexbin filter's metadata
* Use SQL to update the row in the point cloud table with the given pcid to a geometry created in SQL by the WKT
We are currently implementing support in the `filters.programmable` to allow access to all the stage metadata as a JSON tree, which will be useful for inline operations, but for getting the pcid after insertion of all the data, it will still not be a complete solution. We could consider some kind of "post execution operation" where the metadata JSON tree (and pipeline serialization output) are somehow passed into a script to be executed at the completion of the Stage::execute() call. This isn't much better than coordinating things yourself though.
Hope this helps,
Howard
More information about the pgpointcloud
mailing list