[pgpointcloud] stuck getting data into pgpointcloud

Adam Steer Adam.Steer at anu.edu.au
Thu Jul 21 23:14:07 PDT 2016


OK - dropping the PCID option from my pipeline.xml did the trick, and I have points in a table of patches!

I’d like to know why the schema didn’t work out - but I can tinker and figure that out later.

Thanks everyone

Adam

> On 22 Jul 2016, at 11:32 AM, Andrew Bell <andrew.bell.ia at gmail.com> wrote:
> 
> The PDAL pgpointcloud writer will create an appropriate schema automatically.
> 
> On Jul 21, 2016 8:24 PM, "Adam Steer" <Adam.Steer at anu.edu.au> wrote:
> Hi Remi, all
> 
> Moving on - I realised that my pipeline was not limiting the output of PDAL properly, so I made a new schema to fit all 16 dimensions. It works (with one mismatched data type), for example:
> 
> pcbm_pc=> insert into act_points (pt) select pc_makepoint(3, array[692236.790,6094672.340,615.530,93.000,1.000,2.000,1.000,0.000,5.000,-12.000,0.000,601.000,436302.342,0.000,0.000,0.000]) as values;
> WARNING:  Value -12 truncated to 0 to fit in uint16_t
> 
> …where the array of values is a direct copy and paste from pdal translate filename.las stdout | head. And I get the results back as expected:
> 
> pcbm_pc=> SELECT PC_AsText(pt) FROM act_points LIMIT 2;
>                                    pc_astext
> -------------------------------------------------------------------------------
>  {"pcid":3,"pt":[692237,6.09466e+06,604.97,89,3,3,1,0,3,0,0,583,364567,0,0,0]}
>  {"pcid":3,"pt":[692237,6.09467e+06,615.53,93,1,2,1,0,5,0,0,601,436302,0,0,0]}
> (2 rows)
> 
> ….but when I try to make a patch, I’m missing something:
> 
> pcbm_pc=> select * from act_points limit 1;
>  id |                                                             pt
> ----+----------------------------------------------------------------------------------------------------------------------------
>   1 | 010300000048E17A9419202541AE47E10AD23F5741F6285C8FC2E78240590003000300010000000300000000004702736891ED5B401641000000000000
> (1 row)
> 
> pcbm_pc=> insert into act_patches (pa) values ('010300000048E17A9419202541AE47E10AD23F5741F6285C8FC2E78240590003000300010000000300000000004702736891ED5B401641000000000000');
> ERROR:  pc_patch_from_wkb: unknown compression '-1803886264' requested
> LINE 1: insert into act_patches (pa) values ('010300000048E17A941920…
> 
> …which is sort of similar to the message I get back from my PDAL pipeline (?? well, unknown compression vs wkb and data size mismatch)
> 
> # pdal pipeline postgis_pc_pipeline.xml
> PDAL: ERROR:  pc_patch_uncompressed_from_wkb: wkb size and expected data size do not match
> LINE 1: INSERT INTO "act_patches" ("pa") VALUES ('010300000000000000...
> 
> My schema says:
> 
>  <pc:metadata>
>     <Metadata name="compression">none</Metadata>
>   </pc:metadata>
> 
> So - I have an issue creating patches. Again, thanks in advance for any insight!
> 
> Regards
> Adam
> 
> 
> 
> > On 21 Jul 2016, at 9:06 AM, Adam Steer <Adam.Steer at anu.edu.au> wrote:
> >
> > Thanks Rémi,
> >
> > I will do some tests and report back. I had not even considered this approach!
> >
> > Regards
> >
> > Adam
> >
> >> On 20 Jul 2016, at 10:04 PM, Rémi Cura <remi.cura at gmail.com> wrote:
> >>
> >> Hey,
> >> As you seem to discover this pgpointcloud world,
> >> maybe you could do this step by step.
> >> First create your point xml schema and test it within the database
> >> (meaning creating a random point and patch using it).
> >>
> >> Then you can use lastool to output a few actual points in text,
> >> and manually create pgpoints from these values (casting point to text back to be sure your precision/offset and so is OK).
> >>
> >> When you are sure everything works, then you can use pdal.
> >>
> >> Cheers,
> >> Rémi-C
> >>
> >> 2016-07-20 2:52 GMT+02:00 Adam Steer <Adam.Steer at anu.edu.au>:
> >> Hi all
> >>
> >> I am attempting to ingest some points into postGIS-pointcloud, and basically failing at the start:
> >>
> >> - I want to ingest time,x,y,z,intensity,return number and classification
> >> - I’ve used pdal info —schema to get the numeric types, sizes and field names of my .LAS file (v1.4)
> >> - I’ve (maybe) built a format schema based on those (see below)
> >> - …and used a PDAL pipeline to try to ingest them.
> >>
> >> but I’ve managed to map something badly - PDAL returns:
> >>
> >> #> pdal pipeline postgis_pc_pipeline.xml
> >> PDAL: ERROR:  pc_patch_uncompressed_from_wkb: wkb size and expected data size do not match
> >> LINE 1: INSERT INTO "blocks" ("pa") VALUES ('01020000000000000087130...
> >>
> >> While I’ve worked with points a lot, I’ve not used any clever things like postGIS pointcloud to manage them (because PhD, time, and triage between working with what exists and learning new things) - and now I’m learning on the fly (with a need to learn fast!)
> >>
> >> PDAL tells me I’m clearly missing something about the relationship between the size of things in my schema, and the size of things PDAL is getting from my .LAS file.
> >>
> >> Also I’m trying to be lazy - I should probably work out the scales and offsets correctly, and store XYZ as long integers. I wanted to try floating point storage (as real, which if I read correctly should be big enough).
> >>
> >> Any help will be highly appreciated!
> >>
> >> Thanks
> >>
> >> Adam
> >>
> >> ----
> >> Here is my pipeline:
> >>
> >> # cat postgis_pc_pipeline.xml
> >>
> >> <?xml version="1.0" encoding="utf-8"?>
> >> <Pipeline version="1.0">
> >>    <Writer type="writers.pgpointcloud">
> >>        <Option name="connection">host='localhost' dbname=‘….' user=‘….' password=‘….'</Option>
> >>        <Option name="table">blocks</Option>
> >>        <Option name="srid">28355</Option>
> >>        <Option name="pcid">2</Option>
> >>        <Filter type="filters.chipper">
> >>            <Option name="capacity">5000</Option>
> >>            <Reader type="readers.las">
> >>                <Option name="filename">./subsets/ACT2015-C3-ELL_6926094_55_0002_0002_7_12.las</Option>
> >>                <Option name="spatialreference">EPSG:28355</Option>
> >>            </Reader>
> >>        </Filter>
> >>    </Writer>
> >> </Pipeline>
> >>
> >> (notes here - should I use a filter to get *just* the fields I want - or does PDAL handle extracting only what exists in the schema?)
> >>
> >> …and my schema SQL:
> >>
> >> INSERT INTO pointcloud_formats (pcid, srid, schema) VALUES (2, 28335,
> >> '<?xml version="1.0" encoding="UTF-8"?>
> >> <pc:PointCloudSchema xmlns:pc="http://pointcloud.org/schemas/PC/1.1"
> >>    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
> >>    <pc:dimension>
> >>    <pc:position>1</pc:position>
> >>    <pc:size>8</pc:size>
> >>    <pc:description>Time as seconds of GPS week, as real
> >>    </pc:description>
> >>    <pc:name>T</pc:name>
> >>    <pc:interpretation>real</pc:interpretation>
> >>    <pc:scale>1</pc:scale>
> >>  </pc:dimension>
> >>  <pc:dimension>
> >>    <pc:position>2</pc:position>
> >>    <pc:size>8</pc:size>
> >>    <pc:description>X coordinate as real.</pc:description>
> >>    <pc:name>X</pc:name>
> >>    <pc:interpretation>real</pc:interpretation>
> >>    <pc:scale>1</pc:scale>
> >>  </pc:dimension>
> >>  <pc:dimension>
> >>    <pc:position>3</pc:position>
> >>    <pc:size>8</pc:size>
> >>    <pc:description>Y coordinate as real</pc:description>
> >>    <pc:name>Y</pc:name>
> >>    <pc:interpretation>real</pc:interpretation>
> >>    <pc:scale>1</pc:scale>
> >>  </pc:dimension>
> >>  <pc:dimension>
> >>    <pc:position>4</pc:position>
> >>    <pc:size>8</pc:size>
> >>    <pc:description>Z coordinate is ellipsoidal height as real
> >>    </pc:description>
> >>    <pc:name>Z</pc:name>
> >>    <pc:interpretation>real</pc:interpretation>
> >>    <pc:scale>1</pc:scale>
> >>  </pc:dimension>
> >>  <pc:dimension>
> >>    <pc:position>5</pc:position>
> >>    <pc:size>6</pc:size>
> >>    <pc:description>The intensity value is the integer representation
> >>                    of the pulse return magnitude. This value is optional
> >>                    and system specific. However, it should always be
> >>                    included if available.</pc:description>
> >>    <pc:name>Intensity</pc:name>
> >>    <pc:interpretation>uint16_t</pc:interpretation>
> >>    <pc:scale>1</pc:scale>
> >>  </pc:dimension>
> >>  <pc:dimension>
> >>    <pc:position>6</pc:position>
> >>    <pc:size>1</pc:size>
> >>    <pc:description>Return number</pc:description>
> >>    <pc:name>ReturnNumber</pc:name>
> >>    <pc:interpretation>uint16_t</pc:interpretation>
> >>    <pc:scale>1</pc:scale>
> >>  </pc:dimension>
> >>  <pc:dimension>
> >>    <pc:position>7</pc:position>
> >>    <pc:size>1</pc:size>
> >>    <pc:description>Classification if supplied (ASPRS LAS guidelines)</pc:description>
> >>    <pc:name>Class</pc:name>
> >>    <pc:interpretation>uint16_t</pc:interpretation>
> >>    <pc:scale>1</pc:scale>
> >>  </pc:dimension>
> >>  <pc:metadata>
> >>    <Metadata name="compression">dimensional</Metadata>
> >>  </pc:metadata>
> >> </pc:PointCloudSchema>');
> >>
> >>
> >>
> >>
> >>
> >>
> >> --
> >> _______________________________________________
> >> pgpointcloud mailing list
> >> pgpointcloud at lists.osgeo.org
> >> http://lists.osgeo.org/mailman/listinfo/pgpointcloud
> >>
> >
> > _______________________________________________
> > pgpointcloud mailing list
> > pgpointcloud at lists.osgeo.org
> > http://lists.osgeo.org/mailman/listinfo/pgpointcloud
> 
> _______________________________________________
> pgpointcloud mailing list
> pgpointcloud at lists.osgeo.org
> http://lists.osgeo.org/mailman/listinfo/pgpointcloud



More information about the pgpointcloud mailing list