[pdal] pgpointcloud write incorrect wkb size

adam steer adam.d.steer at gmail.com
Thu Apr 12 06:15:35 PDT 2018


Hi Andrew, Lars

I would greatly appreciate any insight also. Some time ago I had the same
issue and used exactly Andrew's advice as a workaround. I never had time to
understand *why* I couldn't use a predefined schema, or do more
experiments.

Thanks,

Adam

On 12 Apr. 2018 22:58, "Andrew Bell" <andrew.bell.ia at gmail.com> wrote:

I'd have to do some more work to figure out the discrepancy, but if your
goal is to use PDAL to read/write to postgres, you don't need to do all the
setup that you've done.  PDAL will handle this for you.  Simply point PDAL
at your (empty) database and let it do the setup and writing for you.

Normally PDAL will create a schema that writes X,Y and Z as
double-precision values.  You can have it write 32-bit scaled integers by
providing scaling factors (see
https://www.pdal.io/stages/writers.pgpointcloud.html for more info).  I
think the issue is that the schema PDAL wants to use isn't the one you've
created.

If I have time I'll try to reproduce your steps so that I can give a more
precise answer.

On Thu, Apr 12, 2018 at 6:14 AM, Lars <laasunde at hotmail.com> wrote:

> Hello
>
> Using PDAL 1.5, PostgreSQL 9.5 and PostGIS bundle 2.3 on Windows 7.
>
>
> We have followed the instructions found at
> https://github.com/pgpointcloud/pointcloud
> (Created database, added extensions, created pcpoint and pcpatch tables).
>
>
> We have inserted a schema into pointcloud_formats (samme as
> https://github.com/pgpointcloud/pointcloud but without Intensity).
>
>
> We have successfully insert data into pcpoint table from sql shell.
> INSERT INTO pcpoint(pa) VALUES(PC_MakePoint(1, ARRAY[1,2,3]));
> This results in a data blob that is 17 bytes.
>
>
> However using PDAL pipeline to insert data into table causes the following
> error;
> pc_patch_uncompressed_from_wkb: wkb size and expected size does not match.
>
>
> The sql log file shows that PDAL execute the following sql statement;
> INSERT INTO pcpoint(pa) VALUES('0103000000........0840');
> This data blob is 37 bytes and is very different to above test (17 bytes).
>
>
> The pipeline script looks like this;
> {
>   "pipeline": [
>     {
>       "type": "readers.text",
>       "filename": "data.txt",
>       "separator": ";"
>     },
>     {
>       "type":"filters.chipper",
>       "capacity":100
>     },
>     {
>       "type": "writers.pgpointcloud",
>       "connection": "host='localhost' dbname='aaa' port='5433' user='aaa'
> password='aaa'",
>       "table": "pcpoint",
>       "pcid":"1",
>       "compression": "none"
>     }
>   ]
> }
> The data.txt file looks like this;
> X;Y;Z
> 1.00;2.00;3.00
>
>
> What are we doing wrong? How can I can make PDAL pipeline produce the
> correct data blob that matches the schema used in db?
>
>
> Thanks.
>
>
>
> _______________________________________________
> pdal mailing list
> pdal at lists.osgeo.org
> https://lists.osgeo.org/mailman/listinfo/pdal
>



-- 
Andrew Bell
andrew.bell.ia at gmail.com
_______________________________________________
pdal mailing list
pdal at lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/pdal
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/pdal/attachments/20180412/8c22c51f/attachment.html>


More information about the pdal mailing list