[pdal] How to handle las files that don't fit in memory

Tom van Tilburg tom.van.tilburg at gmail.com
Wed Dec 16 14:38:10 PST 2015


I have a las-zip file of 3 gigs and try to run a pdal splitter filter on
it. PDAL will try to load the entire file into memory (30 GB) which doesn't
fit and segfaults upon that.
Would there be an option to split the large las file first into smaller
blocks based on extent by simply running over all points and writing them
directly to output instead of first caching it?
I tried the crop filter as well but it also seems to load all points into
memory first.
It is possible with lastools but I would prefer to keep my process within a
pdal-based script.

Best,
 Tom
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/pdal/attachments/20151216/139eab2b/attachment.html>


More information about the pdal mailing list