[pdal] entwine crash with large number of duplicate points

Jim Klassen klassen.js at gmail.com
Mon Feb 24 10:10:43 PST 2020


It looks like entwine crashes if it is presented with a large number of duplicate points.

I was seeing 'entwine build' segfault on one of about 33,000 laz files (all produced by the same pdal pipeline).  Running 'enwtine build' on this file alone and even just on certain subsets of the points in it caused the segfault.  Other subsets of points from the same LAZ ran fine.  I've got it down to a 24 kB LAZ file (70,000 points) that still causes the segfault.

After rebuilding entwine in debug mode it doesn't segfault, but an assert gets tripped instead:

entwine/builder/chunk-cache.cpp:68: void entwine::ChunkCache::insert(entwine::Voxel&, entwine::Key&, const entwine::ChunkKey&, entwine::Clipper&): Assertion `ck.depth() < maxDepth' failed.

I have tried entwine e61e33a (Feb 17 2020) and 878acc8 (Sep 2 2019) and both behave the same.

Using pdal translate to convert the laz to a txt file, I found the reduced file contains 65599 points that are identical across all dimensions.

If I remove the duplicate points from the original LAZ using PDAL's filter.sample.radius=0.0001, the resulting LAZ file works fine with entwine (reduced to 4.5 million points down from 11.2 million points).



More information about the pdal mailing list