[pdal] entwine crash with large number of duplicate points

Connor Manning connor at hobu.co
Mon Feb 24 10:30:56 PST 2020


If you can email me directly with a file that reproduces the issue I can
take a look.

On Mon, Feb 24, 2020 at 12:10 PM Jim Klassen <klassen.js at gmail.com> wrote:

> It looks like entwine crashes if it is presented with a large number of
> duplicate points.
>
> I was seeing 'entwine build' segfault on one of about 33,000 laz files
> (all produced by the same pdal pipeline).  Running 'enwtine build' on this
> file alone and even just on certain subsets of the points in it caused the
> segfault.  Other subsets of points from the same LAZ ran fine.  I've got it
> down to a 24 kB LAZ file (70,000 points) that still causes the segfault.
>
> After rebuilding entwine in debug mode it doesn't segfault, but an assert
> gets tripped instead:
>
> entwine/builder/chunk-cache.cpp:68: void
> entwine::ChunkCache::insert(entwine::Voxel&, entwine::Key&, const
> entwine::ChunkKey&, entwine::Clipper&): Assertion `ck.depth() < maxDepth'
> failed.
>
> I have tried entwine e61e33a (Feb 17 2020) and 878acc8 (Sep 2 2019) and
> both behave the same.
>
> Using pdal translate to convert the laz to a txt file, I found the reduced
> file contains 65599 points that are identical across all dimensions.
>
> If I remove the duplicate points from the original LAZ using PDAL's
> filter.sample.radius=0.0001, the resulting LAZ file works fine with entwine
> (reduced to 4.5 million points down from 11.2 million points).
>
> _______________________________________________
> pdal mailing list
> pdal at lists.osgeo.org
> https://lists.osgeo.org/mailman/listinfo/pdal
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/pdal/attachments/20200224/95c86431/attachment.html>


More information about the pdal mailing list