<div dir="ltr">If you can email me directly with a file that reproduces the issue I can take a look.</div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Feb 24, 2020 at 12:10 PM Jim Klassen <<a href="mailto:klassen.js@gmail.com">klassen.js@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">It looks like entwine crashes if it is presented with a large number of duplicate points.<br>
<br>
I was seeing 'entwine build' segfault on one of about 33,000 laz files (all produced by the same pdal pipeline). Running 'enwtine build' on this file alone and even just on certain subsets of the points in it caused the segfault. Other subsets of points from the same LAZ ran fine. I've got it down to a 24 kB LAZ file (70,000 points) that still causes the segfault.<br>
<br>
After rebuilding entwine in debug mode it doesn't segfault, but an assert gets tripped instead:<br>
<br>
entwine/builder/chunk-cache.cpp:68: void entwine::ChunkCache::insert(entwine::Voxel&, entwine::Key&, const entwine::ChunkKey&, entwine::Clipper&): Assertion `ck.depth() < maxDepth' failed.<br>
<br>
I have tried entwine e61e33a (Feb 17 2020) and 878acc8 (Sep 2 2019) and both behave the same.<br>
<br>
Using pdal translate to convert the laz to a txt file, I found the reduced file contains 65599 points that are identical across all dimensions.<br>
<br>
If I remove the duplicate points from the original LAZ using PDAL's filter.sample.radius=0.0001, the resulting LAZ file works fine with entwine (reduced to 4.5 million points down from 11.2 million points).<br>
<br>
_______________________________________________<br>
pdal mailing list<br>
<a href="mailto:pdal@lists.osgeo.org" target="_blank">pdal@lists.osgeo.org</a><br>
<a href="https://lists.osgeo.org/mailman/listinfo/pdal" rel="noreferrer" target="_blank">https://lists.osgeo.org/mailman/listinfo/pdal</a></blockquote></div>