[Liblas-devel] problems with lasblock and big datasets
Howard Butler
hobu.inc at gmail.com
Fri Sep 10 09:06:28 EDT 2010
On Sep 10, 2010, at 2:47 AM, Hugo Ledoux wrote:
> Greetings,
>
> While testing lasblock yesterday (code checked out a few days ago and compiled under Linux 4.4) I ran into a problem with a big dataset:
>
> hugo at TUDLEDOUX:~/data/las/ahn2_original/gefilterd$ time lasblock -c 10000 -p 3 bigone.las
> terminate called after throwing an instance of 'std::length_error'
> what(): vector::reserve
> Aborted
>
> real 0m5.066s
> user 0m4.620s
> sys 0m0.432s
>
>
> Reading how lasblock works, I did expect it to struggle with very large dataset, but here it crashed after 4s. The dataset "bigone.las" has ~280 million points, and is 5.3GB. With smaller datasets (~20M points) it works flawlessly.
>
> Is it simply that allocating 2 arrays of 280M is too much and then it aborts?
Yep. It's trying to reserve 3*280m though, and depending on the underlying stl implementation of std::vector::reserve, actually trying to allocate it.
You're the first to hit the limit, although we've been running it on 64bit hardware with substantial RAM without too much trouble. I can't offer much in the way of work arounds other than the following at this point:
las2las2 --split 1024 bigone.las
lasblock bigone_1.las
lasblock bigone_2.las
...
In the interim, I'll see if there's any simple workarounds for needing to allocate so much memory for the sort(s).
Howard
More information about the Liblas-devel
mailing list