[Liblas-devel] chopping up .las

Peter Tittmann pwtittmann at ucdavis.edu
Thu Dec 16 15:11:07 EST 2010


 Greetings all,


I wanted to see if someone could advise as to the most efficient use of LibLAS for the following:


I have a series of 200-500mb las files with classified points.


I have a program (C++) that reads in .txt files and doesn't like them bigger than about 10mb. I have written some python to batch the C++ app for multiple tiles. Obviously, using LibLAS to load the .las files directly into the app would be best but due to time/resource constraints thats not going to happen. 


I need to subset the large tiles spatially and by return #/classification. My idea at the moment is to use write a script to batch las2las2 to produce text files for each combination of spatial extent and point class, then use the python api to produce text files that are digestible by the app. 


My question (finally), is weather this is the best way to approach this problem from an efficiency standpoint. I have used the python api to read through the points in the original (large) .las files and spit out text files with my criteria but its very brute force and slow. 


Thanks in advance,


Peter


-- 
Peter Tittmann
UC Davis
Geography Graduate Group

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.osgeo.org/pipermail/liblas-devel/attachments/20101216/0818d4bf/attachment.html


More information about the Liblas-devel mailing list