[Liblas-devel] Point limit in the software
Howard Butler
hobu.inc at gmail.com
Mon Feb 23 15:06:12 EST 2009
On Feb 23, 2009, at 12:36 PM, Neil Woodhouse wrote:
> Howard,
>
> Could you see any issue in not writing the header until the end?
> What the code is doing right now is just adding points to the LAS file
> and before closing writing the header.
> In this way I can accumulate the
> bounds of the data and commit it then.
Our implementation of lasinfo's "repair" operation does this by
opening the file in append mode and just writing the bounds. You can
cheat like this as long as you don't try to do things like change VLRs
or do any other operation that would attempt to change the size of the
header + the VLR block. I would try writing the file and then opening
it up again in append mode and changing the counts/bounds as you need
to.
> Thinking about it this seems a
> bit odd because one should write the header at the start :).
Doing it the other way -- writing all of the points and going back to
the beginning of the file and writing the header (do you move the
existing points, or write a hole of blank bytes as a placeholder, etc)
-- seems even more crazy :) The thing that causes the trouble is the
damn VLR block. It can be any size. If I had my druthers, the VLR
records would be written at the end of the file, denoted by some
scheme or its location hinted to in the header. This would allow us
to modify the VLRs or the header without having to rewrite the points
(and change the values of points too, in limited cases).
>
>
> I figured that the number of points should not be a problem
> because the tools on Isenberg's site work well with the data. It
> takes a
> while to get the data out though. I did this so that I could discount
> the viewer which seems stable, but the data has a lot of points in it.
>
> I am trying out a few things currently. Do you think it wise to
> treat the LAS file as an append operation and update the header after
> each block that is being processed? I am looking into logging the
> error
> information from the processing which may give me some clues as to
> what
> the issue may be.
>
Using svn trunk, the following Python code successfully wrote out a
LAS 1.1 point record 1 file with 460,000,000 points in one shot. I
was able to issue 'lasinfo -c -i 1.1_1.las` and have it report
summaries as well. I do not adulterate the header after I am done
writing the points, I just destroy it with the f.close() call.
This was on my Mac Pro, and it wrote a 12gb file in about 35 minutes
or so.
I think something is wrong with the order of operations you are doing
to the file when you finish up writing it. We'll need to see some
code to be able to dig any further.
> import datetime
>
> p = point.Point()
> p.flightline_edge = 0
> p.return_number = 1
> p.classification = 0
> p.scan_angle = -13
> p.x = 470692.447538
> p.y = 4602888.904642
> p.z = 16.0
> c = color.Color()
>
> c.red=255
> c.green=12
> c.blue=234
> p.color = c
> p.time = datetime.datetime(2008,3,19)
> p.classification = 2
> p.return_number = 2
>
> s = srs.SRS()
> s.proj4 = '+proj=utm +zone=15 +ellps=NAD83 +datum=NAD83 +units=m
> +no_defs '
>
> g = guid.GUID(key='8388f1b8-aa1b-4108-bca3-6bc68e7b062e')
>
> def write_file(version, format):
> h = header.Header()
> h.guid = g
> h.date = datetime.datetime.now()
> h.dataformat_id = format
> h.major_version = 1
> h.minor_version = version
> h.min = [p.x, p.y, p.z]
> h.max = [p.x, p.y, p.z]
> h.point_return_count = [0L, 1L, 0L, 0L, 0L, 0L, 0L, 0L]
> h.srs = s
> h.date = p.time
>
> f = file.File('/Volumes/Data/1.%d_%d.las'%(version,format),
> mode='w', header=h)
> for i in xrange(460000000):
> f.write(p)
> f.close()
>
> write_file(1,1)
>
> fire:Data hobu$ /tmp/lasjunk/bin/lasinfo -c 1.1_1.las
>
> ---------------------------------------------------------
> Header Summary
> ---------------------------------------------------------
> File Name: 1.1_1.las
> Version: 1.1
> Source ID: 0
> Reserved: 0
> Project ID/GUID: '8388f1b8-aa1b-4108-bca3-6bc68e7b062e'
> System Identifier: 'libLAS'
> Generating Software: 'libLAS 1.2'
> File Creation Day/Year: 78/2008
> Header Size 227
> Offset to Point Data 438
> Number Var. Length Records 2
> Point Data Format 1
> Point Data Record Length 28
> Number of Point Records 460000000
> Number of Points by Return 0 1 0 0 0
> Scale Factor X Y Z 0.01 0.01 0.01
> Offset X Y Z 0.000000 0.000000 0.000000
> Min X Y Z 470692.447538 4602888.904642 16.000000
> Max X Y Z 470692.447538 4602888.904642 16.000000
> Spatial Reference +proj=utm +zone=15 +ellps=GRS80
> +datum=NAD83 +units=m +no_defs
>
> ---------------------------------------------------------
> VLR Summary
> ---------------------------------------------------------
> User: 'LASF_Projection' - Description: ''
> ID: 34735 Length: 64
>
> User: 'LASF_Projection' - Description: ''
> ID: 34737 Length: 39
>
>
> ---------------------------------------------------------
> Point Inspection Summary
> ---------------------------------------------------------
> Header Point Count: 460000000
> Actual Point Count: 460000000
>
> Minimum and Maximum Attributes (min,max)
> ---------------------------------------------------------
> Min X,Y,Z: 470692.440000,4602888.900000,16.000000
> Max X,Y,Z: 470692.440000,4602888.900000,16.000000
> Bounding Box: 470692.44,4602888.90,470692.44,4602888.90
> Time: 1205902800.000000,1205902800.000000
> Return Number: 2,2
> Return Count: 0,0
> Flightline Edge: 0,0
> Intensity: 0,0
> Scan Direction Flag: 0,0
> Scan Angle Rank: -13,-13
> Classification: 2,2
> Minimum Color: 0 0 0
> Maximum Color: 0 0 0
>
> Number of Points by Return
> ---------------------------------------------------------
> (0) 0 (1) 460000000 (2) 0 (3) 0 (4) 0
> Total Points: 460000000
>
> Number of Returns by Pulse
> ---------------------------------------------------------
> (1) 0 (2) 0 (3) 0 (4) 0 (5) 0 (6) 0 (7) 0
> Total Pulses: 0
>
> Actual number of points by return
> is different from header (actual, header):
> (0,0) (460000000,1) (0,0) (0,0) (0,0)
>
> Point Classifications
> ---------------------------------------------------------
> 460000000 Ground (2)
>
More information about the Liblas-devel
mailing list