[postgis-users] Massive Lidar Dataset Datatype Suggestions?
Gerry Creager N5JXS
gerry.creager at tamu.edu
Sat Nov 13 16:11:37 PST 2004
Hmmm... Can we start thinking in terms of a NetCDF data structure?
Robert W. Burgholzer wrote:
> Perhaps I've weighed in on this before, so excuse me if I sound like a
> broken record. This situation appears to me to be best suited to a
> raster analysis. I would like to see (help to develop) this type of
> functionality in postgis.
>
> From what I have seen, there are no raster components to the OGC specs
> for SQL. IMHO, this is a logical next step, and also one that could be
> potentially very powerful for the integration of environmental data such
> as the lidar elevations and so forth. Of course, with anything
> worthwhile, this will require some time and expertise - something I am
> more than willing to be involved in. If anyone could comment on some
> documents that might point one to how to outline a raster specification
> for SQL, that would be great.
>
> A potential set of components:
> - Raster as rectangular feature
> - Geometry column holds boundaries of raster rectangle
> - raster data as file pointer or large data object (BLOB?)
>
>
> Rob
>
> At 12:34 PM 11/13/2004 -0800, you wrote:
>
>> Greetings all,
>>
>> I asked some questions about massive point datasets in April, before
>> the implementation of LWGEOM. Then I fell off the face of the earth
>> for a while. Know I'm back working on this issue again and would love
>> some input.
>>
>> I have a LIDAR dataset of 473 million points with 2 point geometries
>> (first & last returns), timestamp and 2 laser intensity returns
>> (integer).
>>
>> I am trying to figure the best setup for storing, extracting and
>> processing this dataset. btw, it is a smallish dataset. We will be
>> processing 2 billion+ point projects in the near future.
>>
>> Currently I compiled & installed postgresql 8.0 beta 4, with postgis
>> 0.9 release, geos 2.0.1, and proj 4. This is on Fedora Core 2, smp
>> 733mhz, 1GB ram, 160gb hdd Intellistation.
>>
>> I am using HWGEOM, with WKT right now and managed to create a table
>> with oid & the_geom for one point per return.
>> The upload took 48 hours and is roughly 85GB.
>> The GiST indexing took ~80 hours and is ~35GB.
>>
>> This is obviously non-optimal considering we now have WKB and LWGEOM
>> to play with. I couldn't get LWGEOM to install properly from the cvs
>> extract, which is why I reverted to the 0.9 version.
>>
>> So, any suggestions on how to get the full 9-column dataset uploaded
>> with a more efficient data structure? (note: current machine is just
>> a test machine. Production will have a LOT more drivespace).
>>
>> Also, I intend to perform a fair amount of point processing inside the
>> database using either plpgsql or java api. Is this a bad idea?
>>
>> Thanks for any input.
>>
>> ________________
>> Collin Bode
>> GIS Informatics Researcher
>> Power Lab, Integrative Biology
>> University of California, Berkeley
>> _______________________________________________
>> postgis-users mailing list
>> postgis-users at postgis.refractions.net
>> http://postgis.refractions.net/mailman/listinfo/postgis-users
>
>
> Robert Burgholzer
> Environmental Engineer
> MapTech Inc.
> phone: 804-869-3066
> http://www.maptech-inc.com/
> _______________________________________________
> postgis-users mailing list
> postgis-users at postgis.refractions.net
> http://postgis.refractions.net/mailman/listinfo/postgis-users
--
Gerry Creager -- gerry.creager at tamu.edu
Network Engineering -- AATLT, Texas A&M University
Cell: 979.229.5301 Office: 979.458.4020 FAX: 979.847.8578
Page: 979.228.0173
Office: 903A Eller Bldg, TAMU, College Station, TX 77843
More information about the postgis-users
mailing list