[postgis-users] Re: [GMT-HELP] gridding density of ship tracks?
Brent Wood
pcreso at pcreso.com
Thu Jul 12 13:40:34 PDT 2007
I figured it was at least as relevant here, so here's a copy.
Brent Wood
--- Kurt Schwehr <schwehr at ccom.unh.edu> wrote:
>
> Hi Brent,
>
> That sounds much like what I am trying to do. I have the data in GMT
> multisegment lines, PostgreSQL/PostGIS, and SQLite for various tasks.
> Do you have a paper or tech report on what you did?
Being written at present. A client report for a government dept, whether it
makes it out of the grey literature is yet to be decided.
The summary below should cover it for you though.
I used a script to generate 5000m x 5000m cells in a custom local equal area
projection created in the spatial_ref_sys table, then added anothe EPSG:4326
polygon column to this table & updated this with the lat/long transform() of
the equal area cells, so I had equal area cells in a lat long projection to
work with my lat long trackline data.
Tracks were then buffered to polygons (to give swept areas) using a transform()
to equal area to buffer(), then transform() back to lat/long, ie:
transform(buffer(transform(trackline,27201),doorspread/2, 8)), 4326)
& these areas were joined (cropped) at cell boundaries so each track was
divided into cell based sections. This allows easy access to no of tracks/cell
& aggregate swept areas/cell (which can exceed the cell area where a cell is
encountered often).
The merged area/cell to give the % of each cell that has been fished is done
with geomunion() to dissolve cell-trackline boundaries to give a single
multipolygon per cell of the fished region in each cell. This gave a few
problems, which an update to geos v3 (latest RC version) pretty much remedied.
With up to thousands of polygons in a cell to merge, it wasn't pretty
topologically speaking.
Once this geoprocessing was complete, it was just basic queries to
sumarise/analyse the data, & a script to loop through species, years, methods
etc to use GMT to iteratively & semi automagically generate some 2000 maps of
appropriate subsets of the data extracted from the database on the fly as it
went.
All done on a dual core Athlon64 box with the data on striped WD Raptor SATA
drives & 4Gb memory with 64bit OpenSuse 10.2 & everything copiled from scratch.
The Raptors gave about 150Mb/sec reads, which helps a lot :-)
My problem now is how to best manage this number of maps in the report :-)
Hope this helps, if you need more info, let me know.
Cheers,
Brent
>
> Thanks,
> -kurt
>
>
> >
> > Hi Kurt,
> >
> > Another option for you.
> >
> > I have done this exercise with 1,500,000 ship tracks to model fishing
> effort.
> > I've used GMT as the cartographic tool, but the data was stored & the cell
> > counts done in PostGIS. Being able to group data by vessel category, target
> > species, etc, was also necessary, and SQL with PostGIS extensions provides
> lots
> > of functionality and flexibility to manage/query/overlay the data.
> >
> > I also turned tracklines into polygons based on wingspread/doorspread for
> > trawlers. so swept area polygons could be analysed by cell, not just the
> > tracklines.
> >
> >
> > Cheers,
> >
> > Brent Wood
> >
> >
> >
>
>
More information about the postgis-users
mailing list