[Landsat-pds] Status

Sundwall, Jed jsundwal at amazon.com
Fri Jan 23 16:13:41 PST 2015


Team: thank you all for working with us on this project. I just want to point out that we recently acquired our 10,000th scene. Looking forward to acquiring many many more, but just wanted to celebrate such a nice round number as we get started.

Thank you Frank for getting this going, and thank you Amit, for offering to help.

Have a great weekend everyone, Jed.


On Jan 23, 2015, at 1:03 PM, Amit Kapadia <amit at mapbox.com<mailto:amit at mapbox.com>> wrote:

Hi Frank,

Thanks for all the hard work!

We do have most of December and January scenes available on S3. They're in a private bucket, so we'll have to figure out the best way to transfer these data. One option is that I can run a copy from our bucket into the public bucket, and you can take it from there. I'm open to other ideas. Our scenes expire after 50 days, so the December stash dwindles a little bit each day.

Cheers,
Amit


On Thu, Jan 22, 2015 at 1:23 PM, Chris Holmes <cholmes at planet.com<mailto:cholmes at planet.com>> wrote:
Nice work Frank!

Markdown in the github repo sounds ideal to me.

On Thu, Jan 22, 2015 at 12:20 PM, Frank Warmerdam <warmerdam at pobox.com<mailto:warmerdam at pobox.com>> wrote:
Folks,

OK, I now have the pull from usgs and queue in S3 job running every two hours (for two hours) on our job system.  At the beginning of each job it processes the previous run's queued scenes in parallel (usually 200-400), then builds index files, then pulls scenes from usgs until the job times out (2 hour limit).

So we should see the catalog fill out automatically, though the slowness and 503's of the USGS download service are pretty frustrating to me.  Incidentally we try to run up to 6 of the puller jobs.  They also now do 4 retries with exponential backoff on 503's.

If MapBox has the scene .gz files still sitting around for some of January, I'd be interested in us side-loading missing ones into the input queue. If Amit or something is interested in that we could do a little push on it in person now that we (Planet Labs) are just down the street.

Also, I added a small job (tree_index_maker.py) to build very simple index files so we can more easily walk the tree in the web browser.  If anyone knows a way of auto-indexing instead of rebuilding these every two hours, I'd be interested in hearing.

See:

  https://s3-us-west-2.amazonaws.com/landsat-pds/L8/index.html

Also, thanks to Charlie for the improved thumbnail generation PR, it has been merged and deployed though old ones aren't regenerated automatically.

Lastly, I'd like us to setup a wiki or markup based web site for our effort when I can start documenting things and we can describe the effort.  Should I just start writing markdown docs in the git repo for the ingestor?  Or what?

Best regards,
--
---------------------------------------+--------------------------------------
I set the clouds in motion - turn up   | Frank Warmerdam, warmerdam at pobox.com<mailto:warmerdam at pobox.com>
light and sound - activate the windows | http://pobox.com/~warmerdam
and watch the world go round - Rush    | Geospatial Software Developer

_______________________________________________
Landsat-pds mailing list
Landsat-pds at lists.osgeo.org<mailto:Landsat-pds at lists.osgeo.org>
http://lists.osgeo.org/cgi-bin/mailman/listinfo/landsat-pds



_______________________________________________
Landsat-pds mailing list
Landsat-pds at lists.osgeo.org<mailto:Landsat-pds at lists.osgeo.org>
http://lists.osgeo.org/cgi-bin/mailman/listinfo/landsat-pds


_______________________________________________
Landsat-pds mailing list
Landsat-pds at lists.osgeo.org<mailto:Landsat-pds at lists.osgeo.org>
http://lists.osgeo.org/cgi-bin/mailman/listinfo/landsat-pds

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/landsat-pds/attachments/20150124/f85439d5/attachment.html>


More information about the Landsat-pds mailing list