[Benchmarking] Context for Benchmarks
Schlagel, Joel D IWR
Joel.D.Schlagel at usace.army.mil
Thu Jan 13 22:54:11 EST 2011
Daniel, thanks - I agree I got a bit off the mark on part 1 - and something
needs to change to bring it back to wms...
-joel
----- Original Message -----
From: benchmarking-bounces at lists.osgeo.org
<benchmarking-bounces at lists.osgeo.org>
To: benchmarking at lists.osgeo.org <benchmarking at lists.osgeo.org>
Sent: Thu Jan 13 21:48:00 2011
Subject: Re: [Benchmarking] Context for Benchmarks
Hi Joel,
I like this idea of building the benchmarks around a specific scenario.
The scenario may need a bit of work, but this looks like a great
starting point to me. (For instance, in part 1 you mention WMS+SLD but
then the benchmark is about building a tile cache which kind of
contradicts the use of WMS+SLD.)
Daniel
On 11-01-13 03:34 PM, Schlagel, Joel D IWR wrote:
>
> All - here is a rough idea of how we might put the OGC benchmarking in
> context, and maybe simplify the presentation
>
> The storyboard is interesting to me, and may help reduce the number of
> variables that need to be tested, because in the end the main benchmark is
> just on best effort.
>
> Fine grain benchmarks that will be of interest to developers in improving
> software will be generated along the way, but I think workflow and best
> effort will be an easier story to tell, and may even allow an undisputed
> champion to reign for a year.
>
> This is just an idea, so no problem if the consensus is to keep
benchmarking
> as a purely technical exercise.
>
>
>
> OSGEO WMS Shoot out Scenario Š putting the benchmarks in context
> 2011 Scenario Colorado County Government Response to a Disaster
>
> Part 1 - Routine Operations
>
> Colorado County operates a web based GIS. The server caches tiles from
> level 1 to N for the base map. Their imagery is stored on file system and
> updated on schedule X while their vector layers are stored in a relational
> database and updated on schedule Y. They provide OGC WMS interface to the
> server including SLD, etc.
>
> This Benchmark tests the capability& efficiency of software to generate
map
> tiles for routine web mapping operations.
>
> Generate image tile cache from large satellite image and N tiles of aerial
> imagery
>
> Generate vector overlay tile cache
>
> - roads
> - rivers
> - parcels
> - parks
> - utilities
>
> The benchmark is time to complete workflow (ie generate all tiles) using
> ³best effort.² teams choose the OS, backend database (must be RDBMS), and
> mapping server. Parallel requests vs sequential is not a concern - total
> time to complete the workflow is the benchmark. Will use complex
symbology
> and SLD.
>
>
> Sub-benchmarks can test client vs backend database combinations (e.g
> mapserver on oracle vs postgis) and connection strategies. The overall
> measure is ³best effort² time to complete the work flow.
>
>
> Part 2 Response
>
> A large Fire in Colorado County Forest. New aerial imagery and incident
> data such as current fire area, weather, forecast direction, response
points
> arrive in form of shape file. These need to be served right away to
> interested public.
>
> In this benchmark, file system shape files and imagery are served under
> load. A random set of requests is created for various scales and areas
> beforehand and same random requests are used for each software. This
leaves
> out all database interaction and just deals with efficient random file
> access. Benchmark is wms requests per second, throughput under max load.
>
>
> _______________________________________________
> Benchmarking mailing list
> Benchmarking at lists.osgeo.org
> http://lists.osgeo.org/mailman/listinfo/benchmarking
--
Daniel Morissette
http://www.mapgears.com/
_______________________________________________
Benchmarking mailing list
Benchmarking at lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/benchmarking
More information about the Benchmarking
mailing list