[Qgis-psc] Performance tests QGIS Server

Paul Blottiere paul.blottiere at oslandia.com
Mon Jun 18 01:22:00 PDT 2018


Dear PSC,


As explained previously by Régis, we spent some time working on a
dedicated platform to benchmark performance between several QGIS Server
versions (currently 2.14, 2.18, 3.0 and master) and it's time to release
it now.

Many users are asking for information about performance differences
between QGIS Server 2.X and 3.X. However, they often want to test
performance with their own dataset, and not only with some generic
stuff. And, in this particular case, we realized that there weren't any
simple and convenient way to concretely measure and generate a
performance report. For the purpose of meeting the specific needs of
these requests, we worked on two dedicated projects: Graffiti [1] and
QGIS-Server-Perfsuite [2].


      Graffiti

Graffiti is a simple Python tool allowing to generate a HTML report from
a tests scenario described in a YAML file. This way, anyone with
available QGIS Server instances may use this tool to generate a
performance report with custom data, custom .qgs project and custom WMS
parameters. This tool meets the initial demands.


      QGIS Server Perfsuite

We need such reports to be generated on a daily basis so that we can
track regressions or improvements. QGIS Server Perfsuite allows to
easily deploy a whole platform with Dockerfile to build/execute QGIS
Server, some Ansible scripts for a remote deployment and default
configuration files for Graffiti. The result is the report previously
mentioned [0].


      First results


Regarding these preliminary results, we don't have very good results for
lines and polygons in 3.X, but all this requires more investigations
because internal test for one customer sometimes shows different
tendencies. We already tackled one performance issue with line parallel
labelling. QGIS 2.18 and PAL candidates were not taken into account
anymore. Once located and the bugfix merged in master, performances are
good again, and this scenario is clearly visible in the report [3].

All this to say that performance regressions can happen suddenly, and
developing without keeping an eye on performances may be dangerous,
especially from a server point of view. The PerfSuite contains only a
few scenarios and datasets, and we aim at adding new ones progressively.
Of course, we have to be realistic and keep in mind that, regarding the
number of options in QGIS, server tuning and data providers, we cannot
test everything...


      How does this platform plays with MSPerf from C2C ?

It should be made clear that the C2C performance platform does not have
the same purpose that what we're are working on. It's not better, it's
not worse, it's just a different tool. Firstly, in our case, we don't
want to compare QGIS Server with other map servers. The underlying aim
is clearly to provide a convenient tool for developers and users to
measure the response time of the server on specific data with a
particular configuration (by the way, it seems that Docker images of
QGIS are private in C2C, which make their facility a tiny difficult to
customize without asking them [5]). Secondly, graffiti measures the
unitary response time per request, not according to a number of users.
Once again, it's a very good thing to have to contemplate the big
picture! However, at this stage, it seems more suitable to check the
unit rendering time, without messing with web server configurations.
Moreover, it allows us to observe the caching time (like with the trust
option [6]).


The last step from a system point of view, is to run these tests in a
continuous integration environment. Is the PSC interested in helping us
in that direction?

Of course, we look forward to hearing the comments and criticisms :).


Sorry for being so long, and have a good day!


All the best.

Paul / the Oslandia Team


[0] http://37.187.164.233/qgis-server-perfsuite-report/graffiti/report.html

[1] https://github.com/pblottiere/graffiti

[2] https://github.com/Oslandia/QGIS-Server-PerfSuite

[3]
http://37.187.164.233/qgis-server-perfsuite-report/graffiti/report.html#a98e80fea3074fe19f037adb8e86d35c


[4] https://github.com/KDAB/hotspot

[5] https://github.com/camptocamp/ms_perfs/pull/27

[6]
http://37.187.164.233/qgis-server-perfsuite-report/graffiti/report.html#0e8047f450854e73a08851336b21a1d7




On 09/06/18 16:07, Régis Haubourg wrote:
> Hi PSC ,
>
> Yes we are aware of the MS perf platform since the 2016 QGIS server
> code sprint.
>
> We have been discussing quite a few times about coding with
> performance and waited for some investment from C2C during 2017. We
> are very happy they now publish a daily report, however publishing
> static tests on static datasets is not what we seek. Yves showed some
> results in the QGIS Fr user days two years back, and I saw the
> expectations, frustrations and debates it rose that time. 
> Our goal is NOT to compare QGIS server with Mapserver or GeoServer. We
> don't want to fall into these of debates that in my opinion are
> dangerous for everyone because it's almost impossible to reach
> absolute measuring for tools that runs in different environnement and
> render data differenly.
>
> We are talking of a semi scientific approach here, and being able to
> reproduce and confirm issues with different tools is probably a very
> good thing, if we just can take time to analyse things.
>
> Our task was to build a very light platform dedicated to be integrated
> in continuous integration, and that is really easy to enrich with new
> tests.  So we build a comparison between framework between ltr,
> release and dev version that can be run in many contexts (web server
> type, mutlithread, multiprocess options, rendering options, data
> providers, etc..).
>
> We also want to keep an history of the developpement version
> performance for each test.
>
> As you see, this can lead to massive amount of computing and logs, so
> we really need something easy to set up and as light as possible.
> Moreover, measuring performance needs light tools to avoid influencing
> the measure themselves. 
>
> We believe that QGIS renders data extremely fast, but it has some
> glitches due to the desktop design oringin that can be tackled if we
> have permanent feedback. 
>
> Paul just finished the platform this week, and we now dedicating
> efforts on running it on a dedicated server to be certain of not being
> influenced by any external load.
>
> I hope next week we'll publish it with the first reports.
>
> In the end, the question of hosting the platform on dedicated server
> still remains, as much as administrating it correctly so that no tool
> run in parallel at the same time, with the risk of altering the
> measures.  I hope you are still ok with the fact that we need this
> kind of tool.
>
> Best regards,
> Régis
>
> 2018-06-09 15:10 GMT+02:00 Richard 🌍 Duivenvoorde <richard at duif.net
> <mailto:richard at duif.net>>:
>
>     Hi PSC,
>
>     On my TODO list there was to ask Yves/Camptocamp about their
>     QGISserver
>     tests and make sure Oslandia was eventually aware of that.
>
>     See below:
>
>     - results are available
>     https://gmf-test.sig.cloud.camptocamp.net/ms_perfs/
>     <https://gmf-test.sig.cloud.camptocamp.net/ms_perfs/>
>     and apparenlty updated daily?
>     Yves promises to do some cleanup and upload to test.qgis.org
>     <http://test.qgis.org>
>
>     - he thinks Oslandia is aware of this work (to be sure, I bcc
>     Regis :-)
>
>     Regards,
>
>     Richard Duivenvoorde
>
>
>     -------- Forwarded Message --------
>     Subject:        Re: Performance tests QGIS Server
>     Date:   Wed, 6 Jun 2018 10:30:39 +0200
>
>     Le 04/06/2018 à 20:44, Richard 🌍 Duivenvoorde a écrit :
>     > Hi Yves,
>     >
>     > During PSC meeting we were talking about some QGIS-Server OWS
>     > performance-tests/service that Oslandia is doing currently.
>     I heard something about this indeed.
>     > Andreas mentioned that CampToCamp also did something (for
>     Andreas) last
>     > year. And that you asked me to put it on the website.
>     > So Question: did you ask me? And did I answer? :-)
>     Camptocamp did it something and we discussed about how to improve
>     it and
>     share to the community. The topic was "QGIS benchmark"
>     (07/11/2017). See
>     https://gmf-test.sig.cloud.camptocamp.net/ms_perfs/
>     <https://gmf-test.sig.cloud.camptocamp.net/ms_perfs/>. Here a quote of
>     your answer:
>
>     """
>     What is the idea? That we (as qgis.org <http://qgis.org>
>     <http://qgis.org>) ourselves run
>     the benchmark
>     (docker?) every now and then?
>
>     If NOT, then it is easiest when I give you credentials on the
>     test.qgis.org <http://test.qgis.org> <http://test.qgis.org>
>     (virtual)Webserver, so you can
>     rsync/scp the result to a
>     directory 'benchmarks' at:
>
>     http://test.qgis.org/
>
>     As you can see Paul (of Oslandia) also pushes the QGISServer CITE test
>     results there into a directory 'ogc_cite':
>
>     http://http://test.qgis.org/ogc_cite/
>     <http://test.qgis.org/ogc_cite/> <http://test.qgis.org/ogc_cite/
>     <http://test.qgis.org/ogc_cite/>>
>
>     The idea is to either add an index.html to test.qgis.org
>     <http://test.qgis.org>
>     <http://test.qgis.org> which then
>     sents you to individual test directories (or the latest in that
>     directory), OR we create a (translatable) page in the qgis.org
>     <http://qgis.org>
>     <http://qgis.org> website
>     which does some description of the different tests, and then links to
>     the html-output pages on test.qgis.org <http://test.qgis.org>
>     <http://test.qgis.org>
>     I think there should really be some explanation at the performance
>     tests...
>
>     Both is possible, just need some body/time to do it :-)
>     """
>     I am ok with a rsync on the QGIS server. Result need some love to
>     improve some graphics (null value gives no graph at all).
>
>     And I should add picture of layer to illustrate the layer complexity.
>
>     > Andreas was also wondering in how much of the Oslandia work is
>     actually
>     > already done by C2C and if it is still handy to communicatie
>     about this.
>     I don't know, I have no idea what Oslandia is working on, well not
>     more
>     than what they shared on QGIS mailing list. They are aware of our
>     project for sure, as we discussed about this at the QGIS server
>     hackfest
>     and 3Liz pushed a pull request.
>
>     Y.
>
>
>
>
> _______________________________________________
> Qgis-psc mailing list
> Qgis-psc at lists.osgeo.org
> https://lists.osgeo.org/mailman/listinfo/qgis-psc

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/qgis-psc/attachments/20180618/1277dcd0/attachment.html>


More information about the Qgis-psc mailing list