[mapserver-commits] r8188 - in trunk/docs: cgi development development/tests input/vector

svn at osgeo.org svn at osgeo.org
Wed Dec 3 11:55:58 EST 2008


Author: hobu
Date: 2008-12-03 11:55:58 -0500 (Wed, 03 Dec 2008)
New Revision: 8188

Added:
   trunk/docs/development/tests/
   trunk/docs/development/tests/autotest.txt
   trunk/docs/development/tests/index.txt
   trunk/docs/development/tests/mapscript.txt
Modified:
   trunk/docs/cgi/controls.txt
   trunk/docs/cgi/index.txt
   trunk/docs/cgi/introduction.txt
   trunk/docs/cgi/mapcontext.txt
   trunk/docs/cgi/mapserv.txt
   trunk/docs/development/index.txt
   trunk/docs/development/svn.txt
   trunk/docs/input/vector/ogr.txt
Log:
add testing section to development section

Modified: trunk/docs/cgi/controls.txt
===================================================================
--- trunk/docs/cgi/controls.txt	2008-12-03 16:21:46 UTC (rev 8187)
+++ trunk/docs/cgi/controls.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -1,3 +1,10 @@
+.. _cgi_controls:
+
+*****************************************************************************
+ MapServer CGI Controls
+*****************************************************************************
+
+
 Variables
 =========
 

Modified: trunk/docs/cgi/index.txt
===================================================================
--- trunk/docs/cgi/index.txt	2008-12-03 16:21:46 UTC (rev 8187)
+++ trunk/docs/cgi/index.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -12,7 +12,7 @@
 :Author: Frank Koormann
 
 .. toctree::
-   :maxdepth: 2
+   :maxdepth: 1
 
    introduction
    mapserv

Modified: trunk/docs/cgi/introduction.txt
===================================================================
--- trunk/docs/cgi/introduction.txt	2008-12-03 16:21:46 UTC (rev 8187)
+++ trunk/docs/cgi/introduction.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -1,3 +1,10 @@
+.. _cgi_introduction:
+
+*****************************************************************************
+ MapServer CGI Introduction
+*****************************************************************************
+
+
 Notes
 =====
 

Modified: trunk/docs/cgi/mapcontext.txt
===================================================================
--- trunk/docs/cgi/mapcontext.txt	2008-12-03 16:21:46 UTC (rev 8187)
+++ trunk/docs/cgi/mapcontext.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -1,3 +1,9 @@
+.. _mapcontext_cgi:
+
+*****************************************************************************
+ Map Context Files
+*****************************************************************************
+
 Support for Local Map Context Files
 -----------------------------------
 

Modified: trunk/docs/cgi/mapserv.txt
===================================================================
--- trunk/docs/cgi/mapserv.txt	2008-12-03 16:21:46 UTC (rev 8187)
+++ trunk/docs/cgi/mapserv.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -4,7 +4,8 @@
  mapserv
 *****************************************************************************
 
-The CGI interface can be tested at the commandline by using the "QUERY_STRING" switch, such as:
+The CGI interface can be tested at the commandline by using the "QUERY_STRING"
+switch, such as:
 
 ::
 

Modified: trunk/docs/development/index.txt
===================================================================
--- trunk/docs/development/index.txt	2008-12-03 16:21:46 UTC (rev 8187)
+++ trunk/docs/development/index.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -10,4 +10,6 @@
 
    sponsors
    bugs
+   svn
+   tests/index
    rfc/index

Modified: trunk/docs/development/svn.txt
===================================================================
--- trunk/docs/development/svn.txt	2008-12-03 16:21:46 UTC (rev 8187)
+++ trunk/docs/development/svn.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -46,7 +46,7 @@
   (dumping code into the project and providing no way to maintain it is almost
   as bad as having no code at all)
 - be active instead of casual about the project.
-- election of Subversion committers is covered in :ref:`rfc7`
+- election of Subversion committers is covered in :ref:`rfc7.1`
 
 Subversion Web View
 ===================

Added: trunk/docs/development/tests/autotest.txt
===================================================================
--- trunk/docs/development/tests/autotest.txt	                        (rev 0)
+++ trunk/docs/development/tests/autotest.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -0,0 +1,295 @@
+.. _raster:
+
+*****************************************************************************
+ Regression Testing 
+*****************************************************************************
+
+:Author:        Frank Warmerdam
+:Contact:       warmerdam at pobox.com
+:Revision: $Revision$
+:Date: $Date$
+:Last Updated: 2007/8/31
+
+.. contents:: Table of Contents
+    :depth: 2
+    :backlinks: top
+
+
+The msautotest is a suite of test maps, data files, expected result images,
+and test scripts intended to make it easy to run an a set of automated
+regression tests on MapServer.
+
+Getting msautotest
+------------------------------------------------------------------------------
+
+The autotest is available from SVN. On Unix it could be fetched something like:
+
+::
+
+    % svn checkout http://svn.osgeo.org/mapserver/trunk/msautotest
+
+This would create an msautotest subdirectory whereever you are. I normally put
+the autotest within my MapServer directory.
+
+Running msautotest
+------------------------------------------------------------------------------
+
+The autotest requires python (but not python Mapscript), so if you don't have
+python on your system - get and install it. More information on python is
+available at http://www.python.org. Most Linux system have some version
+already installed.
+
+The autotest also requires that the executables built with MapServer, notably
+:ref:`shp2img`, :ref:`legend <legend_utility>`, :ref:`mapserv` and
+:ref:`scalebar <scalebar_utility>, are available in the path. I generally
+accomplish this by adding the MapServer build directory to my path.
+
+csh::
+
+ % setenv PATH $HOME/mapserver:$PATH
+
+bash/sh::
+
+ % PATH=$HOME/mapserver:$PATH
+
+Verify that you can run stuff by typing 'shp2img -v' in the autotest
+directory::
+
+     warmerda at gdal2200[152]% shp2img -v
+     MapServer version 3.7 (development) OUTPUT=PNG OUTPUT=JPEG OUTPUT=WBMP 
+     SUPPORTS=PROJ SUPPORTS=TTF SUPPORTS=WMS_SERVER SUPPORTS=GD2_RGB 
+     INPUT=TIFF INPUT=EPPL7 INPUT=JPEG INPUT=OGR INPUT=GDAL INPUT=SHAPEFILE
+
+Now you are ready to run the tests. The tests are subdivided into categories,
+currently just "gdal", "misc", and "wxs" each as a subdirectory. To run the
+"gdal" tests cd into the gdal directory and run the run_test.py script.
+
+Unix::
+
+  ./run_test.py 
+
+Windows::
+
+  python.exe run_test.py
+
+The results in the misc directory might look something like this::
+
+ warmerda at gdal2200[164]% run_test.py
+ version = MapServer version 3.7 (development) OUTPUT=PNG OUTPUT=JPEG OUTPUT=WBMP
+ SUPPORTS=PROJ SUPPORTS=TTF SUPPORTS=WMS_SERVER SUPPORTS=GD2_RGB INPUT=TIFF 
+ INPUT=EPPL7 INPUT=JPEG INPUT=OGR INPUT=GDAL INPUT=SHAPEFILE
+
+ Processing: rgba_scalebar.map
+    results match.
+ Processing: tr_scalebar.map
+    results match.
+ Processing: tr_label_rgb.map
+    results match.
+ Processing: ogr_direct.map
+    results match.
+ Processing: ogr_select.map
+    results match.
+ Processing: ogr_join.map
+    results match.
+ Test done:
+    0 tested skipped
+    6 tests succeeded
+    0 tests failed
+    0 test results initialized
+
+In general you are hoping to see that no tests failed.
+
+Checking Failures
+------------------------------------------------------------------------------
+
+Because most msautotest tests are comparing generated images to expected
+images, the tests are very sensitive to subtle rounding differences on
+different systems, and subtle rendering changes in libraries like freetype
+and gd. So it is quite common to see some failures.
+
+These failures then need to be reviewed manually to see if the differences
+are acceptable and just indicating differences in rounding/rendering or
+whether they are "real" bugs. This is normally accomplished by visually
+comparing files in the "result" directory with the corrresponding file in
+the "expected" directory. It is best if this can be done in an application
+that allows images to be layers, and toggled on and off to visually
+highlight what is changing. OpenEV can be used for this.
+
+PerceptualDiff
+------------------------------------------------------------------------------
+
+If you install the PerceptualDiff program (http://pdiff.sourceforge.net/) and
+it is in the path, then the autotest will attempt to use it as a last fallback
+when comparing images. If images are found to be "perceptually" the same the
+test will pass with the message "result images perceptually match, though
+files differ." This can dramatically cut down the number of apparent failures
+that on close inspection are for all intents and purposes identical. Building
+PerceptualDiff is a bit of a hassle.
+
+Background
+------------------------------------------------------------------------------
+
+The msautotest suite was initially developed by Frank Warmerdam
+(warmerdam at pobox.com), who can be contacted with questions it.
+
+The msautotest suite is organized as a series of .map files. The python
+scripts basically scan the directory in which they are run for files ending in
+.map. They are then "run" with the result dumped into a file in the result
+directory. A binary comparison is then done to the corresponding file in the
+expected directory and differences are reported. The general principles for
+the test suite are that:
+
+- The test data should be small so it can be easily stored and checked out of
+  svn without big files needing to be downloaded.
+
+- The test data should be completely contained within the test suite ... no
+  dependencies on external datasets, or databases that require additional
+  configuration. PostGIS and Oracle will require separate testing mechanisms.
+
+- The tests should be able to run without a significant deal of user
+  interaction. This is as distinct from the DNR test suite described in
+  FunctionalityDemo.
+
+- The testing mechanism should be suitable to test many detailed functions in
+  relative isolation.
+
+- The test suite is not dependent on any of the MapScript environments, though
+  I think it would be valuable to extend the testsuite with some mapscript
+  dependent components in the future (there is a start on this in the mspython
+  directory).
+
+Test Script Internal Functioning
+..............................................................................
+
+Because MapServer can be built with many possible extensions, such as support
+for OGR, GDAL, and PROJ.4, it is desirable to have the testsuite automatically
+detect which tests should be run based on the configuratio of MapServer. This
+is accomplished by capturing the version output of "shp2img -v" and using the
+various keys in that to decide which tests can be run. A directory can have a
+file called "all_require.txt" with a "REQUIRES:" line indicating components
+required for all tests in the directory. If any of these requirements are not
+met, no tests at all will be run in this directory. For instance, the
+gdal/all_require.txt lists:
+
+::
+
+    REQUIRES: INPUT=GDAL OUTPUT=PNG
+
+In addition, individual .map files can have additional requirements expressed
+as a REQUIRES: comment in the mapfile. If the requirements are not met the map
+will be skipped (and listsed in the summary as a skipped test). For example
+gdal/256_overlay_res.map has the following line to indicate it requires
+projection support (in addition to the INPUT=GDAL and OUTPUT=PNG required by
+all files in the directory):
+
+::
+
+    REQUIRES: SUPPORTS=PROJ
+
+The output files generated by processing a map is put in the file
+results/<mapfilebasename>.png (regardless of whether it is PNG or not). So
+when gdal/256_overlay_res.map is processed, the output file is written to
+gdal/results/256_overlay_res.png. This is compared to
+gdal/expected/256_overlay_res.png. If they differ the test fails, and the
+"wrong" result file is left behind for investigation. If they match the result
+file is deleted. If there is no corresponding expected file the results file
+is moved to the expected directory (and reported as an "initialized" test)
+ready to be committed to CVS.
+
+There is also a RUN_PARMS keyword that may be placed in map files to override
+a bunch of behaviour. The default behaviour is to process map files with
+shp2img, but other programs such as mapserv or scalebar can be requested, and
+various commandline arguments altered as well as the name of the output file.
+For instance, the following line in misc/tr_scalebar.map indicates that the
+output file should be called tr_scalebar.png, the commandline should look like
+"[SCALEBAR] [MAPFILE] [RESULT]" instead of the default "[SHP2IMG] -m [MAPFILE]
+-o [RESULT]".
+ 
+::
+    RUN_PARMS: tr_scalebar.png [SCALEBAR] [MAPFILE] [RESULT]
+
+For testing things as they would work from an HTTP request, use the RUN_PARMS
+with the program [MAPSERV] and the QUERY_STRING argument, with results
+redirected to a file.
+
+::
+    # RUN_PARMS: wcs_cap.xml [MAPSERV] QUERY_STRING='map=[MAPFILE]&SERVICE=WCS&VERSION=1.0.0&REQUEST=GetCapabilities' > [RESULT]
+
+For web services that generate images that would normally be prefixed with the
+Content-type header, use [RESULT_NOMIME] to instruct the test harnass to
+script off any http headers before doing the comparison.
+
+::
+
+    # Generate simple PNG.
+    # RUN_PARMS: wcs_simple.png [MAPSERV] QUERY_STRING='map=[MAPFILE]&SERVICE=WCS&VERSION=1.0.0&REQUEST=GetCoverage&WIDTH=120&HEIGHT=90&FORMAT=GDPNG&BBOX=0,0,400,300&COVERAGE=grey&CRS=EPSG:32611' > [RESULT_DEMIME]
+
+What If A Test Fails?
+..............................................................................
+
+When running the test suite, it is common for some tests to fail depending on
+vagaries of floating point on the platform, or harmless changes in
+MapServer. To identify these compare the results in result with the file in
+expected and determine what the differences are. If there is just a slight
+shift in text or other features it is likely due to floating point differences
+on different platforms. These can be ignored. If something has gone seriously
+wrong, then track down the problem!
+
+It is also prudent to avoid using output image formats that are platform
+specific. For instance, if you produce TIFF it will generate big endian on big
+endian systems and therefore be different at the binary level from what was
+expected. PNG should be pretty safe.
+
+TODO
+------------------------------------------------------------------------------
+
+- Add lots of tests for different stuff!  Very little vector testing done yet. 
+
+- Add a high level script in the msautotest directory that runs the subscripts
+  in all the subdirectories and produces a summary report.
+
+Adding New Tests
+------------------------------------------------------------------------------
+
+- Pick an appropriate directory to put the test in. Feel free to start a new
+  one for new families of testing functionality.
+
+- Create a minimal map file to test a particular issue. I would discourage
+  starting from a "real" mapfile and cutting down as it is hard to reduce this
+  to the minimum.
+
+- Give the new mapfile a name that hints at what it is testing without making
+  the name too long. For instance "ogr_join.map" tests OGR joins.
+  "rgb_overlay_res_to8bit.map" tests RGB overlay layers with resampling and
+  converting to 8bit output.
+
+- Put any MapServer functionality options in a # REQUIRES: item in the header
+  as described in the internal functioning topic above.
+
+- Write some comments at the top of the .map file on what this test is
+  intended to check.
+
+- Add any required datasets within the data directory beneath the test
+  directory. These test datasets should be as small as possible! Reuse
+  existing datasets if at all possible.
+
+- run the "run_tests.py" script.  
+
+- verify that the newly created expected/<testname>.png file produces the
+  results you expect. If not, revise the map and rerun the test, now checking
+  the results/<testname>.png file. Move the results/<testname>.png file into
+  the expected directory when you are happy with it.
+ 
+- add the .map file, and the expected/<testname>.png file to CVS when you are
+  happy with them. Make sure the .png file (and any supporting data files) are
+  marked as binary files. For example,
+
+::
+
+    % svn add mynewtest.map expected/mynewtest.png
+    % svn commit -m "new" mynewtest.map expected/mynewtest.png
+ 
+You're done!
+
+
+


Property changes on: trunk/docs/development/tests/autotest.txt
___________________________________________________________________
Name: svn:mime-type
   + text/x-rst
Name: svn:keywords
   + Id Rev Author Date

Added: trunk/docs/development/tests/index.txt
===================================================================
--- trunk/docs/development/tests/index.txt	                        (rev 0)
+++ trunk/docs/development/tests/index.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -0,0 +1,13 @@
+.. _testing:
+
+*****************************************************************************
+ Testing
+*****************************************************************************
+
+
+.. toctree::
+   :maxdepth: 2
+
+   autotest
+   mapscript
+   
\ No newline at end of file


Property changes on: trunk/docs/development/tests/index.txt
___________________________________________________________________
Name: svn:mime-type
   + text/x-rst
Name: svn:keywords
   + Id Rev Author Date

Added: trunk/docs/development/tests/mapscript.txt
===================================================================
--- trunk/docs/development/tests/mapscript.txt	                        (rev 0)
+++ trunk/docs/development/tests/mapscript.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -0,0 +1,86 @@
+.. _vector:
+
+*****************************************************************************
+ MapScript Unit Testing
+*****************************************************************************
+
+:Date: 2005/11/20
+:Author: Sean Gillies
+
+Test Driven Development
+------------------------------------------------------------------------------
+
+In 2003, I began to commit to test driven development of the mapscript module.
+TDD simply means development through repetition of two activities:
+
+  1) add a test, cause failure, and write code to pass the test
+  
+  2) remove duplication
+
+Test Driven Development is also a book by Kent Beck.
+
+New features that I develop for MapServer begin as test expressions. There are
+a bazillion good reasons for working this way. The most obvious are
+
+  1) accumulation of automated unit tests
+
+  2) accumulation of excellent usage examples
+
+  3) that i'm prevented from starting work on flaky ideas that can't be tested
+
+About the tests
+------------------------------------------------------------------------------
+
+Tests are committed to the MapServer CVS under mapscript/python/tests. They
+are written in Python using the JUnit inspired unittest module. A good
+introduction to unit testing with Python is found at
+http://diveintopython.org/unit_testing/index.html.
+
+The test framework imports mapscript from python/tests/cases/testing.py.  This
+allows us to test the module before installation
+
+::
+
+    [sean at lenny python]$ python setup.py build
+    [sean at lenny python]$ python tests/runtests.py -v
+
+Test cases are implemented as Python classes, and individual tests as class
+methods named beginning with test*. The special setUp() and tearDown() methods
+are for test fixtures and are called before and after every individual test.
+
+Since version 4.2, MapServer includes a very lightweight testing dataset under
+mapserver/tests. The set consists of symbols, fonts, three single-feature
+shapefiles, and a test.map mapfile. This is the only data used by the unit
+tests.
+
+Many tests that require a mapObj derive from testing.MapTestCase::
+
+    class MapTestCase(MapPrimitivesTestCase):
+        """Base class for testing with a map fixture"""
+        def setUp(self):
+            self.map = mapscript.mapObj(TESTMAPFILE)
+        def tearDown(self):
+            self.map = None
+
+One example is the MapSymbolSetTestCase, the test case I used for development
+of the expanded symbolset functionality present in the 4.2 release::
+
+    class MapSymbolSetTestCase(MapTestCase):
+        def testGetNumSymbols(self):
+            """expect getNumSymbols == 2 from test fixture test.map"""
+            num = self.map.getNumSymbols()
+            assert num == 2, num
+        
+        ...
+
+
+Status
+------------------------------------------------------------------------------
+
+This unit testing framework only covers functionality that is exposed to the
+Python mapscript module. It can help to check on pieces of the core MapServer
+code, but is no guarantor of the :ref:`mapserv` program or of the :ref:`PHP
+MapScript <php>` module. As of this writing, there are 159 tests in the suite.
+These are tests of features added since mid-2003. Much of MapServer's older
+stuff remains untested and it is doubtful that we'll make the time to go back
+and fill in.


Property changes on: trunk/docs/development/tests/mapscript.txt
___________________________________________________________________
Name: svn:mime-type
   + text/x-rst
Name: svn:keywords
   + Id Rev Author Date

Modified: trunk/docs/input/vector/ogr.txt
===================================================================
--- trunk/docs/input/vector/ogr.txt	2008-12-03 16:21:46 UTC (rev 8187)
+++ trunk/docs/input/vector/ogr.txt	2008-12-03 16:55:58 UTC (rev 8188)
@@ -320,8 +320,6 @@
     TIGER/Line files, datasource_name is the directory containing the TIGER 
     files (eg. ogrinfo TGR25001) 
 
-Examples of "OGRINFO" in use:
-*****************************
 
 **Example 5. To get the list of layers in a file:**
 



More information about the mapserver-commits mailing list