[gdal-dev] Testing the driver

Abel Pau a.pau at creaf.uab.cat
Tue Mar 5 13:28:21 PST 2024


Hi again,
after solving some issues I used WSL (Windows subsystem Linux) to create an environment where I am able to run tests.

I run the cmake inside build folder in the environment. It’s slow but finally it finish. After cmake --build . --target install all is ready to be tested.

I create a simple test ogr_miramon_vector.py (see the code below) to prove that it’s reliable.

I run:
pytest autotest/ogr/ogr_miramon_vector.py
and:

apau at ABEL2:/mnt/d/GitHub-repository/gdal/build$ pytest autotest/ogr/ogr_miramon_vector.py
Test session starts (platform: linux, Python 3.8.10, pytest 8.0.2, pytest-sugar 1.0.0)
benchmark: 4.0.0 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
GDAL Build Info:
  PAM_ENABLED: YES
  OGR_ENABLED: YES
  CURL_ENABLED: YES
  CURL_VERSION: 7.68.0
  GEOS_ENABLED: YES
  GEOS_VERSION: 3.8.0-CAPI-1.13.1
  PROJ_BUILD_VERSION: 6.3.1
  PROJ_RUNTIME_VERSION: 6.3.1
  COMPILER: GCC 9.4.0
GDAL_DOWNLOAD_TEST_DATA: undefined (tests relying on downloaded data may be skipped)
GDAL_RUN_SLOW_TESTS: undefined (tests marked as "slow" will be skipped)
rootdir: /mnt/d/GitHub-repository/gdal/build/autotest
configfile: pytest.ini
plugins: benchmark-4.0.0, sugar-1.0.0, env-1.1.3
collected 0 items

My questions is why it seems it’s not working?
Thanks!

The test:
-------------
import os

import gdaltest
import ogrtest
import pytest

from osgeo import gdal, ogr, osr

pytestmark = pytest.mark.require_driver("MiraMonVector")

###############################################################################
@pytest.fixture(scope="module", autouse=True)
def init():
    with gdaltest.config_option("CPL_DEBUG", "ON"):
        yield


###############################################################################
# basic test

def test_ogr_miramon_vector_1():
    try:
        ds = gdal.OpenEx("data/miramon/Points/SimplePoints/SimplePointsFile.pnt")
        lyr = ds.GetLayer(0)

        assert lyr is not None, "Failed to get layer"

        assert lyr.GetFeatureCount() == 3
        assert lyr.GetGeomType() == ogr.wkbPoint

        f = lyr.GetNextFeature()
        assert f.GetFID() == 0
        assert f.GetGeometryRef().ExportToWkt() == "POINT (513.49 848.81)"
        assert f.GetField("ID_GRAFIC") == "0"

        f = lyr.GetNextFeature()
        assert f.GetField("ID_GRAFIC") == "1"

        f = lyr.GetNextFeature()
        assert f.GetField("ID_GRAFIC") == "2"

        ds = None
    except Exception as e:
        pytest.fail(f"Test failed with exception: {e}")




De: Even Rouault <even.rouault at spatialys.com>
Enviado el: divendres, 9 de febrer de 2024 11:48
Para: Abel Pau <a.pau at creaf.uab.cat>; gdal-dev at lists.osgeo.org
Asunto: Re: [gdal-dev] Testing the driver


Abel,
Le 09/02/2024 à 10:55, Abel Pau via gdal-dev a écrit :
Hi,
I am at the lasts steps before pulling a request about the MiraMon driver.
I need to write some documentation and formalize the tests.
After that, I’ll do the pull request to github.
I'd suggest first before issuing the pull request that you push to your fork on github and look at the Actions tab. That will allow you to fix a lot of things on your side, before issuing the PR itself


I am a little confused about the testing. I can use pytest or ctest, right? Which is the favourite? Are there any changes from the official documentation?

ctest is just the CMake way of launching the test suite. It will execute C++ tests of autotest/cpp directly, and for tests written in python will launch "pytest autotest/XXXXX" for each directory.

"ctest --test-dir $build_dir -R autotest_ogr -V"  will just run all the autotest/ogr tests, which can be quite long already.

To test your own development, you may have a more pleasant experience by directly running just the tests for your driver with something like "pytest autotest/ogr/ogr_miramon.py"  (be careful on Windows, the content of $build_dir/autotest is copied from $source_dir/autotest each time "cmake" is run, so if you edit your test .py file directly in the build directory, be super careful of not accidentally losing your work, and make sure to copy its content to the source directory first. That's admittedly an annoying point of the current test setup on Windows, compared to Unix where we use symbolic links)

after setting the environment to have PYTHONPATH point to something like $build_dir/swig/python/Release or $build_dir/swig/python/Debug (I believe you're on Windows?).  If you look at the first lines output by the above "ctest --test-dir $build_dir -R autotest_ogr -V" invokation, you'll actually see the PYTHONPATH value to specify.

You also need to first install pytest and other testing dependencies with: python -m pip install autotest/requirements.txt
There is a minimal test to create?
A maximal test suite, you mean ;-) You should aim for a "reasonable" coverage of the code you wrote. Aiming to test the nominal code paths of your driver is desirable (testing the error cases generally requires a lot more effort).

Can you recommend me some driver that tests things like:

1.       Read a point/arc/polygon layer from some format (gml,kml, gpckg,..) and assert the number of readed objectes

2.       Read a point layer and assert some points (3d included) and some of the fields values

3.       The same with arcs and polygons

4.       Create some layer from the own format to anothers and compare the results with some “good” results.

5.       Create multiple layers from one outer format (like gpx) and verify the name of the created files...

You don't necessarily need to use other formats. It is actually better if the tests of a format don't depend too much on other formats, to keep things isolated.

To test the read part of your driver, add a autotest/ogr/data/miramon directory with *small* test files, ideally at most a few KB each to keep the size of the GDAL repository reasonable, and a few features in each is often enough to unit test, with different type of geometries, attributes, and use the OGR Python API to open the file and iterate over its layers and features to check their content. Those files should have ideally be produced by the Miramon software and not by the writing side of your driver, to check the interoperability of your driver with a "reference" software.

For the write site of the driver, you can for example run gdal.VectorTranslate(dest, source) on those files, and use again the test function to validate that the read side of your driver likes what the write site has produced. An alternative is also to do a binary comparison of the file generated by your driver with a reference test file stored in for example autotest/ogr/data/miramon/ref_output. But this may be sometimes a fragile approach if the output of your driver might change in the future (would require regenerating the reference test files).

I'd suggest your test suite also has a test that runs the "test_ogrsf" command line utility which is a kind of compliance test suite which checks a number of expectations for a driver, like that GetFeatureCount() returns the same number as iterating with GetNextFeature(), etc etc

It is difficult to point at a "reference" test suite, as all drivers have their particularities and may need specific tests. Potential sources of inspirations:

- autotest/ogr/ogr_gtfs.py  . Shows very simple testing of the read side of a driver, and includes a test_ogrsf test

- autotest/ogr/ogr_csv.py  has examples where the writing side of the driver is checked by opening the output file and checking that some strings are present in it (only easily doable with text based formats)

- autotest/ogr/ogr_openfilegdb_write.py . Extensive testing of the writing side of a driver . A lot in it will be specific to the format and irrelevant to your concern, but you should at least find all possible aspects of how to test the write side of a driver.
Even

--

http://www.spatialys.com

My software is free, but my time generally not.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/gdal-dev/attachments/20240305/ff9824d5/attachment-0001.htm>


More information about the gdal-dev mailing list