[OSGeoLive] #2489: python3 datacube-core
OSGeoLive
trac_osgeolive at osgeo.org
Mon Oct 28 17:18:40 PDT 2024
#2489: python3 datacube-core
-------------------------+----------------------------
Reporter: darkblueb | Owner: osgeolive@…
Type: enhancement | Status: new
Priority: major | Milestone: OSGeoLive17.0
Component: OSGeoLive | Resolution:
Keywords: python |
-------------------------+----------------------------
Comment (by darkblueb):
## test datacube-core
`$ sudo apt install --yes python3-hypothesis`
`$ pip3 install --break-system-packages moto`
`... Successfully installed moto-5.0.18 responses-0.25.3`
## run tests; 7 FAIL, 460 PASS
{{{
user at osgeolive:~/Downloads$ pytest-3 datacube-core-1.8.19/tests/
========================================= test session starts
=========================================
platform linux -- Python 3.12.3, pytest-7.4.4, pluggy-1.4.0
rootdir: /home/user/Downloads/datacube-core-1.8.19
configfile: pytest.ini
plugins: anyio-4.2.0, hypothesis-6.98.15
collected 471 items
datacube-core-1.8.19/tests/test_3d.py ..
[ 0%]
datacube-core-1.8.19/tests/test_concurrent_executor.py ..
[ 0%]
datacube-core-1.8.19/tests/test_config.py .......
[ 2%]
datacube-core-1.8.19/tests/test_driver.py ..........
[ 4%]
datacube-core-1.8.19/tests/test_dynamic_db_passwd.py .
[ 4%]
datacube-core-1.8.19/tests/test_eo3.py ...........
[ 7%]
datacube-core-1.8.19/tests/test_gbox_ops.py ..
[ 7%]
datacube-core-1.8.19/tests/test_geometry.py
................................................... [ 18%]
.................x
[ 22%]
datacube-core-1.8.19/tests/test_load_data.py ........
[ 23%]
datacube-core-1.8.19/tests/test_metadata_fields.py .....
[ 24%]
datacube-core-1.8.19/tests/test_model.py ...............
[ 28%]
datacube-core-1.8.19/tests/test_testutils.py ....
[ 28%]
datacube-core-1.8.19/tests/test_utils_aws.py ........F..F
[ 31%]
datacube-core-1.8.19/tests/test_utils_changes.py ..
[ 31%]
datacube-core-1.8.19/tests/test_utils_cog.py .............
[ 34%]
datacube-core-1.8.19/tests/test_utils_dask.py ........FFFF.
[ 37%]
datacube-core-1.8.19/tests/test_utils_dates.py ....
[ 38%]
datacube-core-1.8.19/tests/test_utils_docs.py
..........................................FE..... [ 48%]
.x.........
[ 50%]
datacube-core-1.8.19/tests/test_utils_generic.py ...
[ 51%]
datacube-core-1.8.19/tests/test_utils_other.py
.............................................. [ 61%]
datacube-core-1.8.19/tests/test_utils_rio.py ......
[ 62%]
datacube-core-1.8.19/tests/test_warp.py ...
[ 63%]
datacube-core-1.8.19/tests/test_xarray_extension.py .........
[ 65%]
datacube-core-1.8.19/tests/api/test_core.py ....
[ 66%]
datacube-core-1.8.19/tests/api/test_grid_workflow.py ...
[ 66%]
datacube-core-1.8.19/tests/api/test_masking.py .........
[ 68%]
datacube-core-1.8.19/tests/api/test_query.py
................................ [ 75%]
datacube-core-1.8.19/tests/api/test_virtual.py ...............
[ 78%]
datacube-core-1.8.19/tests/drivers/test_rio_reader.py ........
[ 80%]
datacube-core-1.8.19/tests/index/test_api_index_dataset.py ...
[ 80%]
datacube-core-1.8.19/tests/index/test_fields.py ....
[ 81%]
datacube-core-1.8.19/tests/index/test_hl_index.py .
[ 81%]
datacube-core-1.8.19/tests/index/test_postgis_fields.py ....
[ 82%]
datacube-core-1.8.19/tests/index/test_query.py .
[ 83%]
datacube-core-1.8.19/tests/index/test_validate_dataset_type.py
.................... [ 87%]
datacube-core-1.8.19/tests/scripts/test_search_tool.py ..
[ 87%]
datacube-core-1.8.19/tests/storage/test_base.py ...
[ 88%]
datacube-core-1.8.19/tests/storage/test_netcdfwriter.py ...........
[ 90%]
datacube-core-1.8.19/tests/storage/test_storage.py .................
[ 94%]
datacube-core-1.8.19/tests/storage/test_storage_load.py ..
[ 94%]
datacube-core-1.8.19/tests/storage/test_storage_read.py ......
[ 95%]
datacube-core-1.8.19/tests/ui/test_common.py ..E
[ 96%]
datacube-core-1.8.19/tests/ui/test_expression_parsing.py .........
[ 98%]
datacube-core-1.8.19/tests/ui/test_task_app.py .......
[100%]
=============================================== ERRORS
================================================
_____________________________ ERROR at setup of test_read_docs_from_http
______________________________
file /home/user/Downloads/datacube-core-1.8.19/tests/test_utils_docs.py,
line 200
def test_read_docs_from_http(sample_document_files, httpserver):
E fixture 'httpserver' not found
> available fixtures: anyio_backend, anyio_backend_name,
anyio_backend_options, cache, capfd, capfdbinary, caplog, capsys,
capsysbinary, dask_client, data_folder, doctest_namespace,
eo3_dataset_doc, eo3_dataset_file, eo3_dataset_s2, eo3_metadata,
eo3_metadata_file, eo_dataset_doc, eo_dataset_file, example_gdal_path,
example_netcdf_path, monkeypatch, no_crs_gdal_path, non_geo_dataset_doc,
non_geo_dataset_file, odc_style_xr_dataset, pytestconfig, record_property,
record_testsuite_property, record_xml_attribute, recwarn,
sample_document_files, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory,
tmpnetcdf_filename, without_aws_env, workdir
> use 'pytest --fixtures [testpath]' for help on them.
/home/user/Downloads/datacube-core-1.8.19/tests/test_utils_docs.py:200
______________________________ ERROR at setup of test_ui_path_doc_stream
______________________________
file /home/user/Downloads/datacube-core-1.8.19/tests/ui/test_common.py,
line 90
def test_ui_path_doc_stream(httpserver):
E fixture 'httpserver' not found
> available fixtures: anyio_backend, anyio_backend_name,
anyio_backend_options, cache, capfd, capfdbinary, caplog, capsys,
capsysbinary, dask_client, data_folder, doctest_namespace,
eo3_dataset_doc, eo3_dataset_file, eo3_dataset_s2, eo3_metadata,
eo3_metadata_file, eo_dataset_doc, eo_dataset_file, example_gdal_path,
example_netcdf_path, monkeypatch, no_crs_gdal_path, non_geo_dataset_doc,
non_geo_dataset_file, odc_style_xr_dataset, pytestconfig, record_property,
record_testsuite_property, record_xml_attribute, recwarn, tmp_path,
tmp_path_factory, tmpdir, tmpdir_factory, tmpnetcdf_filename,
without_aws_env, workdir
> use 'pytest --fixtures [testpath]' for help on them.
/home/user/Downloads/datacube-core-1.8.19/tests/ui/test_common.py:90
============================================== FAILURES
===============================================
_____________________________________________ test_s3_io
______________________________________________
monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x711338600ad0>,
without_aws_env = None
def test_s3_io(monkeypatch, without_aws_env):
import moto
from numpy import s_
url = "s3://bucket/file.txt"
bucket, _ = s3_url_parse(url)
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "fake-key-id")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "fake-secret")
> with moto.mock_s3():
E AttributeError: module 'moto' has no attribute 'mock_s3'. Did you
mean: 'mock_aws'?
datacube-core-1.8.19/tests/test_utils_aws.py:204: AttributeError
______________________________________ test_obtain_new_iam_token
______________________________________
monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x71131a104320>,
without_aws_env = None
def test_obtain_new_iam_token(monkeypatch, without_aws_env):
import moto
from sqlalchemy.engine.url import URL
url = URL.create(
'postgresql',
host="fakehost", database="fake_db", port=5432,
username="fakeuser", password="definitely_a_fake_password",
)
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "fake-key-id")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "fake-secret")
> with moto.mock_iam():
E AttributeError: module 'moto' has no attribute 'mock_iam'. Did you
mean: 'mock_aws'?
datacube-core-1.8.19/tests/test_utils_aws.py:272: AttributeError
_____________________________ test_save_blob_s3_direct[some utf8 string]
______________________________
blob = 'some utf8 string', monkeypatch = <_pytest.monkeypatch.MonkeyPatch
object at 0x711330268bc0>
@pytest.mark.parametrize("blob", [
"some utf8 string",
b"raw bytes",
])
def test_save_blob_s3_direct(blob, monkeypatch):
region_name = "us-west-2"
blob2 = blob + blob
url = "s3://bucket/file.txt"
url2 = "s3://bucket/file-2.txt"
bucket, _ = s3_url_parse(url)
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "fake-key-id")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "fake-secret")
> with moto.mock_s3():
E AttributeError: module 'moto' has no attribute 'mock_s3'. Did you
mean: 'mock_aws'?
datacube-core-1.8.19/tests/test_utils_dask.py:132: AttributeError
_________________________________ test_save_blob_s3_direct[raw bytes]
_________________________________
blob = b'raw bytes', monkeypatch = <_pytest.monkeypatch.MonkeyPatch object
at 0x71133020a9f0>
@pytest.mark.parametrize("blob", [
"some utf8 string",
b"raw bytes",
])
def test_save_blob_s3_direct(blob, monkeypatch):
region_name = "us-west-2"
blob2 = blob + blob
url = "s3://bucket/file.txt"
url2 = "s3://bucket/file-2.txt"
bucket, _ = s3_url_parse(url)
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "fake-key-id")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "fake-secret")
> with moto.mock_s3():
E AttributeError: module 'moto' has no attribute 'mock_s3'. Did you
mean: 'mock_aws'?
datacube-core-1.8.19/tests/test_utils_dask.py:132: AttributeError
_________________________________ test_save_blob_s3[some utf8 string]
_________________________________
blob = 'some utf8 string', monkeypatch = <_pytest.monkeypatch.MonkeyPatch
object at 0x7113302bcd40>
dask_client = <Client: 'inproc://192.168.122.226/51286/1' processes=1
threads=1, memory=23.00 GiB>
@pytest.mark.parametrize("blob", [
"some utf8 string",
b"raw bytes",
])
def test_save_blob_s3(blob, monkeypatch, dask_client):
region_name = "us-west-2"
blob2 = blob + blob
dask_blob = dask.delayed(blob)
dask_blob2 = dask.delayed(blob2)
url = "s3://bucket/file.txt"
url2 = "s3://bucket/file-2.txt"
bucket, _ = s3_url_parse(url)
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "fake-key-id")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "fake-secret")
> with moto.mock_s3():
E AttributeError: module 'moto' has no attribute 'mock_s3'. Did you
mean: 'mock_aws'?
datacube-core-1.8.19/tests/test_utils_dask.py:170: AttributeError
____________________________________ test_save_blob_s3[raw bytes]
_____________________________________
blob = b'raw bytes', monkeypatch = <_pytest.monkeypatch.MonkeyPatch object
at 0x7113302bda30>
dask_client = <Client: 'inproc://192.168.122.226/51286/1' processes=1
threads=1, memory=23.00 GiB>
@pytest.mark.parametrize("blob", [
"some utf8 string",
b"raw bytes",
])
def test_save_blob_s3(blob, monkeypatch, dask_client):
region_name = "us-west-2"
blob2 = blob + blob
dask_blob = dask.delayed(blob)
dask_blob2 = dask.delayed(blob2)
url = "s3://bucket/file.txt"
url2 = "s3://bucket/file-2.txt"
bucket, _ = s3_url_parse(url)
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "fake-key-id")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "fake-secret")
> with moto.mock_s3():
E AttributeError: module 'moto' has no attribute 'mock_s3'. Did you
mean: 'mock_aws'?
datacube-core-1.8.19/tests/test_utils_dask.py:170: AttributeError
_______________________________________ test_read_docs_from_s3
________________________________________
sample_document_files = [('/home/user/Downloads/datacube-
core-1.8.19/tests/data/multi_doc.yml', 3), ('/home/user/Downloads
/datacube-core-1.8.1...e-core-1.8.19/tests/data/single_doc.yaml', 1),
('/home/user/Downloads/datacube-core-1.8.19/tests/data/sample.json', 1)]
monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x711338773dd0>
def test_read_docs_from_s3(sample_document_files, monkeypatch):
"""
Use a mocked S3 bucket to test reading documents from S3
"""
boto3 = pytest.importorskip('boto3')
moto = pytest.importorskip('moto')
monkeypatch.setenv('AWS_ACCESS_KEY_ID', 'fake')
monkeypatch.setenv('AWS_SECRET_ACCESS_KEY', 'fake')
> with moto.mock_s3():
E AttributeError: module 'moto' has no attribute 'mock_s3'. Did you
mean: 'mock_aws'?
datacube-core-1.8.19/tests/test_utils_docs.py:179: AttributeError
========================================== warnings summary
===========================================
<frozen importlib._bootstrap>:488
<frozen importlib._bootstrap>:488: RuntimeWarning: numpy.ndarray size
changed, may indicate binary incompatibility. Expected 16 from C header,
got 96 from PyObject
tests/test_concurrent_executor.py::test_concurrent_executor
/home/user/Downloads/datacube-
core-1.8.19/tests/test_concurrent_executor.py:79: DeprecationWarning: Call
to deprecated function (or staticmethod) get_executor. (Executors have
been deprecated and will be removed in v1.9)
-- Deprecated since version 1.8.14.
executor = get_executor(None, 2)
tests/test_concurrent_executor.py::test_concurrent_executor
tests/test_concurrent_executor.py::test_concurrent_executor
/home/user/Downloads/datacube-core-1.8.19/datacube/executor.py:245:
DeprecationWarning: Call to deprecated function (or staticmethod)
_get_concurrent_executor. (Executors have been deprecated and will be
removed in v1.9)
-- Deprecated since version 1.8.14.
concurrent_exec = _get_concurrent_executor(workers,
use_cloud_pickle=use_cloud_pickle)
tests/test_concurrent_executor.py::test_concurrent_executor
/home/user/Downloads/datacube-
core-1.8.19/tests/test_concurrent_executor.py:83: DeprecationWarning: Call
to deprecated function (or staticmethod) get_executor. (Executors have
been deprecated and will be removed in v1.9)
-- Deprecated since version 1.8.14.
executor = get_executor(None, 2, use_cloud_pickle=False)
tests/test_concurrent_executor.py::test_concurrent_executor
tests/test_concurrent_executor.py::test_concurrent_executor
/usr/lib/python3.12/multiprocessing/popen_fork.py:66:
DeprecationWarning: This process (pid=51286) is multi-threaded, use of
fork() may lead to deadlocks in the child.
self.pid = os.fork()
tests/test_concurrent_executor.py::test_fallback_executor
/home/user/Downloads/datacube-
core-1.8.19/tests/test_concurrent_executor.py:89: DeprecationWarning: Call
to deprecated function (or staticmethod) get_executor. (Executors have
been deprecated and will be removed in v1.9)
-- Deprecated since version 1.8.14.
executor = get_executor(None, None)
tests/test_concurrent_executor.py::test_fallback_executor
/home/user/Downloads/datacube-core-1.8.19/datacube/executor.py:238:
DeprecationWarning: Call to deprecated class SerialExecutor. (Executors
have been deprecated and will be removed in v1.9)
-- Deprecated since version 1.8.14.
return SerialExecutor()
tests/test_testutils.py: 10 warnings
tests/test_utils_dates.py: 6 warnings
tests/test_xarray_extension.py: 28 warnings
tests/storage/test_netcdfwriter.py: 2 warnings
/home/user/Downloads/datacube-core-1.8.19/datacube/utils/dates.py:103:
UserWarning: Converting non-nanosecond precision datetime values to
nanosecond precision. This behavior can eventually be relaxed in xarray,
as it is an artifact from pandas which is now beginning to support non-
nanosecond precision values. This warning is caused by passing non-
nanosecond np.datetime64 or np.timedelta64 values to the DataArray or
Variable constructor; it can be silenced by converting the values to
nanosecond precision ahead of time.
return xr.DataArray(data,
tests/test_utils_other.py::test_num2numpy
/home/user/Downloads/datacube-core-1.8.19/datacube/utils/math.py:194:
DeprecationWarning: NumPy will stop allowing conversion of out-of-bound
Python integers to integer arrays. The conversion of -1 to uint8 will
fail in the future.
For the old behavior, usually:
np.array(value).astype(dtype)
will give the desired result (the cast overflows).
return dtype.type(x)
tests/drivers/test_rio_reader.py::test_rd_internals_bidx
/home/user/Downloads/datacube-
core-1.8.19/tests/drivers/test_rio_reader.py:79: DeprecationWarning:
datetime.datetime.utcfromtimestamp() is deprecated and scheduled for
removal in a future version. Use timezone-aware objects to represent
datetimes in UTC: datetime.datetime.fromtimestamp(timestamp,
datetime.UTC).
timestamp=datetime.utcfromtimestamp(1),
tests/storage/test_netcdfwriter.py: 107 warnings
tests/storage/test_storage.py: 2 warnings
/home/user/Downloads/datacube-
core-1.8.19/datacube/drivers/netcdf/writer.py:68: DeprecationWarning:
datetime.datetime.utcnow() is deprecated and scheduled for removal in a
future version. Use timezone-aware objects to represent datetimes in UTC:
datetime.datetime.now(datetime.UTC).
.format(__version__, datetime.utcnow()))
tests/ui/test_task_app.py::test_run_tasks
/home/user/Downloads/datacube-core-1.8.19/tests/ui/test_task_app.py:165:
DeprecationWarning: Call to deprecated class SerialExecutor. (Executors
have been deprecated and will be removed in v1.9)
-- Deprecated since version 1.8.14.
executor = datacube.executor.SerialExecutor()
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
======================================= short test summary info
=======================================
FAILED datacube-core-1.8.19/tests/test_utils_aws.py::test_s3_io -
AttributeError: module 'moto' has no attribute 'mock_s3'. Did you mean:
'mock_aws'?
FAILED datacube-
core-1.8.19/tests/test_utils_aws.py::test_obtain_new_iam_token -
AttributeError: module 'moto' has no attribute 'mock_iam'. Did you mean:
'mock_aws'?
FAILED datacube-
core-1.8.19/tests/test_utils_dask.py::test_save_blob_s3_direct[some utf8
string] - AttributeError: module 'moto' has no attribute 'mock_s3'. Did
you mean: 'mock_aws'?
FAILED datacube-
core-1.8.19/tests/test_utils_dask.py::test_save_blob_s3_direct[raw bytes]
- AttributeError: module 'moto' has no attribute 'mock_s3'. Did you mean:
'mock_aws'?
FAILED datacube-
core-1.8.19/tests/test_utils_dask.py::test_save_blob_s3[some utf8 string]
- AttributeError: module 'moto' has no attribute 'mock_s3'. Did you mean:
'mock_aws'?
FAILED datacube-
core-1.8.19/tests/test_utils_dask.py::test_save_blob_s3[raw bytes] -
AttributeError: module 'moto' has no attribute 'mock_s3'. Did you mean:
'mock_aws'?
FAILED datacube-
core-1.8.19/tests/test_utils_docs.py::test_read_docs_from_s3 -
AttributeError: module 'moto' has no attribute 'mock_s3'. Did you mean:
'mock_aws'?
ERROR datacube-
core-1.8.19/tests/test_utils_docs.py::test_read_docs_from_http
ERROR datacube-
core-1.8.19/tests/ui/test_common.py::test_ui_path_doc_stream
================== 7 failed, 460 passed, 2 xfailed, 167 warnings, 2 errors
in 31.34s ==================
}}}
--
Ticket URL: <https://trac.osgeo.org/osgeolive/ticket/2489#comment:1>
OSGeoLive <https://live.osgeo.org/>
self-contained bootable DVD, USB thumb drive or Virtual Machine based on Lubuntu, that allows you to try a wide variety of open source geospatial software without installing anything.
More information about the osgeolive
mailing list