[gdal-dev] GDAL_CACHMAX on a docker container

Even Rouault even.rouault at spatialys.com
Wed May 23 03:48:33 PDT 2018


On mercredi 23 mai 2018 13:30:30 CEST Guy Doulberg wrote:
> Hi guys
> 
> I wanted to share with you a GOTCHA I struggled with in the passed few
> days, maybe it will be helpful for more people
> 
> I am using GDAL on a docker cluster, since there are more services running
> on the host I limit the container to use only 1GB of ram
> 
> The default behavior of GDAL_CACHMAX is 5% of the available memory, but
> apparently when running in above setup it sees the System memory, which
> 16GB and not 1GB.
> 
> On that container I ran 3 process of GDAL which means it calculated it can
> use 2.4 GB approximately, but the container limitation was only 1GB
> 
> The result was that the OOM killer of the operating system killed those
> processes because they exceeded memory.
> 
> To solve to issue (to control the memory the process is going to use) I
> explictly set the GDAL_CACHMAX to an absolute value

Guy,

can you create an issue about that in https://github.com/OSGeo/gdal/issues ?

There are some hints to how this could be done:
https://github.com/dotnet/coreclr/blob/master/src/gc/unix/cgroup.cpp
https://stackoverflow.com/questions/42187085/check-mem-limit-within-a-docker-container

I just tried

$ docker run --memory 512m --rm -it ubuntu bash

root at 0ae673d96789:/# cat /sys/fs/cgroup/memory/memory.limit_in_bytes
536870912

Can you check in your docker container that
cat /sys/fs/cgroup/memory/memory.limit_in_bytes
returns the maximum memory amount you specified ?

Even

-- 
Spatialys - Geospatial professional services
http://www.spatialys.com


More information about the gdal-dev mailing list