[gdal-dev] Using GPU for GDAL Python
Paulo Petersen
paulo.au15 at gmail.com
Thu Oct 14 18:07:08 PDT 2021
Hi Folks, good morning!
I'm using GDAL Python for pre-processing some satellite images, currently
using CPU instances in AWS.
I'm struggling to find a way to implement it using GPUs, more specifically,
the NVIDIA V100 GPU.
Does any one achieve that and could share how you've done it?
Thanks,
Paulo Petersen.
paulo.au15 at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/gdal-dev/attachments/20211015/4354fea2/attachment.html>
More information about the gdal-dev
mailing list