[Qgis-user] QGIS really slow with .gpkg layers
Francesco Pelullo
f.pelullo at gmail.com
Thu Dec 10 14:36:30 PST 2020
Il gio 10 dic 2020, 16:50 Nicolas Cadieux <njacadieux.gitlab at gmail.com> ha
scritto:
> Hi,
> I also used txt files to load LiDAR points in QGIS because .shp was not
> good. This was before .gpkg came along, before I learned Python or
> CloudCompare. I am just starting to familiarize myself with the
> geopackage format so i am no .gpkg expert but this is what I would do or
> ask myself:
>
> How are the files stored? Server, USB stick, hard-drive, SSD?
>
They are usually stored in a SMB server, accessible via VPN because this
Company has multiple offices. However, due to the really slowly data
access, projects and data are moved in local SSD disk before access/editing.
> Are the files in the same CRS as the project? If not, every thing may be
> read in cache and reprojected even before you start...
>
Yes, projects and data layers are created with same CRS.
> What happens when the files are simplified? Try more files and less
> layer, try more points in the same files.
>
This could be easy to do, i will check on next job.
Try a fresh project an open the layers one by one. Is one more problematic
> than others?
>
No, It seems to me that geopackages becames slow after some days. E.g., if
i create a new project and new geopackages, qgis has no problems to open
them in a decent time (one or two minutes). But after some hours, if i
close qgis and restart it with the same project/data, access becames really
slow (15 minutes and more).
Enough memory?
>
Win10, i5, 16GB RAM, SSD 1TB, dual screen.
Disable all plugins.
>
Ok.
> Caching features: in your QGIS options, you can change the amount of
> featured that are cashed when you open a lector layer. Try cashing more
> feature or much less features (like 1). What happens? If you have a max
> of 16000 features per layer, chances are you are caching everything so
> everything is being loaded to memory.
>
Ok i will try.
> File creation: how are the files created? Look at the options in Gdal if
> the files are created in QGIS. https://gdal.orgdrivers/vector/gpkg.html Try
> creating the files with Gdal translate (vector menu) instead of just
> “export as” or save as in QGIS, force extra option to force index creation.
>
They are imported in qgis as CSV (are point geometries with a single
attribute as REAL) and then exported in gpkg format.
In export options, spatial index creation is enabled by default.
> Do you have a spatial index build in? In theory .gpkg comes with a spatial
> index but perhaps you can rebuild one? Or make sure you build on from the
> start when you create the file. (See above).
>
Sometime i run Vacuum from QGIS db manager. Also sometime i force a new
spatial index creation from layer properties, but with no appreciable
differences.
> I remember that when I created a .gpkg in python, and used it in ArcGis,
> the file would take a long time to open because it was reading every point
> of the file in order just to get the files extent. Maybe that is going
> on? I think you can specify the files extent in the file metadata. (I
> would need to check my code but I don’t see my solution here. I normally
> post them whenI find them...).
> https://gis.stackexchange.com/questions/374408/using-geopandas-generated-gpkg-in-arcmap
> <https://gis.stackexchange.com/questions/374408/using-geopandas-generated-gpkg-in-arcmap>
>
>
> Just a bunch of ideas...
>
Thank you for your suggestions.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/qgis-user/attachments/20201210/e552cd79/attachment.html>
More information about the Qgis-user
mailing list