<div dir="ltr"><div dir="ltr">Hi Ana,<div><br></div><div>I've never handled such kind of big files but I would opt for making them available on the internal filesystem of the geoserver container, loading them from the GUI of GeoServer in the geonode workspace and finally running the updatelayers command [1] on the django container.</div><div><br></div><div>Don't know if Alessio, Paolo and somebody else can add something more appropriate.</div><div><br></div><div>Hope this helps.</div><div><br></div><div>Francesco</div><div><br></div><div>[1] <a href="http://docs.geonode.org/en/master/tutorials/admin/admin_mgmt_commands/#updatelayers">http://docs.geonode.org/en/master/tutorials/admin/admin_mgmt_commands/#updatelayers</a><br><br><div class="gmail_quote"><div dir="ltr">Il giorno mar 20 nov 2018 alle ore 12:28 Ana Silva <<a href="mailto:anasilva.ifpb@gmail.com">anasilva.ifpb@gmail.com</a>> ha scritto:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Hi everyone, I have a question. In the project I'm working on, we deal with a gigantic dataset (raster files larger than 18GB each, in the best case) and a total dataset size of more than 3TB.</div><div>And I'm having a problem uploading this data, the uploader always shows me an entity too large message when I try to upload some of this data.</div><div>I'm using Geonode 2.10.x, installed via Docker.</div><div>Would anyone have some kind of solution to this problem?</div></div>
_______________________________________________<br>
geonode-users mailing list<br>
<a href="mailto:geonode-users@lists.osgeo.org" target="_blank">geonode-users@lists.osgeo.org</a><br>
<a href="https://lists.osgeo.org/mailman/listinfo/geonode-users" rel="noreferrer" target="_blank">https://lists.osgeo.org/mailman/listinfo/geonode-users</a><br>
</blockquote></div></div></div></div>