[GRASS-user] i.segment.stats memory usage error
James Duffy
james.philip.duffy at gmail.com
Fri Nov 4 14:08:47 PDT 2016
Sorry to post again on this long thread...
I have just managed to get my hands on a 64bit osgeolive install with 12GB
RAM available and the afformentioned function worked. Just to reiterate
this was i.segment.stats on gp_seg_optimum_clump (which was the output of
i.segment, run through r.clump). I haven't created the vectormap, but it
seems that the csv was created correctly.
I don't know if it was the 64bit or the bit of extra RAM that made the
difference here.
James
On 4 November 2016 at 15:34, Markus Metz <markus.metz.giswork at gmail.com>
wrote:
> On Fri, Nov 4, 2016 at 3:51 PM, Moritz Lennert
> <mlennert at club.worldonline.be> wrote:
> > On 04/11/16 14:29, Markus Metz wrote:
> >>
> >> On Wed, Nov 2, 2016 at 5:15 PM, Markus Metz
> >> <markus.metz.giswork at gmail.com> wrote:
> >>>
> >>> On Wed, Nov 2, 2016 at 3:47 PM, Moritz Lennert
> >>> <mlennert at club.worldonline.be> wrote:
> >>>>
> >>>> On 02/11/16 11:05, James Duffy wrote:
> >>>>>
> >>>>>
> >>>>> I can't register with osgeo to submit a bug as no-one has replied
> with
> >>>>> a
> >>>>> 'mantra' for me to do so... what a convoluted bug reporting system!
> >>>>>
> >>>>> And in your opinion Moritz, does it look like my workflow will not be
> >>>>> possible unless I find a 64bit machine with a decent amount of RAM?
> >>>>
> >>>>
> >>>>
> >>>> I'm not sure about the necessity of 64bit, but more RAM probably.
> >>>>
> >>>> IIUC, all the stats of all the zones are put into memory during the
> run
> >>>> of
> >>>> the module. Maybe a file-based store of the information might be
> >>>> possible
> >>>> which would avoid this memory usage, but I'm not sure.
> >>>>
> >>>> @MarkusM: could you have a look at this ? See
> >>>> https://lists.osgeo.org/pipermail/grass-user/2016-October/075493.html
> >>>> for
> >>>> the beginning of the thread. In short: James seems to be hitting
> memory
> >>>> limits of r.univar and I'm wondering out loud whether there is
> something
> >>>> that could be done about this...
> >>>
> >>>
> >>> As Anna wrote, the -e flag increases memory consumption of r.univar
> >>> considerably. A file based approach for extended stats of r.univar is
> >>> difficult because the number of values per segment is initially
> >>> unknown to r.univar, thus the amount of disk space needed for extended
> >>> stats for each segment is unknown.
> >>>
> >>> I would rather use r.univar for standard stats and r.stats.quantile
> >>> for extended stats.
> >>
> >>
> >> There were some bugs in r.stats.quantile, fixed in r69770-2 in all G7
> >> branches.
> >
> >
> > Great, thanks !
> >
> >>
> >> About the limit to MAX_CATS categories in the base map (ticket #3198),
> >> there is not really a reason for it. I guess it is meant to protect
> >> against out-of-memory errors.
> >
> >
> > Does this mean we can just take out the lines referencing it ?
>
> Maybe replace the error with a warning. I tested with a base map with
> 1.4 million categories (output of i.segment) and it works, but I
> needed to reduce the number of bins from 1000 to 100 to avoid OOM.
> With a low number of bins, the memory consumption of r.stats.quantile
> is much less than what r.univar requires, but a low number of bins can
> in some cases (histogram of the cell values with a peak) lead to
> higher memory consumption.
>
> r.univar does memory reallocation on demand which means that for a
> short time, double the amount of memory is needed. r.univar2 makes a
> guess about how much memory is required and then allocates all at
> once. Still, the binning approach of r.(stats.)quantile usually needs
> less memory.
>
> Markus M
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/grass-user/attachments/20161104/ba18a645/attachment.html>
More information about the grass-user
mailing list