[GRASS-dev] graph function input limitations

Paulo van Breugel p.vanbreugel at gmail.com
Wed Nov 21 13:41:02 PST 2012


Hi Glynn,

Your suggestion to use r.recode was spot on. It not only avoids the
mentioned limitations, it is also faster. (and it really is also much
simpler).

I am still wondering about these apparent memory limitations in the
r.mapcalc graph function.. but that is just curiosity, with your
alternative solution, it isn't an issue for me anymore

Thanks

Paulo



On Wed, Nov 21, 2012 at 4:28 PM, Paulo van Breugel
<p.vanbreugel at gmail.com>wrote:

>
>
> On Wed 21 Nov 2012 03:39:44 PM CET, Glynn Clements wrote:
>
>>
>> Paulo van Breugel wrote:
>>
>>  I am working with the r.mapcalc graph function. When the input (number
>>> of xy pairs) is very high, I am getting an error message:
>>>
>>> "memory exhausted
>>> Parse error
>>> ERROR: parse error"
>>>
>>> The largest possible number of xy pairs seems to be somewhere between
>>> 2400 and 2500. It runs with 2400 xy pairs, and pretty fast for that
>>> matter. Memory usage doesn't go beyond 1.5GB with 2400 xy pairs, while I
>>> have 12GB RAM (on 64bit Linux computer), so I guess RAM isn't the
>>> limitation here.
>>>
>>
>> Each "term" in an r.mapcalc expression requires a row buffer, i.e. one
>> int/float/double value for each column of the map. A numeric literal
>> results in a row buffer filled with that value.
>>
>> Depending upon what you're trying to do, it might be possible to use a
>> combination of r.mapcalc -> r.recode -> r.mapcalc. r.recode would use
>> significantly less memory than r.mapcalc's graph() function.
>>
>
> Thanks for the suggestion, I will see if I can find a way to use r.recode
>
>
>
>>  In the help page of r.series, there is a mention of a soft limit of 1024
>>> files. Does something similar apply here (but with different numbers)?
>>>
>>> I can actually work around it, using a approximation of the graph
>>> function, but would like to know at what number of xy pairs I should
>>> apply the workaround. Particularly important, is the limit the same for
>>> all users, or if not, what would be a safe limit? Or is there a way to
>>> set this limit from within a script?
>>>
>>
>> The limit would depend upon the number of columns in the current
>> region and the amount of memory available.
>>
>
> OK, but what I find strange is that using 2400 terms uses less then 2.5
> GB, I would not expect 2500 terms to use more then 12 GB (amount of RAM
> available) in that case.
> I also tried to run a r.mapcalc with 2600 terms for a map of 15 x 19
> cells. Also here I got the message: "memory exhausted", which can hardly be
> related to lack of memory?
>
>
>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/grass-dev/attachments/20121121/95767845/attachment.html>


More information about the grass-dev mailing list