[GRASS5] [bug #3514] (grass) r.surf.contour - dozens of
warnings
Hamish
hamish_nospam at yahoo.com
Wed Mar 1 02:20:06 EST 2006
> > this bug's URL: http://intevation.de/rt/webrt?serial_num=3514
>
> > Subject: r.surf.contour - dozens of warnings
>
> > GRASS 6.1.cvs (caves_utm33):~ > r.surf.contour
> > input=melio_poz_clear_rast output=melio_poz_clear_rast_10_dem2
> >
> > WARNING: segment_pagein: No such file or directory
>
> > Although it completes, looks scarry. What is it?
>
> if (read (SEG->fd, SEG->scb[cur].buf, SEG->size) != SEG->size)
> {
> G_warning ("segment_pagein: %s\n", strerror(errno));
> return -1;
> }
>
> The warning occurs if read() returns a short count, i.e. if you try to
> read beyond the end of the file, or an error.
>
> The actual error message is bogus; read() only sets errno if it
> returns a negative result, not if it returns a short count.
I've updated the warning in lib/segment/pagein.c, and added a level 2
debug message stating how much was read & how much was asked for.
This shows that the error that pops up in r.cost (at least for me) is a
short count:
G61> r.cost -vk in= out= start_points=`v.to.points -v coastline` \
percent_memory=50
Null cells excluded from cost evaluation.
Source map is: Integer cell type
20800 rows, 12100 cols.
Creating some temporary files ...
Reading southland_regional_bound ...
D2/2: segment_pagein: read_result=524268 SEG->size=524288
WARNING: segment_pagein: short count during read()
100%
Initializing output
D2/2: segment_pagein: read_result=524268 SEG->size=524288
WARNING: segment_pagein: short count during read()
100%
WARNING: Adapted sites library used for vector points (module should be
updated to GRASS 6 vector library).
[...]
Finding cost path
7%
[...?]
Hamish
ps - r.cost can make lots of big temp files. We should catch ^C etc &
remove them. In about 10 minutes I just made 12gig worth....
"top" reports r.cost is using 1948mb ram for a region this big, with
seg=50% and 90k start points. :/ going into swap makes me sad.
If I try seg=100% then G_malloc() gives an out of memory error right
at the start, which is nice.
More information about the grass-dev
mailing list