[GRASS-dev] Right way to use G_getl2
wenzeslaus at gmail.com
Thu Sep 12 19:58:53 PDT 2013
this relates to the recent fix for r.in.ascii (r57649). The skip parameter
 required one line more because when reading lines using G_getl2 
function and counting these lines it counted one line more.
The documentation of the function says that it reads max n-1 bytes. In
other words, n is the size of the buffer, so it is clear that it can n-1
useful characters plus string terminating character '\0'. Quite standard
The problem in v.in.ascii was that it the caller passed the previously
determined maximum number of characters on line in file plus one (space for
the terminating character). So this number is the same for all lines and it
works well for all lines except for the longest one.
The reason is that G_getl2 stops reading characters for the longest line
and leaves end line character in the stream. So, in fact, the longest line
is in buffer and can be processed. However, the next line is than only the
remaining end line character. This is not what the caller expected. There
is even no way to find out that the line was not read completely.
So, finally: What is the right usage of G_getl2? Should the caller use
buffer size equal to maximum number of expected characters on line plus end
of line character (plus 2 on MS Windows) plus terminating character? Or
should he just pass the n parameter increased by one (two on MS Windows)
since in fact nothing will be stored to buffer? Or should he just allocate
really huge buffer and than read chars from buffer if he wants to store
Note 1: Don't be confused from documentation since the n parameter
description for G_getl2 is wrong.
Note 2: In description I was taking only about Unix line endings. On
Windows we need two bytes more to store it as noted in questions. The case
of old Mac OS is not cover because it is even more tricky in G_getl2
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the grass-dev