Sujet : Re: Buffer contents well-defined after fgets() reaches EOF ?
De : 643-408-1753 (at) *nospam* kylheku.com (Kaz Kylheku)
Groupes : comp.lang.cDate : 14. Feb 2025, 20:34:59
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <20250214112225.480@kylheku.com>
References : 1 2 3 4 5 6 7 8 9 10 11
User-Agent : slrn/pre1.0.4-9 (Linux)
On 2025-02-14, Keith Thompson <Keith.S.Thompson+
u@gmail.com> wrote:
Kaz Kylheku <643-408-1753@kylheku.com> writes:
[...]
It would be good if fgets nuked the terminating newline.
>
Many uses of fgets, after every operation, look for the newline
and nuke it, before doing anything else.
>
There is a nice idiom for that, by the way, which avoids an
temporary variable and if test:
>
line[strcspn(line, "\n")] = 0;
[...]
>
Then how do you detect a partial line? That can occur either if
the last line doesn't have a terminating newline (on systems that
permit it) or a line that's too long to fit in the array.
I've seen many programs like this don't care. They have some
'char buf[4096]' and that's that.
In a program not required or designed to handle arbitrarily
long lines, you can do something very simple (prior to the
above line[strcspn(line, "\n")] = 0 expression).
- zero-initialize the buffer.
- after every call to fgets, inspect the value of the second-to-last
array element. If the value is neither zero, nor '\n', then somehow
diagnose that a too-long line has been presented to the program,
contrary to its documented limitations.
This will yield a false positive on an unterminated last line. That
issue can be added as a documented limitation, or else the buffer can be
sized one greater than what the documented line length limit requires,
so that the program allows inner lines to be one character longer than
the documented limit, but is strict with regard to an unterminated last
line.
-- TXR Programming Language: http://nongnu.org/txrCygnal: Cygwin Native Application Library: http://kylheku.com/cygnalMastodon: @Kazinator@mstdn.ca