Sujet : File processing (was Event loop and http::geturl)
De : nospam.nurdglaw (at) *nospam* gmail.com (Alan Grunwald)
Groupes : comp.lang.tclDate : 26. Jun 2025, 09:30:03
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <103j0gi$3bup4$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9
User-Agent : Mozilla Thunderbird
On 26/06/2025 02:52, et99 wrote:
On 6/25/2025 2:32 PM, Rich wrote:
Jonathan Kelly <jonkelly@fastmail.fm> wrote:
proc queue {} {
set ::input [open "|cat test.txt" r]
fconfigure $::input -blocking 0 -buffering line
fileevent $::input readable [list check $::input]
}
>
Curious why you are opening a pipe to cat, having cat read and print
the contents, and then consuming that, when you can just open text.txt
directly:
>
set ::input [open test.txt r]
>
And achieve the same result.
>
I was also curious about this. But I'm also wondering why this is even event driven at all? Why not simply, in pseudo code:
while 1 {
read...a line
if end of file, break
geturl
do something with the url results
}
If there's also a gui that the OP wants to keep alive, it should not be starved, since the synchronous form of geturl is calling vwait, and that would allow gui events to get processed while waiting for the url request to complete.
-e
My pseudo code is generally
while !eof {
read a line
if line is not empty
do stuff
endif
endwhile
I used to be puzzled why I needed the test for non-emptiness. I never worked out why, nowadays I simply accept that's the way of things and do it.
It might be that I come from a VMS background and assume that there's a newline at the end of each "record" in the file (which I find convenient because without one, the shell prompt gets appended to the last line when using cat at the command line). Have I missed something? Is there something inherently daft about my preconceptions?