Liste des Groupes | Revenir à cl c |
On 17/03/2024 19:28, Malcolm McLean wrote:I this case it s very short code and easy to see that it is right, so a win for recursion. Except that it is only right if the stack is bigger than N/2 calls deep, where N is the number fo pixels in the image. Now a 100x100 woked fine an my machine - I just checked the main stack, and it's 8MB by default. BUt of cuuurse the bigger than machine, the bigger the image th euser might want to load.On 17/03/2024 16:45, David Brown wrote:Unfortunately, I am still fallible - /easier/ does not mean I'll get it right :-( And I apologise for unhelpfully rushing that and getting it wrong.On 16/03/2024 16:09, Malcolm McLean wrote:Except it is not. You didn't give the right answer for the space requirements.>The OP's code is simple and obvious, as is its correctness (assuming reasonable definitions of the pixel access and setting functions) and its time and space requirements. Yours is not.
>
However, I stand by my claim that the recursive version is much easier to analyse.
Exactly. If a routine ia leaf, you can cut and paste it and use it where you will. If you have to take subroutines, you've got to explore the code to understand what you neeed to take, then you have to out them somewhere. So it's better to keep routines leaf is possible and fold a few trivial operations into the code body, even if ideally they would be subroutines. And I understand these trade offs. >It's better to have one function. Subroutines have a way of getting lost.>Seriously? "Subroutines get lost" ? So your answer is to put all your ideas in a mixer and scrunch them up until any semblance of logic and structure is lost, and the code works (if it does) by trial and error? And then the whole mess is cut-and-paste duplicated - along with any subtle bugs it might have - for 8-connected version. And that's better, in your eyes, than re-using code?
I have been most interested in being able to be sure the algorithm is correct, rather than considering its absolute (rather than "big O") efficiency in different systems. It is certainly the case that cache considerations are more relevant now than they used to be on many systems. And for working on PC's, you would likely dispense with your growing stack entirely and simply allocate a queue big enough for every pixel in the image.That is an idea. But a bit extravanagant. I'd like to try to work out how much quue s actually used in typical as well as worst case.
I suggested separating the code into functions - that is /definitely/ constructive. I suggested using sensible names for parameters and variables (well, the suggestion was implied by my criticism).And of course the entire binary image library has a consistent style. And we don't want the user mee=ssing about with writing his own getpixel / setpixel functions, thouhg there would be a case for that for a geneeral purpose flood fill.
And I am also suggesting now that you allocate a queue that is big enough for every pixel in the image. Much of what you don't touch of that space, will probably never be physically allocated by the OS, depending on page sizes and free memory.
And I would also suggest you drop the requirement for coding in an ancient tongue, and instead switch to reasonably modern C. Make abstractions for the types and the access functions - it will make the code far easier to follow, easier to show correct, and easier to modify and reuse, without affecting efficiency at run-time.
Les messages affichés proviennent d'usenet.