Sujet : Re: Incorrect requirements --- Computing the mapping from the input to HHH(DD)
De : news.x.richarddamon (at) *nospam* xoxy.net (Richard Damon)
Groupes : comp.theoryDate : 09. May 2025, 18:59:05
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vvlhrl$2vfvk$1@dont-email.me>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
User-Agent : Mozilla Thunderbird
On 5/8/25 10:23 PM, Keith Thompson wrote:
Richard Damon <richard@damon-family.org> writes:
On 5/8/25 7:53 PM, olcott wrote:
[...]
void DDD()
{
HHH(DDD);
return;
}
We don't need to look at any of my code for me
to totally prove my point. For example when
the above DDD is correctly simulated by HHH
this simulated DDD cannot possibly reach its own
"return" instruction.
>
And thus not correctly simulatd.
>
Sorry, there is no "OS Exemption" to correct simulaiton;.
Perhaps I've missed something. I don't see anything in the above that
implies that HHH does not correctly simulate DDD. Richard, you've read
far more of olcott's posts than I have, so perhaps you can clarify.
If we assume that HHH correctly simulates DDD, then the above code is
equivalent to:
void DDD()
{
DDD();
return;
}
which is a trivial case of infinite recursion. As far as I can tell,
assuming that DDD() is actually called at some point, neither the
outer execution of DDD nor the nested (simulated) execution of DDD
can reach the return statement. Infinite recursion might either
cause a stack overflow and a probable program crash, or an unending
loop if the compiler implements tail call optimization.
I see no contradiction, just an uninteresting case of infinite
recursion, something that's well understood by anyone with a
reasonable level of programming experience. (And it has nothing to
do with the halting problem as far as I can tell, though of course
olcott has discussed the halting problem elsewhere.)
Richard, what am I missing?
You are missing the equivocation he is using on what is "DDD()"
First, he tries to define it as just the code of the function, and not including any of the code that it calls. He does this so all the various HHH that he talks about are given the "same" input.
Then he tries to also say that when those functions look at DDD, they can follow the memory chain to the functions that it calls, that weren't actually part of the input.
This means the "behavior" of his input isn't actually defined by the input.
He has also, to get around other objections about what he is doing, stipulated that his functions must be pure functions, and thus only dependent on their direct input, other wise we can add the following code to the beginning of HHH to make his statement false:
int HHH(void (*p)()) {
static int flag = 0;
if (flag) return 0;
flag = 1;
/* then the rest of the code that he uses to simulate the input */
Such an HHH obviously can "correctly simulate" a call to itself in the same memory space to the return of DDD.
But, since HHH has been stipulated to be a pure function, it can't access memory that wasn't part of its input, and thus his first definition can't be used, and when we use the second definiton, it is clear that each of his HHH's get different input, so is ploy to show that the HHH that does correctly emulate the input which show that "the" DDD is non-halting also shows that the HHH that does abort, and was given the "same" input can also conclude that.
Of course, different inputs can behavie differently and thus he is just using his projection that HHH(DDD) is just like the sum(2,3) that returned 7 as the sum of 2 + 5 since it "changed" the input to something it wasn't.
His goal is to ge this "fact" agreed to in the abstract case, so he can claim it must also be true