Sujet : Re: DDD correctly emulated by HHH is correctly rejected as non-halting.
De : polcott333 (at) *nospam* gmail.com (olcott)
Groupes : comp.theoryDate : 11. Jul 2024, 22:56:09
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <v6pgt9$2kc07$1@dont-email.me>
References : 1 2 3 4 5 6 7 8
User-Agent : Mozilla Thunderbird
On 7/11/2024 3:19 PM, joes wrote:
Am Thu, 11 Jul 2024 10:05:58 -0500 schrieb olcott:
On 7/11/2024 9:25 AM, joes wrote:
Am Thu, 11 Jul 2024 09:10:24 -0500 schrieb olcott:
On 7/11/2024 1:25 AM, Mikko wrote:
On 2024-07-10 17:53:38 +0000, olcott said:
On 7/10/2024 12:45 PM, Fred. Zwarts wrote:
Op 10.jul.2024 om 17:03 schreef olcott:
>
Unneeded complexity. It is equivalent to:
int main()
{
return HHH(main);
}
Every time any HHH correctly emulates DDD it calls the x86utm
operating system to create a separate process context with its own
memory virtual registers and stack, thus each recursively emulated
DDD is a different instance.
However, each of those instances has the same sequence of
instructions that the x86 language specifies the same operational
meaning.
*That is counter-factual*
Contradicting yourself? "Counterfactual" usually means "if it were
different".
>
When DDD is correctly emulated by HHH according to the semantics of
the x86 programming language HHH must abort its emulation of DDD or
both HHH and DDD never halt.
If the recursive call to HHH from DDD halts, the outer HHH doesn't need
to abort.
Sure and when squares are round you can measure the radius of a square.
Do you mean that HHH doesn't halt?
DDD depends totally on HHH; it halts exactly when HHH does.
Which it does, because it aborts.
Halting means reaching its own last instruction and terminating
normally.
What does HHH do after it aborts?
When DDD is correctly emulated by HHH1 according to the semantics of
the x86 programming language HHH1 need not abort its emulation of DDD
because HHH has already done this.
Where does HHH figure into this? It is not the simulator here.
The behavior of DDD emulated by HHH1 is identical to the behavior of
the directly executed DDD().
At last!
HHH must abort its simulation. HHH1 does not need to do that because HHH
has already done this.
No, HHH1 doesn't need to because DDD is just a regular program to it,
not constructed to be unsimulatable.
DDD correctly simulated by HHH has provably different behavior than DDD
correctly simulated by HHH1.
Which means that HHH is not doing the simulation correctly.
When HHH simulates DDD according to the semantics of
the x86 language then HHH is simulating correctly.
When people disagree with the semantics of the x86
language THEY ARE WRONG !!!
-- Copyright 2024 Olcott "Talent hits a target no one else can hit; Geniushits a target no one else can see." Arthur Schopenhauer