Science is not prepared for the AI Revolution (Was: Prolog totally missed the AI Boom)

Liste des GroupesRevenir à l prolog 
Sujet : Science is not prepared for the AI Revolution (Was: Prolog totally missed the AI Boom)
De : janburse (at) *nospam* fastmail.fm (Mild Shock)
Groupes : comp.lang.prolog
Date : 29. Jun 2025, 15:35:37
Autres entêtes
Message-ID : <103rivn$1frqg$1@solani.org>
References : 1
User-Agent : Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:128.0) Gecko/20100101 Firefox/128.0 SeaMonkey/2.53.21
Hi,
How it started, total humbug:
Quantentheorie und "Ich" - Gary B. Schmid (2017)
https://cropfm.at/archive/show/quantenich
How its going, a little better:
A Science without Free Will - Robert M. Sapolsky (2023)
https://www.amazon.de/dp/0525560971
Bye
Mild Shock schrieb:
 Inductive logic programming at 30
https://arxiv.org/abs/2102.10556
 The paper contains not a single reference to autoencoders!
Still they show this example:
 Fig. 1 ILP systems struggle with structured examples that
exhibit observational noise. All three examples clearly
spell the word "ILP", with some alterations: 3 noisy pixels,
shifted and elongated letters. If we would be to learn a
program that simply draws "ILP" in the middle of the picture,
without noisy pixels and elongated letters, that would
be a correct program.
 I guess ILP is 30 years behind the AI boom. An early autoencoder
turned into transformer was already reported here (*):
 SERIAL ORDER, Michael I. Jordan - May 1986
https://cseweb.ucsd.edu/~gary/PAPER-SUGGESTIONS/Jordan-TR-8604-OCRed.pdf
 Well ILP might have its merits, maybe we should not ask
for a marriage of LLM and Prolog, but Autoencoders and ILP.
But its tricky, I am still trying to decode the da Vinci code of
 things like stacked tensors, are they related to k-literal clauses?
The paper I referenced is found in this excellent video:
 The Making of ChatGPT (35 Year History)
https://www.youtube.com/watch?v=OFS90-FX6pg
 

Date Sujet#  Auteur
22 Feb 25 * Prolog totally missed the AI Boom28Mild Shock
22 Feb 25 +* Auto-Encoders as Prolog Fact Stores (Was: Prolog totally missed the AI Boom)3Mild Shock
23 Feb 25 i+- Ignorance in ILP circles confirmed (Was: Auto-Encoders as Prolog Fact Stores)1Mild Shock
19 Mar 25 i`- Neuro infused logic programming [NILP] (Was: Auto-Encoders as Prolog Fact Stores)1Mild Shock
7 Mar 25 +- Last Exit Analogical Resoning (Was: Prolog totally missed the AI Boom)1Mild Shock
25 Mar 25 +* A software engineering analyis why Prolog fails (Was: Prolog totally missed the AI Boom)3Mild Shock
27 Mar 25 i`* Lets re-iterate software engineering first! (Was: A software engineering analyis why Prolog fails)2Mild Shock
27 Mar 25 i `- Re: Lets re-iterate software engineering first! (Was: A software engineering analyis why Prolog fails)1Mild Shock
23 Jun 25 +* No Coders completely Brain Dead (Was: Prolog totally missed the AI Boom)12Mild Shock
23 Jun 25 i`* Unicode and atom length=1 (Was: No Coders completely Brain Dead)11Mild Shock
23 Jun 25 i `* Most radical approach is Novacore from Dogelog Player (Was: Unicode and atom length=1)10Mild Shock
23 Jun 25 i  +* SWI-Prolog master not wide awake, doing day-sleeping (Was: Most radical approach is Novacore from Dogelog Player)6Mild Shock
23 Jun 25 i  i+- Re: SWI-Prolog master not wide awake, doing day-sleeping (Was: Most radical approach is Novacore from Dogelog Player)1Mild Shock
23 Jun 25 i  i+- The beauty of a double hook (Was: SWI-Prolog master not wide awake, doing day-sleeping)1Mild Shock
23 Jun 25 i  i`* The beauty of a dual use hook (Was: SWI-Prolog master not wide awake, doing day-sleeping)3Mild Shock
23 Jun 25 i  i `* maplist(char_code, Chars, Codes) is bidirectional (Was: The beauty of a dual use hook)2Mild Shock
23 Jun 25 i  i  `- I really have lost all hope and given up (Was: maplist(char_code, Chars, Codes) is bidirectional)1Mild Shock
27 Jun12:21 i  `* Do Prologers know the Unicode Range? (Was: Most radical approach is Novacore from Dogelog Player)3Mild Shock
27 Jun12:22 i   `* Can Prologers produce 100% Prolog Code? (Was: Do Prologers know the Unicode Range?)2Mild Shock
27 Jun12:36 i    `- Attention: Python versus Java (Was: Can Prologers produce 100% Prolog Code?)1Mild Shock
23 Jun 25 +* Do not give dogs what is holy [Matthew 7:6] (Was: Prolog totally missed the AI Boom)5Mild Shock
23 Jun 25 i`* Typo:: Do not give dogs what is holy [Matthew 7:6] (Was: Prolog totally missed the AI Boom)4Mild Shock
23 Jun 25 i `* What WG17 could do to prevent segregation [DEC-10 Prolog (10 November 1982)] (Was: Typo:: Do not give dogs what is holy)3Mild Shock
23 Jun 25 i  `* Avoid the cheap tricks by Scryer Prolog (Was: What WG17 could do to prevent segregation [DEC-10 Prolog (10 November 1982)])2Mild Shock
23 Jun 25 i   `- Why tuck the tail in front of a false Messias (Was: Avoid the cheap tricks by Scryer Prolog)1Mild Shock
29 Jun12:32 +* Missed the AI Boom because missed the Emojis (Was: Prolog totally missed the AI Boom)2Mild Shock
29 Jun12:36 i`- Bonus in Trealla Prolog, different Tokenizer (Was: Missed the AI Boom because missed the Emojis)1Mild Shock
29 Jun15:35 `- Science is not prepared for the AI Revolution (Was: Prolog totally missed the AI Boom)1Mild Shock

Haut de la page

Les messages affichés proviennent d'usenet.

NewsPortal