Dead horse or wake up call? (Was: Spring 2025 Challenge: TicTacToe Transformer)

Liste des GroupesRevenir à cl prolog 
Sujet : Dead horse or wake up call? (Was: Spring 2025 Challenge: TicTacToe Transformer)
De : janburse (at) *nospam* fastmail.fm (Mild Shock)
Groupes : comp.lang.prolog
Date : 25. Feb 2025, 09:09:04
Autres entêtes
Message-ID : <vpjtqv$nli9$1@solani.org>
References : 1
User-Agent : Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:128.0) Gecko/20100101 Firefox/128.0 SeaMonkey/2.53.20
Prologers are still on the path of Don Quixote:
 > extremely restrictive setting and the only reason
 > it’s worked so well over the years is that people
 > have persisted at flogging it like the deadest
 > of dead horses
For some its a dead horse, for others by means of the
two nobel prices, one for Geoffrey Hinton in Physics and
one for Demis Hassabis in Chemistry, both in 2024,
its rather a wakeup call.
The current state of affaire in Prolog is , autoencoders
and transformers are not available via ILP, it lacks the
conceptual setting, because its based on a model of
belief congruence,
trying to avoid cognitve dissonance. Basically ILP adopts
Abduction as already conceived by Charles Sanders Peirce.
He is also the originator of Conceptual Graphs. The
problem is solved for some
background knowledge B and some observation E, in that the
idea is to find a hypothesis H such that:
Consistency: B, H |/- f /* no absurdity */
Completess: B, H |- E
There is also a refinement with positive and negative
observation E+ and E-. The challenge I am positing is to
get some hands-on and see what are the merits of autoencoders
and transformers, and maybe to see whether there is a possible
marriage of autoencoders and transformers with ILP. The
challenge here is that autoencoders and transformers have
no concept of absurdity. The main feature of extrapolation in autoencoders and transformers are:
- Inferencing:
   The autoencoder might also tolerate deviations in
   the input that are not in the training data, giving
   it some inferential capability.
- Generation:
   And then choose an output again not in the training
   data, giving it some generative capabilities.
There is no measurement against absurdity in the
inferencing and no measurement against absurdity in
the generation. This is also seen in practice, like
when you interact with
ChatGPT, it can halucinate unicorns, and it can
even make mistake, in the halucination, like believing
the are are white chetsnut unicorns.
So the following is possible:
   There are unicorns
   There are white chestnut unicorns
I see this as a chance that absurdity is possible in
autoencoders and transformers, for many reasons,
especially from my interest in paraconsistent logics.
You can already not assume that training data is
consistent. That there is no ex falso explosion in the
type of autoencoder and transformer machine learning
is rather a benefit than a curse, and somehow gives a
neat solution to many problems, where ILP might
fail by design because it is too strict.
See also:
https://de.wikipedia.org/wiki/Geoffrey_Hinton
https://de.wikipedia.org/wiki/Demis_Hassabis
https://en.wikipedia.org/wiki/Abductive_reasoning#Abduction
Mild Shock schrieb:
 Very simple challenge conceptually, develop the idea
of Centipawn towards TicTacToe and implement the
game based on learning / training a transformer, and
 then executing it. All written in Prolog itself! Optional
bonus exercise, make the execution ИИUƎ style, i.e.
incremental evaluation of the transformer.
 Centipawn - Chess Wiki
https://chess.fandom.com/wiki/Centipawn
 NNUE - Chess Programming Wiki
https://www.chessprogramming.org/NNUE

Date Sujet#  Auteur
24 Feb 25 * Spring 2025 Challenge: TicTacToe Transformer5Mild Shock
25 Feb 25 +* Dead horse or wake up call? (Was: Spring 2025 Challenge: TicTacToe Transformer)2Mild Shock
25 Feb 25 i`- Some Progress with SWI-Prolog eForests (Re: Dead horse or wake up call?)1Mild Shock
2 Mar 25 `* Will a decoder-only transformer also work? (Was: Spring 2025 Challenge: TicTacToe Transformer)2Mild Shock
2 Mar 25  `- I didn't invent these things (Was: Will a decoder-only transformer also work?)1Mild Shock

Haut de la page

Les messages affichés proviennent d'usenet.

NewsPortal