Liste des Groupes | Revenir à l prolog |
Oops should read:
0'0 =< [C], [C] =< 0'9, Digit is [C]-0'0`.
Mild Shock schrieb:What is holy is only for Dogelog Player!
>
Do not give dogs what is holy, and do not
throw your pearls before pigs, lest they
trample them underfoot and turn to attack you.
-- Matthew 7:6
https://www.biblegateway.com/passage/?search=Matthew%207%3A6
>
I have deleted my posts and the swi2.pl.log proposal:
>
between(C, 0'0, 0'9), Digit is C-0'0.`
>
Just rewrite it to:
>
0'0 =< [Digit], [Digit] =< 0'9, [Digit] is C-0'0`.
>
The [X] in an evaluation is dual use again:
>
?- X is [a].
X = 97.
>
?- X is [0'a].
X = 97.
>
>
Mild Shock schrieb:>>
Inductive logic programming at 30
https://arxiv.org/abs/2102.10556
>
The paper contains not a single reference to autoencoders!
Still they show this example:
>
Fig. 1 ILP systems struggle with structured examples that
exhibit observational noise. All three examples clearly
spell the word "ILP", with some alterations: 3 noisy pixels,
shifted and elongated letters. If we would be to learn a
program that simply draws "ILP" in the middle of the picture,
without noisy pixels and elongated letters, that would
be a correct program.
>
I guess ILP is 30 years behind the AI boom. An early autoencoder
turned into transformer was already reported here (*):
>
SERIAL ORDER, Michael I. Jordan - May 1986
https://cseweb.ucsd.edu/~gary/PAPER-SUGGESTIONS/Jordan-TR-8604-OCRed.pdf
>
Well ILP might have its merits, maybe we should not ask
for a marriage of LLM and Prolog, but Autoencoders and ILP.
But its tricky, I am still trying to decode the da Vinci code of
>
things like stacked tensors, are they related to k-literal clauses?
The paper I referenced is found in this excellent video:
>
The Making of ChatGPT (35 Year History)
https://www.youtube.com/watch?v=OFS90-FX6pg
>
Les messages affichés proviennent d'usenet.