Liste des Groupes | Revenir à l prolog |
Hi,
What WG17 could do to prevent segregation.
It could specify:
- The back_quotes flag. Not really something
new , most Prolog systems have it already.
- The [X] evaluable function. Not really something
new , most Prolog systems have it already. For
example DEC-10 Prolog (10 November 1982) had it
already, The new thing for some Prolog systems
would be its non-strict evaluation strategy
and the dual use:
[X] (a list of just one element) evaluates to X if X is an
integer. Since a quoted string is just a list of integers,
this allows a quoted character to be used in place of its
ASCII code; e.g. "A" behaves within arithmetic expressions
as the integer 65.
https://userweb.fct.unl.pt/~lmp/publications/online-papers/DECsystem-10%20PROLOG%20USER%27S%20MANUAL.pdf Instead what is WG17 doing?
- Introducing a notation for open strings:
[a, b, c|X] = "abc" || X
With a new separator ||, giving possibly much more
headache to Prolog system implementors than a flag
and an evaluable function.
Bye
Mild Shock schrieb:Oops should read:
>
0'0 =< [C], [C] =< 0'9, Digit is [C]-0'0`.
>
Mild Shock schrieb:What is holy is only for Dogelog Player!>
>
Do not give dogs what is holy, and do not
throw your pearls before pigs, lest they
trample them underfoot and turn to attack you.
-- Matthew 7:6
https://www.biblegateway.com/passage/?search=Matthew%207%3A6
>
I have deleted my posts and the swi2.pl.log proposal:
>
between(C, 0'0, 0'9), Digit is C-0'0.`
>
Just rewrite it to:
>
0'0 =< [Digit], [Digit] =< 0'9, [Digit] is C-0'0`.
>
The [X] in an evaluation is dual use again:
>
?- X is [a].
X = 97.
>
?- X is [0'a].
X = 97.
>
>
Mild Shock schrieb:>>
Inductive logic programming at 30
https://arxiv.org/abs/2102.10556
>
The paper contains not a single reference to autoencoders!
Still they show this example:
>
Fig. 1 ILP systems struggle with structured examples that
exhibit observational noise. All three examples clearly
spell the word "ILP", with some alterations: 3 noisy pixels,
shifted and elongated letters. If we would be to learn a
program that simply draws "ILP" in the middle of the picture,
without noisy pixels and elongated letters, that would
be a correct program.
>
I guess ILP is 30 years behind the AI boom. An early autoencoder
turned into transformer was already reported here (*):
>
SERIAL ORDER, Michael I. Jordan - May 1986
https://cseweb.ucsd.edu/~gary/PAPER-SUGGESTIONS/Jordan-TR-8604-OCRed.pdf >
>
Well ILP might have its merits, maybe we should not ask
for a marriage of LLM and Prolog, but Autoencoders and ILP.
But its tricky, I am still trying to decode the da Vinci code of
>
things like stacked tensors, are they related to k-literal clauses?
The paper I referenced is found in this excellent video:
>
The Making of ChatGPT (35 Year History)
https://www.youtube.com/watch?v=OFS90-FX6pg
>
Les messages affichés proviennent d'usenet.