Liste des Groupes | Revenir à l prolog |
> Those that use a large part pay a pretty
> high price in terms of memory and currently
> also time for code points > 0xffff
Emojis are typically above 0xffff. And from this
announcement its seem, Emojis are a big part with
keeping up with the AI Boom:
> :rocket: Call for Papers: Integrating Logical
> Reasoning & Large Language Models (LLMs) :brain:
>
> https://swi-prolog.discourse.group/t/9065
But it would cost you nothing to support this here in library(portray_text):
/* SWI-Prolog 9.3.24 */
?- X = [a,b,c]
X = `abc`
It is extremly trivial to implement, its not really
rocket science. It doesn need much brains and
it works also for Emojis:
/* Scryer Prolog 0.9.4-411 */
?- X = [a,b,c].
X = "abc".
?- X = ['🚀', a, '🧠', b, c].
X = "🚀a🧠bc".
In Scryer Prolog it shows double quotes and not
back quotes, because of the different default settings
of the Prolog flags double_quotes and back_quotes.
Mild Shock schrieb:>
Inductive logic programming at 30
https://arxiv.org/abs/2102.10556
>
The paper contains not a single reference to autoencoders!
Still they show this example:
>
Fig. 1 ILP systems struggle with structured examples that
exhibit observational noise. All three examples clearly
spell the word "ILP", with some alterations: 3 noisy pixels,
shifted and elongated letters. If we would be to learn a
program that simply draws "ILP" in the middle of the picture,
without noisy pixels and elongated letters, that would
be a correct program.
>
I guess ILP is 30 years behind the AI boom. An early autoencoder
turned into transformer was already reported here (*):
>
SERIAL ORDER, Michael I. Jordan - May 1986
https://cseweb.ucsd.edu/~gary/PAPER-SUGGESTIONS/Jordan-TR-8604-OCRed.pdf
>
Well ILP might have its merits, maybe we should not ask
for a marriage of LLM and Prolog, but Autoencoders and ILP.
But its tricky, I am still trying to decode the da Vinci code of
>
things like stacked tensors, are they related to k-literal clauses?
The paper I referenced is found in this excellent video:
>
The Making of ChatGPT (35 Year History)
https://www.youtube.com/watch?v=OFS90-FX6pg
>
Les messages affichés proviennent d'usenet.