Sujet : Will a decoder-only transformer also work? (Was: Spring 2025 Challenge: TicTacToe Transformer)
De : janburse (at) *nospam* fastmail.fm (Mild Shock)
Groupes : comp.lang.prologDate : 02. Mar 2025, 03:49:32
Autres entêtes
Message-ID : <vq0gvr$tuur$1@solani.org>
References : 1
User-Agent : Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:128.0) Gecko/20100101 Firefox/128.0 SeaMonkey/2.53.20
Ok, my bad. You can of course also try a decoder-only.
Just like here in this Python code example:
> **Simple PyTorch Implementation of “Grokking”**
> We trained a standard decoder-only transformer (Vaswani et al., 2017)
>
https://github.com/teddykoker/grokkingThe transformer need not necessarely have a encoder and
a latent space. It can be also a decoder-only.
Mild Shock schrieb:
Very simple challenge conceptually, develop the idea
of Centipawn towards TicTacToe and implement the
game based on learning / training a transformer, and
then executing it. All written in Prolog itself! Optional
bonus exercise, make the execution ИИUƎ style, i.e.
incremental evaluation of the transformer.
Centipawn - Chess Wiki
https://chess.fandom.com/wiki/Centipawn
NNUE - Chess Programming Wiki
https://www.chessprogramming.org/NNUE