Liste des Groupes | Revenir à cl prolog |
will probably never get a Turing Award or something
for what I did 23 years ago. Why is its reading
count on research gate suddently going up?
Knowledge, Planning and Language,
November 2001
I guess because of this, the same topic takled by
Microsofts recent model GRIN. Shit. I really should
find some investor and pump up a start up!
"Mixture-of-Experts (MoE) models scale more
effectively than dense models due to sparse
computation through expert routing, selectively
activating only a small subset of expert modules."
https://arxiv.org/pdf/2409.12136
But somehow I am happy with my dolce vita as
it is now... Or maybe I am decepting myself?
P.S.: From the GRIN paper, here you see how
expert domains modules relate with each other:
Figure 6 (b): MoE Routing distribution similarity
across MMLU 57 tasks for the control recipe.
Les messages affichés proviennent d'usenet.