Liste des Groupes | Revenir à col advocacy |
On Mon, 3/24/2025 2:21 AM, rbowman wrote:On Mon, 24 Mar 2025 00:36:17 -0000 (UTC), Chris wrote:
Paul <nospam@needed.invalid> wrote:On Sun, 3/23/2025 7:17 AM, Joel wrote:
It's clear why Microsoft would use x86 emulation with ARM, countless
reasons, but who cares about their Copilot bullshit, put Linux for ARM
on that mother fucker.
Some day, you'll be able to run an AI locally.
You can. Have a look at Ollama. Totally local and open source. Works
well too!
Training and inference are two different things. Other than toy datasets I
doubt much training will happen locally.
Realistically, I think it's going to be quite a while,
if ever, before we can put together a decent box for inference.
Les messages affichés proviennent d'usenet.