Sujet : Re: Windows-on-ARM Laptop Is A “Frequently-Returned Item” On Amazon
De : nospam (at) *nospam* needed.invalid (Paul)
Groupes : alt.comp.os.windows-11 comp.os.linux.advocacyDate : 24. Mar 2025, 16:15:30
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vrrsuk$15shc$1@dont-email.me>
References : 1 2 3 4 5 6 7
User-Agent : Ratcatcher/2.0.0.25 (Windows/20130802)
On Mon, 3/24/2025 2:21 AM, rbowman wrote:
On Mon, 24 Mar 2025 00:36:17 -0000 (UTC), Chris wrote:
Paul <nospam@needed.invalid> wrote:
On Sun, 3/23/2025 7:17 AM, Joel wrote:
>
It's clear why Microsoft would use x86 emulation with ARM, countless
reasons, but who cares about their Copilot bullshit, put Linux for ARM
on that mother fucker.
>
Some day, you'll be able to run an AI locally.
>
You can. Have a look at Ollama. Totally local and open source. Works
well too!
Training and inference are two different things. Other than toy datasets I
doubt much training will happen locally.
Realistically, I think it's going to be quite a while,
if ever, before we can put together a decent box for inference.
In this gold rush, all the excess profit is in "mules and shovels".
A mule I was looking at today, the price is $8500 or so.
This kind of pricing, is hardly encouraging.
It would be cheaper, to build a wooden box, and
put a midget inside the box, and have it answer
questions. Can anyone give me a price on a
PhD class midget ?
Paul