Sujet : Re: Remember "Bit-Slice" Chips ?
De : bowman (at) *nospam* montana.com (rbowman)
Groupes : comp.os.linux.miscDate : 20. Dec 2024, 18:19:52
Autres entêtes
Message-ID : <lsln9nFbe1iU1@mid.individual.net>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
User-Agent : Pan/0.149 (Bellevue; 4c157ba)
On Fri, 20 Dec 2024 01:31:30 -0500,
186282@ud0s4.net wrote:
It's the 'hand-wave' thing that sunk the first AI paradigm.
Marv Minsky (who posted on usenet for awhile) and friends saw how
easily 'decisions' could be done with a transistor or two and assumed
it would thus be easy to build an AI. AC Clarke used the Minsky
optimism when fashioning the idea of "HAL".
Minsky threw a wrench in the works with his 9169 'Perceptrons'. He had
tried to implement B. F. skinner's operant condition with a analog lashup
that sort of worked if the vacuum tubes didn't burn out. Rosenblatt has
built a 'Perceptron' and Minsky pointed out original design couldn't
handle an XOR. That sent research down another rabbit hole.
By the '80s the original perceptron had evolved into a multilayer network
train by back propagation. When I played around with it 'Parallel
Distributed Processing' by Rumelhart and McClelland was THE book.
https://direct.mit.edu/books/monograph/4424/Parallel-Distributed-Processing-Volume
The ideas were fascinating but the computing power wasn't there. Most of
what I learned then is still relevant to TensorFlow and the other neural
network approaches except now there are the $30,000 Nvidia GPUs to do the
heavy lifting.
The '80s neural networks weren't practical so the focus shifted to expert
systems until they petered out. The boom and bust cycles led to the term
'AI Winter'
https://www.techtarget.com/searchenterpriseai/definition/AI-winterI think something worthwhile will come from this cycle but ultimately it
won't be the LLMs that are getting all the hype.