Sujet : Re: Remember "Bit-Slice" Chips ?
De : bowman (at) *nospam* montana.com (rbowman)
Groupes : comp.os.linux.miscDate : 15. Dec 2024, 07:46:09
Autres entêtes
Message-ID : <ls7c9hF1af8U1@mid.individual.net>
References : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
User-Agent : Pan/0.149 (Bellevue; 4c157ba)
On Sat, 14 Dec 2024 23:57:52 -0500,
186282@ud0s4.net wrote:
"Stochastic" basically means "guessing".
Pretty much. For example if you're building a classifier you split your
labeled data into two chunks, one for training and one for testing. Rinse
and repeat until the output is good enough. If your image classified
mistakes a black for a gorilla there will be hell to pay.
In the training process some randomness is often introduced on purpose.
The problem is local maxima (or minima depending on how you prefer to
thing). If you picture a three dimensional surface with mountains and
valleys gradient descent tends to get stuck,
For example, assume you're hiking in mountainous terrain and you algorithm
is to always head uphill. Sooner or later you'll find yourself at a place
where all choices are downhill, but it isn't the highest hill around. You
need to roll the dice to get off it.
https://en.wikipedia.org/wiki/Simulated_annealingThere's a nice little animation if you scroll down a bit.
After training you really hope that the result will be deterministic. You
don't want the cat to be alive on one run and dead on the next. Or a dog.
Differentiating cats and dogs is one of the 'hello world' projects in ML.
Sometimes the results aren't what you hoped for. In one sample set the
dogs tended to be photographed outdoors and the cats indoors. Whatever
magic went on in training the result was the 'intelligence' was really
good at sorting outdoor images from indoor ones. It didn't know jack about
dogs and cats.