Sujet : Re: Neural Networks (MNIST inference) on the “3-cent” Microcontroller
De : d (at) *nospam* ray (D. Ray)
Groupes : comp.arch.embeddedDate : 28. Oct 2024, 17:42:42
Autres entêtes
Organisation : Usenet.Farm
Message-ID : <cpbKuNbyJaIKqGhWVrwdXAVlyKMyriQk@news.usenet.farm>
References : 1 2
User-Agent : NewsTap/5.5 (iPhone/iPod Touch)
George Neuner <
gneuner2@comcast.net> wrote:
Depends on whether you mean
Perhaps you misunderstood me. I’m not the author, I just posted beginning
of a blog post and provided the link to the rest of it because it seemed
interesting. The reason I didn’t post a whole thing is because there are
quite few illustrations.
Blog post ends with:
“It is indeed possible to implement MNIST inference with good accuracy
using one of the cheapest and simplest microcontrollers on the market. A
lot of memory footprint and processing overhead is usually spent on
implementing flexible inference engines, that can accomodate a wide range
of operators and model structures. Cutting this overhead away and reducing
the functionality to its core allows for astonishing simplification at this
very low end.
This hack demonstrates that there truly is no fundamental lower limit to
applying machine learning and edge inference. However, the feasibility of
implementing useful applications at this level is somewhat doubtful.”