• Re: =?utf-8?Q?Neural_Networks_(MNIS?= =?utf-8?Q?T_inference)_on_the_?=

    From George Neuner@21:1/5 to D. Ray on Tue Oct 22 15:39:42 2024
    On Mon, 21 Oct 24 20:06:28 UTC, D. Ray <d@ray> wrote:

    Bouyed by the surprisingly good performance of neural networks with >quantization aware training on the CH32V003, I wondered how far this can be >pushed. How much can we compress a neural network while still achieving
    good test accuracy on the MNIST dataset? When it comes to absolutely
    low-end microcontrollers, there is hardly a more compelling target than the >Padauk 8-bit microcontrollers. These are microcontrollers optimized for the >simplest and lowest cost applications there are. The smallest device of the >portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
    memory and 64 bytes of ram, more than an order of magnitude smaller than
    the CH32V003. In addition, it has a proprieteray accumulator based 8-bit >architecture, as opposed to a much more powerful RISC-V instruction set.

    Is it possible to implement an MNIST inference engine, which can classify >handwritten numbers, also on a PMS150C?


    <https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>

    <https://archive.md/DzqzL>


    Depends on whether you mean implementing /their/ recognizer, or just implementing a recognizer that could be trained using their data set.

    Any 8-bitter can easily handle the computations ... FP is not required
    - fixed point fractions will do fine. The issue is how much memory is
    needed and what your target chip brings to the party.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From George Neuner@21:1/5 to olcott on Sun Oct 27 16:41:31 2024
    On Sat, 26 Oct 2024 20:43:01 -0500, olcott <NoOne@NoWhere.com> wrote:


    test to see if this posts or I should dump this paid provider.


    Eternal September is a good, no cost Usenet provider.

    http://www.eternal-september.org/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From D. Ray@21:1/5 to George Neuner on Mon Oct 28 15:42:42 2024
    George Neuner <gneuner2@comcast.net> wrote:

    Depends on whether you mean

    Perhaps you misunderstood me. I’m not the author, I just posted beginning
    of a blog post and provided the link to the rest of it because it seemed interesting. The reason I didn’t post a whole thing is because there are quite few illustrations.

    Blog post ends with:

    “It is indeed possible to implement MNIST inference with good accuracy
    using one of the cheapest and simplest microcontrollers on the market. A
    lot of memory footprint and processing overhead is usually spent on implementing flexible inference engines, that can accomodate a wide range
    of operators and model structures. Cutting this overhead away and reducing
    the functionality to its core allows for astonishing simplification at this very low end.

    This hack demonstrates that there truly is no fundamental lower limit to applying machine learning and edge inference. However, the feasibility of implementing useful applications at this level is somewhat doubtful.”

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)