Do 8-bit Microcontrollers Have a Future in ML & DL?

Thread Starter

POU_117

Joined Oct 25, 2025
1
Hi everyone!

I’m currently conducting an academic research project on the potential role of 8-bit microcontrollers in lightweight AI applications.

As we know, Artificial Intelligence has been evolving rapidly and increasingly demands more computational resources. While 8-bit microcontrollers are limited in terms of processing power and memory, my research aims to assess whether these architectures can still play a useful role in lightweight AI applications, or if they have become entirely obsolete in this field.

I would really appreciate your insights on the following questions:

  1. Are 8-bit microcontroller architectures still used in industrial applications nowadays? If so, could you please specify which models are still in use?
  2. What software frameworks or tools are currently available for AI development on 32-bit microcontroller architectures?
  3. Have you seen any applications developed using Machine Learning or Deep Learning on 8-bit microcontrollers?
  4. Do you think it is viable to develop AI applications (ML & DL) on 8-bit microcontrollers, or is it better to move on from them for such tasks?
  5. Are there any strategies or techniques you know of to optimize AI models and adapt them to such limited devices?
  6. Do you think 8-bit microcontrollers are still a good learning tool for engineering or electronics students?
  7. How do you envision the future of 8-bit microcontrollers in a context where AI is becoming increasingly relevant?

Thank you very much for your time and expertise!
 

Papabravo

Joined Feb 24, 2006
22,065
To answer #1, Industrial networks on the factory floor make extensive use of 8-bit micros to implement I/O devices with digital inputs and outputs along with analog inputs and outputs. A network is used to reduce the amount of "home-run" wiring that might have been used with traditional, relay-based ladder logic. I expect this will continue to be the case as the PLCs (Programmable Logic Controllers) are implemented with increasingly capable and sophisticated processors. I can imagine scan cycles on the order of 500 microseconds or less with top end GPUs. That is what drinking from a fire hose might be like in terms of data transfer volume.

BTW, what do the initials ML and DL stand for?
 
Last edited:

nsaspook

Joined Aug 27, 2009
16,273
Last edited:

BobTPH

Joined Jun 5, 2013
11,487
But can they be used to actually implement machine learning type AI? No. What fraction of applications currently using 8-bit micros need machine learning, or would benefit from it? 0.
 

nsaspook

Joined Aug 27, 2009
16,273
But can they be used to actually implement machine learning type AI? No. What fraction of applications currently using 8-bit micros need machine learning, or would benefit from it? 0.
https://ai.engineering.columbia.edu/ai-vs-machine-learning/
Artificial Intelligence (AI) vs. Machine Learning
What Is Artificial Intelligence?
Artificial Intelligence is the field of developing computers and robots that are capable of behaving in ways that both mimic and go beyond human capabilities. AI-enabled programs can analyze and contextualize data to provide information or automatically trigger actions without human interference.

Today, artificial intelligence is at the heart of many technologies we use, including smart devices and voice assistants such as Siri on Apple devices. Companies are incorporating techniques such as natural language processing and computer vision — the ability for computers to use human language and interpret images — to automate tasks, accelerate decision making, and enable customer conversations with chatbots.

What Is Machine Learning?
Machine learning is a pathway to artificial intelligence. This subcategory of AI uses algorithms to automatically learn insights and recognize patterns from data, applying that learning to make increasingly better decisions.

By studying and experimenting with machine learning, programmers test the limits of how much they can improve the perception, cognition, and action of a computer system.

Deep learning, an advanced method of machine learning, goes a step further. Deep learning models use large neural networks — networks that function like a human brain to logically analyze data — to learn complex patterns and make predictions independent of human input.
We don't need lots of things related to 'AI' (a very wide field that includes LLM types) but there are ML applications that run on 8-bit controllers because the definition of what is machine learning has no exception for processing power, only ML functionality.
https://medium.com/data-science/ultra-tinyml-machine-learning-for-8-bit-microcontroller-9ec8f7c8dd12

https://arxiv.org/pdf/2403.19076
Tiny Machine Learning: Progress and Futures

You generate the training dataset with the 8-bit machine, process the training dataset with something powerful, and then download the model to the 8-bit controller for 'X' ML binary classification function of the sensor data.
 
Last edited:

WBahn

Joined Mar 31, 2012
32,760
Hi everyone!

I’m currently conducting an academic research project on the potential role of 8-bit microcontrollers in lightweight AI applications.

As we know, Artificial Intelligence has been evolving rapidly and increasingly demands more computational resources. While 8-bit microcontrollers are limited in terms of processing power and memory, my research aims to assess whether these architectures can still play a useful role in lightweight AI applications, or if they have become entirely obsolete in this field.

I would really appreciate your insights on the following questions:

  1. Are 8-bit microcontroller architectures still used in industrial applications nowadays? If so, could you please specify which models are still in use?
  2. What software frameworks or tools are currently available for AI development on 32-bit microcontroller architectures?
  3. Have you seen any applications developed using Machine Learning or Deep Learning on 8-bit microcontrollers?
  4. Do you think it is viable to develop AI applications (ML & DL) on 8-bit microcontrollers, or is it better to move on from them for such tasks?
  5. Are there any strategies or techniques you know of to optimize AI models and adapt them to such limited devices?
  6. Do you think 8-bit microcontrollers are still a good learning tool for engineering or electronics students?
  7. How do you envision the future of 8-bit microcontrollers in a context where AI is becoming increasingly relevant?

Thank you very much for your time and expertise!
What is your "research" for?

Your list of questions seem like they come out of a course assignment. Is this some kind of homework?
 

simozz

Joined Jul 23, 2017
170
Do 8-bit Microcontrollers Have a Future in ML & DL?
Short answer: IMO no, but in general depends on MCU HW performances in therms of IE speed and memory (size and speed/fast access etc.). This is not the top on the market so...

You may struggle (in the sense of IE latencies) to compute/deal 32b data on a 8b MCU (for 8b you pay what you get for), I wouldn't use 8b CPUs for ML and/or DL applications (which are usually high computing performance and speed demanding).
 
Top