source code for a basic Preemptive RTOS

BR-549

Joined Sep 22, 2013
4,928
I'm sure that's the case. I think that 5G will bring about a huge programming demand for the small stuff. I don't think this demand will be solved by professional programmers. With the high speed and large memory, "function routines" will be installed by visual programming tools that the common man can use. Every household(or person) will have it's own part of the IoT network.

Just like tubes are now, there will always be a custom 8 bit niche.

I'm sure there will always be a demand for professional programmers....at all levels.

But I also believe that processor and memory over-kill will become standard. It allows everyone to participate.
 

BR-549

Joined Sep 22, 2013
4,928
Of course, 5G might cause a delayed incurable cancer too. Who knows? Some think even a little harmful health effects would be worth it.

We still drive cars.

The demand for quick large wireless data transfer, will win. We won't quit now.
 

Thread Starter

bug13

Joined Feb 13, 2012
2,002
Once you get over the learning curve you won't go back. And there are better development tools.
That's encouraging. I think I need to buy an STM32 dev board to play with.

My current job is an in-house development of a dynamic balancing machine for small jet turbines. The hardware has a lot of stuff to do and the UI BT'ed to a PC or Android app. I started with an 8 bit cpu but there was too much maths to do and too many tasks to do to sensibly keep track of what was going on. The extra maths meant that the Android had to do it, which was asking a lot of it (glorified telephone), and the comms bandwidth went too high as well. Now it is an STM32 and FreeRTOS. All the tasks are now under control and easy to manage. The maths is done too, blindingly fast, and so only the answers need to be sent to the host. The PC and the Android are just a GUI and don't have to do much thinking. All this for the same hardware cost as an 8bit solution
I am very interested in why you use FreeRTOS instead of CHIBIOS in this case. I can see from the FreeRTOS website, it got more library, eg tcp/ip stack, fat file system, WolfSSL etc... It even support AWS as well.

While CHIBIOS, I don't see much on, their main emphasis is fast and small foot print.
 

jfitter

Joined Mar 14, 2019
14
That's encouraging. I think I need to buy an STM32 dev board to play with.



I am very interested in why you use FreeRTOS instead of CHIBIOS in this case. I can see from the FreeRTOS website, it got more library, eg tcp/ip stack, fat file system, WolfSSL etc... It even support AWS as well.

While CHIBIOS, I don't see much on, their main emphasis is fast and small foot print.
FreeRTOS is my original RTOS training. I am comfortable with it and it is well supported by the STM tools.
CHIBIOS came about because I needed to develop a motor controller and the model I learned from used CHIBIOS. I rather like CHIBIOS now. It is clean and easy to use and fast too.
CHIBIOS or FreeRTOS - right now it depends on which side of the bed I get up on.....
 

jfitter

Joined Mar 14, 2019
14
...Just like tubes are now, there will always be a custom 8 bit niche.....also believe that processor and memory over-kill will become standard. It allows everyone to participate.
This goes back to what I wrote earlier. If you want to make a million widgets then processor cost overrides development cost.
I haven't made a million of anything yet, despite trying hard, so development cost is key to me. Since most of any development cost is debugging then anything that makes bug free code is on my list. RTOSs enhance bug free embedded code (except for the bugs in the RTOS, but hey, that's another guy's problem - choose carefully).

It's called "redundant engineering". We have so many resources available in a cheap, common package that we use it for any task and just ignore the resources we don't need. This is actually GOOD engineering because it reduces manufacturing and parts costs.

PC programmers have had it this way for years. They assume infinite resources. Hear them scream and hide behind their screens when the embedded guys talk about clock cycles and stack overflows :(
 

Ya’akov

Joined Jan 27, 2019
9,152
You might be surprised to hear the 8-bit micros are still selling like cold beer in hell. All of them have a particular range of utility.

https://www.electronicdesign.com/microcontrollers/11-myths-about-8-bit-microcontrollers
There has been a trend, fueled by the effects of Moore’s Law, to ignore the relative cost of throwing CPU and memory at problems and see only the absolute, immediate cheapness of having so much to waste.

But in fact, this is not always cost free once the systems involved become practical and are in production. There will always be a trend back to simplicity after complexity, seeming “easy”, turns out to be complex after all.

I’ve seen it many times. It won’t be true for every use case, but there will be quite a few where the “innovation” will be a return to simplicity, possibility with a new name, and greybeards will sigh and say, “we did that 20 years ago...”
 

jfitter

Joined Mar 14, 2019
14
There has been a trend, fueled by the effects of Moore’s Law, to ignore the relative cost of throwing CPU and memory at problems and see only the absolute, immediate cheapness of having so much to waste.

But in fact, this is not always cost free once the systems involved become practical and are in production. There will always be a trend back to simplicity after complexity, seeming “easy”, turns out to be complex after all.

I’ve seen it many times. It won’t be true for every use case, but there will be quite a few where the “innovation” will be a return to simplicity, possibility with a new name, and greybeards will sigh and say, “we did that 20 years ago...”
I am one of those "grey beards", and 20 years ago was actually over 45 years ago writing in Assembler and Fortran and creating stuff out of wires and solder.
Innovation, simplicity, and elegance are nice to have and make you feel good, but costs are driven by development, not hardware, and redundant engineering lowers development costs.
There is no concept of waste on a chip. The actual material cost is very small. The product cost is in the IP. It is geometry. You are paying for a pattern, so if you don't use part of the pattern then there is nothing to lose. What you gain is that one pattern, one IP overhead, one tooling overhead, solves a multitude of different applications.
So I must respectfully disagree with the author that a return to simplicity is in our future. Where does it end? Discrete transistor logic....
Of course this is just one man's opinion :)
 

Ya’akov

Joined Jan 27, 2019
9,152
You misunderstand.

There is a tendency to adopt new technology on promise, and then to learn what the actual tradeoffs are, and to optimize based on experience.

It's not that we regress to less functional solutions, it's that we learn where previous technologies were more efficient and apply them in those cases. There are many facets to the cost of development, including what knowledge people have to work with. Sometimes, an excellent and effective solution is lost because those who knew how to apply it age out.

The newer engineers and technicians don't know about the older ideas which may well be very applicable but someone spots the potential of the "old way" of doing things and reintroduces it, fully aware of what the newer technology can do.

This is far more clear at the system level, but happens up and down the technology spectrum, including tools and techniques. We are flooded with new things, and new people learn them. The old things, the ones worth preserving (not everything, clearly), fade out as the people who use them disappear from the scene. But, those that had merit reappear, usually as "innovation", sometimes as rediscovery.

So, no, I am not saying what you suggest in your last paragraph, so you may still disagree, but that wasn't a disagreement with what I meant, and your reductio ad absurdum missed the target.
 

bogosort

Joined Sep 24, 2011
696
There is a tendency to adopt new technology on promise, and then to learn what the actual tradeoffs are, and to optimize based on experience.
But 32-bit architectures are not new technology. From a business perspective, it can be argued that total development costs are generally less for 32-bit applications than 8-bit, and -- as jfitter pointed out -- dev costs swamp hardware costs for most low-volume businesses. As a developer, I much prefer working with 32-bit platforms and toolchains, and -- yes, please -- a hardware FPU.

Obviously, there are specific cases where an 8-bit MCU is optimal, especially if power budget is the primary concern. But I'd argue, contrary to what I think your point is, that 32-bit MCUs have less overall development complexity than 8-bit MCUs, as there are typically more hardware-specific details to worry about on 8-bit architectures. And if you have a team of developers and engineers, maintaining several different products, there's enormous benefit to homogenizing the architecture and toolchain across all products.
 

jfitter

Joined Mar 14, 2019
14
Yaakov, I agree with everything you are saying. I think we are looking at the same issue from two different perspectives.

I can remember working in an era of minimalistic design because there was no option. Resources were either unavailable or prohibitively expensive, and hardware design horribly complex. The simplest widget had parallel memory buses. If you wanted a UART you bought a chip for it then you had to interface it to your circuit and write the driver code. The same for parallel IO. Clock speeds were pedestrian so you counted clock cycles. One interrupt source and software timers. The list goes on. Everything was a grinding development exercise, and it was easy to get so totally lost in the details of making all the innards work that one lost focus on the overall application. And that was in 1980. I was making stuff in 1972 that was truly primitive....

Back in the very early 80's I designed a commercial in-motion weighing system for a client; everything - hardware, software, packaging, etc. To ensure performance I had to resort to all manner of clever tricks and shortcuts, both in the hardware and the software, and this took a lot of thinking and hair pulling in addition to much coffee, pizzas, and all nighters. It was very satisfying to finish it, a true accomplishment, but as I got older and more experienced in business I realized that satisfaction had to be secondary to securing a robust and functional product;

Today the weigher job is almost a "no-brainer". An ARM, a smart bridge amp with A/D, an RTOS and I can have a prototype in under a month, and with proper care I can be reasonably certain of less than 1 bug / 100 LOC on the first pass. No way could I do that in 1980.

Now I buy a chip that does all of the stuff I want to do, and more, for peanuts. All the stuff works because it's designed by people who specialize in that stuff. Same with the software; well at least the RTOS. I can totally focus on the functionality of the application. The biggest danger I face these days is "scope bloat", ie. I now have so much functionality and resources I am tempted to use them all. It is not such a bad problem to have.

I am not suggesting for a moment, however, that the "old ways" should be forgotten. I teach at the local Uni and I am a committed advocate of undergrads knowing the basics, the "old ways". You cannot write embedded software effectively if you don't know the basics. The STM32 documentation is 4500 pages - I expect my students to read it and know it, or pick a simpler processor. Binary algebra reduction techniques are a must even though today we automate it. CAD techniques including manual drawing constructions. Manufacturing processes, and so forth.

We have a long way to go with software development and techniques. This and debugging is still the most time consuming and costly part of product development. For short product runs the focus should be on robust software at the expense of hardware. This is boring work and you don't get any accolades for cleverness. The payoff is in ten years time when your product is still running V1.0 of the firmware.
For high volume products then of course the engineer can bask in as much glory as he wants and show off all his tricks.
 

Ya’akov

Joined Jan 27, 2019
9,152
But 32-bit architectures are not new technology. From a business perspective, it can be argued that total development costs are generally less for 32-bit applications than 8-bit, and -- as jfitter pointed out -- dev costs swamp hardware costs for most low-volume businesses. As a developer, I much prefer working with 32-bit platforms and toolchains, and -- yes, please -- a hardware FPU.

Obviously, there are specific cases where an 8-bit MCU is optimal, especially if power budget is the primary concern. But I'd argue, contrary to what I think your point is, that 32-bit MCUs have less overall development complexity than 8-bit MCUs, as there are typically more hardware-specific details to worry about on 8-bit architectures. And if you have a team of developers and engineers, maintaining several different products, there's enormous benefit to homogenizing the architecture and toolchain across all products.
Nowhere am I arguing that any particular technology is better than any other, instead I am saying it is often the case that people with a particular toolset will choose brute force where that is not the best choice, not realizing there are other options. The example you mention, of power budget, is an example worth considering.

With the increase in, for example, the use of much higher capacity batteries, people will ignore the power budget problems by throwing battery at the problem. Sometimes this is fine, sometimes, if just isn't.

My point is about optimization, the real thing, not premature or unnecessary optimization but good engineering by people or teams that can accommodate the whole picture from hardware to software.

I am not arguing that you should only use 8-bit options unless 32-bit options are required, but that you shouldn't use 32-bit options when 8-bit is the correct engineering choice.

Since it is often the case that brute force in some aspect of a design is employed from ignorance, I am suggesting we need well-informed, interdisciplinary engineering to get optimal solutions.
 

Ya’akov

Joined Jan 27, 2019
9,152
Yaakov, I agree with everything you are saying. I think we are looking at the same issue from two different perspectives.
I think you are correct. I am talking about real optimization, not because we are starved of resources but because there is a apparent glut. You have an unusual perspective, you, personally, are unlikely to make errors of brute force easily.

When a brute force design fails to meet specs, there is sometimes an older, lighter solution that someone in the past would be able to identify instantly but is unknown to the current crop of designers. That's when things are rediscovered, or possibly reinvented from ignorance.
 

Ya’akov

Joined Jan 27, 2019
9,152
I'll add that there are many, many things designed by engineers who aren't necessarily well rounded, and these things are actually in production use and suffer from gaps of knowledge.

This is far less likely in the case of very large manufacturers or design houses working for them.
 

bogosort

Joined Sep 24, 2011
696
I am not arguing that you should only use 8-bit options unless 32-bit options are required, but that you shouldn't use 32-bit options when 8-bit is the correct engineering choice.
I don't think anyone is arguing against correct engineering choices. The distinction is how we define correct. In a business context, engineering decisions will invariably be constrained by business needs. While an 8-bit MCU may be sufficient for a particular application, it may not be correct when all associated costs are considered.

As a simple analogy, suppose that you calculate the value of a resistor in a new product line to be 9.2 kΩ. While it's easy enough to source 9.1k resistors, you don't stock them and you don't expect to be selling enough of the boards to justify the inventory expense. But if you keep a huge stock of 10k resistors, and if 10 kΩ keeps the results in spec, the correct decision may well be to add a 10k resistor to the BOM.

Since it is often the case that brute force in some aspect of a design is employed from ignorance, I am suggesting we need well-informed, interdisciplinary engineering to get optimal solutions.
Most embedded systems engineers that I know are well-familiar with 8-bit development; ignorance is not what drives us to 32-bit platforms.
 

Ya’akov

Joined Jan 27, 2019
9,152
I don't think anyone is arguing against correct engineering choices. The distinction is how we define correct. In a business context, engineering decisions will invariably be constrained by business needs. While an 8-bit MCU may be sufficient for a particular application, it may not be correct when all associated costs are considered.
I don't feel that we have a meeting of minds here. I am not advocating for or against any processor architecture. I don't disagree with the points you are making, just the applicability to the point I am trying to make.

I'll leave it alone at this point, but repeat that I am not disagreeing with your points in general.
 

nsaspook

Joined Aug 27, 2009
13,272
But 32-bit architectures are not new technology. From a business perspective, it can be argued that total development costs are generally less for 32-bit applications than 8-bit, and -- as jfitter pointed out -- dev costs swamp hardware costs for most low-volume businesses. As a developer, I much prefer working with 32-bit platforms and toolchains, and -- yes, please -- a hardware FPU.

Obviously, there are specific cases where an 8-bit MCU is optimal, especially if power budget is the primary concern. But I'd argue, contrary to what I think your point is, that 32-bit MCUs have less overall development complexity than 8-bit MCUs, as there are typically more hardware-specific details to worry about on 8-bit architectures. And if you have a team of developers and engineers, maintaining several different products, there's enormous benefit to homogenizing the architecture and toolchain across all products.
I've heard the refrain for the last 15 years that 8-bit controllers are dead or dying.

I think what's overlooked is that many 8-bit embedded design ins are made by and primarily programmed by hardware engineers (who also program) as time constrained I/O, power control, HID I/O engines, etc .. for 32-bit controller back-ends. For a experienced hardware engineer the 8-bit hardware-specific details are easy when compared to the complexity of the complete system design. These hardware engineers don't much care about the latest in software design or toolchains, they want dependable, easy to understand devices and toolchains that will work today and tens years from now. Companies design, build and sell to high-volume customers needs and those products can be built for profit in fully-depreciated “antique” fabs at geometries that are almost laughable today.
 

bogosort

Joined Sep 24, 2011
696
I've heard the refrain for the last 15 years that 8-bit controllers are dead or dying.
Nah, clearly 8-bit MCUs have their place. As do ASICs. But the combination of performance, price, and versatility of modern 32-bit MCUs and FPGAs -- including both on the same chip -- has changed the game in a lot of markets. It's an exciting time.
 

Thread Starter

bug13

Joined Feb 13, 2012
2,002
That's a helpful discussion. Good to see both side's arguments.

For my own purpose, I recently saw a job I like, and they need someone with RTOS experience. I happy to invest in a STM32 dev board that is suitable for both FreeRTOS and CHIBIOS. So I can play with both FreeRTOS and CHIBIOS.

Any recommendation for a suitable STM32 dev board?
 

nsaspook

Joined Aug 27, 2009
13,272
There is also RIOT. I have a local PIC32MZEF port of the OS to a Microchip dev board.
https://github.com/nsaspook/RIOT/tree/PIC32MZEF/boards/pic32-cpicmzef

BLE application using RIOT
https://github.com/nsaspook/RIOT/tree/PIC32MZEF/examples/rn4020_riot/test.X

42354726051_01fd11946b_z.jpg
MCP3208 (8 12-bit channels) and ADS1220 (4 24-bit channels single/diff unipolar/bipolar inputs) ADC interface vector board in a mikro socket for
driver development with a RN4020 BLE modules on another vector board socket.
I'm mainly using the ADS1220 temp sensor and my hot finger to simulate heartbeat changes in an Android app until a DIY body sensor
is made for the 24-bit adc.
 
Top