Rules of c language are very confusing

WBahn

Joined Mar 31, 2012
29,976
Why microcontroller programming is mostly done in low level language like assembly language, c language? why is there no high level language like java dot net.. ?

Does the many compiler support only C and C++ that's why they are used or are there other reasons?
It wasn't all that long ago that nearly all MCU programming was done directly in assembly. The reason is that, historically, MCUs were extremely resource-starved. For instance, the PIC16C55 had 24 bytes of RAM and 512 words of ROM for the program.

Using very tightly targeted versions of C only became feasible once MCUs with significantly more resources became available.

Today the resources available on some MCUs is astonishing and there ARE alternatives other than Assembly or C, but most MCUs still suffer from resource and clock speed limitations which means you need to get the most bang out of every instruction in the program -- that still means Assembly or C.
 

ApacheKid

Joined Jan 12, 2015
1,533
Why microcontroller programming is mostly done in low level language like assembly language, c language? why is there no high level language like java dot net.. ?

Does the many compiler support only C and C++ that's why they are used or are there other reasons?
There's a lot of interest and talk about .Net, particularly now that .Net Core has matured and brings huge improvements, simplifications and greater portability.

But .Net isn't small, it has a runtime component that include the garbage collector, the GC is big and very complex despite being rewritten with contributions from many experts in the open source world.

There is .Net nano - that's an open source project (with a Microsoft manager overseeing it) and there's Tiny CLR from GHI electronics, but in each case there are burdensome limitations for example no generic support or async/await.

The footprint is just too much for many small embedded devices.

I have been pondering though recently, about the possibility of a brand new language for embedded applications, something that is designed specifically and primarily for embedded devices, something that does what C does but much better at it.

I'm no stranger to serious compiler development so feel confident about designing it, but its still a lot of work and I'm pretty new really to embedded programming, so this isn't something I'd undertake hastily, but taking a stab at defining the feature set, overall goals would be an interesting exercise.

A core goal - speaking personally - is the abandonment of the antiquated and troublesome C grammar, brining a much cleaner syntax that's easily extensible, into the mix.

Other goals might be native language support for common things like SPI and so on, rather than relying on external libraries. Since things like SPI are (often) standardized, that kind of stability means it could be part of the language in principle.
 
Last edited:

crutschow

Joined Mar 14, 2008
34,280
I'm certainly not a programmer, but had to learn a little C to modify a test-gear program at work (which I managed to successfully do with the help of the C Handbook, along with some serious head scratching), and often wondered about why a computer language with such an arcane syntax had become so popular.
After reading the posts here, I now better understand how and why that happened.
But for me, Basic is about all I can handle.

(My favorite story about UNIX and C is that is was developed as an April Fool joke, and that when it successfully compiled (;P("\n"),R--;P("|"))for(e=C;e--;P("_"+(*u++/8)%2))P("| "+(*u/4)%2); they considered it complete).
 

WBahn

Joined Mar 31, 2012
29,976
Seems there is something called "embedded C" which I've never heard of before:

https://standards.iso.org/ittf/PubliclyAvailableStandards/c051126_ISO_IEC_TR_18037_2008.zip

There's an emphasis on fixed point arithmetic, didn't know that was a big deal in embedded work.
Floating point arithmetic is very expensive in most embedded processors. Remember, it was enough of a bottleneck that up until the 80486 there were separate (end quite expensive) chips called "math co-processors" whose primary purpose was to perform floating point arithmetic in hardware.

Since many MCUs can't even multiply and divide, let along do floating point operations, this has to be done in firmware -- often on a platform with a slow clock speed and very limited RAM/ROM resources.
 

ApacheKid

Joined Jan 12, 2015
1,533
Floating point arithmetic is very expensive in most embedded processors. Remember, it was enough of a bottleneck that up until the 80486 there were separate (end quite expensive) chips called "math co-processors" whose primary purpose was to perform floating point arithmetic in hardware.

Since many MCUs can't even multiply and divide, let along do floating point operations, this has to be done in firmware -- often on a platform with a slow clock speed and very limited RAM/ROM resources.
Yes, I recall those, I have an old one too in my little historic chips collection here! I recall when I was working on another compiler many years ago, looking at licensing a floating point library for the Intel 386 family that some firm produced, this was to support floating point math without a coprocessor.

So anyway, fixed point arithmetic is important for the MCU world, I simply never really thought about it.
 

ApacheKid

Joined Jan 12, 2015
1,533
Two very progressive and innovative languages are Forth and APL.
APL stands as one of my favorite languages.
I love APL too, a very under appreciated language. Tom Iverson was primarily interested in "Notation as a tool of thought" and won the Turing prize for this, with a paper of the same title.

I was seeking to have the F# language expand its allowed alphabet so we could create functions named after the APL function (using the very same symbols, they are all part of Unicode) you can read a bit about that here.

I just didn't have the time to push this much so its kind of fallen by the wayside. The core point was for me, that many APL operators are truly powerful and don't exist much in either the imperative or functional world.

The lex tools they used were quite difficult for me to fathom too.
 

nsaspook

Joined Aug 27, 2009
13,079
...
A core goal - speaking personally - is the abandonment of the antiquated and troublesome C grammar, brining a much cleaner syntax that's easily extensible, into the mix.

Other goals might be native language support for common things like SPI and so on, rather than relying on external libraries. Since things like SPI are (often) standardized, that kind of stability means it could be part of the language in principle.
SPI is dead simple in basic principle but there are a thousand variations on how it's implemented (for max efficiency, highest speed and lowest cpu usage) on each processor and what's the method (software,, hardware polled, interrupt, DMA) of the data interface.
https://ww1.microchip.com/downloads/en/DeviceDoc/61106G.pdf
Native language support is a pipe dream for sure.
Some spi driver C code for RIOTOS with DMA and FIFO support.
https://github.com/nsaspook/RIOT/blob/PIC32MZEF/cpu/mips_pic32_common/periph/spi.c

Long term stability is a big problem that C is really good at.

Just today a 30 year old CRT touchscreen system that I converted (wrote new controller firmware to translate the new serial touch protocol to the ancient binary serial touch protocol) to a ELO IntelliTouch LCD 15 years ago was replaced (by the parts managers because of shortages) with a new version of the ELO AccuTouch LCD screen that changed the SmartSet status codes for touch responses.

A one-liner in the C code fixed the problem.
https://myelo.elotouch.com/support/s/article/SmartSet-Data-Protocol
** 01, 02 or 04 for AccuTouch controllers; 81, 82 or 84 for IntelliTouch controllers
Now touch works but the XY screen origin is reversed, another few lines to fix for this variation of touch-screen.
Handling the infinite variations of standard protocols is what drives you mad, not the language grammar.
 
Last edited:

MrChips

Joined Oct 2, 2009
30,706
I love APL too, a very under appreciated language. Tom Iverson was primarily interested in "Notation as a tool of thought" and won the Turing prize for this, with a paper of the same title.

I was seeking to have the F# language expand its allowed alphabet so we could create functions named after the APL function (using the very same symbols, they are all part of Unicode) you can read a bit about that here.

I just didn't have the time to push this much so its kind of fallen by the wayside. The core point was for me, that many APL operators are truly powerful and don't exist much in either the imperative or functional world.

The lex tools they used were quite difficult for me to fathom too.
I still have a manual on how to implement APL on a microcomputer. I never got around to doing so.
I did have APL running on a Z80 CP/M system which I used quite extensively in is time.
 

ApacheKid

Joined Jan 12, 2015
1,533
SPI is dead simple in basic principle but there are a thousand variations on how it's implemented (for max efficiency, highest speed and lowest cpu usage) on each processor and what's the method (software,, hardware polled, interrupt, DMA) of the data interface.
https://ww1.microchip.com/downloads/en/DeviceDoc/61106G.pdf
Native language support is a pipe dream for sure.
Some spi driver C code for RIOTOS with DMA and FIFO support.
https://github.com/nsaspook/RIOT/blob/PIC32MZEF/cpu/mips_pic32_common/periph/spi.c

Long term stability is a big problem that C is really good at.

Just today a 30 year old CRT touchscreen system that I converted (wrote new controller firmware to translate the new serial touch protocol to the ancient binary serial touch protocol) to a ELO IntelliTouch LCD 15 years ago was replaced (by the parts managers because of shortages) with a new version of the ELO AccuTouch LCD screen that changed the SmartSet status codes for touch responses.

A one-liner in the C code fixed the problem.
https://myelo.elotouch.com/support/s/article/SmartSet-Data-Protocol


Now touch works but the XY screen origin is reversed, another few lines to fix for this variation of touch-screen.
Handling the infinite variations of standard protocols is what drives you mad, not the language grammar.
A very good way to evaluate an idea like a new embedded language is to - initially - generate C code from the input source. This means one can get a basic implementation and run something actually built from the new language.

A later step is to then replace the C code gen with a native one for say ARM and then use the C gen version to serve as a test reference for the native code gen version.

Writing a language scanner/parser is much easier these days with a powerful language like C#, in fact I experimented with a new grammar for C# a while back, made some good progress but it was never a truly serious goal to deliver anything, just a way to explore improved grammars.

I blogged about it here and the C# code can be found here FYI.
 

nsaspook

Joined Aug 27, 2009
13,079
The problem IMO with embedded computer programming is not languages, parsers or grammars, it's programmers not really understanding the domain (double true with low-level hardware embedded programming) they are programming in and how to express that internally in the AN/MK1 brain.

C is perfectly logical, with a proper grammar if you understand the intricacies of low-level memory operations that interface with the internals of processors and associated I/O modules. C makes the memory mechanics explicit and visible.That's exactly what you need in some programming domains where machine and memory-mapping oriented operations are the prime focus line in most microcontrollers.

https://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EWD498.html
 

ApacheKid

Joined Jan 12, 2015
1,533
Well the grammar can/should strive to make it easier to work in some domain, for example one can do OO in C if they want but languages like Java and C# make it much less effort because they fit the OO domain more closely.

You speak of memory mapping details, I'm not familiar enough with these chips to yet appreciate what you mean, but if this were inherently represented in a language then this could help developers understand the domain better surely?

A huge weakness in the C grammar (and all derivatives) is the inability to add new keywords without braking backward compatibility, of course this is a general feature not domain specific, but being able to add keywords would enable brand new functionality to be added with no backward compatibility issues, so I place a lot of emphasis on this but as I said, its not peculiar to MCU code.
 

nsaspook

Joined Aug 27, 2009
13,079
Well the grammar can/should strive to make it easier to work in some domain, for example one can do OO in C if they want but languages like Java and C# make it much less effort because they fit the OO domain more closely.

You speak of memory mapping details, I'm not familiar enough with these chips to yet appreciate what you mean, but if this were inherently represented in a language then this could help developers understand the domain better surely?

A huge weakness in the C grammar (and all derivatives) is the inability to add new keywords without braking backward compatibility, of course this is a general feature not domain specific, but being able to add keywords would enable brand new functionality to be added with no backward compatibility issues, so I place a lot of emphasis on this but as I said, its not peculiar to MCU code.
I see the C language restriction of adding new keywords as a long term language stability feature. There a plenty of ways to 'add functionality' in the existing C framework using the C pre-processor.

Code:
#define two 2
#define times *

int four = two times two;
The embedded programming domain is still filled by people with a hardware engineer mentality that program to make things work. The means most of us see software problems (engineering schools drill this into your brain) as mechanical, with mechanical physics solutions (as a firmware engineer) instead solution based on a hierarchy of artistic prose with things like OOP. The HW engineering brains I work with see the the current push of OOP languages as a waste of good space in your head but we also spend the time to be familiar with the hype just in case there might be a nugget or two buried in the heap of fantastic virtues expounded for using those principles.

There are plenty of languages that map hardware requirements in formal languages like HDLs but most are very domain specific. The more you know, the less you would think adding things like SPI native language support to a GP computer language is a good idea.
 

ApacheKid

Joined Jan 12, 2015
1,533
I see the C language restriction of adding new keywords as a long term language stability feature. There a plenty of ways to 'add functionality' in the existing C framework using the C pre-processor.
Adding new keywords does not introduce "instability" unless the grammar has reserved words.

Code:
#define two 2
#define times *

int four = two times two;
That's not adding a new keyword. A new keyword might be something like:

if (a > 100)
activate
{
// stuff pertaining to "activation" some new hitherto unknown, language keyword.
}

That can't be done today because any user created identifier "activate" breaks the compile. The keywords of C are a small finite number and forever fixed, cannot be expanded.

The embedded programming domain is still filled by people with a hardware engineer mentality that program to make things work. The means most of us see software problems (engineering schools drill this into your brain) as mechanical, with mechanical physics solutions (as a firmware engineer) instead solution based on a hierarchy of artistic prose with things like OOP. The HW engineering brains I work with see the the current push of OOP languages as a waste of good space in your head but we also spend the time to be familiar with the hype just in case there might be a nugget or two buried in the heap of fantastic virtues expounded for using those principles.
Right, but I have not been and am not advocating OOP, simply arguing that a new language specifically addressing the problem domain of MCU's might have its merits, and explaining why I find C past its use by date in some respects.

There are plenty of languages that map hardware requirements in formal languages like HDLs but most are very domain specific. The more you know, the less you would think adding things like SPI native language support to a GP computer language is a good idea.
You might well be right, I am just speculating for the sake of argument. remember my core complaint is that C is a poor language, with no scope for new keywords and so on. For example adding keywords like "async" or "await" is impossible because the grammar has reserved words.
 

nsaspook

Joined Aug 27, 2009
13,079
Adding new keywords does not introduce "instability" unless the grammar has reserved words.
and adding new keywords to C or the existence of reserved works is a problem that doesn't really exist for the vast majority of embedded programmers. A few new reserved words have been added over the years in C99 and C11.
 
Top