The Imperium Programming Language - IPL

dcbingaman

Joined Jun 30, 2021
1,065
A namespace is nothing more than a means of defining a naming hierarchy.

Without a namespace a function called reset_device must be called literally as reset_device and creating another function named reset_device (say for a different kind of device) has to be given a different name like reset_other_device.

With a namespace you can give these a prefix and if the prefix is different then the functions can in fact have an identically spelled name, this shows you the principle:

Code:
namespace Hardware
   namespace ADC
      procedure reset_device (device_ptr);
         // code
      end;
   end;

   namespace GPS
      procedure reset_device (device_ptr);
         // code
      end;
   end;
end;

procedure main(device_ptrs);

   dcl device_ptrs(2) pointer;

   call Hardware.ADC.reset_device(device_ptrs(ADC_DEVICE));

   call Hardware.GPS.reset_devuce(device_ptrs(GPS_DEVICE));

end;
There's really nothing more to them than that, rather simple but hugely helpful, C has no namespace capability and although one can "simulate" it that requires all kinds of boiler plate code and structuring, having them designed into the language is far better, simpler and cheaper.
I have ran into this myself multiple times when dealing with vendor 'libraries'. Name clashing during compilation and very difficult to troubleshoot and or correct. It would be nice to see namespaces added to the ANSI C standard. Most of the time I end up making very specific function names in libraries just to avoid this problem:

void EEPROM2237_Init(void);
void EEPROM2237_Write(...);

etc.

Very annoying.
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
I have ran into this myself multiple times when dealing with vendor 'libraries'. Name clashing during compilation and very difficult to troubleshoot and or correct. It would be nice to see namespaces added to the ANSI C standard. Most of the time I end up making very specific function names in libraries just to avoid this problem:

void EEPROM2237_Init(void);
void EEPROM2237_Write(...);

etc.

Very annoying.
Yes, adding this to C would have been one of the more truly helpful changes. I suspect that the grammar itself is the road block, C's grammar is hard to parse (it is full of ambiguities and C++ is the hardest language to parse among all languages) adding a new keyword like "namespace" is very hard to do without potentially breaking backward compatibility.

IMHO C is a dead end language, hard if not impossible to truly enhance and enrich.
 

dcbingaman

Joined Jun 30, 2021
1,065
Yes, adding this to C would have been one of the more truly helpful changes. I suspect that the grammar itself is the road block, C's grammar is hard to parse (it is full of ambiguities and C++ is the hardest language to parse among all languages) adding a new keyword like "namespace" is very hard to do without potentially breaking backward compatibility.

IMHO C is a dead end language, hard if not impossible to truly enhance and enrich.
True, C is definitely not the easiest to work with. The one nice thing about it; it is well documented and standardized, a lot of people know it (at least somewhat) and a lot of free compilers especially for microcontrollers support it. If I want to provide someone a general algorithm for example, I typically use C. Not that it is superior in any way so much as it is the programming language that is common among a lot of engineers. I think C++ is a little overkill for simple microcontrollers and simple microcontroller applications. I have developed PC applications with C++ until C# and .NET came along which makes those applications much easier to develop with. C# is an insanely powerful language but of course not meant for small microcontrollers. At my last job we worked with more powerful processors running VxWorks. Most of that development was C++. Originally I used assembly language for programming Microcontrollers (beginning of my career) but I prefer C on microcontrollers as the code can port to other platforms and it gives me enough control at the register level to get done what I need.
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
True, C is definitely not the easiest to work with. The one nice thing about it; it is well documented and standardized, a lot of people know it (at least somewhat) and a lot of free compilers especially for microcontrollers support it. If I want to provide someone a general algorithm for example, I typically use C. Not that it is superior in any way so much as it is the programming language that is common among a lot of engineers. I think C++ is a little overkill for simple microcontrollers and simple microcontroller applications. I have developed PC applications with C++ until C# and .NET came along which makes those applications much easier to develop with. C# is an insanely powerful language but of course not meant for small microcontrollers. At my last job we worked with more powerful processors running VxWorks. Most of that development was C++. Originally I used assembly language for programming Microcontrollers (beginning of my career) but I prefer C on microcontrollers as the code can port to other platforms and it gives me enough control at the register level to get done what I need.
Well I've used C a great deal in the past, only recently have I used it with microcontrollers (by using Visual Studio 2022 and VisualGDB).

It was while working on a small exploratory project (the readme details how I was actively organizing the code, much of the code discussed there is precisely because there's no namespace in C) that I began to experience increasing frustration with the language, I was spending too much time and energy keeping the code base tidy and well organized, this is true of C more than almost another language, one has to keep it on a tight leash, actively wrestle with it, otherwise it has a tendency - a natural tendency - to become disorganized and unclear.

This frustration is why this thread exists, I wanted to see if I could do better.
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
On a separate note, I scoured the standard operators used by various languages including Verilog. I've settled on these are core bitwise operators given that this is intended to be an mcu-friendly language:

Code:
|        OR
&        AND
^        XOR
~|       NOR
~&       NAND
~^       XNOR
The tilde ~ is obviously the bitwise NOT operator. The precedence rules for these are pretty much as seen in most other languages.

Verilog had an interesting class of operators I'd never seen before, the "reduction" operators. These are it seems prefix operators, or they are are the same symbols used for bitwise operators but used in a prefix context.

I don't know enough about Verilog or its grammar to know exactly how ambiguities are avoided given that it uses the same symbols for bitwise as it dis for reduction, I'm tempted to consider that an alternative set of symbols is better, but I need to read up.

I have no idea how or if having these reduction operators in an MCU programming language might help either....
 
Last edited:

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
The grammar is now rather stable, the vast bulk of it is done and looking solid. So I am now just starting to examine the generated (by Antlr) parse tree to get a feel for how that code will look.

This is the very beginning of the semantic analysis phase of this project, and this code is experimental.

This is the very first diagnostic message, reporting a problem in a 2D array that has both an upper and a lower bound specified:

1675117609373.png

This is the code it is reporting on:

1675117675196.png

also:

1675122591436.png

Its trivial to specify an option to alert the user that a keyword has been used as an identifier, this is 100% legal and has no impact on the code's compilation or execution, but it might be undesirable, done unwittingly or as a result of using a new version of the language that has introduced new keywords that were used already as identifiers in this source code; these are "I" - informational - messages.
 
Last edited:

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
Our old friend the "ternary" operator came up next, this was something I find valuable yet PL/I never had any special syntax for it.

After thinking about this, two distinct grammatical forms come up:

1675436152144.png

The grammar rules for this are easy and present no conflict with existing rules. The first is general, simply mapping an expression to some other expression with a default case trailing, the second is specifically for a boolean case, akin to the C/C++ '?' operator.

The symbol ⇒ is an example of a Unicode alternate. I've allowed some symbols to have this alternate form when there's an obvious close match to the ascii form. The ascii for that symbol is '=>' so either can be used, has no impact on parsing whatsoever.
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
These expressions (because that's all they are) can be nested too of course:

Code:
a = expr/36 + 17 ⇒ (expr_1 ⇒ rate ⇒ (100 ⇒ slow)(200 ⇒ fast)(stopped))(expr_2 ⇒ result_2)(result_3);
Equivalent more or less, to:

Code:
_temp = expr/36 + 17;

if temp = expr_1 then
   if rate = 100 then
      return (slow);
   elif rate = 200 then
      return (fast);
   else
      return (stopped);
   end;
elif temp = expr_2 then
   return (result_2);
else
   return (result_3);
end;
 

nsaspook

Joined Aug 27, 2009
13,079
Well I've used C a great deal in the past, only recently have I used it with microcontrollers (by using Visual Studio 2022 and VisualGDB).

It was while working on a small exploratory project (the readme details how I was actively organizing the code, much of the code discussed there is precisely because there's no namespace in C) that I began to experience increasing frustration with the language, I was spending too much time and energy keeping the code base tidy and well organized, this is true of C more than almost another language, one has to keep it on a tight leash, actively wrestle with it, otherwise it has a tendency - a natural tendency - to become disorganized and unclear.

This frustration is why this thread exists, I wanted to see if I could do better.
I think you can do a lot better than C easily but that's not really the cure to move micro-controller embedded programming from C. It must provide value to the end-user that doesn't care about pedantic programming styles. They want a tool that provides very transparent abstraction to hardware to ease hardware debugging when the software is 100% correct because the job task is not writing software, it's making hardware work correctly with software. It's often needed to modify (at times using 'ad-hoc' methods) 100% correct software to correct errata type hardware bugs and other real-word interface issues. C provides this value down to the smallest micro-controller at the cost of being a dead-end language like Latin. ;)
Both will likely never be an “Extinct Language” because of their heavily influence in science, engineer and technology.
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
I think you can do a lot better than C easily but that's not really the cure to move micro-controller embedded programming from C. It must provide value to the end-user that doesn't care about pedantic programming styles. They want a tool that provides very transparent abstraction to hardware to ease hardware debugging when the software is 100% correct because the job task is not writing software, it's making hardware work correctly with software. It's often needed to modify (at times using 'ad-hoc' methods) 100% correct software to correct errata type hardware bugs and other real-word interface issues. C provides this value down to the smallest micro-controller at the cost of being a dead-end language like Latin. ;)
Both will likely never be an “Extinct Language” because of their heavily influence in science, engineer and technology.
You raise several interesting points here.

I am certainly not seeking to see things move away from C, that's outside of my control and is not a goal of mine. I do want to develop a programming language that is preferable to C though, a language one would choose if they had freedom to pick, a better overall language.

Now I assume that when you say "provides very transparent abstraction to hardware" you're referring to MCU peripherals? Because the other stuff is already abstracted. The stack is abstracted, memory is abstracted, machine instructions are abstracted so is that right? you mean peripherals too should be abstracted?

This is thought provoking too "the job task is not writing software, it's making hardware work correctly with software" but this sounds like it might be paradoxical to me. An MCU is both hardware and software, the hardware depends on the software and the software depends on the hardware. It is impossible to have a useful MCU without there being software.

The end product requires that both the hardware and the software work, they are mutually dependent, on each other.

All the software is anyway is a large number of preset bit patterns, data, the software is just data in memory. The challenge is that that data is the end result of human intelligence, creativity, there is no other way to get it if it doesn't already exist.

The same is true of hardware, human intelligence is necessary to produce it, instead of being some arrangement of information it is a topological arrangement of material bits n pieces.

Finally, I agree, C does provide value, it does work and can be used to great effect it just isn't very good, the effort in terms of time and intellectual effort, needed to get some result is higher than I think is necessary. If one could produce MCU software with less effort, in less time and with less debugging then would that not be better?

In another discussion I had recently the subject of being able to include raw binary data into a compilation came up, people - C users - were asking why the compiler does not allow that, support some option. This is one of umpteen things that it could do but doesn't (yet) do.

Here's a great article by a guy on the C23 committee who fought tooth and nail to get this into the new standard, it's worth a read because this really captures much of what I find disappointing about C.

#embed is in C23.

I quote:

You would think that we would have figured out a decent, cross-platform way to put data in executables after 60 years of filesystem research and advances and 40 years of C and C++ implementations.

You would unfortunately be thinking wrong.
So much energy went into convincing people that yes, we should have this, no, we can’t “just parse better” our way out of it, no, the smartest people in the last 30 years across all major compilers were literally not capable of doing a better job at this!
It's a very good idea, I'm adding it to IPL, it's a no-brainer that nobody in this forum even thought of, why? Not because they're dumb or they hate programming or any of that, it's mainly because the tools we use sometimes dictate the way we think, as we use tools over and over we accommodate their shortcomings, we don't overcome them we just accommodate them and adapt to them, but that does not mean they are not real shortcomings.
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
Speaking of hardware, here's an example of some of the operators I'm putting into IPL. PL/I never had shifts or anything but these are important, C and other more recent languages have them, so here's what I have so far:

Here's an example:

Code:
status ⇐ xfer_reg ⧀ 3 & saved_reg ⧁ 5 ⊕ control_mask;
Rotates the bits in 'xfer_reg' left three bits ands that with 'saved_reg' rotated 5 bits right and then exclusive ors the result of all that with 'control_mask'.

Now before people start screaming I want to mention these are termed Unicode alternates by me, they are optional, they are equivalent to their ASCII counterparts the parser is unaware of them.

Here's the set so far:

1675458218900.png

So one can write an '=' or a '⇐' in an assignment (they are different tokens because '=' is used for compare but for assignment they are interchangeable).

Look at XOR it can be '^' or if you want it can be '⊕' - cool yes? Again there is no need to use these Unicode symbols, they are there if desired.

You have rotates, shifts, and and or and even the Verilog reduction and and or:

Code:
      tally ⇐ &(input) & !(previous);
That parses absolutely fine and "ands" all the bits in 'input' and then "ands" that with the "or" of all the bits in 'previous'.
 
Last edited:

nsaspook

Joined Aug 27, 2009
13,079
There is an advantage in an MCU embedded language to keep language constructs and support structures as lightweight as possible as it expands the capabilities possible with limited memory and processing power hardware. This also increases the probability the embedded hardware engineer that also needs to write code will quickly learn the language because it makes practical sense to an EE that loves complex physical hardware tasks.

"Provides very transparent abstraction to hardware" "the job task is not writing software, it's making hardware work correctly with software" means the ability to precisely think how you're molding hardware using software that likely controls other hardware that often obeys physical laws with hard time and energy constraints. MCU hardware/software are mutually dependent but typically IMO not mutually important to machine operation as I've built various software system functionalities for the same MCU based hardware system that MUST be physically compatible in the hardware domain or the software will never work. Usually in the highly optimized for cost for function MCU world, the hardware design is costly prime wood and the software is much cheaper glue.
 

nsaspook

Joined Aug 27, 2009
13,079
In another discussion I had recently the subject of being able to include raw binary data into a compilation came up, people - C users - were asking why the compiler does not allow that, support some option. This is one of umpteen things that it could do but doesn't (yet) do.

Here's a great article by a guy on the C23 committee who fought tooth and nail to get this into the new standard, it's worth a read because this really captures much of what I find disappointing about C.

#embed is in C23.

I quote:

It's a very good idea, I'm adding it to IPL, it's a no-brainer that nobody in this forum even thought of, why? Not because they're dumb or they hate programming or any of that, it's mainly because the tools we use sometimes dictate the way we think, as we use tools over and over we accommodate their shortcomings, we don't overcome them we just accommodate them and adapt to them, but that does not mean they are not real shortcomings.
On a typical small MCU system without a file system or OS, fixed raw binary data is usually put in a const array in a separate include file as hex values. If it automates that on the compile host side that would be nice in general for PC targets but for small MCU systems it's usually a limited size fixed binary that only needs to be converted only once from some (image, etc...) format to a binary compatible outside of the compile cycle.

Featuritis
1675465746174.png

C:
#include "lcd_drv.h"

/*
* image2cpp
* created using https://javl.github.io/image2cpp/
* plain bytes, Vertical - 1 bit per pixel
*/

const uint8_t foo_map[] = {
// 'foo', 100x100px
0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01,
// data deleted from post
0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08,
0x08, 0x08, 0x08, 0x08
};
Boot Image.
1675465568385.png
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
There is an advantage in an MCU embedded language to keep language constructs and support structures as lightweight as possible as it expands the capabilities possible with limited memory and processing power hardware. This also increases the probability the embedded hardware engineer that also needs to write code will quickly learn the language because it makes practical sense to an EE that loves complex physical hardware tasks.
[/code]

Well generating efficient, compact code is the goal there really, a "language construct" is an abstraction intended for human reasoning, how that gets translated to machine code is important but there's no correspondence with "language construct" and generated code performance.

Most C compilers today get their optimization not from the compiler but from LLVM which has a huge set of optimizations available, various forms of peep-hole optimizations and all sorts. Clang for example generates LLVM IR instructions not CPU specific code.

"Provides very transparent abstraction to hardware" "the job task is not writing software, it's making hardware work correctly with software" means the ability to precisely think how you're molding hardware using software that likely controls other hardware that often obeys physical laws with hard time and energy constraints. MCU hardware/software are mutually dependent but typically IMO not mutually important to machine operation as I've built various software system functionalities for the same MCU based hardware system that MUST be physically compatible in the hardware domain or the software will never work. Usually in the highly optimized for cost for function MCU world, the hardware design is costly prime wood and the software is much cheaper glue.
Well I'd need some specific examples of a "transparent abstraction" of hardware to comment much more on that, do you have some?

I'm mainly working on a systems programming language, that's the general way I'd characterize my work on this, a language that would be a better choice than say C or C++ if writing an operating system. If a language can achieve that then I'd contend that it likely has much to offer an MCU programmer.
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
On a typical small MCU system without a file system or OS, fixed raw binary data is usually put in a const array in a separate include file as hex values. If it automates that on the compile host side that would be nice in general for PC targets but for small MCU systems it's usually a limited size fixed binary that only needs to be converted only once from some (image, etc...) format to a binary compatible outside of the compile cycle.

Featuritis
View attachment 286746

C:
#include "lcd_drv.h"

/*
* image2cpp
* created using https://javl.github.io/image2cpp/
* plain bytes, Vertical - 1 bit per pixel
*/

const uint8_t foo_map[] = {
// 'foo', 100x100px
0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01,
// data deleted from post
0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08,
0x08, 0x08, 0x08, 0x08
};
Boot Image.
View attachment 286745
Well the goal there is for the runtime code to have 'foo_map' pre-populated. Initializing it by generating the initialization text in the syntax needed with '0x' and ',' all over the place is a way of doing it, a means to an end.

A better way to do it is (as the article explains and will be available in C23) is to use #embed <binary-filename> and the raw binary data will be dragged in as the code gets compiled, with the same end result but with no need for the intermediate text form. Are you really saying that this is not an attractive feature?

This is a superb example of a hardware developer friendly feature.
 

nsaspook

Joined Aug 27, 2009
13,079
Well the goal there is for the runtime code to have 'foo_map' pre-populated. Initializing it by generating the initialization text in the syntax needed with '0x' and ',' all over the place is a way of doing it, a means to an end.

A better way to do it is (as the article explains and will be available in C23) is to use #embed <binary-filename> and the raw binary data will be dragged in as the code gets compiled, with the same end result but with no need for the intermediate text form. Are you really saying that this is not an attractive feature?

This is a superb example of a hardware developer friendly feature.
I can tell you don't have much embedded experience with limited MCU hardware.

Attractive, yes in a general way, necessary for most MCU embedded tasks, no, as the devil is in the details. It's not always (rarely the case when you need to optimize for space and compatiblity) the case that the original binary format of the file is compatible with the MCU application. For my boot image the original binary was a JPG. That's a binary for sure but it's the wrong format binary as I needed a 1-bit bitmap binary format compatible with the display I use on this embedded project so I can directly bitmap (DMA the image memory buffer result directly to the display) the image to display without expensive format transformations. Unless you also include a universal format converter like image2cpp that converts, resizes, changes coordinates systems directly to a C file array compatible form it's not a useful addition if it's strictly limited to byte conversion.

https://github.com/nsaspook/wfi32/blob/nhost/firmware/lcd_drv/foo.jpeg

This is a superb example hardware developer friendly features in a program.
https://javl.github.io/image2cpp/
 
Last edited:

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
I can tell you don't have much embedded experience with limited MCU hardware.

Attractive, yes in a general way, necessary for most MCU embedded tasks, no, as the devil is in the details. It's not always (rarely the case when you need to optimize for space and compatiblity) the case that the original binary format of the file is compatible with the MCU application. For my boot image the original binary was a JPG. That's a binary for sure but it's the wrong format binary as I needed a 1-bit bitmap binary format compatible with the display I use on this embedded project so I can directly bitmap (DMA the image memory buffer result directly to the display) the image to display without expensive format transformations. Unless you also include a universal format converter like image2cpp that converts, resizes, changes coordinates systems directly to a C file array compatible form it's not a useful addition if it's strictly limited to byte conversion.

https://github.com/nsaspook/wfi32/blob/nhost/firmware/lcd_drv/foo.jpeg

This is a superb example hardware developer friendly features in a program.
https://javl.github.io/image2cpp/
I can tell you don't have much experience with designing and implementing programming languages.

There's more to #embed, you should read up on it. Are you arguing it's better not to have #embed? Initializing static memory from a binary file at compile time without the manual steps of creating parsable source text, is a no brainer. A file is just a stream of bytes, if you actually want to load a different file then create it and embed that one.

I'm quite sure that there are many use cases for it, as with anything one can always find exception cases. The point you're missing is that embed is portable.

We're at an impasse, it really seems that you are of the view that nothing can be done to improve programming experience/costs when the targets are MCUs. C isn't perfect but don't dare suggest we replace it; so it's simple, I disagree with that position, I too am an engineer and I like to improve things not defeatistly accept the status quo.

Incidentally, I openly admit I'm not experienced at MCU development, but that's not a bad thing, Edison wasn't an electrical engineer, Einstein wasn't a PhD physicist, sometimes being an outsider gives one an advantage, a fresh perspective, an ability to see a big picture that those embroiled in details cannot see.
 
Last edited:

nsaspook

Joined Aug 27, 2009
13,079
I can tell you don't have much experience with designing and implementing programming languages.

There's more to #embed, you should read up on it. Are you arguing it's better not to have #embed? Initializing static memory from a binary file at compile time without the manual steps of creating parsable source text, is a no brainer. A file is just a stream of bytes, if you actually want to load a different file then create it and embed that one.

I'm quite sure that there are many use cases for it, as with anything one can always find exception cases. The point you're missing is that embed is portable.

We're at an impasse, it really seems that you are of the view that nothing can be done to improve programming experience/costs when the targets are MCUs. C isn't perfect but don't dare suggest we replace it; so it's simple, I disagree with that position, I too am an engineer and I like to improve things not defeatistly accept the status quo.

Incidentally, I openly admit I'm not experienced at MCU development, but that's not a bad thing, Edison wasn't an electrical engineer, Einstein wasn't a PhD physicist, sometimes being an outsider gives one an advantage, a fresh perspective, an ability to see a big picture that those embroiled in details cannot see.
No, I don't, I'm a language tool user, not a language tool maker but I do know how to use several of them in this domain and have the experience to tell you what actually works and what doesn't on actual MCU systems I created, working 24/7 for decades.

I did read the blog of the #embed feature submitter for C23 and the long winded sad history. Good stuff and useful for several things like dynamically selecting connected devices firmware binary during compile but it's rare for micro-controllers to need raw unconverted sizable binaries to upload to another machine. It's usually the other way around that a machine with an OS downloads to the MCU via a communication port boot-loader on the micro-controller.
Yes it's byte format portable (that's a trivial problem usually) but not really sufficient in most cases where binary conversion of a file you didn't originally create to another type of binary is needed.

"A file is just a stream of bytes" It's a meaningless statement for what it represents as a source of information that must be processed by some MCU.

"if you actually want to load a different file then create it and embed that one"
Why would I add a compile step #embed when the external format conversion automatically does the embed function for me in one step?

I'm saying it's better to spent time on things of more MPL value to the embedded programmer targeted (I hope that's the target) by your efforts. That way it's more likely to actually be used by people like me who don't even think about C as a programming language on a conscious level when programming. The solutions to the hardware issues controllers are used for, are solved in the subconscious as C code as an expression of the function needed. This ability to not think about programming when programming is the reason C is still so popular today with people who are not software programmers by profession.

It's like an experienced car driver, you only really think about driving during exceptions and after a while, most exceptions are automatically handled.
 
Last edited:

nsaspook

Joined Aug 27, 2009
13,079
Here was a possible use case for microcontroller #embed but the OEM already provides a C compatible binary include file in their device driver/demo that I just copied to my program for "any/no motion interrupts" from the chips internal feature engine.. I don't actually use that functionality on the vibration sensor network as it's mainly for wearable applications.

https://github.com/boschsensortec/BMA490L-Sensor-API/blob/master/bma490l.c
They are now obsolete, replaced by the bma400 or bma456.

https://github.com/boschsensortec/BMA456-Sensor-API/blob/master/bma456h.c
https://github.com/boschsensortec/BMA456MM-Sensor-API/blob/master/bma456mm.c

The selected replacement bma400 device doesn't need a downloaded config file from the controller, much better for reducing memory and code complexity.
https://community.bosch-sensortec.com/t5/Knowledge-base/BMA400-accelerometer-design-guide/ta-p/7397
https://www.mouser.com/datasheet/2/783/BST_BMA400_DS000-1509606.pdf

I've found that most OEM devices that need a binary already provide that as a C file instead of a binary (because they know their typical product consumer) that needs to be converted if the target for the device is small embedded.
 
Last edited:

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,533
No, I don't, I'm a language tool user, not a language tool maker but I do know how to use several of them in this domain and have the experience to tell you what actually works and what doesn't on actual MCU systems I created, working 24/7 for decades.

I did read the blog of the #embed feature submitter for C23 and the long winded sad history. Good stuff and useful for several things like dynamically selecting connected devices firmware binary during compile but it's rare for micro-controllers to need raw unconverted sizable binaries to upload to another machine. It's usually the other way around that a machine with an OS downloads to the MCU via a communication port boot-loader on the micro-controller.
Yes it's byte format portable (that's a trivial problem usually) but not really sufficient in most cases where binary conversion of a file you didn't originally create to another type of binary is needed.

"A file is just a stream of bytes" It's a meaningless statement for what it represents as a source of information that must be processed by some MCU.
Yet a file is just a stream of bytes. Initializing static data at compile from a file is a useful option, a useful additional way to achieve it. Granted there are different use cases and not all of them are addressed by the feature but having such a feature might well allow certain workflows to be improved by certain users.

"if you actually want to load a different file then create it and embed that one"
Why would I add a compile step #embed when the external format conversion automatically does the embed function for me in one step?
I only meant that if you have a file with data in format X but at runtime the code is written to expect it in format Y then it is an option to store the file format Y. This is only happening because the data is not in a format that the code was written to process.

I'm saying it's better to spent time on things of more MPL value to the embedded programmer targeted (I hope that's the target) by your efforts. That way it's more likely to actually be used by people like me who don't even think about C as a programming language on a conscious level when programming. The solutions to the hardware issues controllers are used for, are solved in the subconscious as C code as an expression of the function needed. This ability to not think about programming when programming is the reason C is still so popular today with people who are not software programmers by profession.

It's like an experienced car driver, you only really think about driving during exceptions and after a while, most exceptions are automatically handled.
I don't think you've ever answered my question - what exactly do you think would be useful, helpful features above and beyond what C provides, in the context of writing code for MCUs?

Do you think:

1. There are no such features possible, C cannot be improved upon.
2. I don't care, C is sufficient for me and I do not care if it can or cannot be improved upon.
3. Yes I can think of the following that I would like to see or be able to express in C:
...
 
Top