Linking in C embedded project

THE_RB

Joined Feb 11, 2008
5,438
More twilight zone?

"Dead code optimisation" is an elite feature, my MikroC is the real old version (long before MikroC "pro") and does not have that feature. Dead code refers to not compiling sections within code that cannot be reached.

What I'm talking about is normal pre-processor activity, that should occur in all C compilers, that check which functions are used and then not compile functions that are never called. The pre-processor in any embedded compiler needs to check all functions, their sizes and frequency of their calls for the purposes of ordering/fitting them neatly in ROM banks etc. So the preprocessor always does a lot of work on functions before compiling anyway, and it's absolutely standard to NOT compile functions if they are never called. It's incredibly simple for any preprocessor to search the code for calls to that function.

This is totally different to dead code!
Dead code; Any code areas that cannot be reached due to the decisions of the code are eliminated, and funcion calls within that code are elminated (which might also eliminate some functions). This requires elite level source code decision/branching analysis.

Normal preprocessor; Checks variables and any time variables are used, eliminates unused variables from compilation and RAM. Checks function calls and functions, if any function is never called it is eliminated from compilation and ROM.

Across many compilers and platforms for many years, I have left unused functions in my code and they have never been compiled.

Maybe the way you are declaring functions in your source or header is forcing the compiler to compile it? Do you have a real world example?
 

ErnieM

Joined Apr 24, 2011
8,377
I believe you need to do some research on these things and stop letting your imagination get the best of you.

The "normal pre-processor activity" is not a function of the preprocessor, but the compiler itself.
 

THE_RB

Joined Feb 11, 2008
5,438
Where the imagination come in? You're the one saying the compiler will compile un-called functions. Still no proof coming from you either I noticed. ;)

It's the pre-processor's job to count labels, vars, functions etc, before the compilation begins. From my memory it's the pre-processor that funds dud labels and eliminates them from compilation, but maybe things have changed in the C pre-processor world?

After the pre-processor has checked/counted the functions and their declarations and calls it's of little relevance to our discussion whether the pre-processor OR the compiler makes that decision to eliminate the un-called functions.

The relevant point is whether an embedded C compiler will compile un-called functions or not.
 

ErnieM

Joined Apr 24, 2011
8,377
Where the imagination come in? You're the one saying the compiler will compile un-called functions. Still no proof coming from you either I noticed. ;)

It's the pre-processor's job to count labels, vars, functions etc, before the compilation begins. From my memory it's the pre-processor that funds dud labels and eliminates them from compilation, but maybe things have changed in the C pre-processor world?

After the pre-processor has checked/counted the functions and their declarations and calls it's of little relevance to our discussion whether the pre-processor OR the compiler makes that decision to eliminate the un-called functions.

The relevant point is whether an embedded C compiler will compile un-called functions or not.
Nothing has changed. You are wrong again.

I do hope no one follows your advice to #include dot c files in other dot c files. That can easily lead to new work not getting compiled unless you religiously clean before building, something the OP here was attempting to avoid.
 

joeyd999

Joined Jun 6, 2011
5,287
I've been watching this thread, and haven't commented simply because I don't use C for embedded code.

But I do code C for PC applications. And here is my understanding in that rhelm:

If you are using pre-compiled code, say a library file, the compiler, at the time of library compilation, has no idea as to whether you will need any or all functions in that file in the future. Therefore, it has no choice but to compile all the code and include it in the resulting object file. No optimization can take place at that time.

Only the linker, at the time of compiling the final application, knows which functions are needed and which are not. It is only then that unused code can be stripped during the linking process.

Obviously, if you are compiling your main app code along with the libraries, the compiler could potentially strip the appropriate code, either before object code generation or after. But, the OP was specifically referring to pre-compiled libraries. So, code optimization, IMHO, must take place at link time, not compile time.

Am I wrong?
 

THE_RB

Joined Feb 11, 2008
5,438
I'm not sure, but it is not what we were discussing. :)

Don't sweat it, my discussions with ErnieM often go this way. He makes a statement, I politely call him out on it and ask for proof, and he totally fails to provide proof and instead starts personal insults like "imagining" etc. Just another thread gone down the same path.
 

ErnieM

Joined Apr 24, 2011
8,377
I've been watching this thread, and haven't commented simply because I don't use C for embedded code.

But I do code C for PC applications. And here is my understanding in that rhelm:
Minor point: I don't know of any major (or minor really) differences in the sequence, procedure, or any material process when building an application written in C that depends on the target platform, excluding the underlying generated machine code of course.

AFAIK the term "embedded C" is just something someone made up, not a true variation to the C standard.

If you are using pre-compiled code, say a library file, the compiler, at the time of library compilation, has no idea as to whether you will need any or all functions in that file in the future. Therefore, it has no choice but to compile all the code and include it in the resulting object file. No optimization can take place at that time.
Again, a minor point: The compiler is not completely clueless when *creating* a library object. Say a library contains exported functions A() B() C(), plus function and D(), where A B and C all call D: all these get compiled to the object. But if you now add a function Z() that is NOT referenced by the code or in the dot h then the compiler may see it as dead code and thus delete it from the object.

And any compiler (even the microC compiler) will create code for any and all functions, even those without a reference if you simply turn the optimizations off.

Only the linker, at the time of compiling the final application, knows which functions are needed and which are not. It is only then that unused code can be stripped during the linking process.
The linker does not strip code per say, it *adds* code by checking the list of unfulfilled symbols (functions and variables) and scanning any library files it sees for matching symbols its needs. When a match if found the *entire* object is imported; anything less not being possible as the linker just has a list of symbols, no source to check if anything does not get used.

What you can do is build a collection of small atomic functions taking advantage of the fact that one dot-C file makes one dot-O object file. Thus you build a collection library from many small dot C files to get good granularity.

If you use the XC8 compiler you can see an example of this granularity for I2C functions in this folder:
...\Program Files (x86)\Microchip\xc8\v1.20\sources\pic18\plib\i2c

(I would provide a library source folder for microC but they seem to have not exposed any of their library source codes. The microC compiler has several other weird behaviors, such as using an automatic (and thus hidden) link to the device file of the part being compiled, making the device file a dot-C instead of a dot-h (it should not contain code). Generally microC works so hard to make the compilation process transparent (a curious term meaning invisible) that some users are left to make up guesses as to haw it it actually working, especially those users with little experience with other compilers.)

Obviously, if you are compiling your main app code along with the libraries, the compiler could potentially strip the appropriate code, either before object code generation or after. But, the OP was specifically referring to pre-compiled libraries. So, code optimization, IMHO, must take place at link time, not compile time.
Am I wrong?
Nope, correct, providing you accept "pasting together a minimal set of objects" to mean "optimization." The library objects are complete at the link stage, you can't shave one down without recompiling it.

And as a final statement about those who make rude comments, obscure claims out of thin air, never provide any of their own proofs, and are incapable of correcting their own work: do not expect me to feed the trolls.
 
Top