Hardcoded pins within software

Thread Starter

Jswale

Joined Jun 30, 2015
121
Hi all,

I have been working on a project and I have confused myself by over thinking.

I have a 128 pin MCU and I am downgrading to a 100 pin compatible MCU. The firmware is designed for both models, the latter just has less IOs.

My question is, the pin list on both MCUs are different...e.g XIN is on pin1 and on pin 27 on the latter.

I know these will be hardcoded in the firmware but will I need to change some code somewhere or will I just need to connect up the relevant hardware and the firmware should do the rest, as it does for the 128 pin MCU?

Cheers
JSwale
 

joeyd999

Joined Jun 6, 2011
5,283
Hi all,

I have been working on a project and I have confused myself by over thinking.

I have a 128 pin MCU and I am downgrading to a 100 pin compatible MCU. The firmware is designed for both models, the latter just has less IOs.

My question is, the pin list on both MCUs are different...e.g XIN is on pin1 and on pin 27 on the latter.

I know these will be hardcoded in the firmware but will I need to change some code somewhere or will I just need to connect up the relevant hardware and the firmware should do the rest, as it does for the 128 pin MCU?

Cheers
JSwale
Unless the code performs some kind of device discovery (which I doubt) -and- there is a 1-to-1 correspondence of I/O pins on the 100 pin part to the 128 pin part and to the attached devices (which I also doubt), you will need to modify the code.

There are two approaches you could take. The first is to use compiler directives to create a conditional build based upon the target chip. This way, you can build for either part in the future, but the code will be different for each part. The second is to somehow dynamically determine the part on which the code is being executed, and remap the I/O upon initialization. This is more complicated, but the same code will run on both parts interchangeably.

You could also just hack up the old code for the new chip -- but you lose backward compatibility.
 

atferrari

Joined Jan 6, 2004
4,768
The second is to somehow dynamically determine the part on which the code is being executed, and remap the I/O upon initialization.
Hola Joey

Could you elaborate briefly on that? It means that you need to include some info (data) beforehand for the software to select, isn't it?
 

ErnieM

Joined Apr 24, 2011
8,377
My input would be "it depends." I have seen some families of micros where several devices share a common set of pins and functions such that yes indeed, the same firmware can be built to run on a smaller or larger device since it only uses the features that are common to all sizes.

So if, for example, you set pin X on Port Y it matters not if that pin is wired to any particular physical location: the same code will wiggle it here or there.

The complication comes into play when you try to program the device. Many devices (including Microchip which I am most familiar with) includes a device ID in the code image the compiler produces, and the programmer will read this ID from the device before it programs it. If a match is not made the programming is halted.

If you have the source code available then you can just rebuilt it targeting the smaller device, and assuming you get an output with no errors then all is fine and your work is done.

If you do get errors... They need to be fixed because the parts are not as similar as the programmer assumed.
 

joeyd999

Joined Jun 6, 2011
5,283
Hola Joey

Could you elaborate briefly on that? It means that you need to include some info (data) beforehand for the software to select, isn't it?
Obviously, there'd need to be I/O code for each configuration. The I/O code that is executed would be dependent upon a flag bit (say, 0=128 pin part, 1=100 pin part).

The trick would be setting the flag upon power on. If the processor has an built-in, firmware accessible id code, then you could query that and set the flag appropriately. Or, a port pin could be scanned -- the presence/absence of a particular signal could indirectly indicate which MCU is installed.

I didn't say it was easy. I prefer conditional builds.
 

JohnInTX

Joined Jun 26, 2012
4,787
I didn't say it was easy. I prefer conditional builds.
Me too.
I put all of my IO definitions in a single file (prjIODEF.h or.inc) that is tied to the chip, PCB revision - anything that may affect the IO map or operation. For each port, data directions, initial values, etc are specified (#define TRISAinit 0bxxxxxxxx etc). ALL IO functions (RELAY_ON, LED_GREEN etc) are macros that access the port bits. There is never any direct IO allowed in the main code i.e. RELAY_ON is OK, RELAY=1 or PORTB.1=1 is not.

A code file xxxIO.c has the initIO function. It looks like PORTA=PORTAinit, TRISA=TRISAinit etc. It also has any helper routines for more complex IO.
Each different chip (80 pins, 100pins etc) has its own prjIODEF.h file. So does a PCB rev that affects the IO (maybe REV2 has an inverting driver for the relay etc). A project may have one or many prjIODEF files (prjIODEF_REV1.h, prjIODEF_REV2.h, prjIODEF_PIC18F4520.h etc etc). What results is an abstraction layer that provides some separation between the code's logic and how it interfaces to the real world.

Each project has a CONFIG.h /.inc file where all of the different combinations can be specified. That's where the conditionals are that cause the various IODEF file(s) to be included. At the top (or with a command line parameter), I can specify what the build is for, click GO and get a codeset for that configuration. In CONFIG.h and each prjIODEFxxxx file, I put messages that appear in the build output window to verify what was built for example:
IODEF_PIC4520.h would have a line:
MESSG "IODEF_PIC4520_REV2.h: CODE BUILT FOR PIC18F4520 - REV2 PCB" etc. // shows up in the build output window

Its more involved but once done, makes updates manageable and releases much less stressful.

On very rare occasions, I've had to do some dynamic assignments but that was usually due to some customer's weird concept, not mine.

Just my .03
 

joeyd999

Joined Jun 6, 2011
5,283
Me too.
I put all of my IO definitions in a single file (prjIODEF.h or.inc) that is tied to the chip, PCB revision - anything that may affect the IO map or operation. For each port, data directions, initial values, etc are specified (#define TRISAinit 0bxxxxxxxx etc). ALL IO functions (RELAY_ON, LED_GREEN etc) are macros that access the port bits. There is never any direct IO allowed in the main code i.e. RELAY_ON is OK, RELAY=1 or PORTB.1=1 is not.

A code file xxxIO.c has the initIO function. It looks like PORTA=PORTAinit, TRISA=TRISAinit etc. It also has any helper routines for more complex IO.
Each different chip (80 pins, 100pins etc) has its own prjIODEF.h file. So does a PCB rev that affects the IO (maybe REV2 has an inverting driver for the relay etc). A project may have one or many prjIODEF files (prjIODEF_REV1.h, prjIODEF_REV2.h, prjIODEF_PIC18F4520.h etc etc). What results is an abstraction layer that provides some separation between the code's logic and how it interfaces to the real world.

Each project has a CONFIG.h /.inc file where all of the different combinations can be specified. That's where the conditionals are that cause the various IODEF file(s) to be included. At the top (or with a command line parameter), I can specify what the build is for, click GO and get a codeset for that configuration. In CONFIG.h and each prjIODEFxxxx file, I put messages that appear in the build output window to verify what was built for example:
IODEF_PIC4520.h would have a line:
MESSG "IODEF_PIC4520_REV2.h: CODE BUILT FOR PIC18F4520 - REV2 PCB" etc. // shows up in the build output window

Its more involved but once done, makes updates manageable and releases much less stressful.

On very rare occasions, I've had to do some dynamic assignments but that was usually due to some customer's weird concept, not mine.

Just my .03
Yes! And the last thing I do is to use Git to not only keep track of my revisions, but also maintain branches for the various different builds and releases. A quick checkout loads up a whole new configuration. And I can easily merge changes into each of the various builds*.

*Why? Because I may not have time to test for regressions for each configuration and each change. I can merge the changes into each different configuration as I have time (or the requirement) to test.
 
Top