Model SF/HW cost changes: Need sample embedded systems

Thread Starter

lukmax

Joined Jul 24, 2014
6
Hello,

I want to build a quantitative model to illustrate the monetary relationship between HW and SF in embedded systems. How do the S and H cost change if I implement a function in SF or HW?

In the need to express a S function in H I want to determine the graph with Source Lines of Code on the x-axis and transistor count on the vertical axis. What I do get is the SLOC for each function of an embedded system and the chip size (where I can calculate the transistor count from).

To model the graph I need to look at some specific embedded systems with the information on the chip hardware and the Software code (best: SLOC). Do you know any source where I can find these information?

Thanks in advance for your help!:)
Luk
 

Papabravo

Joined Feb 24, 2006
21,158
I suspect you will have difficulty with this project since it is rare to have more than one implementation of a given system. To get the kind of information you want on any non trivial system is also unlikely since that information is generally not revealed by the manufacturer.
 

Thread Starter

lukmax

Joined Jul 24, 2014
6
How does your model take into account the number of units being produced?
As a variable. SW development will be divided by production quantity (expect maintanance..).

I suspect you will have difficulty with this project since it is rare to have more than one implementation of a given system. To get the kind of information you want on any non trivial system is also unlikely since that information is generally not revealed by the manufacturer.
I don't need many implementations of one system, I will just look at different systems. For example parking system, Engine Control, Airbag Control etc..

Trivial systems will be fine for a start. Do you know where I can get such information? Any open source systems?
 

GopherT

Joined Nov 23, 2012
8,009
I think you would be better off looking to some Microcontroller manufacturers for some sales brochures that show examples.

In any case, there are so many hardware options and software options to solve any specific task (and so many different types of functions) that a graph with any real correlation will not be possible.
 

Papabravo

Joined Feb 24, 2006
21,158
I disagree. To calibrate the model you need to have a hardware intensive implementation and a software intensive implementation of the same system. Otherwise the whole project is just an amusing speculation on your part.

For a non trivial project look at the HPSDR project.
http://opensdr.org/
 

sirch2

Joined Jan 21, 2013
1,037
Lines of code is not a good measure of software effort. There has been a huge amount of work done on software metrics (cyclomatic complexity is one that comes to mind) but none give a good indicator of how long a piece of code will take to write and therefore how much it will cost. Studies have also found a 10 fold difference between the productivity of a good coder and a not-so-good one.
 

Papabravo

Joined Feb 24, 2006
21,158
Lines of code is not a good measure of software effort. There has been a huge amount of work done on software metrics (cyclomatic complexity is one that comes to mind) but none give a good indicator of how long a piece of code will take to write and therefore how much it will cost. Studies have also found a 10 fold difference between the productivity of a good coder and a not-so-good one.
This is especially true of reusable design modules. I can spend days on a few lines of something I've never tried before and crank out several hundred lines of things I use repeatedly in all projects. Same with hardware, and I can lay down several thousand gates with a couple dozen lines of Verilog.

Any economic model should focus on how long it takes to clearly define the project requirements. That is the prime determinant of meeting cost and schedule goals. Trust me on this.
 

djsfantasi

Joined Apr 11, 2010
9,156
Software development costs will vary wildly dependent on who is doing the coding. I have seem hundreds of lines of code replaced with a handful by a experienced coder. And by experienced, I am including detailed knowledge in a complex system. By measuring with lines of code, you are rewarding the mistakes of rookies. Also, experience reduces costs in the testing and debugging phases. I have seen a multi-week effort totally scrapped when testing showed that the code was non-functional.
If all these situations occurred together (and they often do), the software cost will be misleadingly high.
 

sirch2

Joined Jan 21, 2013
1,037
As Papabravo says, requirements definition is a significant factor.

The other thing that needs to be considered is in-house skill set. It is generally going to be quicker and cheaper if you go with the skills you have, i.e. let hardware guys do it in hardware and let software guys do it in software, assuming other criteria (performance, component cost, etc.) are similar.
 

Thread Starter

lukmax

Joined Jul 24, 2014
6
I think you would be better off looking to some Microcontroller manufacturers for some sales brochures that show examples.

In any case, there are so many hardware options and software options to solve any specific task (and so many different types of functions) that a graph with any real correlation will not be possible.
Do you know if there are not only information about the built in Microcontroller but also the specific code that runs through the processor?

Lines of code is not a good measure of software effort. There has been a huge amount of work done on software metrics (cyclomatic complexity is one that comes to mind) but none give a good indicator of how long a piece of code will take to write and therefore how much it will cost. Studies have also found a 10 fold difference between the productivity of a good coder and a not-so-good one.
I'm aware of the disadvantages of LOC. Software costs will be measured by using Cocomo. It includes all the attributes you mentioned.

Basically I'm searching for a "What-If" quantitative model that uses LOC as a basis for software.
 

sirch2

Joined Jan 21, 2013
1,037
I am sat here looking at Pressman's book in the stack on my desk. COCOMO is driven by LOC but the fundamental problem with LOC is you don't know how many lines it will take until you have written the code. Even if you have metrics built on experience (and I have personal metrics for the last several decades) you can be more than 100% out.
 

Thread Starter

lukmax

Joined Jul 24, 2014
6
I am sat here looking at Pressman's book in the stack on my desk. COCOMO is driven by LOC but the fundamental problem with LOC is you don't know how many lines it will take until you have written the code. Even if you have metrics built on experience (and I have personal metrics for the last several decades) you can be more than 100% out.
"The LOC measure is a terrible way to measure software size, except that all the other ways to measure size are worse" McConnell (2006)
 

sirch2

Joined Jan 21, 2013
1,037
And the clue there is "measure" i.e. it is after the code is written. So how many lines of code would it take to produce a real time clock in software vs. using a hardware RTC?

Or another scenario, in a recent thread on here I asked about a circuit that caused an LED to flash faster as the input voltage increased. This could be done in hardware or software, how many LOC in software?
 

Thread Starter

lukmax

Joined Jul 24, 2014
6
This is exactly the question I'm trying to answer: If this function needs this kind of hardware, what would it cost to implement it in software? As this is not too easy to answer and I need a quantitative model, I have to start simple. The model will be adopted by time and requirements.
 

sirch2

Joined Jan 21, 2013
1,037
I wouldn't be surprised if using say and I2C RTC module didn't require more LOC than implementing a basic RTC on the MCU itself. However I may still choose to use the external module...
 

GopherT

Joined Nov 23, 2012
8,009
What is the goal? A correlation? LOC vs. number of components or board space or cost of components or assembly cost, or inventory cost or risk of specifying obsolete (or soon to be obsolete) parts or ...

There are so many ways to building an LED flasher that without a Microcontroller that any method you pick will have trade-offs. Even picking a capacitor/resistor pair to set your time constant can change the board space and component cost without changing the number of components.

This is an interesting discussion but will not likely yield a beneficial output unless you tighten up your definitions and what you hope to correlate (and why). Even then, I am not so sure of the quality or strength of your final conclusions.

Finally,
LOC is a bad indicator since a Microcontroller could care less about how many lines you wrote. I think compiled size is more important. And even that is a bad argument because cost of the micro will be a step function of available firmware space vs. cost (if you select cost as a correlation - still not clear).
 

sirch2

Joined Jan 21, 2013
1,037
Good post GopherT but is that an optimizing compiler? and if I am trying to figure out whether to go hardware or software how do I estimate compiled size before I write the code? (factor in that I may choose any of many MCUs).

Sorry, I really need to stop chewing on this particular bone...
 

GopherT

Joined Nov 23, 2012
8,009
Good post GopherT but is that an optimizing compiler? and if I am trying to figure out whether to go hardware or software how do I estimate compiled size before I write the code? (factor in that I may choose any of many MCUs).

Sorry, I really need to stop chewing on this particular bone...
Yes, all intended to be open questions. I just got tired of listing reasons this question is unanswerable.
 

NorthGuy

Joined Jun 28, 2014
611
And the clue there is "measure" i.e. it is after the code is written. So how many lines of code would it take to produce a real time clock in software vs. using a hardware RTC?
That I can tell you. I have written software that communicates with PIC RTCC module, and I have written software that implements real time clock without it. PIC based module is 99 lines. Software module is 375 lines, although it does lots of extra stuff, such as calculating suset and sunrise times and perform conversions to/from strings. Without these extra capabilities, it probably would be about the same size as RTCC one.

Or another scenario, in a recent thread on here I asked about a circuit that caused an LED to flash faster as the input voltage increased. This could be done in hardware or software, how many LOC in software?
Depends on the CPU and the nature of the relationship between voltage and frequency. 15 to 20 for something simple.
 
Top