interfacing 8051 with 8255 in c lang

Thread Starter

mahesh H.B

Joined Nov 8, 2005
2
hello everybody, i am new to this field and i am doing project using microcontrollers 8051 and 8255, i am writing the program using 'c' langauage and after that i am transfering it using cross compiler to assembly language. the discription is the program has to acccept the 12 bit signal from outside that is from chip.the programmers need not know the architecture of any microcontrollers.the program should contain accepting 12 bit data and again sending 12 bit data to outside.so anybody can send a sample program which will be verymuch helpfull to me for further progress.
thanking u
Mahesh H.B
 

sci-3d

Joined Aug 22, 2006
51
Did you 12-bit Analog to Digital detects signal from outside that is from chip?

I use ads7841 interface the sensor to detect various kind of signal.
 

Papabravo

Joined Feb 24, 2006
21,227
hello everybody, i am new to this field and i am doing project using microcontrollers 8051 and 8255, i am writing the program using 'c' langauage and after that i am transfering it using cross compiler to assembly language. the discription is the program has to acccept the 12 bit signal from outside that is from chip.the programmers need not know the architecture of any microcontrollers.the program should contain accepting 12 bit data and again sending 12 bit data to outside.so anybody can send a sample program which will be verymuch helpfull to me for further progress.
thanking u
Mahesh H.B
I believe that your requirements are inconsistent with the problem solution. The C language is built on the paradigm of a large linear address space. In this model hardware registers are treated the same as memory. In order for the compiler to generate the correct code the knowledge of where the 12 bits of data is coming from or going to is required. This does not account for numerous hardware variations for sending and receiving the data. In short it might be a nice goal in theory, but it has very little practical applicability. Things can't always be boiled down to the level of an underpaid Indian coder working in a software development sweatshop.
 

beenthere

Joined Apr 20, 2004
15,819
Hi,

I might guess you're requirement came from somebody with a degree in computer science. That training tends to ignore hardware entirely, and treat it as a somehow uniform resource.

What you find is that there are many times when the hardware dictates the coding. You can't write code without completely understanding the interface requirements of the processor and the external device.

Some time ago, the US military paid a lot of money to develop a universal computer language. It was called ADA. What happened was that many compilers had to be written to handle the output for each computer system involved. All due to differring hardware requirements.

Now, languages tend to ignore stuff like that, and insist that all attached devices function either alike, or have a support driver to handle the interfacing. In order for your schemem to work, a driver would have to be written for each of those data sources, and integrated into the STDIO library in order for C to be able to send and receive data from those "chips".
 

Papabravo

Joined Feb 24, 2006
21,227
Sure, you might be right. It's not their fault that jobs have been outsourced by rapacious and evil corporations. I actually feel for the Indian coders, but there is a price to pay for that corporate greed and this question exposes the lunacy that coding can be done by anyone anywhere.

You can't be upset at me for besmirching the reputations of our altruistic and benevolent multinational corporations can you?
 
Top