8051 Instruction set

takao21203

Joined Apr 28, 2012
3,702
Maybe you should study politics, society and humanities.

There is a cartoon from the 19th century:

Imagine two poor street musicians, performing, and begging, and crying they are so poor and about to die.

A rich snob passes by. He gives them 2x 1000 Mark bank notes. 2000 Mark at that time was a very large amount, maybe kindof like $15,000 nowadays- or more.

A few days later. They bought a large organ with whistles- still the same face expressions, still the same whining. Much louder music- as it looks, it had even become worse.

Its just a short comic strip.

So I must think about this story. What if someone literally learns all about the 8051 and after some while, indeed is able to replicate it:

-Thinking its important to follow detail
-It must be done that way down to numbers of bits
-Its important to know the actual implementation
-And so forth.

Then, we would end up with a much larger 8051 kind of machine, with all its drawbacks?

This technology is so much evolving, there are so many techniques to improve performance. Innovation never stops. If you learn some 20 years old design literally, you should know its already deprecated and outdated.

Why not the TMS1000 or 4004? Research and understand how the 4004 evolved, and research the parallel devleopements at that time. Even the 4004 schematics are already quite difficult to understand.

Looking at them with the mindset "It must be done that way for some mystic reason" isnt a very good idea.

If you'd really want to learn about how a CPU is designed, after a few weeks you'd be able to build a small sequencer from simple TTL gates.

Take a "Message in space". You can build that with a small RAM, NE555 ICs, and a few registers + counters.

All you need now is branching, and a primitive ALU. Thats pretty much it.

If you wanted to, you could learn + understand it in a few weeks, then probably leave it behind you, and do something else you'd prefer to do for a longer time. Maybe you'd also find:

  • -You arent brilliant at it
  • -You lack resources to do it properly + the way you wanted to
  • -It already has been done years ago

For the understanding, its enough to make a few primitive sequencers, play with FLASH chips and small RAMs, make a message in space. Maybe design a primitive branching unit + add an ALU.

I did, and decided I understood all I wanted to understand.

Bit width? How the opcodes are encoded and recognized? I could do it any way I wanted to.

When you ever have a serious interview, the HR guy (or lady) if they are good, would ask you adhoc to present some of your own ideas about humanities and societies, or maybe bait you with something like "Why do you think racism exists explain reasons for it and explain your own opinion about some topic of that kind".

They dont just want you to repeat Wikipedia knowledge. They will present you questions you cant know, and will evaluate your statements about them, probably you never considered the issue seriously.

It could be something like that. If you are clever, you can, within a few sentences, draw up some loosely related example you are familiar with.

They could be examining if you take things literally, or if you follow instructions easily, you'd never know.

http://en.wikipedia.org/wiki/Wilhelm_Busch

Studying doesnt mean you become an expert about it, it just means, you dig into it, and think about it, research some aspects of it, and consider what use it may have.

For instance if you are brilliant in psychology, you could briefly refer that
http://en.wikipedia.org/wiki/Hans_Asperger

is controversial, among other things, the predescribed "roles" kindof are a manifest for a "self fulfilling" prophecy. The literature describes it in some way, and the child just did something wrong on one day, and is "expected" to "have" such traits waiting to break out.

Asperger indeed is highly controversial. That makes intellectual brilliance- to be able to criticize and put a doubt on issues and topics- not just to repeat what you did read somewhere.

And believe me- Shima, the engineer who designed the 4004 and also helped the 8080, was intellectually brilliant. There was no real CPU before him to "understand" or to learn from.
 

absf

Joined Dec 29, 2010
1,968
Is it possible ? can I use 3 decoders , I have doubt I think 8051 use only 8 to 256 decoder if this possible tell me I will create logic table
Does it really matter if the Instruction Register decoder uses one big decoder or 3 smaller ones? As long as all the instructions are decoded properly and you understand how it works, who cares? BTW Hexreader also agreed that your dounts are just doubts...

Intel never actually reveals its 8051 core design to the public (correct me if I'm wrong). So unless you're one of the engineers in the design team, no one is going to show you the original design of the mcu core. There are perhaps hundreds if not thousands of ways to design the 8051 core, that would work like the 8051. That's why there are AMD pentiums that work like Intel pentiums in PCs. Dont think Intel is so kind to give away its core design to its rivals....

What we are doing here is called reverse engineering to create a mcu core based on the 8051 block diagram, clock and the instructions set. There's no way we can tell if we are right or wrong. If you have a better idea, show us what you think how it should work .........

Allen
 
Last edited:

absf

Joined Dec 29, 2010
1,968
And believe me- Shima, the engineer who designed the 4004 and also helped the 8080, was intellectually brilliant. There was no real CPU before him to "understand" or to learn from.
Really? I thought 4004 was designed by Faggin and Hoff.

"The chief designers of the chip were Federico Faggin who created the design methodology and the silicon-based chip design, Ted Hoff who formulated the architecture, both of Intel, and Masatoshi Shima of Busicom who assisted in the development." quoted Wikipedia

Allen
 

takao21203

Joined Apr 28, 2012
3,702
Really? I thought 4004 was designed by Faggin and Hoff.

"The chief designers of the chip were Federico Faggin who created the design methodology and the silicon-based chip design, Ted Hoff who formulated the architecture, both of Intel, and Masatoshi Shima of Busicom who assisted in the development." quoted Wikipedia

Allen
Intel made the chips for Busicom
 
Top