digital circuit

Thread Starter

vead

Joined Nov 24, 2011
629
hi every one i am confused ,what is analog circuit and digital circuit
analog circuit-analog component such as transistor,diode capacitor,inductor,resistor is use to create analog circuit

digital circuit- gates is made of analog component such as diode transistor

Resistor Transistor Logic.
Diode Transistor Logic.
Transistor Transistor Logic.
in both circuit same component is use to create circuit
someone explain me difference between analog and digital circuit​
 

thatoneguy

Joined Feb 19, 2009
6,359
Digital circuits, no matter the construction of the gates, operate using True or False, only two results for an operation, 0 or 1, on or off, etc. Discreet steps define what voltage ranges qualify as which.

Analog circuits can be of any value, some digital theory is in analog, such as mixers, but nearly all digital components are made from components that could be used in an analog circuit.

Analog covers amplifiers, and essentially any circuit that doesn't produce a true or false output.

There are many "hybrid" circuits, such as analog to digital converters and digital to analog converters.
 

thatoneguy

Joined Feb 19, 2009
6,359
When the voltage value of an output falls in a certain range, that value is considered true, and if it falls lower than that range, that value is treated as false, it is a digital circuit.

An analog circuit of the same sort would be a comparator. If the voltage value is above a certain point, the output is set to a value, and if it goes below that point, the output goes to a different value.

The difference is in the digital realm, the voltages are globally defined, such as +5V/0V is standard (or 3.3v in newer devices). Whereas comparators may operate at any voltage, and the exact value of that voltage doesn't matter as much as the voltage it is being compared to, which can be anywhere from 0-100V or more, with the output being any value.

Analog doesn't tend towards "true" and "false" as much, mostly it is amplification of voltages and mixing voltages. Another signature of Analog is the ability to work with AC voltages, wheras digital is limited to an overall set of definitions for "voltage ranges" to indicate a true or false, and the output of that circuit is the highest digital voltage or the lowest digital voltage, rather than anything in between.
 

MrChips

Joined Oct 2, 2009
30,824
Digital simply means taking the analog world and describing it as having only two states, such as hot and cold. When do we decide something is hot or cold? Or light and heavy, tall and short, big and small.

"There are only 10 types of people in this world, those who understand binary (digital) and those who don't."
 

crutschow

Joined Mar 14, 2008
34,470
The distinction between analog and digital is how data is represented.

Digital uses binary numbers (or words) to represent data as discrete values. Each bit is represented as a one or zero (high or low voltage) in the circuit. The more bits in a digital word the more precisely you represent the data. For example 8-bits has 256 levels of precision and 16 bits has 65,536 levels of precision. This data can be transmitted or stored indefinitely without error.

Analog represents data as a continuous voltage. There is no discreteness designed into the data, the limit is just the noise level of the system (and ultimately the charge on an electron). It is difficult to store analog data for any significant time (although it can be done with media such as magnetic tape or phonograph records) or send it significant distances without deterioration of the signal. In modern electronics any analog signals (audio, video, sensor data, etc.) are typically converted by an A/D converter to digital form as soon as possible so that it's easier to manipulate and store the data. Thus devices like cell phones and HDTV use digital to process and send the signals even though the originating signals are analog.
 
Top