I have a problem understanding how the voltage drop across a resistor in a series circuit is proportional to the ohmic value of the resistor. Can you please help me understanding this?
Kirchoff's Current Law tell us the current through a series circuit will be the same at any point. Each resistance in series will have the same current flowing through it.
Ohm's Law allows us to calculate the unknown variable based on the two known variables. E=I*R
The rest is algebra...
Since R1 is always a fixed portion of R1+R2, the drop across R1 will be the same fixed portion of the drop across both resistors.
Try some example problems with series circuits, the concept should "sink in" then.