H(s) = 20 (0.1jw +1 ) / (jw)(jw+1 )(0.01 jw +1)

where w is my omega.

We can rewrite this as:

H(s) = 200 (s +10) / (s)(s+1)(s+100)

There are break points at 0, 1, 10, and 100, and I understand how figuring the slope works.

**But what confuses me, is finding the initial gain value at w = 0 !**

If my understanding is correct, normally one can simply plug in w = 0, or, a w value smaller than the smallest break point, to find the initial value (then put that into 20(log(w)) for decibels). But if the plot starts at zero, how can we find the initial value? If one plugs in zero, the denominator becomes zero, and the result is undefined, or infinity, basically...

Edit: This is just a linear plotting btw, or I suppose, uncorrected Bode plot, it's called