Calculate (i) the gain in dB at 10kHz and (ii) the cut-off frequency for R = 100W and C = 1μF if the input voltage = 10V. Any help is appreciated.
can you get the gain itself? because the gain in dB is simply 20log |G|, where |G| is the modulus of the gain, and that's log to the base 10.
what you have there is a potential divider. let X denote the impedence of the capacitor, so that X = 1/jwC. the output voltage would therefore be V_out = V_in[X / (X + R)]. from there, you can divide by V_in to get V_out/V_in, plug in your numbers, find the modulus and 20log it.
past papers are a great way to prepare for your REAL exams. in fact, it's the only way i prepare for them.
@ 10kHz ≈ -16dB (I believe you had a typo) -3dB / 45° cutoff ≈ 1.6kHz Filter is 2nd order low pass filter, attenuation -6dB/octave after cutoff point. Always check other calculations to see if they "make sense", i.e. 10kHz is about 4x 1.6kHz, so either -dB per octave is way too low for attenuation, or the 10kHz calculation is incorrect.