The microcontroller generates the clock pulses that are sent to the slave device.Most interfaces will provide for some implementation choices as to clock frequency. I do not think I have ever seen an I2C or SPI hardware interface where you would mistake the clock generator for an arbitrary waveform generator with respect to frequency and duty cycle. In the days when these interfaces were done by bit banging, there was a much greater level of control over the clock waveform at the expense of speed and consistency. In short you have enough choices to do pretty much anything your mind can imagine.
Note in post #2, the disclaimer that the hardware is not compatible with the specification in some respects. This is a case of the hardware vendors innovating faster than the specification writers. Is this a good thing or a bad thing? I don't know the answer to that, but MIcrochip does make more hardware than the specification writers so maybe the hardware becomes the new "de facto" standard. It would not be the first time this has happened.
@mukesh1 You'll get a variety of responses, but in the end, I think you need to have a starting point. And that starting point is 'what is a clock'? Typically a clock is a means of keeping 'regular' time. Slicing time into regular increments so it can be measured. However, in the world of processors, it can be taken that way plus an addition. It can be thought of more like a 'trigger'.Mainly any micro like avr, pic has SCL pin. I think this pin supplies clock pulses to i2c and spi devices. micro receives clock pulses from oscillator.
Is the clock sent to the slave device programmable?
by Jeff Child
by Jake Hertz
by Robert Keim