Is the clock sent to the slave device programmable?

Thread Starter

mukesh1

Joined Mar 22, 2020
68
Mainly any micro like avr, pic has SCL pin. I think this pin supplies clock pulses to i2c and spi devices. micro receives clock pulses from oscillator.

Is the clock sent to the slave device programmable?
 

Papabravo

Joined Feb 24, 2006
18,432
Most interfaces will provide for some implementation choices as to clock frequency. I do not think I have ever seen an I2C or SPI hardware interface where you would mistake the clock generator for an arbitrary waveform generator with respect to frequency and duty cycle. In the days when these interfaces were done by bit banging, there was a much greater level of control over the clock waveform at the expense of speed and consistency. In short you have enough choices to do pretty much anything your mind can imagine.

Note in post #2, the disclaimer that the hardware is not compatible with the specification in some respects. This is a case of the hardware vendors innovating faster than the specification writers. Is this a good thing or a bad thing? I don't know the answer to that, but MIcrochip does make more hardware than the specification writers so maybe the hardware becomes the new "de facto" standard. It would not be the first time this has happened.
 

Thread Starter

mukesh1

Joined Mar 22, 2020
68
Most interfaces will provide for some implementation choices as to clock frequency. I do not think I have ever seen an I2C or SPI hardware interface where you would mistake the clock generator for an arbitrary waveform generator with respect to frequency and duty cycle. In the days when these interfaces were done by bit banging, there was a much greater level of control over the clock waveform at the expense of speed and consistency. In short you have enough choices to do pretty much anything your mind can imagine.

Note in post #2, the disclaimer that the hardware is not compatible with the specification in some respects. This is a case of the hardware vendors innovating faster than the specification writers. Is this a good thing or a bad thing? I don't know the answer to that, but MIcrochip does make more hardware than the specification writers so maybe the hardware becomes the new "de facto" standard. It would not be the first time this has happened.
The microcontroller generates the clock pulses that are sent to the slave device.

Are these clock pulses controlled by program?

Does it depend on the programmer that it can send any number of pulses like 4, 8, 12 pulses etc.?
 

Papabravo

Joined Feb 24, 2006
18,432
The microcontroller generates the clock pulses that are sent to the slave device.

Yes, I know this, and there is no use repeating it.

Are these clock pulses controlled by program?

If you have implemented an SPI or I2C interface in code then yes. If you have a hardware interface, then you configure the parameters of the clock and after that it is then controlled by the hardware.

Does it depend on the programmer that it can send any number of pulses like 4, 8, 12 pulses etc.?

For the hardware SPI interface you always get 8 clock pulses per byte. For the software SPI interface you can do whatever you want. I've never seen an I2C interface that worked on anything except 8 clocks per byte and an ACK.

Why do you ask?
 
Last edited:

BobaMosfet

Joined Jul 1, 2009
1,979
Mainly any micro like avr, pic has SCL pin. I think this pin supplies clock pulses to i2c and spi devices. micro receives clock pulses from oscillator.

Is the clock sent to the slave device programmable?
@mukesh1 You'll get a variety of responses, but in the end, I think you need to have a starting point. And that starting point is 'what is a clock'? Typically a clock is a means of keeping 'regular' time. Slicing time into regular increments so it can be measured. However, in the world of processors, it can be taken that way plus an addition. It can be thought of more like a 'trigger'.

Thus, when we talk about i2c/TWI, the 'clock' is both a means of keeping time, and triggering when communication is occurring. If pulses are being generated Here is a scope shot of an i2c communication-- UPPER trace is SDA and LOWER trace is SCL:

1639413149960.png

You can see, per the i2c protocol, that the clock is pulsed for each 9-bit sequence. But only when data is going to be sent. So it isn't like the oscillator which provides a heartbeat clock to the MCU. This pulses constantly, always, never stopping. The i2C clock is only intermittent, when data is being transferred. How else would the receiver know when something is a 1 or a zero looking at SDA, unless it was triggered by the SCL to peek at the SDA on each clock cycle?

Lastly, SCL clock pulses are performed by the built in hardware on the MCU, if it's built in (such as with an ATMEGA32. If the MCU does not offer anything other than a UART, without built in i2c/TWI ability, then you would have to write i2c routines to use the UART to do what i2c/TWI does, as an example.
 
Top