STM32 Timers

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,609
You don't use SW loops for timing functions. You let the timer do it for you.
As I said, I use timers to generate VGA VSYNC, HSYNC, and VIDEO straight from hardware without SW intervention.
I can digitize 50MHz ADC continuously straight into memory without SW and interrupts.
Yes of course, I'm just playing around and slowly getting more insights, I want to leverage a timer next that drives some GPIO pin.
 

402DF855

Joined Feb 9, 2013
271
Just as the article shows, you can examine the emitted assembly language code and perhaps clip a bit more time off, if you are interested in that.

And without knowledge of that particular part, I agree with the mention of using a PWM to drive a pin without toggling manually. I'd expect the part to have such a feature. Again repeating, look at the datasheet.

Using a similar canned HAL from Atmel, last week I measured a GPIO toggle routine at about 40 instructions. That's not horrible but the same should be doable in maybe 10 or less. Much of the supplied code is not tuned for ultra high performance. This can be challenging to explain to management why we aren't achieving theoretical max throughput. "I'm givin' her all she's got Cap'n!"
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,609
Just as the article shows, you can examine the emitted assembly language code and perhaps clip a bit more time off, if you are interested in that.

And without knowledge of that particular part, I agree with the mention of using a PWM to drive a pin without toggling manually. I'd expect the part to have such a feature. Again repeating, look at the datasheet.

Using a similar canned HAL from Atmel, last week I measured a GPIO toggle routine at about 40 instructions. That's not horrible but the same should be doable in maybe 10 or less. Much of the supplied code is not tuned for ultra high performance. This can be challenging to explain to management why we aren't achieving theoretical max throughput. "I'm givin' her all she's got Cap'n!"
Ha, "she cannie tek much more cap'n"
 

Analog Ground

Joined Apr 24, 2019
460
Microcontrollers often have special functions for GPIO outputs which allow for setting, resetting or toggling the output level in a very fast way. The major advantage of these special functions is a read-modify-write sequence is not required to change the level on a bit in the GPIO register. Only a write is needed. This saves lots of instructions. I see a function called "I/O data bitwise handling" in an STM32 reference manual which I happen to have on my laptop. This seems to have at least the bit set and reset functions. Check this out to squeeze more speed out of software controlled output bit twiddling.

Edit: This technique is exactly what is going on with Post #19 by the TS. Two writes to the GPIOx_BSRR register set and reset the output. Sorry, I did not read the previous post carefully enough. The next step to try for faster execution is change the compiler optimization levels and look for more efficient code. Warning, at the highest level, you may lose the ability to use the debugger. The context required may be wiped out by the compiler. I might expect a speed up of x2 or more at max speed optimization.
 
Last edited:

MrChips

Joined Oct 2, 2009
30,795
If you want to control a GPIO from software, try writing directly to the output register ODR.
For example:

GPIOB->ODR = 0x8000;
GPIOB->ODR = 0;

will send a 30ns high pulse on PB15, assuming you have configured PB15 correctly.
That's 5 clock cycles at 6ns per cycle @ 168MHz.
 

MrChips

Joined Oct 2, 2009
30,795
And if you really want to go bare metal you can write in assembler.

STR R0, [R2]
STR R1, [R2]

where the three registers were preset before entering the loop,
and this will give a 6ns high pulse, i.e. one clock cycle.
The loop runs in 5 cycles giving a 33MHz signal.
 

Thread Starter

ApacheKid

Joined Jan 12, 2015
1,609
If you want to control a GPIO from software, try writing directly to the output register ODR.
For example:

GPIOB->ODR = 0x8000;
GPIOB->ODR = 0;

will send a 30ns high pulse on PB15, assuming you have configured PB15 correctly.
That's 5 clock cycles at 6ns per cycle @ 168MHz.
This brings up an earlier question I asked and that was how does one get the CPU clock to go from 16 MHz (the default it seems) to 180 MHz.

I've read many articles about this subject and still can't get a good clear understanding.

I have no idea if the discovery boards I have even allow the CPU to run faster, yes I know that these devices can be clocked by an external PLL based clock but do these boards have that?

All quite confusing - for a newcomer anyway...
 

MrChips

Joined Oct 2, 2009
30,795
The MCU has its own internal oscillator and PLL. The internal oscillator runs at 16MHz. The PLL can be set by system SW to the desired frequency.

The 16MHz oscillator may not be stable enough for many applications. An 8MHz crystal is often used.
 

Stuntman

Joined Mar 28, 2011
222
I looked through the comments so far an I think you are getting the point:

Yes, you can have a GPIO toggle based on timer peripheral (ie, have TIM1 directly toggle TIM1_CH1). And, as you suggest in the original post, it allows this to be done accurately and regardless of main program.

I will, however, suggest that if you want to simply get some of these features working, a great place to start is to download ST CubeMX and use it to setup your controller. Although I can attest to its wastefulness, it will shed a lot of light onto how to setup the timer controlled GPIO, as well as setup internal/external oscillators.

As to your frustration with clock setup, know this: Before CubeMX, it was either register level programming, or ST's Standard Peripheral Library (which was usable). Even with these tools, the setup of the oscillator and system clocks became so cumbersome that they had Excel based GUI's you could download for setting them up,. You filled out a number fields/drop down menus in excel, then generated a library file to include with your project.

Save yourself a headache hit up Cube.
 
Top