I am busy programming PIC controllers to decode simple .mid files to send as raw MIDI data to synth, but despite much googling I am stuck on understanding what a "tick" is.
Google hits reveal plenty about how many ticks to a beat and all about delta-time, but I nothing about how much time a tick takes.
I downloaded a simple silent_night_easy.mid file and understand pretty much every byte of the file, but without knowing how long a tick takes, the timing eludes me.
Will attach .mid file and decode if requested, but without understanding of a MIDI "tick" I suspect such downloads would not add anything.
To be clear - I want to know how to work out the length of a "tick" in seconds/milliseconds/picoseconds. NOT as a fraction of something else undefined.
Can anybody help please?
Many thanks
Google hits reveal plenty about how many ticks to a beat and all about delta-time, but I nothing about how much time a tick takes.
I downloaded a simple silent_night_easy.mid file and understand pretty much every byte of the file, but without knowing how long a tick takes, the timing eludes me.
Will attach .mid file and decode if requested, but without understanding of a MIDI "tick" I suspect such downloads would not add anything.
To be clear - I want to know how to work out the length of a "tick" in seconds/milliseconds/picoseconds. NOT as a fraction of something else undefined.
Can anybody help please?
Many thanks