Finding The Right Bitrate for Packet Transmission With H264 Encoder

Thread Starter


Joined Mar 26, 2017
Hi everyone,

I have the following question. I have to encode a video file through an IP-based packet switched
network. The available bitrates are 100kbps and 10 in 100 packets are corruupted due to errors or loss. Assuming the
packets have the same length and packets which are corrupted have to be retransmitted. To encode the file, I am using a H.264 encoder
and varying the QP and the FrameSkip to vary the frame rate.

I thought I had to make the bitrate be less than 90kbps of the available bandwidth since (1-(10/100))*100 = 90% so
0.9*100kbps = 90 kbps but supposedly I am not taking package loss retransmission into account so my value of 90 kbps is not
right. Could someone explain me what the true bitrate should be ? Thanks in advance.