Merge sort algorithm

Thread Starter

Ryan$

Joined Dec 14, 2018
178
Hi guys, I have made code of merge sort like this(int terms of algorithm):
Merge_Sort(A, p , r)
{
if (p<r)
q= Lower.Bound( (q+r)/2 ) // for example if 2.5 => q=2 etc ..
Merge_Sort(A, p, q);
Merge_Sort(A, q+1, r);
Merge(A, p, q,r );
}
that's fine .. but my question the prof said it doesn't matter if you use Lower.Bound or Upper.Bound for the median .. but if I use Upper.Bound (meaning if 2.5 => 3) as we will get infinite loop of recursive call (try Merge_Sort(A, 4,5) ) and that's incorrect ! anyone can help me how the prof claim that's upper or lower bound doesn't matter? thanks
 

WBahn

Joined Mar 31, 2012
26,398
Please format your code properly. There is no way for us to tell which statements are controlled by the if() structure. Don't make us guess what you mean -- engineering is not about guessing.

Using CODE tags will help.

The problem, however, is that you recurse if the first argument is less than the second argument.

So what if r = p+1. Your value for q is (2p + 1)/2 = p + 0.5 and if you take the upper bound then you get p + 1 which is the original value of r.

You need to look very carefully at how all of your values interact to be sure that your bounding cases (all the possible paths to the base case) are consistent.
 

Thread Starter

Ryan$

Joined Dec 14, 2018
178
Please format your code properly. There is no way for us to tell which statements are controlled by the if() structure. Don't make us guess what you mean -- engineering is not about guessing.

Using CODE tags will help.

The problem, however, is that you recurse if the first argument is less than the second argument.

So what if r = p+1. Your value for q is (2p + 1)/2 = p + 0.5 and if you take the upper bound then you get p + 1 which is the original value of r.

You need to look very carefully at how all of your values interact to be sure that your bounding cases (all the possible paths to the base case) are consistent.
But why when I want to analyze the time complexity of this code, we take T(n)=2T(n/2) +c because the median is almost q=(r+p)/2 = almost n/2 and not taking T(n)=2*T(upper bounder(n/2)) +c? or T(n)=2*T(lower bound(n/2)) +c? still really confused !!
you may now say upper/lower bound of n/2 is approximately to n/2 but why we are not taking them in time complexity? when do we neglect them and when we don't?!!

thank you very much!
 

WBahn

Joined Mar 31, 2012
26,398
But why when I want to analyze the time complexity of this code, we take T(n)=2T(n/2) +c because the median is almost q=(r+p)/2 = almost n/2 and not taking T(n)=2*T(upper bounder(n/2)) +c? or T(n)=2*T(lower bound(n/2)) +c? still really confused !!
you may now say upper/lower bound of n/2 is approximately to n/2 but why we are not taking them in time complexity? when do we neglect them and when we don't?!!

thank you very much!
Because (n/2) and (n/2) + 1 are both O(n).

Plus, remember that you are splitting n things into two groups. If n is even, then the split is equal. If n is odd, then one group is 0.5 more than exactly half and the other is 0.5 less than exactly half. All that the upper/lower bound choice does (if implemented properly) is determine whether the first call of the second call gets the larger of the two.
 

Thread Starter

Ryan$

Joined Dec 14, 2018
178
Because (n/2) and (n/2) + 1 are both O(n).

Plus, remember that you are splitting n things into two groups. If n is even, then the split is equal. If n is odd, then one group is 0.5 more than exactly half and the other is 0.5 less than exactly half. All that the upper/lower bound choice does (if implemented properly) is determine whether the first call of the second call gets the larger of the two.
But why are you comparing them to O(n)? for instance not to O(n^2)? and why did you choose specifically Big O and didn't choose the Omega Ω to compare? here is my catch that I'm always fallen in !
To neglect after getting according to whom I'm comparing is very easy for me, but once again how I determine to choose the comparable variable to comparing according to it? for instance you compared now according to O(n)
 

Thread Starter

Ryan$

Joined Dec 14, 2018
178
Also, does the argument of Merge_sort() like "(A, q+1, r)" which doing q+1 take also a time? or it takes and we are just neglecting that?
 

WBahn

Joined Mar 31, 2012
26,398
Also, does the argument of Merge_sort() like "(A, q+1, r)" which doing q+1 take also a time? or it takes and we are just neglecting that?
Yes, it takes time. But it is incorporated into the time it takes to call Merge_sort(), which includes evaluating the arguments, saving the necessary state information, performing the jump to the subroutine code, restoring the state information at the end, and returning from the subroutine. This takes constant time -- which remember does NOT mean that it always takes exactly the same amount of time -- evaluating the arguments takes a slightly different time when you have (A, q, r) than when you have (A, q+1, r). By "constant time" we merely mean that the time does not depend on the size of the problem. As long as there are any terms that DO grow as the size of the problems grow, these constant-time components are driven to insignificance.
 

Thread Starter

Ryan$

Joined Dec 14, 2018
178
Yes, it takes time. But it is incorporated into the time it takes to call Merge_sort(), which includes evaluating the arguments, saving the necessary state information, performing the jump to the subroutine code, restoring the state information at the end, and returning from the subroutine. This takes constant time -- which remember does NOT mean that it always takes exactly the same amount of time -- evaluating the arguments takes a slightly different time when you have (A, q, r) than when you have (A, q+1, r). By "constant time" we merely mean that the time does not depend on the size of the problem. As long as there are any terms that DO grow as the size of the problems grow, these constant-time components are driven to insignificance.
thanks alot; understand it very well
 
Top