Data structure about the Best Case (OMEGA)

Thread Starter

Ryan$

Joined Dec 14, 2018
178
Hi guys, about time complexity there's something not understand able for me and wish you guys help me ..
well, firstly I understand the definition of (OMEGA) and the definition of (BIG O) but what's weird is that:
My lecturer said that if we want to find the best case .. then it's enough to fine one best case and say :
best case< T(n)
I asked him why it's enough to find one best case to say that's T(n) actually bigger than it, may it will be more than suitable best case? he explained this by: lets us say there's y=max grade among whole the students in the class, lets assume one of them said that his grade is 40, then y>=40 and it's enough to prove that y is ofcourse >=40 . then he said the same issue in on the best case of time complexity ..it's enough to find one best case to say that T> best case that I found. ...... but why ? I'm still miss understanding this point.
for example for worst case we are not "simply" finding any worst case and say T(n) < worst case ..

anyone can help me on why for finding best case we suffice to find "one" best case (maybe there's much best cases!! but still we need just one!! ) and it's enough to prove that my T(n)>best case.
 

WBahn

Joined Mar 31, 2012
30,071
This doesn't make a whole lot of sense at all.

We agreed in another thread that T(n) for a given algorithm is the MAXIMUM time that it takes to solve ANY problem of size n using that algorithm (and or that implementation of that algorithm).

So, by definition, NO case is going to take MORE than T(n), but at least one case will take exactly T(n).

But that does NOT guarantee that there are any cases that take less than T(n). What if every case takes exactly T(n)? You can't rule that out and, in fact, there are plenty of algorithms in which the time complexity is completely data-independent so that all runs of a given size take the same amount of time (barring nondeterministic effects that are outside the scope and control of the algorithm).
 

Thread Starter

Ryan$

Joined Dec 14, 2018
178
This doesn't make a whole lot of sense at all.

We agreed in another thread that T(n) for a given algorithm is the MAXIMUM time that it takes to solve ANY problem of size n using that algorithm (and or that implementation of that algorithm).

So, by definition, NO case is going to take MORE than T(n), but at least one case will take exactly T(n).

But that does NOT guarantee that there are any cases that take less than T(n). What if every case takes exactly T(n)? You can't rule that out and, in fact, there are plenty of algorithms in which the time complexity is completely data-independent so that all runs of a given size take the same amount of time (barring nondeterministic effects that are outside the scope and control of the algorithm).
WOW ! I got you exactly.
here is my explanation of what I conclude from your speech:
well, I can guessing any "best case" time complexity and may it work finely but it wouldn't be the tight one which is equal or smaller of T(n) ..and ofcourse maybe it would be the tight one .. but who knows? "maybe" makes whole the difference and in algorithms we solve logically ..what does that mean? we must cover all the circumstances and keep all the doubts away ... as a result we need to do a general analysis step y step to avoid the doubt of "maybe" it's like that or not like that , and to solve the problem appropriate way !

thanks !
 
Top