# Best-case complexity

### From Wikipedia

Often when dealing with computer algorithms it is not only useful to know worst-case complexity, it is also important to know what is the best case for an algorithm. This complexity can be split into two fields. One is time complexity the other is space complexity. The definition using Ω(*g*(*n*)), read Big Omega of g, can be applied to both time or space and as such, only one definition of best case complexity is required. Informally the best case complexity is the largest g(n) such that T(n) = Ω(*g*(*n*)) (*T*(*n*) is in Ω(*g*(*n*))).

**Best-case complexity***b*= {inf(*S*) |*S*= \-/*f*s.t there exist*a**c*,*B*> 0 s.t \-/*n*≥*B**g*(*n*) ≥*cf*(*n*) } where*g*(*n*) is your algorithm.

In plain English this means that the worst case complexity is the largest element in the set of all functions such that *g* is in Ω(*f*(*n*)) for all *n* greater than *B*, which is the *breaking point*.

Best case complexity is not all the best measure of well an algorithm performs. More often computer scientists care more bout the worst case complexity and after that how well an algorithm will run on average. An algorithm that has a best case of Ω(1) is pointless if its worst case is O(2^{n}).

## Contents |

## Time Complexity

Best case time complexity is the least amount of time it would take for an algorithm to execute. Small differences such as built in commands, and variable accesses are ignored, but loops and recursion have a much more profound effect on runtime. A very trivial example of this would be the code in java which prints the numbers 0 through n in order:<source lang=java>

public static void main(String[] args){ for (int i = 0;i < Integer.parseInt(args[1]) ; i++){ System.out.println("" + i); } }</source>

The worst case complexity for this algorithm is the same as its best case. This algorithm is in Ω(*n*).In many cases however the best case does not equal the worst case. In these instances it is quite easy to find a *g*(*n*) such that *T*(*n*) = Ω(*g*(*n*)), finding the tight bound however, ie. the inf(*S*), requires looking at possible sequences that yields the best results and proving there is no such sequence that yields a smaller *g*(*n*).

Another less trivial example of this is linear search in java:

<source lang=java>

public int search(A,x){ boolean found = false; int c = 0; int index = -1; while ((c < A.size()) && (found == false)){ if A[c] = x { found = true; index = b; } }return index; }</source>

In this case clearly the worst case is O(*n*). This happens when you have to traverse the entire array. However with A[0] = x the algorithm only needs to check the first element. This means the runtime of the algorithm at best no matter the size of the input is constant or O(1). Clearly there is no input in which this algorithm could do better than O(1) so this is the best case complexity.

Ie. for a Bubble Sort modified to check for a sorted list the worst case is `n`^{2} while the best case*(a list already in sorted order)* yields a best case of Ω(n). Even though it's a trivial case, it must still be considered.

## Space complexity

Space complexity is the measure of the amount of space, excluding the original data passed in, that it takes an algorithm to execute on an input. Any algorithm that must always store at least all of the data at a time runs in &Omega:(*n*) of space complexity. If however an algorithm only needed to store a counter of some kind its space complexity would be log(*n*) where *n* is the final value of the counter. This is the reason as it only takes log_{2}(*arshad*) bits to store a number.

## See also

## References

This article includes a list of references, related reading or external links, but its sources remain unclear because it lacks inline citations. Please improve this article by introducing more precise citations where appropriate. (February 2009) |

- Katoen,Joost-Pieter Introduction to Algorithm Analysis Lecture Notes,2002.

## Further reading

- Paul E. Black, "Ω", in Dictionary of Algorithms and Data Structures [online], Paul E. Black, ed., U.S. National Institute of Standards and Technology. 14 August 2008. Retrieved Feb. 20/09. Available from: http://www.itl.nist.gov/div897/sqg/dads/HTML/omegaCapital.html .
- Michael Sipser (1997). Introduction to the Theory of Computation. PWS Publishing.
- Joachim Ziegler,The LEDA Tutorial 2004. - section 2.2.3