Common Big-Oh Running Times
- O(1) or constant time: The holy grail of algorithms
is a program that takes the same amount of time regardless
of the problem size.
An example of O(1) time is table lookup,
where the table is represented by an array.
Any problem with a finite number of possible inputs can be solved
in O(1) time by simply looking up the answer in a pre-computed table.
- O(log n) or logarithmic time:
A divide-and-conquer technique that continually cuts the problem size
in half results in O(log n) running time.
Binary search is O(log n),
where n is the size of the lookup table.
- O(n) or linear time: Since most algorithms require that we look
at the entire input,
most algorithms require at least linear time,
so a linear time algorithm
is usually considered a terrific solution to a problem.
- O(n log n) or n log n time:
A divide-and-conquer algorithm that looks at the whole
input within each step
often produces results in O(n log n) time.
Efficient sorting algorithms (e.g. Heapsort) are
n log n worst-case; Quicksort is n log n time average case.
- O(n^2) or quadratic time: Doubly nested loops typically
result in programs that require quadratic time.
Several (inefficient)
sorting algorithms, such as Bubblesort, require O(n^2) time.
- O(n^3) or cubic time: Triple-nested loops typically produce
programs that require cubic time.
- O(2^n) or exponential time: Algorithms that require exponential time
are essentially impractical except for the smallest inputs.
Some problems require exponential time (ie., there is
no efficient program for these problems), and thus are
not generally solvable by computer in any practical amount of time.