This material is only intended as a brief review of the important concepts covered in CSC 172.
When solving a problem on a computer, we typically seek an algorithm that makes efficient use of the computer resources (especially time and space).
An important criterion for evaluating an algorithm is how long it takes (or how much memory it takes) to solve problems of various sizes.
The analysis of algorithms is a mathematical approach to describing the time and space needs of an algorithm in terms of the problem size.
Big-Oh notation is used to hide constant factors (details) that are not part of the big picture. If the running time of a program is O(n), that means the running time is c*n time units, where c is a constant.
T(n) (the running time of a program with input of size n) is said to be O(f(n)) if there exists integers n0 and c such that T(n) <= c*f(n) for all n >= n0. If T(n) is O(f(n)), where f(n)=n^2, then this means that for all but a few (ie., finite number of) cases (namely those cases where the input is of size less than n0), the program will require about (ie., within a constant factor of c) n^2 steps.