Instead of real running time, we need to develop a notion of abstract time. To compute the abstract time, we'll count the number of abstract steps performed in an execution of the algorithm in question, or count the number of significant operations performed, such as comparisons, multiplications, copies, etc. This eliminates the dependence on technology and on implementation.

We can express abstract time as a function of the size of the input.
Suppose that
*A*
is an algorithm and Input represents an input to
*A*
.
Let
| *Input*| = *n*
be the size of Input. Then the number of steps processed
while executing
*A*
given
*Input*
is
*A*(*Input*)
.

Using this measure of complexity, let's look at an example. Let's say we
have an algorithm that takes an array as input,
and for each element
in the array, it compares the element to every other element in the array.
Furthermore, let's say we give the algorithm an array of 100 elements. It
starts on the first element, and then looks at all the other 99 elements.
Then it goes to the second element, and looks at all the other 99 elements.
Etc. With our current metric,
*A*(100) = = 100*99 = 9, 900
.

We don't really want to compute
*A*(*Input*)
exactly. We want the dominant
behavior of
*A*(*Input*)
for large inputs. We also want to ignore constant
factors, as these are the least significant for measuring resource
consumption and are very sensitive to exactly how we count steps in the
algorithm. We want the order of magnitude of a time complexity function.
In simple terms, we want the largest order of magnitude from the equation
that describes the running time of the algorithm. For example,
5*n*
^{2} + 12*n* - 3
would be expressed as
*n*
^{2}
since
*n*
^{2}
is the dominant term of the
equation. As n grows very large, the rate of growth of the function
depends on
*n*
^{2}
more than any of the other terms, so that is all we care
about. This statement is the result of asymptotic analysis.