Название: Data Structure and Algorithms Using C++
Автор: Sachi Nandan Mohanty
Издательство: John Wiley & Sons Limited
Жанр: Программы
isbn: 9781119752035
isbn:
1.5 Efficiency of an Algorithm
Efficiency of an algorithm can be determined by measuring the time, space, and amount of resources it uses for executing the program. The amount of time taken by an algorithm can be calculated by finding the number of steps the algorithm executes, while the space refers to the number of units it requires for memory storage.
1.6 Asymptotic Notations
The asymptotic notations are the symbols which are used to solve the different algorithms and the notations are
Big Oh Notation (O)
Little Oh Notation (o)
Omega Notation (Ω)
Theta Notation (θ)
Big Oh (O) Notation
This Notation gives the upper bound for a function to within a constant factor. We write f(n) = O(g(n)) if there are +ve constants n0 and C such that to the right of n0, the value of f(n) always lies on or below Cg(n)
Omega Notation (Ω)
This notation gives a lower bound for a function to with in a constant factor. We write f(n) = Ωg(n) if there are positive constants n0 and C such that to the right of n0 the value of f(n) always lies on or above Cg(n)
Theta Notation (θ)
This notation bounds the function to within constant factors. We say f(n) = θg(n) if there exists +ve constants n0, C1 and C2 such that to the right of n0 the value of f(n) always lies between c1g(n) and c2(g(n)) inclusive.
Little Oh Notation (o)
Introduction
An important question is: How efficient is an algorithm or piece of code? Efficiency covers lots of resources, including:
CPU (time) usage
Memory usage
Disk usage
Network usage
All are important but we will mostly talk about CPU time
Be careful to differentiate between:
Performance: how much time/memory/disk/...
is actually used when a program is running. This depends on the machine, compiler, etc., as well as the code.
Complexity: how do the resource requirements of a program or algorithm scale, i.e., what happens as the size of the problem being solved gets larger. Complexity affects performance but not the other way around. The time required by a method is proportional to the number of “basic operations” that it performs. Here are some examples of basic operations:
one arithmetic operation (e.g., +, *). one assignment one test (e.g., x == 0) one read one write (of a primitive type)
Note: As an example,
O(1) refers to constant time.
O(n) indicates linear time;
O(nk) (k fixed) refers to polynomial time;
O(log n) is called logarithmic time;
O(2n) refers to exponential time, etc.
n2 + 3n + 4 is O(n2), since n2 + 3n + 4 < 2n2 for all n > 10. Strictly speaking, 3n + 4 is O(n2), too, but big-O notation is often misused to mean equal to rather than less than.
1.7 How to Determine Complexities
In general, how can you determine the running time of a piece of code? The answer is that it depends on what kinds of statements are used.
1. Sequence of statements
statement 1; statement 2; ... statement k;
Note: this is code that really is exactly k statements; this is not an unrolled loop like the N calls to addBefore shown above.) The total time is found by adding the times for all statements:
total time = time(statement 1) + time (statement 2) + ... + time(statement k)
If each statement is “simple” (only involves basic operations) then the time for each statement is constant and the total time is also constant: O(1). In the following examples, assume the statements are simple unless noted otherwise.
2. if-then-else statements
if (cond) { sequence of statements 1 } else { sequence of statements 2 }
Here, either sequence 1 will execute, or sequence 2 will execute. Therefore, the worst-case time is the slowest of the two possibilities: max(time(sequence 1), time(sequence 2)). For example, if sequence 1 is O(N) and sequence 2 is O(1) the worst-case time for the whole if-then-else statement would be O(N).
3. for loops
for (i = 0; i < N; i++) { sequence of statements }
The loop executes N times, so the sequence of statements also executes N times. Since we assume the statements are O(1), the total time for the for loop is N * O(1), which is O(N) overall.
4. Nested loops
for (i = 0; i < N; i++) { for (j = 0; j < M; j++) { sequence of statements } }
The outer loop executes N times. Every time the outer loop executes, the inner loop executes M times. As a result, the statements in the inner loop execute a total of N * M times. Thus, the complexity is O(N * M). In a common special case where the stopping condition of the inner loop is j < N
instead of j < M
(i.e., the inner loop also executes N times), the total complexity for the two loops is O(N2).
5. Statements with method calls:
When a statement involves a method call, the complexity of the statement includes the complexity of the method call. Assume that you know that method f takes constant time, and that method g takes time proportional to (linear in) the value of its parameter k. Then the statements below have the time complexities indicated.
f(k); // O(1) g(k); // O(k)
When a loop is involved, the same rule applies. For example:
for (j = 0; j < СКАЧАТЬ