|
| 1 | +--- |
| 2 | +layout: post |
| 3 | +title: 2.1 Algorithm analysis |
| 4 | +--- |
| 5 | + |
| 6 | +Algorithms can be understood and studied in a language- and machine-independent manner. We want to compare the efficiency of algorithms without implementing them. |
| 7 | +Two tools for this are the **RAM model of computation** and **asymptotic analysis of worst-case complexity (big Oh)** |
| 8 | + |
| 9 | +## The RAM model of computation |
| 10 | + |
| 11 | +* each simple operation (+, \*, -, =, if, call) takes exactly one time step |
| 12 | +* loops and subroutines are the composition of many single-step operations (number of time steps depend on the number of iterations or nature of subroutine) |
| 13 | +* each memory access takes exactly one time step |
| 14 | + |
| 15 | + |
| 16 | +## Best, Worst, and Average-Case Complexity |
| 17 | + |
| 18 | +To analyze the complexity of an algorithm we must know how it works over **all** instances. |
| 19 | + |
| 20 | +* The worst-case complexity of the algorithm is the function defined by the maximum number of steps taken in any instance of size n. This is generally the mopst important. |
| 21 | + |
| 22 | +* The best-case complexity of the algorithm is the function defined by the minimum number of steps taken in any instance of size n. |
| 23 | + |
| 24 | +* The average-case complexity of the algorithm, which is the function defined by the average number of steps over all instances of size n. |
| 25 | + |
| 26 | +These time complexities define a numerical function, representing time versus problem size. These are tipically so complex that we need to simplify them. |
0 commit comments