Textbook, page 22, Exercise 2.1-3
Textbook, page 29, Exercise 2.2-3
For the searching problem in the last two questions, if we know that the data is sorted then we can use the binary search algorithm that you learned about in CSC 230 and/or CSC 330. Write out a recursive version of this algorithm in pseudocode (it must be a recursive version – not iterating with a loop!), and then derive a recurrence that describes the worst-case running time of this algorithm. You do not need to solve the recurrence.
The recursive binary search in the previous question should have included passing the array of values into the recursive call. What if you are using a language in which arrays are pass-by-value, meaning that a copy of the passed array is made before executing the called function. Derive a recurrence for the worst-case running time of the algorithm in this case. You do not need to solve the recurrence.
Consider algorithms with the following time complexities:
Algorithm 1: \(7n\lg n\)
Algorithm 2: \(n!\)
Algorithm 3: \(2^n\)
Algorithm 4: \(100n\)
Algorithm 5: \(n^3\)
Algorithm 6: \(2n^2\)
Assuming that the above formulas give the running times of these algorithms in microseconds, make a table showing the running time of each algorithm for input sizes of \(n=10\), \(n=100\), and \(n=1000\). If a running time turns out to be more than \(10^{50}\) seconds, you can simply write ``too large’’ in the table.
Write the algorithms in order of slowest to fastest when \(n=10\).
Write the algorithms in order of asymptotic running time (slowest to fastest).
For what values of \(n\) is Algorithm 4 faster than Algorithm 1?
Prove or disprove each of the following conjectures using just the definitions of asymptotic notation and basic algebra:
For any positive functions \(f(n)\) and \(g(n)\), if \(f(n)\) is \(O(g(n))\) then \(g(n)\) is not \(O(f(n))\).
For any positive functions \(f(n)\) and \(g(n)\), \(f(n)+g(n)=\Theta(\max(f(n),g(n)))\).
\((3n)^n = O(n^{n+1})\)
\((3n)^n = O(n^{2n})\)