$\textbf{4.1-2}$: $\textbf{4.1-3}$:brute force's time is $\Theta(n^2)$ and the divide and conquer's time is $\Theta(nlogn)$
$\textbf{4.1-4}$: we can compare the original algorithm's result to the empty array 0.
$\textbf{4.1-5}$:
$\textbf{4.2-1}$:略
$\textbf{4.2-2}$:伪代码见课本
$\textbf{4.2-3}$: padding the extra 0s to fit the requirement that n is an exact power of 2.
$\textbf{4.2-4}$: $T(n)=kT(n/3)+\Theta(n^2)$,then we consider $n^{log_3^k}$ with $n^2$ to decide the final time complexity.
$\textbf{4.2-5}$: assume the sub-matrix is m*m, and the time of multiplications is k. like the answer of 4.2-4, we can deduce that $T(n)=kT(n/m)+\Theta(n^2)$, then we compare the $log_m^k$ with 2 to decide the final time complexity according to the master theorem.
For the detailed data, we have:
$log_{68}^{132464}$=2.795
$log_{70}^{143640}$=2.795
$log_{72}^{155424}$=2.804
$\textbf{4.2-6}$:
we assume that matrix A is kn*n and B is n*kn, For A*B situation, we divide the matrix A into an matrix k*1 and matrix B into 1*k using strassen's algorithm as a subroutine. then the time complexity is $\Theta(k^2n^{lg7})$. For B*A, we can deduce that the time complexity is $\Theta(kn^{lg7})$
$\textbf{4.2-7}$:
(a+bi)(c+di)=ac-bd+(ad+bc)i, we can calculate ac, bd and (a+b)(c+d). then (ad+bc) can be got by (a+b)(c+d)-ac-bd.
$\textbf{4.3-1}$:
expand the equation, then is OK.
$\textbf{4.3-2}$:
the same to 4.3.1
$\textbf{4.3-3}$:
deduce according the definition.
$\textbf{4.3-4}$:
tbd.
$\textbf{4.3-5}$:
tbd.
$\textbf{4.3-6}$:
tbd.
$\textbf{4.3-7}$:
tbd.
$\textbf{4.3-8}$:
tbd.
$\textbf{4.3-9}$:
tbd.
$\textbf{4.4-1}$:
$T(n^{log_2^3})$
$\textbf{4.4-2}$:
$T(n^2logn)$
$\textbf{4.4-3}$:
$T(n^2)$
$\textbf{4.4-4}$:
$T(2^n)$
$\textbf{4.4-5}$:
$T(n^2)$
$\textbf{4.4-6}$:
$T(nlogn)$
$\textbf{4.4-7}$:
$T(n^2)$
$\textbf{4.4-8}$:
$T(n^2)$
$\textbf{4.4-9}$:
$T(nlogn)$
$\textbf{4.5-1}$:
a) $\sqrt{n}$
b) $\sqrt{n}logn$
c) $n$
d) $n^2$
$\textbf{4.5-2}$:
for n<=16,Professor Caesar's algorithm is much more faster than strassen algorithm, for n>16, to fit the question, we must have:
$$n^{log_{4}a}<n^{log_{2}7}$$
solve it!,we got a<49. so the answer is a=48
$\textbf{4.5-3}$:
b=1,a=2, so we got $n^log_ab=0$,according to the master method, we can achieve the conclusion.
$\textbf{4.5-4}$:
no, we have to compute according the recursion tree method.
the solution is $O(n^2{lg}^2n)$
$\textbf{4.5-5}$:
$T(n)=T(\frac{n}{2})+n(2-cosn)$
$\textbf{4.1}$:
a. $\Theta(n^4)$
b. $\Theta(nlogn)$
c. $\Theta(n^2logn)$
d. $\Theta(n^{2})$
e. $\Theta(n^{log27})$
f. $\Theta(\sqrt{n}logn)$
g. $\Theta(n^3)$
$\textbf{4.2}$:
$\textbf{4.3-2}$:
the same to 4.3.1
$\textbf{4.3-3}$:
deduce according the definition.
$\textbf{4.3-4}$:
tbd.
$\textbf{4.3-5}$:
tbd.
$\textbf{4.3-6}$:
tbd.
$\textbf{4.3-7}$:
tbd.
$\textbf{4.3-8}$:
tbd.
$\textbf{4.3-9}$:
tbd.
$\textbf{4.4-1}$:
$T(n^{log_2^3})$
$\textbf{4.4-2}$:
$T(n^2logn)$
$\textbf{4.4-3}$:
$T(n^2)$
$\textbf{4.4-4}$:
$T(2^n)$
$\textbf{4.4-5}$:
$T(n^2)$
$\textbf{4.4-6}$:
$T(nlogn)$
$\textbf{4.4-7}$:
$T(n^2)$
$\textbf{4.4-8}$:
$T(n^2)$
$\textbf{4.4-9}$:
$T(nlogn)$
$\textbf{4.5-1}$:
a) $\sqrt{n}$
b) $\sqrt{n}logn$
c) $n$
d) $n^2$
$\textbf{4.5-2}$:
for n<=16,Professor Caesar's algorithm is much more faster than strassen algorithm, for n>16, to fit the question, we must have:
$$n^{log_{4}a}<n^{log_{2}7}$$
solve it!,we got a<49. so the answer is a=48
$\textbf{4.5-3}$:
b=1,a=2, so we got $n^log_ab=0$,according to the master method, we can achieve the conclusion.
$\textbf{4.5-4}$:
no, we have to compute according the recursion tree method.
the solution is $O(n^2{lg}^2n)$
$\textbf{4.5-5}$:
$T(n)=T(\frac{n}{2})+n(2-cosn)$
$\textbf{4.1}$:
a. $\Theta(n^4)$
b. $\Theta(nlogn)$
c. $\Theta(n^2logn)$
d. $\Theta(n^{2})$
e. $\Theta(n^{log27})$
f. $\Theta(\sqrt{n}logn)$
g. $\Theta(n^3)$
a.
1. $T(n)=T(\frac{n}{2})+O(1) =>\Theta(logn)$
2. $T(n)=T(\frac{n}{2})+O(N)) => \Theta(Nlogn)$
3. $T(n)=T(\frac{n}{2})+O(n) => \Theta(n)$
b.
1. $T(n)=2T(\frac{n}{2})+O(n) => \Theta(nlogn)$
2. $T(n)=2T(\frac{n}{2})+O(n)+O(N) => \Theta((n+N)logn)$
3. $T(n)=2T(\frac{n}{2})+O(n) => \Theta(nlogn)$
$\textbf{4.3}$:
a. $T(n)=\Theta(n^{log_34})$
b. $T(n)=\Theta(nloglogn)$, use recursive tree and the integral of $\frac{1}{x}$
c. $T(n)=\Theta(n^2\sqrt{n})$
d. $T(n)=\Theta(nlogn)$
e. same to b.
f. $T(n)=\Theta(n)$
g. $T(n)=\Theta(logn)$
h. $T(n)=\Theta(nlogn)$
i. $T(n)=\Theta(loglogn)$
j. $T(n)=\Theta(loglogn)$
$\textbf{4.4}$:
a. substitute F(z) into this equation.
b. solve a's equation
c. make use of Taylor Expansion of $\frac{1}{1-x}$
d. note that the absolute value of $\hat{\phi}$ less than 1, so we have:
$|{\hat{\phi}}^i|<|\hat{\phi}|<\sqrt{5}/2$
so we have:
$F_i=\frac{1}{\sqrt{5}}\phi-h$, of which h<$\frac{1}{2}$, because $F_i$ must be an integer,so we got it!
$\textbf{4.5}$:
a. if more than n/2 chips are bad, then we can't distinguish the set with that set which contains all bad chips.
b. we will do as follows:
1) divide the original set S into two equal set A and B.
2) we take an elment and corresponding element from A and B, then making a pairwise test, if the test result is <G,G>, then we take one element into a new Set T, otherwise, we discard both the two elements.
3)if the new set T contain only one element, stop. otherwise, we switch to step 2, continue do with the new set T.
c. the above solution's time complexity is $\Theta(n)$, if we get one good chip, we can easily get all other good chips by comparing this good one with other chips.
$\textbf{4.6}$:
a. mathematical induction
b. change the 22 in first row into 24
c. proof by contradiction
d. according to c's conclusion, we scan the odd-numbered rows of A, and at most time, we should scan one row or one column at most one element.
e. $T(m,n)=T(\frac{m}{2},n)+\frac{m}{2}+n$