1
$\begingroup$

I am learning algorithm complexities. So far it has been an interesting ride. There is so much going behind the scenes that I need to understand. I find it difficult to understand complexity in recursive function.

my_func takes an array parameter A of length $n$. Runtime of some_func() is constant.

def my_func(A): 
 if (n < 4):
 some_func(); /* O(1) time */
 else:
 [G1, G2, G3, G4] = split(A) /* split A into 4 disjoint subarrays of size n/4 each */
 my_func([G1, G3]); /* recurses on size n/2 */
 my_func([G1, G4]); /* recurses on size n/2 */
 my_func([G2, G3]); /* recurses on size n/2 */
 my_func([G2, G4]); /* recurses on size n/2 */
 some_other_func(); /* split() and some_other_func() take O(n) time */

Questions

  1. Can I say the asymptotic runtime of my_func is
    $$T(n) = 4T(n/2) + O(n) \text{ with } T(1) = O(1)$$because my_func is called recursively 4ドル$ times for $(n/2)$ size, then split is $O(n)$ and some_other_func is $O(n)$. The base case keeps $T(1) = O(1)$

  2. What is the total number of steps performed by my_func(A)?
    I know that if there are nested for loops then simply multiply. How to calculate in this case? I was trying use Google search and it point to $\Omega(n^3)$. Is that correct?

Now what if I rewrite this function as

def new_func(A): /* A is array of length n 8/
 if (n < 4):
 some_func(); /* O(1) time */
 else:
 [G1, G2, G3, G4] = split(A) /* split A into 4 disjoint subarrays of size n/4 each 
 new_func([G1, G2]); /* recurses on size n/2 */
 new_func([G2, G3]); /* recurses on size n/2 */
 new_func([G3, G4]); /* recurses on size n/2 */
 some_other_func(); /* split() and some_other_func() take O(n) time */

Questions

  1. What is the number of steps now?
    I guess it will be $\Omega(n^3)$

  2. Is new_func faster than $n\log(n)?$
    I think no because Merge sort is $T(n) = 2T(n/2) + n$ and new_func is $T(n) = nT(n/2) + n$

asked Sep 27, 2021 at 2:16
$\endgroup$
1
  • $\begingroup$ There's something mixed up in ur writing, the 2nd with new_func() doesn't differ than above except for the recursive call is 3times (not 2 as u wrote in Q). In general, u could substituteT(n)= aT(n/b)+cn =⟩ T(n)=a[aT(n/b²)+c(n/b)]+cn, then work ur way out to logn to base b times (when u reach T(1) as n is divided to b^(logn base b) which equals n). Take care in here a=b², so a^(log n base b) = n² $\endgroup$ Commented Sep 27, 2021 at 5:58

1 Answer 1

1
$\begingroup$

In my_func(a), Recurrence Relation will be

$T(n) = \begin{cases} 4T\bigg(\frac{n}{2}\bigg)+{n} & \quad \text{if } n \geq 4\\ 1 & \quad \text{if } n <4 \end{cases} $

In new_func(a), Recurrence Relation will be

$T(n) = \begin{cases} 3T\bigg(\frac{n}{2}\bigg)+{n} & \quad \text{if } n \geq 4\\ 1 & \quad \text{if } n <4 \end{cases} $

You can solve Both of these Recurrence Relations using Master Theorem as explained in link.

The Time Complexity of my_func(a) will be $\theta(n^2)$ since $\log_24 = 2$

The Time Complexity of new_func(a) will be $\theta(n^{1.5849})$ since $\log_23 = 1.5849$

You can solve both of these questions by Substitution Method, which is Time Consuming. One of the Example using this method is attached.

The new_func(a) will be slower than Merge Sort, and faster than my_func(a).

answered Sep 29, 2021 at 8:48
$\endgroup$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.