# Algorithms problem 2

Here is the code again for reference.

01 function find_closest(input_array, target)}
02 sorted_array = cocktailsort(input_array)
03 return find_closest_sorted(sorted_array, target)
04 function find_closest_sorted(sorted_array, target)
05 length = get_length(sorted_array) // takes constant time
06 if length = 0
07 return NULL
08 else if length = 1
09 if sorted_array[0] >= target
10 return sorted_array[0]
11 else return NULL
12 else
13 mid = floor(length/2)
14 if (sorted_array[mid-1] < target)
15 half_array = sorted_array(mid,...,length-1) // array containing second half of elements
16 else
17 half_array = sorted_array(0, ...,mid-1) // array containing first half of elements
18 return find_closest_sorted(half_array, target)

Also remember that
cocktailsort was described as taking \(\Theta(n^2)\) time.

### Solution

(a) Recursive definition for the running time of find_closest_sorted
on an input of length n:

\(
\begin{eqnarray*}
R(0) &=& c \\
R(1) &=& d \\
R(n) &=& R(n/2) + p
\end{eqnarray*}
\)

(b) The recursion tree for R(n) has height \(\log n\).

(c) The recursion tree for R(n) has one leaf.

(d) The big-Theta running time of
find_closest_sorted is \(\Theta(\log n)\).

(e) Notice that the first step in find_closest is to call
cocktailsort on the input array.
Cocktailsort takes \(\Theta(n^2)\) time, so
the big-Theta running time of
find_closest is \(\Theta(n^2)\).

### Comments

You aren't expected to have any idea how cocktailsort works or
why it takes \(\Theta(n^2)\). That's a separate project.
Clearly it's not the world's best implementation of sorting,
since mergesort only takes \(\Theta(n\log n) \) time.
This problem is just about using that information to figure out
how fast find_closest is.

### Self-check

The three constants c, d, p should be different. (Though it's a
common place for reasonable people to get sloppy.) You could also
write the recursive definition using O(1):

\(
\begin{eqnarray*}
R(0) &=& O(1) \\
R(1) &=& O(1) \\
R(n) &=& R(n/2) + O(1)
\end{eqnarray*}
\)

The case of n=0 is a bit odd. You can't actually get to n=0 from the
recursive calls, so the leaf level of the recursion tree is n=1. The
n=0 case only runs when the original input had zero length. So it's
really an initial error check rather than part of the recursive structure.