Chapter 23 Self-Assessment Quiz: Searching and Sorting


Section 1: Multiple Choice

Q1. Binary search on a sorted array of 1,000,000 elements requires at most how many comparisons?

(a) 1,000,000 (b) 1,000 (c) 20 (d) 10

Answer (c) Binary search requires at most ceil(log₂(1,000,000)) ≈ 20 comparisons. Each comparison halves the search space: after 20 halvings, 1,000,000 / 2²⁰ ≈ 1.

Q2. Which sorting algorithm has O(n) best-case performance?

(a) Selection sort (b) Insertion sort (c) Merge sort (d) Quicksort

Answer (b) Insertion sort has O(n) best case — when the array is already sorted, the inner loop never executes. Selection sort is always O(n²). Merge sort and quicksort are always at least O(n log n) in their best cases.

Q3. Merge sort's primary disadvantage compared to quicksort is:

(a) Worse average-case time complexity (b) O(n) extra memory requirement (c) Not recursive (d) Unstable sorting

Answer (b) Merge sort requires O(n) auxiliary space for the temporary merge array. Quicksort sorts in-place using only O(log n) stack space. Merge sort is actually more stable (advantage) and has the same or better time complexity (O(n log n) guaranteed).

Q4. What is the worst-case time complexity of quicksort with last-element pivot selection on already-sorted input?

(a) O(n) (b) O(n log n) (c) O(n²) (d) O(2ⁿ)

Answer (c) On sorted input with last-element pivot, the pivot is always the maximum. The partition produces one sub-array of size n-1 and one of size 0 — no real splitting. This gives n + (n-1) + (n-2) + ... + 1 = O(n²) comparisons.

Q5. A sorting algorithm is "stable" if:

(a) It never crashes on invalid input (b) It uses O(1) extra memory (c) It preserves the relative order of equal elements (d) It always runs in O(n log n) time

Answer (c) Stability means that if two elements are equal according to the comparison, they appear in the output in the same relative order as they appeared in the input. Merge sort and insertion sort are stable; quicksort and selection sort are typically not.

Q6. Which of the following is the best algorithm for sorting a nearly-sorted array of 10,000 elements?

(a) Selection sort (b) Insertion sort (c) Merge sort (d) Random quicksort

Answer (b) Insertion sort is adaptive — it runs in nearly O(n) time on nearly-sorted data because the inner loop rarely shifts elements. The other algorithms do not exploit existing order: selection sort is always O(n²), and merge sort and quicksort are always at least O(n log n).

Q7. In the comparison function pattern, a comparison function that returns a negative value means:

(a) The elements are equal (b) The first element should come before the second (c) The first element should come after the second (d) An error occurred

Answer (b) By convention, a comparison function returns negative if A < B (A should come first), zero if A = B, and positive if A > B (B should come first). This convention is used by Pascal's TFPGList.Sort, C's qsort, and most other standard library sorts.

Q8. To sort expenses first by category, then by amount within each category, you should:

(a) Sort by amount, then sort by category with a stable sort (b) Sort by category, then sort by amount with any sort (c) Write a single comparison function that compares category first, then amount as tiebreaker (d) Both (a) and (c) are correct

Answer (d) Both approaches work. Option (a) sorts by the secondary key first, then by the primary key with a stable sort — the stable sort preserves the secondary ordering within groups. Option (c) handles both criteria in a single pass. Option (c) is generally more efficient (one pass instead of two).

Section 2: True or False

Q9. True or False: Binary search can be used on a sorted linked list with the same O(log n) performance as on a sorted array.

Answer False. Finding the middle element of a linked list requires O(n/2) traversal, so binary search on a linked list has O(n log n) total time — worse than linear search's O(n). Binary search's O(log n) performance depends on O(1) random access, which only arrays provide.

Q10. True or False: Merge sort always makes exactly the same number of comparisons regardless of the input ordering.

Answer False. The number of comparisons in the merge step depends on the input. If one half's elements are all smaller than the other half's, the merge finishes early. Merge sort is O(n log n) in all cases, but the exact number of comparisons varies.

Q11. True or False: Selection sort makes fewer swaps than insertion sort in the worst case.

Answer True. Selection sort makes exactly n-1 swaps (one per pass). Insertion sort's worst case (reverse-sorted input) makes O(n²) shifts (element moves). If swapping or moving elements is expensive, selection sort's minimal swap count is an advantage.

Section 3: Short Answer

Q12. You have an array of 50 elements. You need to sort it once and then search it 1,000 times. What is the total time complexity if you use (a) no sorting + linear search each time, vs. (b) merge sort once + binary search each time?

Answer (a) No sort + 1,000 linear searches: 1,000 × O(50) = O(50,000). (b) Merge sort + 1,000 binary searches: O(50 × 6) + 1,000 × O(6) ≈ O(300) + O(6,000) = O(6,300). Sorting once and searching many times is dramatically faster — about 8x in this case.

Q13. Explain the "divide and conquer" strategy as used in merge sort. What are the three steps?

Answer (1) Divide: Split the array into two halves. (2) Conquer: Recursively sort each half. (3) Combine: Merge the two sorted halves into a single sorted array. The key insight is that merging two sorted arrays is O(n), and the divide step creates O(log n) levels, giving O(n log n) total.

Q14. Why does the median-of-three pivot selection strategy reduce quicksort's chance of hitting O(n²)?

Answer The O(n²) worst case occurs when the pivot is consistently the minimum or maximum, creating maximally unbalanced partitions. Median-of-three chooses the median of the first, middle, and last elements, which is unlikely to be the minimum or maximum. This ensures reasonably balanced partitions in all but adversarially constructed inputs.

Q15. Name three practical reasons why you might implement your own sorting algorithm instead of using a library sort.

Answer (1) You need a specific guarantee the library sort does not provide (e.g., stability, adaptivity, deterministic behavior). (2) You are sorting a non-standard data structure (e.g., a linked list, a file on disk). (3) You need fine-grained control (e.g., partial sort to find top-K elements, external sort for data that does not fit in memory). (4) Learning and understanding — you cannot debug what you do not understand.