Problem Description
Given a sorted (non‐decreasing) integer array and an integer k, determine whether you can partition all the elements into one or more disjoint subsequences such that each subsequence is strictly increasing and has length at least k.
Key Insights
- The array is already sorted, so any subsequence picked by selecting one element from different groups remains increasing.
- Duplicate values cannot be in the same increasing subsequence; they force you to “branch” into separate subsequences.
- The minimum number of subsequences needed is forced by the maximum frequency of any element.
- Using a greedy method to extend an existing subsequence (one whose last element is strictly smaller than the current number) minimizes the number of subsequences and maximizes their lengths.
- In the end, every subsequence must have reached length k; if any is shorter, the partition is invalid.
Space and Time Complexity
Time Complexity: O(n log n) – where n is the length of the array; each element is processed with a heap operation. Space Complexity: O(n) – for storing the subsequences in the heap.
Solution
We simulate the partition process using a min-heap (or priority queue) that tracks the current subsequences. Each subsequence is represented by a pair: (last_element, current_length). As we iterate through the array, if the smallest subsequence (by last element) can be extended (i.e. its last element is less than the current number), we extend it by pushing the updated pair back into the heap. Otherwise, we start a new subsequence with the current number. Finally, we check every subsequence’s length – if any subsequence is shorter than k, we return false; otherwise, true. This greedy strategy produces a partition with as few subsequences (hence as long as possible) as necessary, giving the best chance for all subsequences to meet the length requirement.