Problem Description
We have a pyramid of glasses where the i(th) row contains i+1 glasses. When champagne is poured into the top glass, any overflow (amount beyond 1 cup per glass) spills equally into the two glasses immediately below. Given the number of cups poured and a specific glass identified by its row and position (both 0-indexed), compute how full that glass will be.
Key Insights
- Use dynamic programming to simulate the flow of champagne row-by-row.
- Each glass can hold at most 1 cup; any excess over 1 is equally distributed to the two glasses directly below.
- Only the rows from 0 up to query_row need to be simulated, and the answer is the minimum of 1 and the computed value for the target glass.
- The simulation process is efficient since query_row is limited to less than 100.
Space and Time Complexity
Time Complexity: O(query_row²)
Space Complexity: O(query_row²)
Solution
We solve the problem with dynamic programming by constructing a 2D array (dp) where dp[i][j] holds the amount of champagne in the j(th) glass of row i.
- Initialize dp[0][0] with the number of cups poured.
- Iterate over each row up to query_row. For each glass in the current row, if its content is greater than 1 cup, compute the overflow (amount - 1). Distribute the overflow equally to the two glasses in the next row (i.e., add half of the overflow to dp[i+1][j] and dp[i+1][j+1]).
- After processing all rows, the answer for the target glass is min(1, dp[query_row][query_glass]), as a glass can at most be full (i.e., 1 cup).