Problem 279 Perfect Squares
Table of Contents
Problem Statement
Link - Problem 279
Question
Given an integer n
, return the least number of perfect square numbers that sum to n
.
A perfect square is an integer that is the square of an integer; in other words, it is the product of some integer with itself. For example, 1
, 4
, 9
, and 16
are perfect squares while 3
and 11
are not.
Example 1:
Input: n = 12 Output: 3 Explanation: 12 = 4 + 4 + 4.
Example 2:
Input: n = 13 Output: 2 Explanation: 13 = 4 + 9.
Constraints:
1 <= n <= 104
Solution
class Solution {
public:
vector<int> dp;
int numSquares(int n) {
// Initialize the dp array with INT_MAX except for dp[0] which is 0
dp.resize(n + 1, INT_MAX);
dp[0] = 0; // Set dp[0] to 0
// Call the solve function to recursively calculate the minimum number of squares
return solve(n);
}
int solve(int n) {
// If n is 0, it can be formed using 0 squares
if (n == 0)
return 0;
// If n is negative, it cannot be formed, return INT_MAX
if (n < 0)
return INT_MAX;
// If the result for n is already calculated, return it
if (dp[n] != INT_MAX)
return dp[n];
int ans = INT_MAX;
// Iterate through all numbers from 1 to sqrt(n)
for (int i = 1; i * i <= n; i++) {
// Recursively calculate the minimum number of squares needed for n-i*i
int subproblem = solve(n - i * i);
// Update the answer with the minimum of current answer and subproblem + 1
ans = min(ans, subproblem + 1);
}
// Memoize the result for n
dp[n] = ans;
// Return the minimum number of squares needed for n
return ans;
}
};
Complexity Analysis
| Algorithm | Time Complexity | Space Complexity |
| ---------------------------- | --------------- | ---------------- |
| Memoized Dynamic Programming | O(n sqrt(n)) | O(n) |
Explanation
Intial Thoughts
This problem involves breaking down a number into sums of perfect squares, a classic concept in number theory. My initial thoughts are to relate this problem to known results or patterns in number theory. I’m reminded of the concept of sum of squares functions, Lagrange’s Four-Square Theorem, and the Legendre’s Three-Square Theorem. Another approach could be using a dynamic programming approach to build up the solution iteratively.
Intuitive Analysis
To solve this problem intuitively, we can think of it as a dynamic programming problem where each number n can be broken down into its ‘components’ of perfect squares. We can iteratively build up the solution by trying all possible combinations of perfect squares that sum up to n. As in many dynamic programming problems, memoization will be key to prevent redundant calculations and speed up the solution process. The key insight here is to realize that for a given number n, trying all possible ‘subproblems’ of n is inefficient, and we should only try ‘subproblems’ that are perfect squares themselves.
1. Intuition
- The problem can be solved using dynamic programming by breaking it down into smaller subproblems.
- The key insight is to find the minimum number of perfect squares that sum up to the given number.
- This can be achieved by iterating through all perfect squares less than or equal to the given number.
- For each perfect square, we recursively calculate the minimum number of squares needed for the remaining amount.
- We use memoization to store the results of subproblems to avoid redundant calculations.
- The final answer is the minimum number of squares needed for the given number.
- This approach ensures that we consider all possible combinations of perfect squares.
2. Implementation
- We initialize a dynamic programming array
dp
with sizen+1
and setdp[0]
to 0, as the minimum number of squares needed for 0 is 0. - We define a recursive function
solve
that takes an integern
as input and returns the minimum number of squares needed. - In the
solve
function, we check ifn
is 0, in which case we return 0, or ifn
is negative, in which case we returnINT_MAX
. - We also check if the result for
n
is already calculated and stored indp[n]
, in which case we returndp[n]
. - We iterate through all numbers from 1 to
sqrt(n)
and recursively calculate the minimum number of squares needed forn-i*i
. - We update the answer with the minimum of the current answer and the result of the subproblem plus 1.
- Finally, we memoize the result for
n
by storing it indp[n]
and return the minimum number of squares needed.
Complexity Analysis
Time Complexity:
- The code has a nested loop structure, with a recursive call to solve(n) inside a for loop that runs from 1 to sqrt(n), and within the recursive call, there is another loop from 1 to sqrt(n). However, the introduction of memoization through dynamic programming reduces the time complexity. The outer loop runs in O(sqrt(n)) time and for each iteration of the outer loop, the inner computation takes constant time because of memoization, resulting in O(n sqrt(n)) time complexity. The O(n sqrt(n)) is due to the initial filling of dp array.
- The dominant operations are the recursive calls and loop iterations, but memoization reduces the number of recursive calls significantly.
- Therefore, the Big O classification of this algorithm’s time complexity is O(n sqrt(n)).
Space Complexity:
- The code uses a dp array of size n+1 to store the results of subproblems. The maximum amount of memory used is directly proportional to the size of the input.
- The use of a dynamic programming array (dp) to store intermediate results significantly impacts memory usage.
- The space complexity is thus classified as O(n), where n is the input size.
Footnote
This question is rated as Medium difficulty.
Similar Questions:
Title | URL | Difficulty |
---|---|---|
Count Primes | https://leetcode.com/problems/count-primes | Medium |
Ugly Number II | https://leetcode.com/problems/ugly-number-ii | Medium |
Ways to Express an Integer as Sum of Powers | https://leetcode.com/problems/ways-to-express-an-integer-as-sum-of-powers | Medium |