Problem 80 Remove Duplicates from Sorted Array II
Table of Contents
Problem Statement
Link - Problem 80
Question
Given an integer array nums
sorted in non-decreasing order, remove some duplicates in-place such that each unique element appears at most twice. The relative order of the elements should be kept the same.
Since it is impossible to change the length of the array in some languages, you must instead have the result be placed in the first part of the array nums
. More formally, if there are k
elements after removing the duplicates, then the first k
elements of nums
should hold the final result. It does not matter what you leave beyond the first k
elements.
Return k
after placing the final result in the first k
slots of nums
.
Do not allocate extra space for another array. You must do this by modifying the input array in-place with O(1) extra memory.
Custom Judge:
The judge will test your solution with the following code:
int[] nums = [...]; // Input array int[] expectedNums = [...]; // The expected answer with correct length int k = removeDuplicates(nums); // Calls your implementation assert k == expectedNums.length; for (int i = 0; i < k; i++) { assert nums[i] == expectedNums[i]; }
If all assertions pass, then your solution will be accepted.
Example 1:
Input: nums = [1,1,1,2,2,3] Output: 5, nums = [1,1,2,2,3,_] Explanation: Your function should return k = 5, with the first five elements of nums being 1, 1, 2, 2 and 3 respectively. It does not matter what you leave beyond the returned k (hence they are underscores).
Example 2:
Input: nums = [0,0,1,1,1,1,2,3,3] Output: 7, nums = [0,0,1,1,2,3,3,_,_] Explanation: Your function should return k = 7, with the first seven elements of nums being 0, 0, 1, 1, 2, 3 and 3 respectively. It does not matter what you leave beyond the returned k (hence they are underscores).
Constraints:
1 <= nums.length <= 3 * 104
-104 <= nums[i] <= 104
nums
is sorted in non-decreasing order.
Solution
class Solution {
public:
int removeDuplicates(vector<int>& nums) {
int count = 1;
for(int i=1;i<nums.size();i++){
if(nums[i]==nums[i-1])
count++;
else if(nums[i]!=nums[i-1])
count = 1;
if(count>2)
{nums.erase(nums.begin()+i);
i--;}
}
return nums.size();
}
};
Complexity Analysis
| Algorithm | Time Complexity | Space Complexity |
| ------------------------------ | --------------- | ---------------- |
| Modified Two-pointer traversal | O(n) | O(1) |
Explanation
Intial Thoughts
To approach this problem, think of it like organizing a shelf with books of different titles. Each title can have at most two copies. Start by looking at the books one by one. If a book is the same as the previous one, increment a counter. If a book is different, reset the counter. This process helps identify which books to keep and which to remove. Consider how modifying the shelf (or array) in-place affects the overall organization. It’s also important to keep track of the number of books (or elements) that are kept after the duplicates are removed.
Intuitive Analysis
Imagine walking through a sorted list of numbers, similar to walking through a library with books arranged alphabetically. The goal is to leave behind a list where each unique number appears no more than twice. Think of using pointers or markers to keep track of positions in the list where changes need to be made. Incrementing a pointer when a number can be added, and considering the count of each number to decide whether to include it or not. The key intuition is to maintain a pointer that always points to the next position to write to, effectively overwriting redundant elements. This method ensures that the relative order is preserved and that each unique number appears at most twice.
1. Intuition
- The problem requires removing duplicates from a sorted array such that each unique element appears at most twice, and the relative order of elements should be maintained.
- To solve this problem, we need to iterate through the array and keep track of the count of each element.
- If the count of an element exceeds 2, we should remove the extra occurrences of that element.
- We can use a simple counter variable
count
to keep track of the occurrences of each element. - When the count exceeds 2, we can remove the extra element by using the
erase
function in C++. - However, using
erase
in a loop can be inefficient because it shifts all the elements after the erased element, resulting in a time complexity of O(n^2). - A more efficient approach would be to use two pointers, one for reading and one for writing, to avoid shifting elements.
2. Implementation
- We initialize a counter variable
count
to 1 and iterate through the array starting from the second element. - For each element, we check if it is equal to the previous element. If it is, we increment the
count
variable. - If the element is not equal to the previous element, we reset the
count
variable to 1. - If the
count
variable exceeds 2, we remove the extra element by using theerase
function in C++ and decrement the loop counteri
to avoid skipping elements. - However, a more efficient implementation would be to use two pointers,
i
andj
, wherei
is the reading pointer andj
is the writing pointer. - We can then use the condition
if (i > 2 && nums[i] == nums[i-2])
to check if the current element is a duplicate that should be removed, and only incrementj
when the element is not a duplicate. - Finally, we return the value of
j
, which represents the number of unique elements in the array.
Complexity Analysis
Time Complexity:
- The
for
loop iterates over the input vectornums
, resulting in a linear time complexity. In the worst case, every element is unique and the loop iteratesn
times, wheren
is the size ofnums
. Theerase
operation inside the loop has a time complexity of O(n) in the worst case, but since it’s called at mostn
times, the overall time complexity remains O(n). Thecount
variable and conditional statements have constant time complexity, not affecting the overall time complexity. - The dominant operation is the
for
loop iteration, which occursn
times. Additionally, theerase
operation contributes to the time complexity due to potential shifting of elements after removal. - The overall time complexity is O(n) because the algorithm performs a constant amount of work for each element in the input vector, resulting in a linear relationship between the input size and the time taken to execute.
Space Complexity:
- The space complexity is O(1) because the algorithm only uses a constant amount of space to store the
count
variable and loop indices. The input vectornums
is modified in-place, avoiding the need for additional data structures. - The
erase
operation modifies the input vector, but it does not require any additional space that scales with the input size. The space used by the input vector itself is not included in the space complexity analysis. - The algorithm’s space usage does not grow with the input size, making it a constant space complexity solution.
Footnote
This question is rated as Medium difficulty.
Similar Questions:
Title | URL | Difficulty |
---|---|---|
Remove Duplicates from Sorted Array | https://leetcode.com/problems/remove-duplicates-from-sorted-array | Easy |