How to Remove Duplicates from an Array of Objects in JavaScript
To remove duplicates from an array of objects in JavaScript, use the filter() method combined with map:
let arr = [
{ id: 1, name: "Alice" },
{ id: 2, name: "Bob" },
{ id: 1, name: "Alice" },
{ id: 3, name: "Charlie" }
];
let unique = arr.filter((value, index, self) =>
index === self.findIndex((t) => (
t.id === value.id
))
);
console.log(unique);
// Output:
// [
// { id: 1, name: "Alice" },
// { id: 2, name: "Bob" },
// { id: 3, name: "Charlie" }
// ]
To remove duplicates from an array of objects in JavaScript, use the filter() method combined with map:
let arr = [
{ id: 1, name: "Alice" },
{ id: 2, name: "Bob" },
{ id: 1, name: "Alice" },
{ id: 3, name: "Charlie" }
];
let unique = arr.filter((value, index, self) =>
index === self.findIndex((t) => (
t.id === value.id
))
);
console.log(unique);
// Output:
// [
// { id: 1, name: "Alice" },
// { id: 2, name: "Bob" },
// { id: 3, name: "Charlie" }
// ]
This method ensures data uniqueness by comparing each object’s id property, keeping only the first occurrence.
Managing arrays of objects in JavaScript often involves ensuring data uniqueness, especially when dealing with large datasets. Removing duplicates from an array of objects can be a common yet crucial task. This article explores various methods to remove duplicates from an array of objects in JavaScript, providing detailed explanations and code examples for each method.
Steps How to Remove Duplicates from an Array of Objects in JavaScript
Using the filter() Method with map
The filter() method combined with map can be an efficient way to remove duplicates by comparing unique property values.
Example 1: Removing Duplicates Based on a Single Property
let arr = [
{ id: 1, name: "Alice" },
{ id: 2, name: "Bob" },
{ id: 1, name: "Alice" },
{ id: 3, name: "Charlie" }
];
let unique = arr.filter((value, index, self) =>
index === self.findIndex((t) => (
t.id === value.id
))
);
console.log(unique);
// Output:
// [
// { id: 1, name: "Alice" },
// { id: 2, name: "Bob" },
// { id: 3, name: "Charlie" }
// ]
Explanation:
- let arr = […]
This initializes the array of objects which contains some duplicate objects based on the id property. - let unique = arr.filter((value, index, self) => …)
The filter() method is used to create a new array with all elements that pass the test implemented by the provided function. - index === self.findIndex((t) => (t.id === value.id))
- For each element in the array, findIndex is used to find the first occurrence of an object with the same id.
- The current element passes the filter if its index matches the index returned by findIndex, ensuring that only the first occurrence is kept.
- console.log(unique);
This logs the unique array of objects, showing duplicates removed based on the id property.
Using the reduce() Method
The reduce() method is another powerful way to remove duplicates by accumulating results in an object or array.
Example 2: Using reduce() to Remove Duplicates
let arr = [
{ id: 1, name: "Alice" },
{ id: 2, name: "Bob" },
{ id: 1, name: "Alice" },
{ id: 3, name: "Charlie" }
];
let unique = arr.reduce((acc, current) => {
const x = acc.find(item => item.id === current.id);
if (!x) {
acc.push(current);
}
return acc;
}, []);
console.log(unique);
// Output:
// [
// { id: 1, name: "Alice" },
// { id: 2, name: "Bob" },
// { id: 3, name: "Charlie" }
// ]
Explanation:
- let arr = […]
This initializes the array of objects with duplicates. - let unique = arr.reduce((acc, current) => {…}, []);
The reduce() method processes each element of the array (here, current) and accumulates the results in an array (acc). - const x = acc.find(item => item.id === current.id);
For each object in the array, find checks if an object with the same id already exists in the accumulator. - if (!x) { acc.push(current); }
If no object with the same id is found, the current object is added to the accumulator. - console.log(unique);
This logs the unique array of objects, showing duplicates removed.
Using a Set and map
Using a Set with map is a modern and concise way to handle duplicates.Example
3: Using a Set and map()
let arr = [
{ id: 1, name: "Alice" },
{ id: 2, name: "Bob" },
{ id: 1, name: "Alice" },
{ id: 3, name: "Charlie" }
];
let unique = Array.from(new Set(arr.map(a => a.id)))
.map(id => arr.find(a => a.id === id));
console.log(unique);
// Output:
// [
// { id: 1, name: "Alice" },
// { id: 2, name: "Bob" },
// { id: 3, name: "Charlie" }
// ]
Explanation:
- let arr = […]
This initializes the array of objects with duplicates. - let unique = Array.from(new Set(arr.map(a => a.id)))
- arr.map(a => a.id) creates an array of ids.
- new Set(…) creates a Set object that removes duplicate ids.
- Array.from(…) converts the Set back into an array of unique ids.
- .map(id => arr.find(a => a.id === id));
Maps each unique id back to the corresponding object from the original array using find. - console.log(unique);
This logs the unique array of objects, showing duplicates removed.
Using Lodash Library
Lodash is a popular utility library that simplifies array manipulation, including removing duplicates.
Example 4: Using Lodash’s uniqBy
First, include Lodash in your project:
<script src="https://cdn.jsdelivr.net/npm/lodash/lodash.min.js"></script>
Then use the uniqBy method:
let arr = [
{ id: 1, name: "Alice" },
{ id: 2, name: "Bob" },
{ id: 1, name: "Alice" },
{ id: 3, name: "Charlie" }
];
let unique = _.uniqBy(arr, 'id');
console.log(unique);
// Output:
// [
// { id: 1, name: "Alice" },
// { id: 2, name: "Bob" },
// { id: 3, name: "Charlie" }
// ]
Explanation:
- let arr = […]
This initializes the array of objects with duplicates. - let unique = _.uniqBy(arr, ‘id’);
The uniqBy method from Lodash removes duplicates based on the id property. - console.log(unique);
This logs the unique array of objects, showing duplicates removed.
Conclusion
Removing duplicates from an array of objects in JavaScript can be accomplished using various methods, each suitable for different scenarios. Whether using the filter() and map() combination, reduce(), Set, or the Lodash library, JavaScript provides robust tools for handling data uniqueness effectively.