I have an object that contains an array of objects.
obj = {};
obj.arr = new Array();
obj.arr.push({place: "here", name: "stuff"});
obj.arr.push({place: "there", name: "morestuff"});
obj.arr.push({place: "there", name: "morestuff"});
What is the best method to remove duplicate objects from an array? So, for example, obj.arr would become...
{place: "here", name: "stuff"},
{place: "there", name: "morestuff"}
78 Answers 78
Use some ES6 magic:
obj.arr = obj.arr.filter((value, index, self) =>
index === self.findIndex((t) => (
t.place === value.place && t.name === value.name
))
)
A more generic solution would be:
const uniqueArray = obj.arr.filter((value, index) => {
const _value = JSON.stringify(value);
return index === obj.arr.findIndex(obj => {
return JSON.stringify(obj) === _value;
});
});
Using the above property strategy instead of JSON.stringify:
const isPropValuesEqual = (subject, target, propNames) =>
propNames.every(propName => subject[propName] === target[propName]);
const getUniqueItemsByProperties = (items, propNames) =>
items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNames))
);
You can add a wrapper if you want the propNames property to be either an array or a value:
const getUniqueItemsByProperties = (items, propNames) => {
const propNamesArray = Array.from(propNames);
return items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNamesArray))
);
};
allowing both getUniqueItemsByProperties('a') and getUniqueItemsByProperties(['a']);
Explanation
- Start by understanding the two methods used:
- Next, take your idea of what makes your two objects equal and keep that in mind.
- We can detect something as a duplicate, if it satisfies the criterion that we have just thought of, but its position is not at the first instance of an object with the criterion.
- Therefore, we can use the above criterion to determine if something is a duplicate.
18 Comments
things.thing = things.thing.filter((thing, index, self) => self.findIndex(t => t.place === thing.place && t.name === thing.name) === index)const uniqueArray = arrayOfObjects.filter((object,index) => index === arrayOfObjects.findIndex(obj => JSON.stringify(obj) === JSON.stringify(object))); jsfiddle.net/x9ku0p7L/28 One-liners with filter() (Preserves order)
If you have some identifier in the objects which signifies uniqueness (e.g., id), then we can use filter() with findIndex() to work through the list and verify that the index of each object with that id value matches only itself. This means that there's only one such object in the list, i.e., without any duplicates.
myArr.filter((obj1, i, arr) =>
arr.findIndex(obj2 => (obj2.id === obj1.id)) === i
)
(Note that this solution keeps the first instance of detected duplicates in the result. You can instead take the last instance by replacing findIndex with findLastIndex in the above.)
If the order is not important, then map solutions will be faster: Solution with map
The above format can be applied to other cases by altering how we check for duplicates (i.e., replacing obj2.id === obj1.id with something else).
Unique by multiple properties (e.g., place and name, as in the question)
myArr.filter((obj1, i, arr) =>
arr.findIndex(obj2 =>
['place', 'name'].every(key => obj2[key] === obj1[key])
) === i
)
Unique by all properties
myArr.filter((obj1, i, arr) =>
arr.findIndex(obj2 =>
JSON.stringify(obj2) === JSON.stringify(obj1)
) === i
)
Caveats:
- This may get slow, depending on object and array sizes
JSON.stringify()key order is generally consistent, but it is only guaranteed in ES2015 and later- This means that your mileage may vary, and you may want to prefer something more robust (like comparing specific keys)
11 Comments
function uniqueByPredicate(arr, predicate) { return l.filter((v1, i, a) => a.findIndex(v2 => predicate(v1, v2)) === i); } where predicate must be of type (a: T, b: T) => boolean.Using ES6 or later in a single line, you can get a unique list of objects by key:
const key = 'place';
const unique = [...new Map(arr.map(item => [item[key], item])).values()]
It can be put into a function:
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
Here is a working example:
const arr = [
{place: "here", name: "x", other: "other stuff1" },
{place: "there", name: "x", other: "other stuff2" },
{place: "here", name: "y", other: "other stuff4" },
{place: "here", name: "z", other: "other stuff5" }
]
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
const arr1 = getUniqueListBy(arr, 'place')
console.log("Unique by place")
console.log(JSON.stringify(arr1))
console.log("\nUnique by name")
const arr2 = getUniqueListBy(arr, 'name')
console.log(JSON.stringify(arr2))
How does it work
First, the array is remapped in a way that it can be used as an input for a Map.
arr.map(item => [item[key], item]);
which means each item of the array will be transformed in another array with two elements; the selected key as first element and the entire initial item as second element, this is called an entry (for example, array entries, map entries). And here is the official documentation with an example showing how to add array entries in Map constructor.
An example when the key is place:
[["here", {place: "here", name: "x", other: "other stuff1" }], ...]
Secondly, we pass this modified array to the Map constructor and here is the magic happening. Map will eliminate the duplicate keys values, keeping only last inserted value of the same key. Note: Map keeps the order of insertion. (check the difference between Map and object)
new Map(entry array just mapped above)
Third, we use the map values to retrieve the original items, but this time without duplicates.
new Map(mappedArr).values()
And last one is to add those values into a fresh new array, so that it can look as the initial structure and return that:
return [...new Map(mappedArr).values()]
9 Comments
id. The question needs the entire object to be unique across all fields such as place and nameexport const unique = <T extends { [key: string]: unknown }>(arr: T[], key: string): T[] => [ ...new Map(arr.map((item: T) => [item[key], item])).values() ]; Simple and performant solution with a better runtime than the 70+ answers that already exist:
const ids = arr.map(({ id }) => id);
const filtered = arr.filter(({ id }, index) => !ids.includes(id, index + 1));
Example:
const arr = [{
id: 1,
name: 'one'
}, {
id: 2,
name: 'two'
}, {
id: 1,
name: 'one'
}];
const ids = arr.map(({ id }) => id);
const filtered = arr.filter(({ id }, index) => !ids.includes(id, index + 1));
console.log(filtered);
How it works:
Array.filter() removes all duplicate objects by checking if the previously mapped id-array includes the current id ({id} destructs the object into only its id). To only filter out actual duplicates, it is using Array.includes()'s second parameter fromIndex with index + 1 which will ignore the current object and all previous.
Since every iteration of the filter callback method will only search the array beginning at the current index + 1, this also dramatically reduces the runtime because only objects not previously filtered get checked.
What if you don't have a single unique identifier like id?
Just create a temporary one:
const objToId = ({ name, city, birthyear }) => `${name}-${city}-${birthyear}`;
const ids = arr.map(objToId);
const filtered = arr.filter((item, index) => !ids.includes(objToId(item), index + 1));
11 Comments
{id: 1, name: 'one'} and {namd: 'one', id: 1} it would fail to detect the duplicate.{ id } is destructuring the object into only its id-key. To illustrate let's look at this these two loops: 1. arr.forEach(object => console.log(object.id)) and 2. arr.forEach({id} => console.log(id)). They are both doing exactly the same thing: printing the id-key of all objects in arr. However, one is using destructuring and the other one is using a more conventional key access via the dot notation.A primitive method would be:
const obj = {};
for (let i = 0, len = things.thing.length; i < len; i++) {
obj[things.thing[i]['place']] = things.thing[i];
}
things.thing = new Array();
for (const key in obj) {
things.thing.push(obj[key]);
}
7 Comments
If you can use JavaScript libraries, such as Underscore.js or Lodash, I recommend having a look at the _.uniq function in their libraries. From Lodash:
_.uniq(array, [isSorted=false], [callback=_.identity], [thisArg])
Basically, you pass in the array that in here is an object literal, and you pass in the attribute that you want to remove duplicates with in the original data array, like this:
var data = [{'name': 'Amir', 'surname': 'Rahnama'}, {'name': 'Amir', 'surname': 'Stevens'}];
var non_duplidated_data = _.uniq(data, 'name');
Lodash has introduced a .uniqBy as well.
4 Comments
uniqBy instead of uniq, e.g. _.uniqBy(data, 'name')... documentation: lodash.com/docs#uniqBy let data = [{'v': {'t':1, 'name':"foo"}}, {'v': {'t':1, 'name':"bar"}}]; do: let uniq = _.uniqBy(data, 'v.t');I had this exact same requirement, to remove duplicate objects in an array, based on duplicates on a single field. I found the code here: JavaScript: Remove Duplicates from Array of Objects
So in my example, I'm removing any object from the array that has a duplicate licenseNum string value.
var arrayWithDuplicates = [
{"type":"LICENSE", "licenseNum": "12345", state:"NV"},
{"type":"LICENSE", "licenseNum": "A7846", state:"CA"},
{"type":"LICENSE", "licenseNum": "12345", state:"OR"},
{"type":"LICENSE", "licenseNum": "10849", state:"CA"},
{"type":"LICENSE", "licenseNum": "B7037", state:"WA"},
{"type":"LICENSE", "licenseNum": "12345", state:"NM"}
];
function removeDuplicates(originalArray, prop) {
var newArray = [];
var lookupObject = {};
for(var i in originalArray) {
lookupObject[originalArray[i][prop]] = originalArray[i];
}
for(i in lookupObject) {
newArray.push(lookupObject[i]);
}
return newArray;
}
var uniqueArray = removeDuplicates(arrayWithDuplicates, "licenseNum");
console.log("uniqueArray is: " + JSON.stringify(uniqueArray));
The results:
uniqueArray is:
[{"type":"LICENSE","licenseNum":"10849","state":"CA"},
{"type":"LICENSE","licenseNum":"12345","state":"NM"},
{"type":"LICENSE","licenseNum":"A7846","state":"CA"},
{"type":"LICENSE","licenseNum":"B7037","state":"WA"}]
5 Comments
for(var i in array) { if(array[i][prop]){ //valid lookupObject[array[i][prop]] = array[i]; } else { console.log('falsy object'); } }for (let i in originalArray) { if (lookupObject[originalArray[i]['id']] === undefined) { newArray.push(originalArray[i]); } lookupObject[originalArray[i]['id']] = originalArray[i]; }www.tjcafferkey.me link is broken (domain expiry?): "Hmm. We’re having trouble finding that site. We can’t connect to the server at www.tjcafferkey.me "A one-liner using Set:
var things = new Object();
things.thing = new Array();
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
// Assign things.thing to myData for brevity
var myData = things.thing;
things.thing = Array.from(new Set(myData.map(JSON.stringify))).map(JSON.parse);
console.log(things.thing)
Explanation:
new Set(myData.map(JSON.stringify))creates a Set object using the stringified myData elements.- The Set object will ensure that every element is unique.
- Then I create an array based on the elements of the created set using Array.from.
- Finally, I use JSON.parse to convert the stringified element back to an object.
4 Comments
const distinct = (data, elements = []) => [...new Set(data.map(o => JSON.stringify(o, elements)))].map(o => JSON.parse(o)); Then when calling distinct just pass in the property names for the elements array. For the original post that would be ['place', 'name']. For @PirateApp's example that would be ['a', 'b'].An ES6 one-liner is here:
let arr = [
{id:1,name:"sravan ganji"},
{id:2,name:"pinky"},
{id:4,name:"mammu"},
{id:3,name:"avy"},
{id:3,name:"rashni"},
];
console.log(Object.values(arr.reduce((acc,cur)=>Object.assign(acc,{[cur.id]:cur}),{})))
7 Comments
:cur in cur.id]:cur? I dont understand this piece of the code.To remove all duplicates from an array of objects, the simplest way is use filter:
var uniq = {};
var arr = [{"id":"1"},{"id":"1"},{"id":"2"}];
var arrFiltered = arr.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
console.log('arrFiltered', arrFiltered);
2 Comments
id. The question needs the entire object to be unique across all fields such as place and nameOne liners with Map ( High performance, Does not preserve order )
Find unique id's in array arr.
const arrUniq = [...new Map(arr.map(v => [v.id, v])).values()]
If the order is important check out the solution with filter: Solution with filter
Unique by multiple properties ( place and name ) in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify([v.place,v.name]), v])).values()]
Unique by all properties in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify(v), v])).values()]
Keep the first occurrence in array arr
const arrUniq = [...new Map(arr.slice().reverse().map(v => [v.id, v])).values()].reverse()
2 Comments
Here's another option to do it using Array iterating methods if you need comparison only by one field of an object:
function uniq(a, param){
return a.filter(function(item, pos, array){
return array.map(function(mapItem){ return mapItem[param]; }).indexOf(item[param]) === pos;
})
}
uniq(things.thing, 'place');
1 Comment
This is a generic way of doing this: you pass in a function that tests whether two elements of an array are considered equal. In this case, it compares the values of the name and place properties of the two objects being compared.
For ES5
function removeDuplicates(arr, equals) {
var originalArr = arr.slice(0);
var i, len, val;
arr.length = 0;
for (i = 0, len = originalArr.length; i < len; ++i) {
val = originalArr[i];
if (!arr.some(function(item) { return equals(item, val); })) {
arr.push(val);
}
}
}
function thingsEqual(thing1, thing2) {
return thing1.place === thing2.place
&& thing1.name === thing2.name;
}
var things = [
{place: "here", name: "stuff"},
{place: "there",name: "morestuff"},
{place: "there",name: "morestuff"}
];
removeDuplicates(things, thingsEqual);
console.log(things);
For ES3
function arrayContains(arr, val, equals) {
var i = arr.length;
while (i--) {
if ( equals(arr[i], val) ) {
return true;
}
}
return false;
}
function removeDuplicates(arr, equals) {
var originalArr = arr.slice(0);
var i, len, j, val;
arr.length = 0;
for (i = 0, len = originalArr.length; i < len; ++i) {
val = originalArr[i];
if (!arrayContains(arr, val, equals)) {
arr.push(val);
}
}
}
function thingsEqual(thing1, thing2) {
return thing1.place === thing2.place
&& thing1.name === thing2.name;
}
removeDuplicates(things.thing, thingsEqual);
3 Comments
If you can wait to eliminate the duplicates until after all the additions, the typical approach is to first sort the array and then eliminate duplicates. The sorting avoids the N * N approach of scanning the array for each element as you walk through them.
The "eliminate duplicates" function is usually called unique or uniq. Some existing implementations may combine the two steps, e.g., prototype's uniq.
This post has a few ideas to try (and some to avoid :-) ) if your library doesn't already have one! Personally, I find this one the most straightforward:
function unique(a){
a.sort();
for(var i = 1; i < a.length; ){
if(a[i-1] == a[i]){
a.splice(i, 1);
} else {
i++;
}
}
return a;
}
// Provide your own comparison
function unique(a, compareFunc){
a.sort( compareFunc );
for(var i = 1; i < a.length; ){
if( compareFunc(a[i-1], a[i]) === 0){
a.splice(i, 1);
} else {
i++;
}
}
return a;
}
4 Comments
function(_a,_b){return _a.a===_b.a && _a.b===_b.b;} then the array won't be sorted.I think the best approach is using reduce and Map object. This is a single line solution.
const data = [
{id: 1, name: 'David'},
{id: 2, name: 'Mark'},
{id: 2, name: 'Lora'},
{id: 4, name: 'Tyler'},
{id: 4, name: 'Donald'},
{id: 5, name: 'Adrian'},
{id: 6, name: 'Michael'}
]
const uniqueData = [...data.reduce((map, obj) => map.set(obj.id, obj), new Map()).values()];
console.log(uniqueData)
/*
in `map.set(obj.id, obj)`
'obj.id' is key. (don't worry. we'll get only values using the .values() method)
'obj' is whole object.
*/
1 Comment
Considering lodash.uniqWith
const objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
_.uniqWith(objects, _.isEqual);
// => [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }]
1 Comment
Using ES6 and Array.reduce with Array.find.
In this example, filtering objects based on a guid property.
let filtered = array.reduce((accumulator, current) => {
if (! accumulator.find(({guid}) => guid === current.guid)) {
accumulator.push(current);
}
return accumulator;
}, []);
Extending this one to allow selection of a property and compress it into a one-liner:
const uniqify = (array, key) => array.reduce((prev, curr) => prev.find(a => a[key] === curr[key]) ? prev : prev.push(curr) && prev, []);
To use it, pass an array of objects and the name of the key you wish to deduplicate on as a string value:
const result = uniqify(myArrayOfObjects, 'guid')
Comments
You could also use a Map:
const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());
Full sample:
const things = new Object();
things.thing = new Array();
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());
console.log(JSON.stringify(dedupThings, null, 4));
Result:
[
{
"place": "here",
"name": "stuff"
},
{
"place": "there",
"name": "morestuff"
}
]
Dang, kids, let's crush this thing down, why don't we?
let uniqIds = {}, source = [{id:'a'},{id:'b'},{id:'c'},{id:'b'},{id:'a'},{id:'d'}];
let filtered = source.filter(obj => !uniqIds[obj.id] && (uniqIds[obj.id] = true));
console.log(filtered);
// EXPECTED: [{id:'a'},{id:'b'},{id:'c'},{id:'d'}];
3 Comments
id. The question needs the entire object to be unique across all fields such as place and nameplace and name today. Anyone reading this thread is looking for an optimal way to dedup a list of objects, and this is a compact way of doing so.Use:
let myData = [{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}];
let q = [...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];
console.log(q)
One-liner using ES6 and new Map().
// Assign things.thing to myData
let myData = things.thing;
[...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];
Details:
Doing
.map()on the data list and converting each individual object into a[key, value]pair array(length =2), the first element (key) would be thestringifiedversion of the object and second (value) would be anobjectitself.Adding the above-created array list to
new Map()would have the key as astringifiedobject and any same key addition would result in overriding the already-existing key.Using
.values()would give a MapIterator with all values in a Map (objin our case)Finally, the
spread ...operator to give a new Array with values from the above step.
Comments
A TypeScript solution
This will remove duplicate objects and also preserve the types of the objects.
function removeDuplicateObjects(array: any[]) {
return [...new Set(array.map(s => JSON.stringify(s)))]
.map(s => JSON.parse(s));
}
4 Comments
any entirely defeats the purpose of TypeScriptIf array contains objects, then you can use this to remove duplicate
const persons= [
{ id: 1, name: 'John',phone:'23' },
{ id: 2, name: 'Jane',phone:'23'},
{ id: 1, name: 'Johnny',phone:'56' },
{ id: 4, name: 'Alice',phone:'67' },
];
const unique = [...new Map(persons.map((m) => [m.id, m])).values()];
if remove duplicates on the basis of phone, just replace m.id with m.phone
const unique = [...new Map(persons.map((m) => [m.phone, m])).values()];
Comments
Use:
const things = [
{place: "here", name: "stuff"},
{place: "there", name: "morestuff"},
{place: "there", name: "morestuff"}
];
const filteredArr = things.reduce((thing, current) => {
const x = thing.find(item => item.place === current.place);
if (!x) {
return thing.concat([current]);
} else {
return thing;
}
}, []);
console.log(filteredArr)
Solution Via Set Object | According to the data type
const seen = new Set();
const things = [
{place: "here", name: "stuff"},
{place: "there", name: "morestuff"},
{place: "there", name: "morestuff"}
];
const filteredArr = things.filter(el => {
const duplicate = seen.has(el.place);
seen.add(el.place);
return !duplicate;
});
console.log(filteredArr)
Set Object Feature
Each value in the Set Object has to be unique. The value equality will be checked.
The Purpose of Set object storing unique values according to the Data type, whether primitive values or object references. It has very useful four instance methods add, clear, has, and delete.
Unique & data Type feature:..
addmethod
It's pushing unique data into the collection by default and also preserves the data type. That means it prevents pushing duplicate items into the collection. Also it will check the data type by default...
has method
It sometimes needs to check if the data item exist into the collection and it's a handy method for the collection to check for a unique id or item and data type.
delete method
it will remove specific item from the collection by identifying data type..
clear method
It will remove all collection items from one specific variable and set as empty object.
Set object has also Iteration methods & more feature..
Better read from here: Set - JavaScript | MDN
Comments
removeDuplicates() takes in an array of objects and returns a new array without any duplicate objects (based on the id property).
const allTests = [
{name: 'Test1', id: '1'},
{name: 'Test3', id: '3'},
{name: 'Test2', id: '2'},
{name: 'Test2', id: '2'},
{name: 'Test3', id: '3'}
];
function removeDuplicates(array) {
let uniq = {};
return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true))
}
removeDuplicates(allTests);
Expected outcome:
[
{name: 'Test1', id: '1'},
{name: 'Test3', id: '3'},
{name: 'Test2', id: '2'}
];
First, we set the value of variable uniq to an empty object.
Next, we filter through the array of objects. Filter creates a new array with all elements that pass the test implemented by the provided function.
return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
Above, we use the short-circuiting functionality of &&. If the left side of the && evaluates to true, then it returns the value on the right of the &&. If the left side is false, it returns what is on the left side of the &&.
For each object(obj), we check uniq for a property named the value of obj.id (in this case, on the first iteration it would check for the property '1'.)
We want the opposite of what it returns (either true or false) which is why we use the ! in !uniq[obj.id]. If uniq has the id property already, it returns true which evaluates to false (!) telling the filter function not to add that obj.
However, if it does not find the obj.id property, it returns false which then evaluates to true (!) and returns everything to the right of the &&, or (uniq[obj.id] = true). This is a truthy value, telling the filter method to add that obj to the returned array, and it also adds the property {1: true} to uniq.
This ensures that any other obj instance with that same id will not be added again.
Comments
Fast (less runtime) and type-safe answer for lazy TypeScript developers:
export const uniqueBy = <T>( uniqueKey: keyof T, objects: T[]): T[] => {
const ids = objects.map(object => object[uniqueKey]);
return objects.filter((object, index) => !ids.includes(object[uniqueKey], index + 1));
}
1 Comment
uniqueKey should be keyof T instead of string to make it more precise.This way works well for me:
function arrayUnique(arr, uniqueKey) {
const flagList = new Set()
return arr.filter(function(item) {
if (!flagList.has(item[uniqueKey])) {
flagList.add(item[uniqueKey])
return true
}
})
}
const data = [
{
name: 'Kyle',
occupation: 'Fashion Designer'
},
{
name: 'Kyle',
occupation: 'Fashion Designer'
},
{
name: 'Emily',
occupation: 'Web Designer'
},
{
name: 'Melissa',
occupation: 'Fashion Designer'
},
{
name: 'Tom',
occupation: 'Web Developer'
},
{
name: 'Tom',
occupation: 'Web Developer'
}
]
console.table(arrayUnique(data, 'name'))// work well
Printout
┌─────────┬───────────┬────────────────────┐
│ (index) │ name │ occupation │
├─────────┼───────────┼────────────────────┤
│ 0 │ 'Kyle' │ 'Fashion Designer' │
│ 1 │ 'Emily' │ 'Web Designer' │
│ 2 │ 'Melissa' │ 'Fashion Designer' │
│ 3 │ 'Tom' │ 'Web Developer' │
└─────────┴───────────┴────────────────────┘
ES5:
function arrayUnique(arr, uniqueKey) {
const flagList = []
return arr.filter(function(item) {
if (flagList.indexOf(item[uniqueKey]) === -1) {
flagList.push(item[uniqueKey])
return true
}
})
}
These two ways are simpler and more understandable.
Comments
Here is a solution for ES6 where you only want to keep the last item. This solution is functional and Airbnb style compliant.
const things = {
thing: [
{ place: 'here', name: 'stuff' },
{ place: 'there', name: 'morestuff1' },
{ place: 'there', name: 'morestuff2' },
],
};
const removeDuplicates = (array, key) => {
return array.reduce((arr, item) => {
const removed = arr.filter(i => i[key] !== item[key]);
return [...removed, item];
}, []);
};
console.log(removeDuplicates(things.thing, 'place'));
// > [{ place: 'here', name: 'stuff' }, { place: 'there', name: 'morestuff2' }]
1 Comment
Some of the objects in your array may have additional properties that you are not interested in, or you simply want to find the unique objects considering only a subset of the properties.
Consider the array below. Say you want to find the unique objects in this array considering only propOne and propTwo, and ignore any other properties that may be there.
The expected result should include only the first and last objects. So here goes the code:
const array = [{
propOne: 'a',
propTwo: 'b',
propThree: 'I have no part in this...'
},
{
propOne: 'a',
propTwo: 'b',
someOtherProperty: 'no one cares about this...'
},
{
propOne: 'x',
propTwo: 'y',
yetAnotherJunk: 'I am valueless really',
noOneHasThis: 'I have something no one has'
}];
const uniques = [...new Set(
array.map(x => JSON.stringify(((o) => ({
propOne: o.propOne,
propTwo: o.propTwo
}))(x))))
].map(JSON.parse);
console.log(uniques);
4 Comments
array become one in the uniques. Now should that object contain propThree from array[0], or someOtherProperty from array[1], or both, or something else? As long as we know exactly what to do in such case, what you asked for is doable for sure.(({ propOne, propTwo }) => ({ propOne, propTwo }))(x)?(x) is an arrow function which is unpacking the argument object into properties propOne and propTwo. Learn about object destructuring here. Now that I have read the code again, I think it should have been written a little more clearly. I have updated the code.Another option would be to create a custom indexOf function, which compares the values of your chosen property for each object and wrap this in a reduce function.
var uniq = redundant_array.reduce(function(a, b) {
function indexOfProperty (a, b) {
for (var i=0; i<a.length; i++) {
if(a[i].property == b.property) {
return i;
}
}
return -1;
}
if (indexOfProperty(a, b) < 0)
a.push(b);
return a;
}, []);
1 Comment
lodash.isequal npm package as a lightweight object comparator to perform unique array filtering ...e.g. distinct array of objects. Just swapped in if (_.isEqual(a[i], b)) { instead of looking @ a single propertyThis solution worked best for me, by using Array.from method. And also it’s shorter and readable.
let person = [
{name: "john"},
{name: "jane"},
{name: "imelda"},
{name: "john"},
{name: "jane"}
];
const data = Array.from(new Set(person.map(JSON.stringify))).map(JSON.parse);
console.log(data);
1 Comment
const data = [...(new Set(person.map(JSON.stringify)))].map(JSON.parse);Explore related questions
See similar questions with these tags.
arrayWithNoDuplicates = Array.from(new Set(myArray))Array.from(new Set(myArray.map(e => JSON.stringify(e)))))[...(new Set(dataList.map(e => JSON.stringify(e))))].map(e => JSON.parse(e))