Task is to have a function which takes an array. Then removes elements from the array which are contained more then one time. Finally returns an array with only unique elements.
Here's my solution:
// Duplicates : 4 (3 times), 5, 9 (3 times) => Sum: 5 too much !
let withDuplicates = [1, 2, 3, 4, 4,
4, 5, 5, 6, 7,
8, 9, 9, 9, 10
];
let withoutDuplicates = [];
// -- The actual function ------------------
function removeDuplicates(arr) {
return arr.reduce((unique, current) => {
if (!unique.includes(current)) {
unique.push(current);
}
return unique;
}, []);
}
// ----------------------------------------
withoutDuplicates = removeDuplicates(withDuplicates);
console.log(withDuplicates);
console.log('Length:', withDuplicates.length);
console.log(withoutDuplicates);
console.log('Length:', withoutDuplicates.length);
Using reduce() was what came to my mind for solving the task. And the code has become already quite tight.
Nevertheless: Is there a better solution?
1 Answer 1
Is there a better solution?
Yes, always... biting my tongue.
So first let's look at the function you have.
The JS array functions that take a callback as an argument, like reduce
, are slow in comparison to standard loops so for more code you can get a performance increase by implementing only what you need in a reduce function. But we can avoid that as well.
Using a Set
The includes
function is also a little inefficient. It needs to iterate each item in the new array and test it against the current item. This can be improved by using a hash table. JS has two objects that use hash table lookups Set
and Map
. For this case Set
will improve the search for duplicates and will also serve to hold the unique array while processing.
If we are going to use a Set
we can take advantage of the constructor that will create a set from any iterable object. Thus all we need to do is create the set from the array, then convert the set back to an array as it will have removed the duplicates.
Thus
const withDuplicates = [1, 2, 3, 4, 4, 4, 5, 5, 6, 7, 8, 9, 9, 9, 10];
function removeDuplicates(arr){
return [...(new Set(arr)).values()];
}
console.log(removeDuplicates(withDuplicates));
I think that is what you may consider better, however it does involve an extra iteration when converting back to the array. To improve on that we would need to create our own Set
with a hash table lookup. To do that in JS will never be as fast as the native code, so we can just accept that the complexity is unavoidable in favor of speed.
-
\$\begingroup\$ That‘s 100% what I was looking. Awesome. :) Thanks a bunch. \$\endgroup\$michael.zech– michael.zech2017年11月25日 15:32:53 +00:00Commented Nov 25, 2017 at 15:32
Array.from(new Set(arr))
or[...new Set(arr)]
\$\endgroup\$