The spread operator: proceed with caution

JavaScript’s spread syntax is super useful, this tiny sugary extension has made cloning, merging, and expanding objects, arrays, and other iterable things so simple and it conveniently replaces a number of type-specific methods. For example. when shallow cloning arrays and objects the spread syntax looks so much better than calling specific methods:

// Without spread:
const newArr = oldArr.slice(0);
const newObj = Object.assign({}, oldObj);

// With spread:
const newArr = [...oldArr];
const newObj = { ...oldObj };

I’ve seen a very strong uptake of this feature in the codebases I’ve worked on recently, the syntax has superseded function calls almost entirely, but this powerful little operator has a cost and using it too much or unnecessarily will negatively impact performance.

One of the places I regularly see the spread operator used is in combination with the array reduce() method. Reducers enable us iterate over an array and produce a single output of any type and the flexibility of this means we can often avoid iterating multiple times. For example, both functions below filter then pluck out the values for a property from an array of objects:

// 1.
const result = countries
  .filter(({ continent }) => continent === 'europe')
  .map(({ name }) => name));

// 2.
const result = countries.reduce((result, { name, continent }) => {
  if (continent === 'europe') {
    result.push(name);
  }

  return result;
}, []);

The second example using reduce() avoids iterating over the array of countries twice and is therefore faster than the first but a lot of people would say that this version is too long and can be “code golfed” down to a smaller size using spread syntax:

const result = countries.reduce((acc, { name, continent }) =>
  continent === "europe"
    ? [...acc, name]
    : acc
, []);

But oh dear - we’ve just trodden on a performance foot gun 😱

The problem is that the spread operator used to construct a new array is creating a secondary loop and as the size of the accumulator grows the slower the function gets. In other words it’s an algorithm with O(n²) or quadratic time complexity and it’s easy to see why by expanding the syntax again:

const result = countries.reduce((acc, { name, continent }) =>
  continent === "europe"
    ? acc.map((item) => item).concat(name),
    : acc
, []);

Let’s take a look at another example, this time iterating over the array of countries and grouping them by continent:

const result = {};

countries.forEach(({ name, continent }) => {
  if (Array.isArray(result[continent])) {
    result[continent].push(name)
  } else {
    result[continent] = [name]
  }
});

The code above gets the job done but I’ve seen the same algorithm written using reduce() and multiple spread operators instead, just like this:

const result = countries.reduce((acc, { name, continent }) => ({
  ...acc,
  [continent]: [...acc[continent] ?? [], name],
}), {});

Ignoring comprehensibility temporarily, how bad is the performance penalty? Let’s see with a benchmark:

Algorithm Result
Without spread 222,843 ops/sec
With spread 20,606 ops/sec

It’s bad - more than 10 times slower in this very simple example. I think the only other way to waste this much time and energy on nothing is to start mining crypto 🔥

In summary I’d recommend avoiding the use of the spread operator inside a loop and generally encourage everyone to be more mindful when using it. It’s a little operator with big power but it should be used intentionally. And to be clear I’m far from the first person to write about this issue but I felt it was a drum worth a repeat beating 🥁