Optimising JavaScript Array Searches: Benchmarked

I’ve seen a number of articles and benchmarks around the most optimal way to loop through JavaScript String Arrays, and they’ve typically all given the same results. I haven’t seen any that actually try to improve on the methods they’re using to loop, and this challenged me. I want to see if we can improve on the array iterators to better find data stored in them.
Let’s take a look at the different ways to loop through arrays in JavaScript
There’s the traditional for
loop:
for(let i = 0 ; i < maxArraySize; i++) {
let item = array[i];
}
Then there’s the forEach
:
array.forEach(val => {
// Do something with val
});
Finally, there’s the for…in
which is rarely used, but still useful:
for (let val in array) {
// do something with val
}
For those that have knowledge in JavaScript, they’ll know that there is the builtin function indexOf
:
array.indexOf(stringToFind);
Knowing all these functions and when to use them is fantastic, but when your String Array has been pushed to its limits, why not optimise it where you can?
I decided to see if there was a way to optimise them using a few different methods, but these most of these relied on luck.
For example, looping from the end:
for(let i = maxArraySize ; i >= 0; i--) {
// Do something with array(i);
}
The above only works if you know your value is somewhere near the end.
I then came up with the stabInTheDark
method:
function stabInTheDark(array, stringToFind){
while(true){
let index = Math.floor(Math.random() * Math.floor(maxArraySize));
if(array[index] === stringToFind){
return true;
}
}
}
The problem with the above method is that it will randomly try and find the value and since it’s not tracking what it’s done previously (which would waste time) then there’s the chance of duplicate comparisons.
It then occurred to me that the solution has been used for decades: index your data.
If databases index their data to avoid scanning, why don’t we use it in programming languages? Maybe because we might only loop through a set of data once and then dump it, but what if you have to re-use the same data multiple times? I gave it a go, and here are the results.
I came up with the following function along with the index builder:
function searchInIndexed(indexedObj, stringToFind){
let firstChar = stringToFind[0];
let n = indexedObj[firstChar].length;
for(let i = 0 ; i < n; i++) {
if(indexedObj[firstChar][i] === stringToFind){
return true;
}
}
}function createIndexArr(array){
// Our indexed object
let newObj = {}
for(let i = 0 ; i < maxArraySize; i++) {
// We'll use the first character of the string to base our index. Could in the range of [a-z,1-9]
let firstChar = array[i][0];
if(newObj[firstChar]){
newObj[firstChar].push(array[i]);
} else {
newObj[firstChar] = [array[i]];
}
}
return newObj;
}
Here’s what the indexed data looks like now:

The rules for the tests were as follows:
- The searches must use the same String Array data
- Each method must loop through 1000 times
- The results must be consistent to be valid
It took 25.661 seconds to build the word array and then 0.591 seconds to build the index.
Here are the method times sorted by the fastest in seconds:
- Index Search: 23.794
- indexOf: 40.92
- forLoop: 62.869
- forEach: 178.479

If the loop was going to be used only once, then the indexOf method would be the fastest as the index build time would add to the search time.

Summary: Based on these results, we can learn that there is always room to optimise your app, especially when you know what your data looks like and how it will operate in the real world. If you have a String Array with several hundred thousand strings, then indexing that data will be the fastest way.
Even if it’s not a String Array, when you loop through your data, always try and find the fastest way to the data you want. It can really make a huge difference.
Here’s the complete running code for those that are interested: https://github.com/CyberCyclone/optimising-string-arrays