Logo
0%

Parallel processing gocha in javascript

There is a little gocha that I found about while flexing my parallel processing skills and that is exactly what we are gonna discuss today

Filed under Others on

It is about morning of 24 May 2025, I stumbled on this site https://not-the-norm.onrender.com by Basit. As basit puts it, it is a “A space for controversial takes, hot opinions, and real conversation”. I thought what else to do on weekend than to surprize him with a couple hundred thousand requests.

I jumped into my and literally created a file basit.js and I wrote a program to do so. What was meant to be a small program turned into a nice learning lesson.

Note

By the time you are reading this, it is highly possible that the endpoint we used here might not be available, all that depends on Basit, where he may shut this service or leave it up for learning but with pagination (as of now, there is no pagination)
Even if this very service is not available, there is a lot to take from this article.

Initial Attempt

My initial atempt involved looking at how this site works and where to collecting all the cookies necessary to make a post request which ended up with something like this.

code
try {
  const res = await fetch('https://not-the-norm.onrender.com/api/posts', {
    method: "POST",
    body: JSON.stringify({
      text: 'somebody pocking around'
    }),
    headers: {
      'Content-Type': 'application/json',
      'Cookie': 'token=/----MyToken----/'
    },
    credentials: 'include'
  })
 
  const data = await res.json();
  console.info('data', data)
} catch(err) {
  console.error('Error fetching:', err.message)
}

Which I when ran worked perfectly like a charm.

Leveling up

Now you now the obvious game.

  • wrap this in a function.
  • run a veeeryyy long for loop till the author goes bankrupt.

But running for loop is like putting one cookie at a time in oven and wait for it while it gets cooked and only then move on to next one. In other words “veeeryyy time consuming”.

awaited execution context: one-by-one cookie baking

awaited execution context: one-by-one cookie baking

It happened that I was forging this program on my “Apple Silicon M2” and it had made me less prone to waiting. So I thought if we are baking cookies then why not bake them like bakers. So bakers don’t do this one-by-one cookie making, what they do is they batch the cookies in oven like 100 at a time. Time to be bakers xD.

What ?

For this Implimentation I thought something like this:

  • make an array of makePost function calls.
  • use Promise.all() to do batch processing. What I ended up doing was something like this
code
const batch = new Array(100).fill(makePost())
Promise.all(batch)

When we call the makePost function, it will return a Promise object which you only care of when you want to do something in your current Execution Context that depends on its result.

⭐ One might say that if we are waiting for some async task, then what is the fun of using async function at first place. The answer to that is exactly why async exists. So always remember:

  • awaiting asynchronous task will just block the current “Execution Context” or async tasks are non-blocking them self until awaited on.
  • running synchronous task will stop the entire program or in other words keeps the main thread busy (and js is single-threaded)

Two things might be in your mind

  1. What was Execution Context ???
  2. If we don’t await for an async function when are they called?

At this point the article is diving into something I should write a book on. I will make sure my upcoming posts will be related to these topics where we will go in more depth.

Now we are deviated from what we were doing, so lets refresh with a quick video

After you are done watching this, I would highly recommend to ponder for like 10 to 15 minutes about what exactly is happening?

Digging a bit

The two lines of interest are these

code
const batch = new Array(100).fill(makePost());
console.log(batch)
Promise.all(batch);

Starting with makePost, this is an asynchronous function and it is started at the very moment we call it > makePost(). What calling this returns is a Promise and Promise is an object. and Look what MDN has got for us:

The first parameter to pass in Array.prototype.fill is value.
Value to fill the array with. Note all elements in the array will be this exact value: if value is an object, each slot in the array will reference that object.
~ reference

So here lies our gocha. When doing this const batch = new Array(100).fill(makePost());, we are just copying reference to same Promise object 100 times. Our makePost is called just once and that justifies why we are having only one network request WoW.

Coming to the Promise.all, it has nothing to do with calling anything, it just awaits the current execution context until any one of batch request is not in pending state and that justifies why only running this new Array(100).fill(makePost()); made the network request but in non-blocking fashion.

That might seem a lot to digest (maybe) but for a refresh, watch this.

Solution

So the fix in our case will be to have an array but of different promises, each representing a different asynchronous network request.

We can do something like this:

code
const batch = Array.from({ length: 100 }, makePost)

makePost here is a mapFunction which means it will literally be called for each place in the array. That is exactly what we want in order to generate a new Promise at each place.

Final blow

I almost forgot what I was supposed to do here 🤦 “make cookies like bakers”. So lets continue that with our final implimentation being something like this:

code
const totalRequests = 3000; // we can do 30000000
const batchOf = 150;
let count = 0;
console.time('total execution')
 
async function makePost() {
  try {
    count+=1;
    const res = await fetch('https://not-the-norm.onrender.com/api/posts', {
      method: "POST",
      body: JSON.stringify({
        text: "The concept of the 'American Dream' is nothing more than a myth. It's an ideal created to sell the idea of upward mobility while ignoring the systemic inequalities that prevent the majority of people from achieving it. People are told that anyone can succeed if they work hard enough, but the reality is that race, class, gender, and even where you are born play a massive role in determining your success. The system is rigged, and the idea of the 'American Dream' only serves to perpetuate it."
      }),
      headers: {
        'Content-Type': 'application/json',
        'Cookie': 'token=/----MyToken----/'
      },
      credentials: 'include'
    })
    
    // const data = await res.json();
    // console.info('Data:', data)
  } catch(err) {
    console.error('Error fetching:', err.message)
  }
}
 
/** Lets do some Parallel stuff */
 
const numOfBatches = Math.floor(totalRequests/batchOf);
for (let batchNum = 1; batchNum <= numOfBatches; batchNum++) {
  console.log('Doing batch', batchNum);
  console.time(`timeTaken-${batchNum}`);
  
  await Promise.all(
    Array.from({ length: batchOf }, makePost)
  );
  
  console.timeEnd(`timeTaken-${batchNum}`);
}
 
// handline the remaining 
const remaining = totalRequests % batchOf;
console.log('Doing remaining', remaining)
if (remaining > 0) {
 
  console.time('timeTaken-remaining')
  await Promise.all(
    Array.from({ length: remaining }, makePost)
  ); 
  console.timeEnd('timeTaken-remaining')
 
}
 
console.log('\n')
console.timeEnd('total execution')
console.log('Total done', count)

Results

Now the results are pretty neat to demonstrate the power of parallel fetching.

Note: Batch size of 1 just means running good old for loop for each request, no parallel stuff.

Total RequestsBatch sizeTime taken (s)
3001235.695
3001031.232
3005019.229
30015016.293
30030012.963

The reason I took 300 as the max batch size is because after that the API service we are using here literally crashes 💔.

That concludes our post with the hope that Basit will not mind whatever we just did xP.