Parallel processing gocha in javascript
There is a little gocha that I found about while flexing my parallel processing skills and that is exactly what we are gonna discuss today
Filed under Others on
It is about morning of 24 May 2025, I stumbled on this site https://not-the-norm.onrender.com by Basit. As basit puts it, it is a “A space for controversial takes, hot opinions, and real conversation”. I thought what else to do on weekend than to surprize him with a couple hundred thousand requests.
I jumped into my and literally created a file basit.js
and I wrote a program to do so. What was meant to be a small program turned into a nice learning lesson.
Note
By the time you are reading this, it is highly possible that the endpoint we used here might not be available, all that depends on Basit, where he may shut this service or leave it up for learning but with pagination (as of now, there is no pagination)
Even if this very service is not available, there is a lot to take from this article.
Initial Attempt
My initial atempt involved looking at how this site works and where to collecting all the cookies necessary to make a post request which ended up with something like this.
Which I when ran worked perfectly like a charm.
Leveling up
Now you now the obvious game.
- wrap this in a function.
- run a veeeryyy long
for loop
till the author goes bankrupt.
But running for loop
is like putting one cookie at a time in oven and wait for it while it gets cooked and only then move on to next one. In other words “veeeryyy time consuming”.

awaited execution context: one-by-one cookie baking
It happened that I was forging this program on my “Apple Silicon M2” and it had made me less prone to waiting. So I thought if we are baking cookies then why not bake them like bakers. So bakers don’t do this one-by-one cookie making, what they do is they batch the cookies in oven like 100 at a time. Time to be bakers xD.
What ?
For this Implimentation I thought something like this:
- make an array of
makePost
function calls. - use
Promise.all()
to do batch processing. What I ended up doing was something like this
When we call the makePost
function, it will return a Promise
object which you only care of when you want to do something in your current Execution Context that depends on its result.
⭐ One might say that if we are waiting for some async task, then what is the fun of using async function at first place. The answer to that is exactly why async exists. So always remember:
- awaiting asynchronous task will just block the current “Execution Context” or async tasks are non-blocking them self until awaited on.
- running synchronous task will stop the entire program or in other words keeps the main thread busy (and js is single-threaded)
Two things might be in your mind
- What was Execution Context ???
- If we don’t await for an async function when are they called?
At this point the article is diving into something I should write a book on. I will make sure my upcoming posts will be related to these topics where we will go in more depth.
Now we are deviated from what we were doing, so lets refresh with a quick video
After you are done watching this, I would highly recommend to ponder for like 10 to 15 minutes about what exactly is happening?
Digging a bit
The two lines of interest are these
Starting with makePost
, this is an asynchronous function and it is started at the very moment we call it > makePost()
. What calling this returns is a Promise and Promise is an object. and Look what MDN has got for us:
The first parameter to pass in
Array.prototype.fill
isvalue
.
Value to fill the array with. Note all elements in the array will be this exact value: if value is an object, each slot in the array will reference that object.
~ reference
So here lies our gocha. When doing this const batch = new Array(100).fill(makePost());
, we are just copying reference to same Promise object 100 times. Our makePost
is called just once and that justifies why we are having only one network request WoW.
Coming to the Promise.all
, it has nothing to do with calling anything, it just awaits the current execution context until any one of batch request is not in pending state and that justifies why only running this new Array(100).fill(makePost());
made the network request but in non-blocking fashion.
That might seem a lot to digest (maybe) but for a refresh, watch this.
Solution
So the fix in our case will be to have an array but of different promises, each representing a different asynchronous network request.
We can do something like this:
makePost
here is a mapFunction which means it will literally be called for each place in the array. That is exactly what we want in order to generate a new Promise at each place.
Final blow
I almost forgot what I was supposed to do here 🤦 “make cookies like bakers”. So lets continue that with our final implimentation being something like this:
Results
Now the results are pretty neat to demonstrate the power of parallel fetching.
Note: Batch size of 1 just means running good old
for loop
for each request, no parallel stuff.
Total Requests | Batch size | Time taken (s) |
---|---|---|
300 | 1 | 235.695 |
300 | 10 | 31.232 |
300 | 50 | 19.229 |
300 | 150 | 16.293 |
300 | 300 | 12.963 |
The reason I took 300 as the max batch size is because after that the API service we are using here literally crashes 💔.
That concludes our post with the hope that Basit will not mind whatever we just did xP.