Amdahl’s law and Gustafson’s law
Been learning more about parallel computing. Amdahl’s law and Gustafson’s law gives us way to evaluate increase in speed based on resources available.
First let’s dive into Amdahl’s law.
The ratio of how much of our problem is parallel and how much is sequential will dictate the limits of speed. This relation is called Amdahl’s Law. It is named after a Computer Scientist Gene Amdahl.
From the diagram we see that if we increase the parallel portion of problem by 95%, the increase in speed is by 20%. After that the speed plateaus out.
In real life example, let’s say you have invited your closest three friends for a dumpling party. You prepared dough and fillings. The fillings are in a container with two spoons to scoop out. You have one rolling pin and one rolling board. Your friends are gracious to start making dumpling while you, the host, start making dippings and putting the prepared dumplings for cooking.
What could have taken you 4 hours to finish making dumplings took 1.5 hours to complete with your 3 friends.
Let’s say your housemate hears chattering and enters the kitchen. Dumplings being one of the favorite dish, your housemate joins in to help. The housemate makes small round of dough so that the person making dumpling skin using rolling pin is faster. With the housemate’s help the 1.5 hours reduced to 1.2 hours.
Being a weekend, your housemate friend’s shows up. Since all the dumpling making tasks are taken, the housemate’s friend sits idle.
Here we increased the speed of making dumpling, 4 hours to 1.2 hours, by having additional four people working in a parallel manner. However, with addition of one more person did not increase the speed since all the tasks were occupied.
We could have sped the dumpling making process, had we divided the task of making dough and fillings among your friends too. You, the host, prepared on your own in a sequential manner.
Now let’s dive into the Gustafson’s law.
Gustafson’s law/(Gustafson-Barsis’s law) addresses shortcomings of the Amdahl’s law, which is based on the assumption of a fixed problem size and increase in resources does not improved the workload. It is named after Computer Scientists John L. Gustafson and Edwin H. Barsis.
Gustafson’s Law states that by increasing the problem size, we can increase the scalability and make use of all the resources you have.
By increasing resources and problem size, speed will theoretically be a linear. The amount of speed we get is still be determined by the ratio of how much of our problem is parallelized and how much is sequential.
Going back to the dumpling example, let’s say your closest three friend’s friend want to join the dumpling party as well since your friend have praised how good host you are 😊. So the three friends each bring a friend with total of 6 friends. Your housemate having know the plan in advance, invites his friend as well. So total you have 8 people you are hosting.
You being a problem solver, borrow one more rolling pin and rolling board from a neighbor. You prepared dough and fillings and divide them into two different containers adding 2 spoons on each of the fillings container. You assign each of the rolling pin and rolling board to two dumpling making parties.
You also borrow dumpling steamer from your neighbor so that prepared dumplings from both the parties are put to cook as they get prepared.
Your friends divide themselves into a group of 4 people each and divide the task among themselves. You the host, prepare dippings and start cooking the dumplings.
To feed 9 people, it took similar time of 1.2 hours. Could we have made more efficient? hmm.. we could. Two of your friends could have arrived early and helped you preparing the dough and fillings!
In summary, how much of a problem can be parallelized and how much not determines the speed of executing a problem. By increasing the problem size and the parallel portion of a problem we can make use of all the resources we have. This also helps in scaling.
Here’s dumplings ready to enjoy! 🤤
That’s all for the article. Thank you for reading!
- Parallel Computing
You can support me on Patreon.