Multi-core processing with Cluster
16 May 2015Node.js operates in a single thread. In order to get your program to take advantage of all of the cores in your machine, you’ll need some extra help. Today’s post is about the cluster module which has been created to solve this very problem.
What is it?
The cluster module gives your Node.js application the ability to create new processes that will execute your code. Any children that you spawn will share your server ports so this is an excellent utility for process resilience in network applications.
Using the cluster library, you’re given a master and worker relationship in your code. The master has the ability to spawn new workers. From there you can use message passing, IPC, network, etc. to communicate between your workers and the master.
A simple example
In the following sample, I’ll put together a http server that allows a user to kill and create processes. You’ll also see the round-robin approach to requests as well as each of the workers sharing the same port.
Measuring isMaster
and in other cases isWorker
allows you to place code for both sides of your process. This is like the tradition unix fork process.
We count the number of cpu cores and store that off in nWorkers
. This is how many workers we’ll create. Messages are delivered from the worker using the send
function. These are then caught and interpreted by the master using the message
event.
The master will go through the workers in a round-robin fashion (by default) who are all listening on port 3000.
There is plenty more to this API than what’s in this example. Check out the documentation for more information.