To make this happen, I created a simple server that takes a value "n" and adds up all of the numbers between 1 and n. This, by the way, is a purely CPU-bound operation that should use only two registers. It can't get hung up by waiting for RAM or the file system. The server was just as fast as before. I had to feed my underpowered desktop (1.83GHz Intel Core Duo) numbers like n=90000000 before it seemed to pause at all. That's a 9 with seven 0s after it. The answer had 16 digits in it.
When I fed fat numbers to the server, I found that all of the other requests would get in line behind it. When the workload is short, Node.js seems to be multitasking because it gets done with everything so quickly. But if you find an item that weighs down the server, you can lock up everything in a queue behind it.
Fear not. If this happens, Node.js lovers will blame you, not the machine. Your job as a programmer is to anticipate any delays, such as a request for a distant Web service. Then you break your code into two functions, just as AJAX programmers often do on the client. When the data is returned, Node.js will invoke the callback function. In the meantime, it will handle other requests.
In the right hands connected to a right-thinking mind, the results can be staggeringly efficient. When the programmer spends a few minutes and separates the work done before and after a delay, there's no need for the machine to tie up RAM to hold the state in the thread just so it will be ready when the data finally shows up from the database or the distant server or even the file system. It was undeniably easier for the programmer to let the computer keep all of the state, but it's much more efficient this way. Just as businesses try desperately to avoid tying up capital in inventory, a programmer's job is to think like a factory boss, treat RAM as capital, and avoid consuming any of it.