Table of Contents

RESTFul APIs: Servers & Node.js

This is the third article in a five part series on RESTful APIs.

What is Node.js?

Node.js is an open-source runtime for JavaScript. It was created by Ryan Dall in 2009. Dahl criticized the limited possibilities of the Apache HTTP Server and the sequential programming style. Node.js is cross-platform; it can run on Unix, Windows, and macOS operating systems and 32/64 bit or ARM architectures.

The core proposition of Node.js is the language and non-blocking environment. The non-blocking environment makes Node.js a good choice for I/O intensive applications.

JavaScript developers can now, with the help of Node.js, create server-side applications.

A full-stack application can be built with a team of JavaScript developers. The JSON (JavaScript Object Notation) format is widely used, especially in single-page applications.

Some NoSQL databases, like MongoDB, also use JSON and JavaScript. All of these technologies have a good synergy togheter by sharing a similar data object: the JavaScript language and the JSON format.

A strong feature of NodeJS is its non-blocking nature. Asynchronous programming practices like callbacks, promises, async/await, are first-class citizens in NodeJS.

The possibility of sharing code between the server and the client is also a plus.

Using JavaScript and its ecosystem, a broad spectrum of applications can be built:

  • Single Page Applications, with frameworks like AngularReactSvelte or Vue
  • Server applications or API’s based on HTTP/S or WebSockets
  • Microservices
  • Desktop Applications, with Electron
  • Hybrid Mobile applications, with Cordova or Ionic
  • Native mobile applications, with NativeScript or ReactNative
  • Machine Learning powered applications, server or client-side, with Tensorflow.js
  • Robotics and IoT, with johnny-five
  • Performant and user-friendly CLI applications


Node.js has a big and vibrant community which created a lot of modules that everybody can use. For managing them, the NPM project was created.

NPM is an acronym for node package manager and acts as a package manager for node projects. NPM is also a CLI tool that aids with the installation.All modules used in a project are defined in the package.json file and are installed in the node_modules folder. Due to the dependencies being defined in the package.json, file committing the node_modules folder is bad practice, the folder should be added to .gitignore. Users working on the same project can easily install the dependencies by running the ‘npm install’ command in the terminal.

Choosing a good package for your problem implies checking how often the package is updated, how long bugs and issues remain unaddressed and how many people are using it. We do this to make sure it supports our requirements now and in the future.

You can check this article for a more detailed approach and more tips: NPM guide.



Node.js is a JavaScript runtime, but what does that mean? It means that it can interpret JavaScript, just like a browser. Node.js shares the V8 JavaScript engine with chromium-based browsers (Chrome, Opera, EdgeChromium, Brave, etc). The V8 engine is single-threaded, but it’s also asynchronous by using a concept called the Event Loop.

Event Loop

The Event Loop is the architecture that makes this possible. The Event Loop has an event-driven architecture that promotes loosely-coupled systems coupled, while the queue system makes concurrency possible on a single thread.

 while(queue.waitForNextMessage()) {

How does it do that? By delegating the actual work to the operating system or network’s internals and processing the next message.

Let’s say, for example, that we send a request to the server. There is no point in blocking the thread until the data is transferred by the operating system and the network, processed by the other server and sent back.

Another example might be reading a file from the operating system. By delegating the task to the operating system thread, the main thread can continue doing other things.

When the operating system has data, an event is sent to the queue, the loop sends it to the main thread and it handles the rest.

The work is done by other threads of the operating system, routers, or another computer so that we don’t block our thread. In the meanwhile, we can process other messages. This is why asynchronicity works on a single thread. The problems appears when a task is CPU intensive. In CPU intensive tasks there is no work that the main thread can delegate to the OS or other computers, it has to do that itself.


Having just the JavaScript engine is not enough. In browsers, we don’t have direct access to operating system resources because of security reasons. So an API for accessing low-level resources of the operating system was needed. Node.js provides such API. Accessing resources like files, threads or the network is possible from JavaScript.

Programming style

Async Programming

The core of the Node.js programming style lays in its event-driven architecture and using the event loop and handlers.


What are handlers? handlers are unnamed functions that are passed as arguments to other functions. Usually, the other functions are event handling declarations.

We refer to a handler as a callback when there is an actual async operation involved. All callbacks are handlers, only handlers involved in async operations are callbacks.

The anonymous function passed as the second argument to the readFile function is the callback. The callback function will be called back when the file is read. The reading of the file is an async operation.

 const fs = require('fs');
 fs.readFile('example.txt', function(err, content) {
   if (err) return console.error(err);

We can see in the example above the common error handling pattern for callbacks. The pattern says that the callbacks first parameter should be null if there is no error, or an error object if something wrong happened. The rest of the parameters can be used freely.


What are promises? Built on top of callbacks, promises are instances of the Promise class.

Promises are JavaScript objects that represent internally the state of an asynchronous operation. When the operation is completed, its value is contained in the Promise instance. The Promise internal state can be one of the following: pending, fulfilled, and rejected.

Pending means that the asynchronous operation is in progress. Fulfilled means that the operation completed successfully and that we can access the value. Rejected means that the operation failed.

A promise that is pending can be completed with one of the two possible states, fulfilled or rejected.

Promises can be chained using the then() method and can also be passed as arguments.

Promise errors can be caught with the catch() method.

The Promise constructor receives as an argument a callback, function or arrow function, with two parameters to be called when the operations succedees and when it fails.

 const fs = require('fs');
 const readFilePromisified = new Promise(function (resolve, reject) => {
   fs.readFile('example.txt', function(err, content) {
     if (err) return reject(err);
 .then(function (data) { console.log(data); })
 .catch(function (err) { console.error(err); });

Async/Await functions are built on top of promises. In order to be used you need a special kind of function or arrow function marked with the word async. The marking states that that function will return a promise no mather what.

Every function that returns a promise is awaitable.

 async function getCar(name) {
   if (name !== 'Fiat') throw new Error('We do not have this car');
   return 'Fiat 500'
 async function callReadFilePromisified() {
    try {
      const car = getCar('fiat');
    } catch (err) {

Fitting everything together

We mentioned that everything is based on the event loop architecture. Callbacks are made possible by events, being handlers for them. Promises are built on top of callbacks. Promises are just proxies for a value that will be available in the future. Another layer of abstraction is brought by async/await feature which is built on top promises.

Every callback or handler is placed in a queue and waits for the event loop to call it when the necessary state is completed (file read, network request completed, etc.).

A handler queue is a data structure that Node.js uses internally to organize the async operations. A callback is added to the call stack when it is about to be executed. The event loop continually checks the call stack to see if it’s empty, so that it can pic a callback from the queue and add it to the call stack.

There are several types of callback queues that handle different types of operations. IO queue, Timer queue, Microtask queue, Check/Immediate queue, and Close queue. It’s important to note that the event loop checks and executes the microtask queue before other queues. The queue order is, microtask, timer, IO, check, and, lastly, close. Please refer to this article for more details, Deep dive into queues @ logrocket.


Promoting low coupling, a lot of objects in Node.js, and every API in Node.js emits events. Anybody can create custom events.

 const EventEmitter = require('events').EventEmitter;

 const myEventEmitter = new EventEmitter();
 const eventHandler = () => { console.log('Handled event'); }
 myEventEmitter.on('myCustomEvent', eventHandler);

As we can see from the example above there is no need for async functionality but the event loop queue is still used.

A common error that causes memory leaks is not removing an event handler once it ceases to be useful. This is particularly bad if the event handler is set in a loop. We need to keep a reference of the event handler, this way the function that removes the handler, knows which one to remove.

 const eventHandler = () => { console.log('Handled event'); }
 myEventEmitter.on('myCustomEvent', eventHandler);
 myEventEmitter.removeEventListener('myCustomEvent', eventHandler);
Buffers & Streams
What are buffers?

Buffers are instance of class Buffer that are used to manipulate raw data (octets in memory).

You can do Unicode in JavaScript but, sometimes, you need to process binary data. Buffer class is here to help in processing octet streams. The output of the readfile function is a buffer.

We have more details in the unicode article.

What are streams?

Using streams is a faster way of processing data. If the data can be processed in parts, then you have a very good chance of speeding the operations via streams.

Think of it this way, during reading and writing operations (I/O), no work is done by the writing counterpart until the reading of the file is done. Writing and Reading at the same time will double the speed of the operations, but, in real life, you can gain an order of magnitude.

 const fs = require('fs');
 const readStream = fs.createReadStream('source.txt');
 const writeStream = fs.createWriteStream('destination.txt');

That’s all for now, next Persistency & MongoDB.

About us

Bytes Route is a digital adoption tool designed for non-technical people to create no-code product tours for web applications.

Recent Posts

Follow Us

Weekly Tutorial

Create code-free product tours for user onboarding on your web application