Concept of Callbacks and Streams in NodeJS

Last updated on Jan 18 2023
Prabhas Ramanathan

Table of Contents

Node.js – Callbacks Concept

What is Callback?

Callback is an asynchronous equivalent for a function. A callback function is called at the completion of a given task. Node makes heavy use of callbacks. All the APIs of Node are written in such a way that they support callbacks.
For example, a function to read a file may start reading file and return the control to the execution environment immediately so that the next instruction can be executed. Once file I/O is complete, it will call the callback function while passing the callback function, the content of the file as a parameter. So there is no blocking or wait for File I/O. This makes Node.js highly scalable, as it can process a high number of requests without waiting for any function to return results.

Blocking Code Example

Create a text file named input.txt with the following content −
Tutorials Point is giving self learning content
to teach the world in simple and easy way!!!!!
Create a js file named main.js with the following code −

var fs = require("fs");
var data = fs.readFileSync('input.txt');

console.log(data.toString());
console.log("Program Ended");

Now run the main.js to see the result −
$ node main.js
Verify the Output.
Tutorials Point is giving self learning content
to teach the world in simple and easy way!!!!!
Program Ended

Non-Blocking Code Example

Create a text file named input.txt with the following content.
Update main.js to have the following code −

var fs = require("fs");

fs.readFile('input.txt', function (err, data) {
if (err) return console.error(err);
console.log(data.toString());
});

console.log("Program Ended");

 

Now run the main.js to see the result −
$ node main.js
Verify the Output.
Program Ended
Tutorials Point is giving self learning content
to teach the world in simple and easy way!!!!!
These two examples explain the concept of blocking and non-blocking calls.
• The first example shows that the program blocks until it reads the file and then only it proceeds to end the program.
• The second example shows that the program does not wait for file reading and proceeds to print “Program Ended” and at the same time, the program without blocking continues reading the file.
Thus, a blocking program executes very much in sequence. From the programming point of view, it is easier to implement the logic but non-blocking programs do not execute in sequence. In case a program needs to use any data to be processed, it should be kept within the same block to make it sequential execution.

Node.js – Streams

What are Streams?

Streams are objects that let you read data from a source or write data to a destination in continuous fashion. In Node.js, there are four types of streams −
• Readable − Stream which is used for read operation.
• Writable − Stream which is used for write operation.
• Duplex − Stream which can be used for both read and write operation.
• Transform − A type of duplex stream where the output is computed based on input.
Each type of Stream is an EventEmitter instance and throws several events at different instance of times. For example, some of the commonly used events are −
• data − This event is fired when there is data is available to read.
• end − This event is fired when there is no more data to read.
• error − This event is fired when there is any error receiving or writing data.
• finish − This event is fired when all the data has been flushed to underlying system.
This tutorial provides a basic understanding of the commonly used operations on Streams.

Reading from a Stream

Create a text file named input.txt having the following content −
Tutorials Point is giving self learning content
to teach the world in simple and easy way!!!!!
Create a js file named main.js with the following code −

var fs = require("fs");
var data = '';

// Create a readable stream
var readerStream = fs.createReadStream('input.txt');

// Set the encoding to be utf8. 
readerStream.setEncoding('UTF8');

// Handle stream events --> data, end, and error
readerStream.on('data', function(chunk) {
data += chunk;
});

readerStream.on('end',function() {
console.log(data);
});

readerStream.on('error', function(err) {
console.log(err.stack);
});

console.log("Program Ended");

Now run the main.js to see the result −
$ node main.js
Verify the Output.
Program Ended
Tutorials Point is giving self learning content
to teach the world in simple and easy way!!!!!

Writing to a Stream

Create a js file named main.js with the following code −

 

var fs = require("fs");
var data = 'Simply Easy Learning';

// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');

// Write the data to stream with encoding to be utf8
writerStream.write(data,'UTF8');

// Mark the end of file
writerStream.end();

// Handle stream events --> finish, and error
writerStream.on('finish', function() {
console.log("Write completed.");
});

writerStream.on('error', function(err) {
console.log(err.stack);
});

Now run the main.js to see the result −
$ node main.js
Verify the Output.
Program Ended
Write completed.
Now open output.txt created in your current directory; it should contain the following −
Simply Easy Learning

Piping the Streams

Piping is a mechanism where we provide the output of one stream as the input to another stream. It is normally used to get data from one stream and to pass the output of that stream to another stream. There is no limit on piping operations. Now we’ll show a piping example for reading from one file and writing it to another file.
Create a js file named main.js with the following code −

 

var fs = require("fs");

// Create a readable stream
var readerStream = fs.createReadStream('input.txt');

// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');

// Pipe the read and write operations
// read input.txt and write data to output.txt
readerStream.pipe(writerStream);

console.log("Program Ended");

Now run the main.js to see the result −
$ node main.js
Verify the Output.
Program Ended
Open output.txt created in your current directory; it should contain the following −
Tutorials Point is giving self learning content
to teach the world in simple and easy way!!!!!

Chaining the Streams

Chaining is a mechanism to connect the output of one stream to another stream and create a chain of multiple stream operations. It is normally used with piping operations. Now we’ll use piping and chaining to first compress a file and then decompress the same.
Create a js file named main.js with the following code −

var fs = require("fs");
var zlib = require('zlib');

// Compress the file input.txt to input.txt.gz
fs.createReadStream('input.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('input.txt.gz'));

console.log("File Compressed.");

 

Now run the main.js to see the result −
$ node main.js

Verify the Output.
File Compressed.
You will find that input.txt has been compressed and it created a file input.txt.gz in the current directory. Now let’s try to decompress the same file using the following code −

var fs = require("fs");
var zlib = require('zlib');

// Decompress the file input.txt.gz to input.txt
fs.createReadStream('input.txt.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('input.txt'));

console.log("File Decompressed.");

 

Now run the main.js to see the result −
$ node main.js
Verify the Output.
File Decompressed.
So, this brings us to the end of blog. This Tecklearn ‘Concept of Callbacks and Streams in NodeJS’ blog helps you with commonly asked questions if you are looking out for a job in NodeJS Programming. If you wish to learn NodeJS and build a career in NodeJS Programming domain, then check out our interactive, Node.js Training, that comes with 24*7 support to guide you throughout your learning period.

Node.js Training

About the Course

Tecklearn’s Node.js certification training course familiarizes you with the fundamental concepts of Node.js and provides hands-on experience in building applications efficiently using JavaScript. It helps you to learn how to develop scalable web applications using Express Framework and deploy them using Nginx. You will learn how to build applications backed by MongoDB and gain in-depth knowledge of REST APIs, implement testing, build applications using microservices architecture and write a real-time chat application using Socket IO. Accelerate your career as a Node.js developer by enrolling into this Node.js training.

Why Should you take Node.js Training?

• As per Indeed.com data, the average salary of Node.js developer is about $115000 USD per annum.
• IBM, LinkedIn, Microsoft, GoDaddy, Groupon, Netflix, PayPal, SAP have adopted Node.js – ITJungle.com
• There are numerous job opportunities for Node.js developers worldwide. The job market and popularity of Node.js is constantly growing over the past few years.

What you will Learn in this Course?

Introduction to Node.js

• What is Node.js?
• Why Node.js?
• Installing NodeJS
• Node in-built packages (buffer, fs, http, os, path, util, url)
• Node.js Modules
• Import your own Package
• Node Package Manager (NPM)
• Local and Global Packages

File System Module and Express.js

• File System Module
• Operations associated with File System Module
• JSON Data
• Http Server and Client
• Sending and receiving events with Event Emitters
• Express Framework
• Run a Web Server using Express Framework
• Routes
• Deploy application using PM2 and Nginx

Work with shrink-wrap to lock the node module versions

• What is shrink-wrap
• Working with npmvet
• Working with outdated command
• Install NPM Shrinkwrap

Learn asynchronous programming

• Asynchronous basics
• Call-back functions
• Working with Promises
• Advance promises
• Using Request module to make api calls
• Asynchronous Commands

Integration with MongoDB and Email Servers

• Introduction to NoSQL Databases and MongoDB
• Installation of MongoDB on Windows
• Installation of Database GUI Viewer
• Inserting Documents
• Querying, Updating and Deleting Documents
• Connect MongoDB and Node.js Application
• Exploring SendGrid
• Sending emails through Node.js application using SendGrid

REST APIs and GraphQL

• REST API
• REST API in Express
• Postman
• MongoDB Driver API
• Express Router
• Mongoose API
• GraphQL
• GraphQL Playground

Building Node.js Applications using ES6

• ES6 variables
• Functions with ES6
• Import and Export withES6
• Async/Await
• Introduction to Babel
• Rest API with ES6
• Browsing HTTP Requests with Fetch
• Processing Query String
• Creating API using ES6
• Building Dashboard API
• Creating dashboard UI with EJS
• ES6 Aside: Default Function Parameters
• Data Validation and Sanitization

User Authentication and Application Security

• Authentication
• Types of Authentication
• Session Vs Tokens
• Bcrypt
• Node-local storage

Understand Buffers, Streams, and Events

• Using buffers for binary data
• Flowing vs. non-flowing streams
• Streaming I/O from files and other sources
• Processing streams asynchronously
• File System and Security

Build chat application using Socket.io

• Getting Started
• Adding Socket.io To Your App
• Exploring the Front-end
• Sending Live Data Back & Forth
• Creating the Front-end UI
• Showing Messages In App
• Working with Time
• Timestamps
• Show Message Time In Chat App
• Chat application Project

Microservices Application

• Why Microservices?
• What is Microservices?
• Why Docker?
• What is Docker?
• Terminologies in Docker
• Child Processes
• Types of child process

 

0 responses on "Concept of Callbacks and Streams in NodeJS"

Leave a Message

Your email address will not be published. Required fields are marked *