Node.js streams can be a challenge for even the most experienced developers. While working with them in the development of TailFile, I found that the key to understanding streams is recognizing that there are many different components at play. The Node.js documentation on streams is extensive, but it can be difficult to locate all of the important details in one place.

Additionally, streams are also very stateful, so how they function does at times depend on the mode that they’re in. Hopefully, in this article, I can help clarify some of the confusion surrounding streams, focusing specifically on implementing read streams. It’s important to note that writable streams and filesystem streams may have different implementations for similar concepts.

What’s a Stream Implementation?


What is a stream implementation and what does it help in Node.js streams?

A readable implementation is a piece of code that extends Readable, which is the Node.js base class for reading streams. It can also be a simple call to the new Readable() constructor, if you want a custom stream without defining your own class. I’m sure plenty of you have used streams from the likes of HTTP res handlers to fs.createReadStream file streams. An implementation, however, needs to respect the rules for streams, namely that certain functions are overridden when the system calls them for stream flow situations. Let’s talk about what some of this looks like.

The takeaways:

  • Of course, call super(opts)or nothing will work.
  • _read is required and is called automatically when new data is wanted.
  • Calling push(<some data>) will cause the data to go into an internal buffer, and it will be consumed when something, like a piped writable stream, wants it.
  • push(null) is required to end the read stream properly.
    • An 'end' event will be emitted after this.
    • A 'close' event will also be emitted unless emitClose: false was set in the constructor.
  • _destroy is optional for cleanup things. Never override destroy; always use the underscored method for this and for _read.

For such a simple implementation, there’s no need for the class. A class is more appropriate for things that are more complicated in terms of their underlying data resources, such as TailFile. This particular example can also be accomplished by constructing a Readable inline:

However, there’s one major problem with this code. If the data set were larger, from a file stream, for example, then this code is repeating a widespread mistake with node streams:

This doesn’t respect backpressure.

What’s Backpressure?

Remember the internal buffer that I mentioned above? This is an in-memory data structure that holds the streaming chunks of data — objects, strings or buffers. Its size is controlled by the highWaterMark property, and the default is 16KB of byte data, or 16 objects if the stream is in object mode. When data is pushed through the readable stream, the push method may return false. If so, that means that the highWaterMark is close to, or has been, exceeded, and that is called backpressure.

If that happens, it’s up to the implementation to stop pushing data and wait for the _read call to come, signifying that the consumer is ready for more data, so push calls can resume. This is where a lot of folks fail to implement streams properly. Here are a couple of tips about pushing data through read streams:

  • It’s not necessary to wait for _read to be called to push data as long as backpressure is respected. Data can continually be pushed until backpressure is reached. If the data size isn’t very large, it’s possible that backpressure will never be reached.
  • The data from the buffer will not be consumed until the stream is in reading mode. If data is being pushed, but there are no 'data' events and no pipe, then backpressure will certainly be reached if the data size exceeds the default buffer size.

This is an excerpt from TailFile, which reads chunks from the underlying resource until backpressure is reached or all the data is read. Upon backpressure, the stream is stored and reading is resumed when _read is called.

Streams-powered Node APIs

Due to their advantages, many Node.js core modules provide native stream handling capabilities, most notably:

  • net.Socket is the main node api that is stream are based on, which underlies most of the following APIs
  • process.stdin returns a stream connected to stdin
  • process.stdout returns a stream connected to stdout
  • process.stderr returns a stream connected to stderr
  • fs.createReadStream() creates a readable stream to a file
  • fs.createWriteStream() creates a writable stream to a file
  • net.connect() initiates a stream-based connection
  • http.request() returns an instance of the http.ClientRequest class, which is a writable stream
  • zlib.createGzip() compress data using gzip (a compression algorithm) into a stream
  • zlib.createGunzip() decompress a gzip stream.
  • zlib.createDeflate() compress data using deflate (a compression algorithm) into a stream
  • zlib.createInflate() decompress a deflate stream

Closing thoughts


There’s much more to understand about Node.js streams, especially when it comes to writing streams, but the core concepts remain the same. Since the information about streams is scattered, it will be challenging to complicate all of them in one place.

As I write this, I cannot find the place where I learned that ‘push can be called continuously’, but trust me, it’s a thing, even though the backpressure doc below always recommends waiting for _read. The fact is, depending on what you’re trying to implement, the code becomes less clear-cut, but as long as backpressure rules are followed and methods are overridden as required, then you’re on the right track!

About InApps

InApps is an outsourcing and development company located in Vietnam. At InApps, we currently provide services on DevOpsOffshore Software DevelopmentMVP App Development, and Custom App Development. 

We’ve established the partnership with more than 60+ businesses from the US, the UK, Europe to Hongkong.

But more than being a partner, we aim to become the business’s companion. Therefore we continue to update helpful articles for those who are in need. 

If you’re interested in Node.js topic, you might find these articles helpful:


Rate this post
As a Senior Tech Enthusiast, I bring a decade of experience to the realm of tech writing, blending deep industry knowledge with a passion for storytelling. With expertise in software development to emerging tech trends like AI and IoT—my articles not only inform but also inspire. My journey in tech writing has been marked by a commitment to accuracy, clarity, and engaging storytelling, making me a trusted voice in the tech community.

Let’s create the next big thing together!

Coming together is a beginning. Keeping together is progress. Working together is success.

Let’s talk

Get a custom Proposal

Please fill in your information and your need to get a suitable solution.

    You need to enter your email to download


      Success. Downloading...