Welcome to this section to learn about Docker for Java developers. In case, you are directly landing here, it’s recommended to have an idea about the docker and it’s recommended to first visit this page.

What we have seen so far is that, the docker-engine sits on top of the host machine and is responsible for spawning multiple containers. In this blog, we shall be creating a Java based project and demonstrating the “Build-Ship-Run” philosophy. So let’s take a look at what how this can be done.

Step #1 :- We have setup github-desktop application in order to push/pull the…


If you are directly landing here, it would be highly recommended that you have a read about this link here.

What is KSQL ? KSQL is the streaming SQL engine for Apache Kafka. This blog will step through some practical examples of how to use KSQL to build powerful stream-processing applications:

  • Filtering streams of data.
  • Joining live streams of events with reference data (e.g. from a database).
  • Continuous, stateful aggregations.

Introduction to the use-case :-We have an Airline, for which we need to do real-time analysis for the reviews received on-flight :-


Welcome to this second section to learn about Docker for Web developers. In case, you are directly landing here, it’s recommended to first visit this page.

So what we have seen so far is that, the docker-engine sits on top of the host machine and is responsible for spawning multiple containers.

  • First Container runs on the port 80 hosted on Tomcat-7 and JDK 1.7.
  • Second Container runs on the port 80 hosted on Tomcat-8 and JDK 1.8.

Remember these two are virtual machines which means they are as good as a PC running inside a PC.

In this blog, we…


Welcome to this first section to learn about Docker for Web developers..

Story so far :- Let let’s take a look at a real world use case. Let’s say we have two teams. Team A and team B. As of today the teams share the same infrastructure. In the past they had requirements, which allowed them to use the same versions of the JDK and Tomcat. So they were able to expose different ports and get their applications running in production using the same infrastructure. So they never had any issues.

Current twist in story :- Now the problem is…


Real-time data pipelines with kafka :- Let’s first see how the Kafka is used in setting up real-time data-pipelines.

  • First, we have our multiple data-sources and we want to onboard them to Kafka. We have 2 options here to go with i.e. Either Kafka producers OR Kafka source connectors.
  • Next, we might have to perform some processing on the data being received in kafka-topics. We could use Streams processing application which shall consume from kafka topics, perform processing and again puts the data back into kafka topics.
  • Next, After processing, we might want to dump the processed data back onto…


Welcome readers. In case you have directly landed here, I strongly suggest you to go back and read through this link first.

Introduction to the problem :- In this blog, I would like to help you guys to build a Machine Learning model based on the Decision Tree Algorithm. Here, we shall be working on a smaller dataset (taken from archive). We shall first be training our model using the given data and then shall be performing the Multi-class classification using the built model.

Let’s begin by exploring the data-set first. Please note that, there are 4 independent variables and…


Welcome readers.

Introduction to the problem :- In this blog, I would like to help you guys to build a Machine Learning model based on the Decision Tree Algorithm. Here, we shall be working on a smaller dataset of diabetic people. We shall first be training our model using the given data and then shall be performing the Binary classification using the built model.

Fundamentals :- Here our main agenda is to identify, which is going to be the root-node and what would be our splitting criteria for further level-nodes. We can use, thus formed Decision-Tree (If-else based rule-engine) in…


Introduction to the problem :-

In this blog, we would work with one of the popular data-set i.e. of LendingClub. Its a US peer-to-peer lending company, headquartered in San Francisco, California. It was the first peer-to-peer lender to register its offerings as securities with the Securities and Exchange Commission (SEC), and to offer loan trading on a secondary market. LendingClub is the world’s largest peer-to-peer lending platform.

Objective of the Blog :-

In this blog, Given historical data on loans given out with information on whether or not the borrower defaulted (charge-off), we shall be building a model that can…


In case, you are landing here directly, it would be recommended to visit this page. In this section, we would deep dive to learn about AWS-S3.

Amazon S3 is an “infinitely-scaling” storage and therefore we don’t need to plan its storage size in advance. It is one of the building blocks of AWS cloud. Many websites uses Amazon S3 as its backbone. Many other AWS services also uses Amazon S3 as an integration-component as well. Amazon S3 allows people to store objects (files) in S3 buckets (directories). Buckets must have globally(Throughout all the accounts across the globe) unique names, but…

aditya goel

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store