OnlineTrainingIO

Kafka

About Kafka

This Kafka training from Online Training equips you with all the skills needed for becoming an Apache Kafka professional. Kafka is a real-time message broker that allows you to publish and subscribe to message streams. Some of the topics included in this online training course are the Kafka API, creating Kafka clusters, integration of Kafka with the Big Data Hadoop ecosystem along with Spark, Storm and Maven integration.

 

What you will learn in this Kafka course?

  1. Kafka characteristics and salient features
  2. Kafka cluster deployment on Hadoop and YARN
  3. Understanding real-time Kafka streaming
  4. Introduction to the Kafka API
  5. Storing of records using Kafka in fault-tolerant way
  6. Producing and consuming message from feeds like Twitter
  7. Solving Big Data problems in messaging systems
  8. Kafka high throughput, scalability, durability and fault-tolerance
  9. Deploying Kafka in real world business scenarios

 

Who should take this Kafka course?

  • Big Data Hadoop Developers, Architects and other professionals
  • Testing Professionals, Project Managers, Messaging and Queuing System professionals.

 

What are the prerequisites for taking this Kafka course?

Anybody can take this training course. Having a background in Java is beneficial.

Kafka Course Outline

What is Kafka – An Introduction

Understanding what is Apache Kafka, the various components and use cases of Kafka, implementing Kafka on a single node.

Multi Broker Kafka Implementation

Learning about the Kafka terminology, deploying single node Kafka with independent Zookeeper, adding replication in Kafka, working with Partitioning and Brokers, understanding Kafka consumers, the Kafka Writes terminology, various failure handling scenarios in Kafka.

Multi Node Cluster Setup

Introduction to multi node cluster setup in Kafka, the various administration commands, leadership balancing and partition rebalancing, graceful shutdown of kafka Brokers and tasks, working with the Partition Reassignment Tool, cluster expending, assigning Custom Partition, removing of a Broker and improving Replication Factor of Partitions.

Integrate Flume with Kafka

Understanding the need for Kafka Integration, successfully integrating it with Apache Flume, steps in integration of Flume with Kafka as a Source.

Kafka API

Detailed understanding of the Kafka and Flume Integration, deploying Kafka as a Sink and as a Channel, introduction to PyKafka API and setting up the PyKafka Environment.

Producers & Consumers

Connecting Kafka using PyKafka, writing your own Kafka Producers and Consumers, writing a random JSON Producer, writing a Consumer to read the messages from a topic, writing and working with a File Reader Producer, writing a Consumer to store topics data into a file.

error:
Scroll to Top