This Kafka training from Online Training equips you with all the skills needed for becoming an Apache Kafka professional. Kafka is a real-time message broker that allows you to publish and subscribe to message streams. Some of the topics included in this online training course are the Kafka API, creating Kafka clusters, integration of Kafka with the Big Data Hadoop ecosystem along with Spark, Storm and Maven integration.
What you will learn in this Kafka course?
- Kafka characteristics and salient features
- Kafka cluster deployment on Hadoop and YARN
- Understanding real-time Kafka streaming
- Introduction to the Kafka API
- Storing of records using Kafka in fault-tolerant way
- Producing and consuming message from feeds like Twitter
- Solving Big Data problems in messaging systems
- Kafka high throughput, scalability, durability and fault-tolerance
- Deploying Kafka in real world business scenarios
Who should take this Kafka course?
- Big Data Hadoop Developers, Architects and other professionals
- Testing Professionals, Project Managers, Messaging and Queuing System professionals.
What are the prerequisites for taking this Kafka course?
Kafka Course Outline
What is Kafka – An Introduction
Understanding what is Apache Kafka, the various components and use cases of Kafka, implementing Kafka on a single node.
Multi Broker Kafka Implementation
Learning about the Kafka terminology, deploying single node Kafka with independent Zookeeper, adding replication in Kafka, working with Partitioning and Brokers, understanding Kafka consumers, the Kafka Writes terminology, various failure handling scenarios in Kafka.
Multi Node Cluster Setup
Introduction to multi node cluster setup in Kafka, the various administration commands, leadership balancing and partition rebalancing, graceful shutdown of kafka Brokers and tasks, working with the Partition Reassignment Tool, cluster expending, assigning Custom Partition, removing of a Broker and improving Replication Factor of Partitions.
Integrate Flume with Kafka
Understanding the need for Kafka Integration, successfully integrating it with Apache Flume, steps in integration of Flume with Kafka as a Source.
Detailed understanding of the Kafka and Flume Integration, deploying Kafka as a Sink and as a Channel, introduction to PyKafka API and setting up the PyKafka Environment.
Producers & Consumers
Connecting Kafka using PyKafka, writing your own Kafka Producers and Consumers, writing a random JSON Producer, writing a Consumer to read the messages from a topic, writing and working with a File Reader Producer, writing a Consumer to store topics data into a file.