Apache Flume Tutorial

Apache Flume Tutorial

What do you understand by the term Apache Flume?

A tool used for data injection from various data producers (webservers) into Hadoop is Apache Flume. Apache Flume is a standard, simple, robust, flexible, and extensible tool. Apache Flume is a distributed, reliable, and available service for efficiently collecting, aggregating, and moving large amounts of streaming data into the Hadoop Distributed File System (HDFS)

The tutorial explains the basics of Apache Flume and how to use it in practice.

What are the prerequisites required and who are the audience for learning the concept of Apache Flume?

Good knowledge and understanding of Hadoop and HDFS commands are required to understand the concept of Apache Flume.

This tutorial is mainly targeted for the professionals who by using Apache Flume would like to learn the process of transferring log and streaming data from various webservers to HDFS or HBase.

Apache Flume Tutorial: List of Topics

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status