Hadoop Big Data Training course helps you learn the core techniques and concepts of Big Data and Hadoop ecosystem. It equips you with in-depth knowledge of writing codes using MapReduce framework and managing large data sets with HBase. The topics covered in this course mainly includes- Hive, Pig and setup of Hadoop Cluster.
|Start Date||Timing||Class Days|| Seats left|
|September 27th, 2014||09:30 AM - 12:30 PM|
|7:00 PM - 10:00 PM|
|Sat & Sun||15 Seats left|
|October 06th/07th, 2014||10:00 PM - 11:30 PM|
|7:30 AM - 09:00 AM|
|Mon-Fri/Tue-Sat||15 Seats left|
|October 18th, 2014||09:30 AM - 12:30 PM|
|7:00 PM - 10:00 PM|
|Sat & Sun||15 Seats left|
- Understand Big Data and Hadoop ecosystem
- Work with Hadoop Distributed File System (HDFS)
- Write MapReduce programs and implementing HBase
- Write Hive and Pig scripts
- Knowledge of programming in C++ or Java or any other Object Oriented Programming language is preferred, else you can enroll for our Java course free of cost to acquire the necessary skills to learn Hadoop.
- 64 bit or 64 bit ready PC/Laptop (Intel Core 2 Duo or above)
- 8 GB RAM
- 80 GB HDD
What is the Big Data problem?
Big Data is a set of unstructured and structured data that is complex in nature and is growing exponentially with each passing day. Organizations are facing a major challenge in storing and utilizing this enormous data. This problem spans across the world because of a serious dearth of skilled programmers.
"The United States alone faces a shortage of 140,000 to 190,000 people with analytical expertise and 1.5 million managers and analysts with the skills to understand and make decisions based on the analysis of big data."
Here’s the Holy Grail
Hadoop is a game changer for all those companies working with Big Data. It brings together large pools of data, stores and analyses it. Big enterprises like Amazon and IBM have embraced this technology, hence making accurate analyses and better decisions.
Grab the opportunity
Learning Hadoop gives you the opportunity to build your career in the field of Big Data, either as a Hadoop Administrator or a Hadoop Developer.
Hurry up to build a rewarding career in the world’s most powerful business tool!
Virtual Box/VM Ware
Basics, Installations, Backups, Snapshots
Why Hadoop, Scaling, Distributed Framework, Hadoop v/s RDBMS, Brief history of Hadoop, Problems with traditional large-scale systems, Requirements for a new approach, Anatomy of a Hadoop cluster, Other Hadoop Ecosystem components
Pseudo mode, Cluster mode, Installation of Java, Hadoop, Configurations of Hadoop, Hadoop Processes ( NN, SNN, JT, DN, TT), Temporary directory, UI, Common errors when running Hadoop cluster, Solutions
HDFS- Hadoop Distributed File System-
HDFS design and architecture, HDFS concepts, Interacting HDFS using command line,Dataflow, Blocks, Replica
Name node, Secondary name node, Job tracker, Task tracker, Data node
Developing MapReduce application, Phases in MapReduce framework, MapReduce input and output formats, Advanced concepts, Sample applications, Combiner
Writing a MapReduce Program
The MapReduce flow, Examining a sample MapReduce program, Basic MapReduce API concepts, Driver code, Mapper, Reducer, Hadoop’s streaming API, Using Eclipse for rapid development, Hands-on exercise, New MapReduce API
Common MapReduce Algorithms
Sorting and Searching, Indexing, Term Frequency – Inverse Document Frequency, Word Co-occurrence, Hands-on exercise
Writing advance map reduce programs
Building multivalue writable data, Accessing and using counters,Partitioner - Hashpartitioner,Hands on Exercises .
Hadoop Programming Languages
HIVE: Introduction, Installation, Configuration, Interacting HDFS using HIVE, MapReduce programs through HIVE, HIVE commands, Loading, Filtering, Grouping, Data types, Operators, Joins, Groups, Sample programs in HIVE
PIG: Basics, Configuration, Commands,Loading, Filtering, Grouping, Data types, Operators, Joins, Groups, Sample programs in PIG
What is HBase, HBase architecture, HBase API, Managing large data sets with HBase, Using HBase in Hadoop applications.
Integrating Hadoop into the Enterprise Workflow
Integrating Hadoop into an Existing Enterprise, Loading Data from an RDBMS into HDFS by Using Sqoop, Managing Real-Time Data Using Flume.
Ques 1. What if I miss the Hadoop class?
Ans. All classes are recorded automatically. You can access class recordings in your WizIQ account as many times as you want.
Ques 2. Does this online Hadoop course include hands-on-training?
Ans. The tutor will provide regular hands-on practice assignments for gaining practical exposure.
Ques 3. I am not a programmer but still want to learn Hadoop. So how can I get the knowledge of OOPs?
Ans. You can enroll for our Java course free of cost to acquire the necessary skills to learn Hadoop.
Ques 4. What is the minimum internet speed required to attend the Hadoop live classes?
Ans. 1 Mbps of internet speed is recommended to attend Hadoop live classes. However, students can attend the classes from a slower internet speed too (performance can’t be guaranteed though).
Ques 5. What should I do if I encounter any platform problems during the online course?
Ans. We have 24x7 Support Team to assist you in case of any platform related issues. We also conduct live technical demo before starting the course to make you familiar with the WizIQ Virtual Platform and to check the functionality of audio, video devices.
Ques 6. For how long the access to the class recordings available?
Ans. You can access the online class recordings for 6 months- review and revise any number of times.
Ques 7. What are the payment options?
Ans. You can make the payment through Debit Card, Credit Card, Netbanking or PayPal account.