Developer needs to work with Hadoop Distribution platforms like HortonWorks, Cloudera, MapR and others
Take end-to-end responsibility of the Hadoop Life Cycle in the organization
Be the bridge between data scientists, engineers and the organizational needs
Do in-depth requirement analysis and exclusively choose the work platform
Setup and development of Hadoop Architecture and HDFS is a must
Take the ownership of of MapReduce, Spark, Scala, Storm, Java related job
Knowledge of MapReduce Framework, Storm, Kafka with Java
Working knowledge of Linux operating system (RedHat centOS 7)
Must have worked on ecosystem like Hortonworks Ambari
Knowledge of Spark with Scala will add advantage
Job Classification
Industry: IT Services & Consulting Functional Area: IT Software - Other, Role Category: Programming & Design Role: Programming & Design Employement Type: Full time
Education
Under Graduation: Any Graduate Post Graduation: Any Postgraduate