Desired Candidate Profile
Role & Responsibilities at job:
Solution Development
Learn and develop on the latest open source and enterprise level big data technologies and platforms
Hands-on development of solution on big data stack (e.g. Hadoop, Spark, EMR etc.)
Expertise and significant hands-on (design and development) experience in at least 3 big data technologies
Experience in managing and building solution on cloud environment (AWS, Azure etc.)
Project scoping, analysis, installation, testing and support of various big data platforms
Design, develop and implement analytics algorithms / workflow and software products on both streaming and batch data on a big data / cloud-based platform
Integration of diverse data sources on the big data platform
Troubleshoot and Resolve technical issues
Demonstrate technical leadership
Delivery
Ensure delivery of technical engagements by exceeding customer expectations.
Provide expert consultation and best practices to customers and internal stakeholders.
Prepare Estimation and Pricing for customer and internal projects
Lead small to medium size projects, and manage the execution by planning project activities, assigning tasks to team, tracking and monitoring the tasks, and working with close coordination with the project manager.
Training
Provide training and support for internal employees and customers
Business Development
Need to actively contribute to Pre-Sales /BD for custom solution designing as per customer requirement
Author solutions for current and potential customer concerns, including responding to RFPs and developing white papers.
Specifications
Requirement
Overall 8 to 15 years of experience in technical leadership in Business Technology and Big Data / Cloud technology areas. Must have designed and implemented Big Data solutions as Lead Architect for at least 2 end-to-end large scale systems.
Should have successfully done 1-2 end to end implementations using Big Data technologies
Experience in architecting, optimizing, & maintaining large enterprise systems
Experience with Hadoop Ecosystem. (Hadoop ecosystem includes Sqoop, Flume, Zookeeper, Oozie, Mahout, HBase, Sentry, HCatalog, Hue, Drill, Impala,Tez, Ambari, Chukwa.. etc )
Expert knowledge of SQL and PL/SQL
Expert knowledge of at least couple of programming languages (Java, Python, PHP etc.)
Knowledge in Data Analysis & Logical Data Modelling
Business analysis capability to work with HLDD, LLDD and other design documents
Experience in writing MapReduce programs, UDFs, Hive query language, Pig scripts.
Should be able to create the POCs for each problem/scenario/cases.
Good Linux and shell scripting background.
Experience with AWS cloud.
Experience with at-least one NoSQL technologies MongoDB, Cassandra, HBase, Couchbase.. etc.
Good understanding of Java(Basic is must, advanced preferable).
Good understanding of Hadoop design principals like YARN and factors that affect distributed system performance, including hardware and network considerations.
Experience in providing Infrastructure Recommendations, Capacity Planning and develop utilities to monitor cluster better
Experience around managing large clusters with huge volumes of data
Experience with cluster maintenance tasks such as creation and removal of nodes, cluster monitoring.
Experience with cluster troubleshooting. Manage and review Hadoop log files.
Plus/Preferable
Experience with Storm and its ecosystem (Kafka, Scribe..)
Experience with Spark and its ecosystem(Spark Streaming, Spark SQL, MLib, GraphX)
Understands how security model using Kerberos and enterprise LDAP product works and helps implement the same
Knowledge of configuration management / deployment tools like Puppet / Chef Setting upcluster monitoring and alerting mechanism tools like Ganglia, Naios etc.
Experience with any Graph Database Engine.
Behavioral Competency
Proven ability to handle multiple projects while meeting deadlines and documenting progress towards those deadlines.
Excellent communication skills (must be able to interface with both technical and business leaders in the organization)
Ability to be a self-starter and can provide leadership
Strong analytical skills to solve and model complex business requirements are a plus
Qualification
Bachelor's degree in computer science.
Experience working on Pharma/Life Sciences projects will be a plus
A minimum of 5 years of relevant technology experience in designing and developing solutions.
Work Location: Gurgaon/Noida, India
Education:
UG: B.Tech/B.E. - Any Specialization
PG: Other
Contact Details:
Company: Axtria India Private Limited
Website: https://www.axtria.com/
Reference Id: Big data Developers
Keyskills:
hive
mapreduce
cassandra
couchbase
sqoop
impala
aws
mongodb
nosql
pig