Desired Candidate Profile
Responsibilities:
Identify the key areas/ use cases where Paxata can offer real business value and work closely with the business/technology teams to enable them
Engage with IT and Business executives for in-depth discussions about:
o Paxata's compatibility with any new or existing platforms
o How Paxata works with their enterprise Big Data strategies (includingclassic relational, DWH)
Implement large projects end to end on complex Bigdata ecosystem with Paxata to refine and transform data
Co-Conduct end user training for Paxata & work closely with client stakeholders to develop a Paxata community
Participate in consulting and architecting how Paxata fits into wider Database ecosystem
Work closely with Paxata in customising the tool using Java SDK to achieve integrations and specific unique scenarios for the client
Performance tuning and troubleshooting
Skills And Requirements
8-10 years of Data engineering / ETL backgrounds with Hands-on experience with Apache Hadoop including HDFS, Hive, Spark and integrated with Paxata as Data Prep tool
Minimum 5 years of experience in working on Core Java
Strong data analysis skills to support data quality analysis and management
Background in Banking and Financial industry will help
Good oral and written communication skills, including the ability to lead customer discussions
Expertise in one or more of the following: Business Intelligence, Information Management, Data Governance and/or Master Data Management
Collegial, can-do and fun attitude because having a good time is as important as understanding the use cases
Willing to learn and get trained in Paxata - Data preparation tool
Work Location - Chennai & Bangalore
Mode of Employment - Permanent
Contact Details:
Keyskills:
Data Governance
Spark
Hadoop
Data Quality
Hive
Hdfs
ETL
Big Data
Business Intelligence
Database
Informatica
Core Java
Java