Interview Mode: 1 Virtual & 1 Face-to-Face (Mandatory)
Role & responsibilities
Support the design, development, and maintenance of data pipelines using Azure Data Factory (ADF).
Collaborate with senior data engineers and architects to learn and contribute to the design and implementation of scalable, performant, and secure data integration solutions.
Optimize data pipelines for performance, scalability, and cost efficiency under the guidance of senior team members.
Develop and maintain technical documentation, including pipeline diagrams, data flow diagrams, and code documentation.
Participate in data-related aspects of pre-sales activities, including solution design, proposal development, and customer presentations.
Stay current with emerging technologies and industry trends related to data integration and management in Azure, and contribute to improving existing solutions and implementing new ones.
Preferred candidate profile
Bachelor's or master's degree in computer science, engineering, or a related field.
3-5 years of Experience with Azure Data Factory, including data ingestion, transformation, and orchestration.
Experience with data integration and ETL processes and writing SQL queries and SQL scripts.
Good problem-solving and analytical skills, with the ability to identify and resolve technical issues under the guidance of senior team members.
Good communication and collaboration skills, with the ability to work effectively in a team environment.
Willingness to learn and adapt to new technologies and tools.
Keyskills: sql queries Azure Data Lake azure data factory ETL
Hexaware BPS is a unit of Hexaware Technologies Ltd. We are currently staffed at 2000+ people across Navi Mumbai (Mahape) Chennai, Nagpur and US. Ranked 15th in the NASSCOM Top 20 IT Software & Services Exporters from India, we also rank among the Top 20 Best IT employers in India by DQ-IDC fo...