Skill: Snowflake DBT Experience: 4 to 16 years Location: Hyderabad Walkin date: 8th Feb 25 Job Description: Design and implement robust data pipelines using DBT and Snowflake. Collaborate with analytics and data science teams to understand data requirements. Develop and manage ETL processes to extract, transform, and load data efficiently. Maintain and optimize existing data models and pipelines for performance and reliability. Ensure data quality by implementing verification and validation checks. Perform data transformations and aggregation using SQL and other tools. Work on performance tuning for data processing jobs. Document data engineering practices, procedures, and workflows for team reference. Participate in code reviews to ensure best practices are followed. Monitor data pipeline performance and troubleshoot issues promptly. Stay updated on industry trends and emerging technologies related to data engineering. Design and implement data security measures to protect sensitive information. Assist in migration efforts from on-premise to cloud-based data solutions. Contribute to data governance initiatives and policies. Support business intelligence solutions through effective data modeling.,
Employement Category:
Employement Type: Full time Industry: IT Services & Consulting Role Category: Not Specified Functional Area: Not Specified Role/Responsibilies: Snowflake DBT - Hyderabad Job in Cognizant at