Hi there,
we have demands for PySpark Developer with 6 to 9 Years of total experience at Pune Location.
Job Responsibilities1. 2+ Years of developing distributed computing Big Data applications using Spark, Elastic Search on MapR Hadoop
2. Strong work experience on Hadoop distributed computing framework (including apache Spark)
3. Very strong hold over Python programming or scripting
Contact: Varun (va***************a@ca******i.com)
,Capgemini Technology Services India Limited A global leader in consulting, technology services and digital transformation, Capgemini is at the forefront of innovation to address the entire breadth of clients opportunities in the evolving world of cloud, digital and platforms. Building on its str...