Average salary: Rs1,318,570 /yearly
More statsGet new jobs by email
- ...Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting...Suggested
- ...Skill : PySpark Developer/Consultant Job Locations : Chennai, Pune Notice Period : Any Experience : 3-12 years Job Description : PySpark Developer Mandatory Skills : (Apache Spark, Big Data Hadoop Ecosystem, SparkSQL, Python) A good professional...Suggested
- About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range...SuggestedContract workHybrid workImmediate start
- ...Join us as an Application Engineer - PySpark Developer at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer...SuggestedImmediate start
- ...Join us as a Data Engineer - PySpark Developer at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences...SuggestedPermanent employmentImmediate start
- ...Job Type: Full-Time Experience: 4–7 Years Key Responsibilities Design, develop, and maintain scalable data pipelines and ETL workflows. Work with large datasets using PySpark , Python , and SQL to ensure efficient data transformation and integration....SuggestedFull time
- ...gaps and disruptions of the future. We are looking forward to hire PySpark Professionals in the following areas : Job Description Role: Data Engineer. Roles And Responsibilities Design, develop, and maintain scalable data pipelines using Spark (PySpark or Spark...SuggestedFlexible hours
- ...Primary skill required for the data engineer role is hands-on experience with Spark particularly PySpark. Candidates must have experience in coding data pipelines in Spark and should be comfortable with Python for automation. Other skills like Azure Airflow and Databricks...SuggestedFull time
- ...Sr. Azure Data Engineer(Must - Databricks + PySpark) Experience: -5-8 years Shift : 11:00 am to 8 pm(Candidate has to be flexible, if required resource should be flexible to overlap with US business hours) Remote Contract : 6 months Budget :120 LPM Advance...SuggestedRemote jobContract workUS shiftFlexible hoursShift work
- Description : Job Title : PySpark/Scala DeveloperFunctional Skills : Experience in Credit Risk/Regulatory risk domainTechnical Skills :... ...Learning TechniquesJob Description : - 5+ Years of experience with Developing/Fine tuning and implementing programs/applications- Using Python...SuggestedWork at office
- ...solutions across 10 countries and 12 industry verticals in the last four years. About The Role We are hiring a Senior Developer with strong expertise in PySpark, Machine Learning, Generative AI, SQL, Data Warehousing, and Microsoft Fabric. If youre passionate about building...SuggestedRemote job
- ...with 4 to 7 years of experience, particularly strong in Python and PySpark/ Spark.- The ideal candidate will have hands-on expertise in ETL... ...insights to support business decisions.Responsibilities :- Develop and maintain scalable ETL processes for data extraction, transformation...SuggestedPermanent employmentFull timeWork at office
- ...products space; curiosity about the bigger picture of building a company, product development and its people.Roles and Responsibilities :- Develop and manage robust ETL pipelines using Apache Spark (Scala).- Understand park concepts, performance optimization techniques and...Suggested
- ...Ability to write clear and concise technical documents- Banking domain experience will be an advantage- Ability to analyze data and develop strategies for populating data lakesResponsibility of / Expectations from the Role :1. Understand business requirement & banking domain...Suggested
- ...warehousing, and data analytics.Work with AWS and Databricks to design, develop, and maintain data pipelines and data platforms.Build... ...Responsibilities :- Work extensively on Databricks and its modules using PySpark for data processing.- Designs, Develops, and optimize scalable...SuggestedFull timeShift work
- ...Databricks- Experience in building and optimizing data pipelines, architectures and data sets- Excellent experience in Scala or Python, PySpark and SQL- Ability to troubleshoot and optimize complex queries on the Spark platform- Knowledgeable on structured and unstructured...
- ...should be well versed in coding, spark core and data ingestion using Azure. Experience with core Azure DE skills and coding skills (pyspark, python and SQL).Key Responsibilities : - Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and...
