Average salary: Rs1,389,886 /yearly
More statsGet new jobs by email
- ...in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobs for data importing/exporting Dataform, Dataproc, Pyspark, Python...Suggested
- ...Description We are seeking an experienced PySpark Developer to join our dynamic team in India. The ideal candidate will have a strong background in data engineering and a passion for building scalable data solutions. Responsibilities: Develop and maintain data processing...Suggested
- ...Key Responsibilities: Design, develop, and maintain ETL pipelines using Python , PySpark , and SQL on distributed data platforms. Write clean, efficient, and scalable PySpark code for big data transformation and processing. Develop reusable scripts and tools...Suggested
- ...and as such all normal working days must be carried out in India. Job Description Join us as a Software Engineer, Tableau And PySpark This is an opportunity for a driven Software Engineer to take on an exciting new career challenge Day-to-day, you'll build a wide...SuggestedPermanent employmentFull timeHybrid workFlexible hours
- ...with data integration teams. 3+ years of in-depth experience developing data pipelines within an Apache Spark environment (preferably Databricks... ...of data warehouse modelling techniques. Strong knowledge on Pyspark, Python and SQL and distributed computing principles. Strong...Suggested
- Join us as a Software Engineer, PySpark And AWS This is an opportunity for a driven Software Engineer to take on an exciting new career challenge Day-to-day, you'll be engineering and maintaining innovative, customer centric, high performance, secure and robust solutions...SuggestedPermanent employmentFull time
- ...JavaScript ,scale.Job description :- 2-3 years of relevant work experience.- To work in capacity of AWS Databricks, Python, Pyspark Hands ON developer- Work with stakeholders for regular updates, requirement understanding and design discussions. Hands-on experience on designing...Suggested
- ...Engineer with a strong background in Java and hands-on expertise in Python/PySpark, Big Data technologies, and Google Cloud Platform (GCP). The ideal candidate should be capable of designing, developing, and maintaining large-scale data processing systems, ensuring data...Suggested
- ...skills. ~ . in Computer Sciences or similar. ~3-5 years of experience in Data Pipeline development. ~3-5 years of experience in PySpark / Databricks. ~3-5 years of experience in Python / Airflow. ~ Knowledge of OOP and design patterns. ~3-5 years of server-side...SuggestedFull timeFlexible hours
- ..., and transformation across both batch and streaming datasets.- Develop and optimize end-to-end data pipelines to process structured and... ...Strong working knowledge of the Databricks ecosystem, including: PySpark, Notebooks, Structured Streaming, Unity Catalog, Delta Live...SuggestedImmediate start
