Average salary: Rs1,641,358 /yearly

More stats
Get new jobs by email
  • Rs 3 - 10 lakhs p.a.

     ...Engineers Experience : 3 to 8 Yrs of exp Location : Chennai / Pune / Mumbai / Bangalore / Hyderabad Mandatory Skills : Big Data | Hadoop | Java | spark | sparkSql | Hive Qualification : ~ B.TECH / B.E / MCA / Computer Science Background - Any Specification... 
    Suggested

    Sightspectrum

    Pune
    a month ago
  •  ...Job Description Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides...  ...and Drive – the operating manual for how we behave. Join us Spark Java Developer Barclays which will be helping us build, maintain... 
    Suggested

    Women in Technology (WIT)

    Pune
    more than 2 months ago
  •  ...Position Overview Job Title: Spark/Python/Pentaho Developer Location: Pune, India Role Description Spark/Python/Pentaho Developer. Need to work on Data Integration project. Mostly batch oriented using Python/Pyspark/Pentaho. What we’ll offer you As part... 
    Suggested
    Flexible hours
    Pune
    14 hours ago
  • Rs 5 - 7 lakhs p.a.

     ...Job Description Role: Spark & Scala Developer Experience: 5+ years Our expectations is candidate should have: Strong Data frame and programming skills. Should have experience in complex objects and scala or datasets. Also, decent communication to express views... 
    Suggested

    Ifintalent Global Private Limited

    Pune
    7 days ago
  • Hi Jobseeker, We are hiring Python Spark Developer for our MNC client. Location-Pune, Hyderabad Interview Mode- Virtual Experience- 4yrs to 9yrs Notice Period- only immediate to 15days We are looking for a Data Engineer with experience in Python , Spark... 
    Suggested
    Immediate start

    Natobotics

    Pimpri
    2 days ago
  •  ...Technical Expertise :- Hands-on experience with SQL, Databricks, PySpark, Python, Azure Cloud, and (Good to have Power BI).- Design, develop, and optimize Pyspark workloads.- Ability to write scalable, modular and reusable code in SQL, python and Pyspark.- Ability to communicate... 
    Suggested

    Netscribes

    Pune
    a month ago
  •  ...transformation, and loading of data from a wide variety of data sources using Spark, EMR, Snowpark, Kafka and other big data technologies. Work...  ..., and experience has an opportunity to be hired, belong, and develop at TripleLift. Through our People, Culture, and Community... 
    Suggested
    Remote job

    TripleLift

    Pune
    more than 2 months ago
  •  ...build, and maintain robust ETL/ELT processes to ensure smooth and reliable data flow across systems. Data Modeling & Architecture: Develop scalable, efficient data models and architect storage solutions that support analytical and operational needs. Data Integration:... 
    Suggested
    Full time

    Liebherr CMCtec India Private Limited

    Pune
    20 days ago
  • Description : Role Type : Full-timeAbout UsefulBI : UsefulBI is a leading AI-driven data solutions provider specializing in data engineering, cloud transformations, and AI-powered analytics for Fortune 500 companies. We help businesses turn complex data into actionable insights...
    Suggested
    Full time

    Useful BI Corporation

    Pune
    20 days ago
  •  ...ensuring unparalleled customer experiences. As a part of team of developers, you will deliver technology stack, using strong analytical and...  ...HDFS, Hive, Yarn, MapReduce basics Optional/Good to Have: Spark (PySpark/Scala) HBase Kafka streaming interfaces ~ Strong... 
    Suggested
    Permanent employment
    Immediate start

    Barclays

    Pune
    12 days ago
  •  ...streaming pipelines using Google Cloud Dataflow, Google Cloud Datastream, Airbyte, and orchestration tools (Airflow/Prefect/Dagster). # Develop and optimize ETL/ELT processes across AWS Postgres, Google FHIR Store, and Google BigQuery. # Build and maintain unified data... 
    Suggested
    Full time

    FOLD HEALTH INDIA PRIVATE LIMITED

    Haveli
    more than 2 months ago
  •  ...junior team members, and collaborate with cross-functional teams to deliver high-quality data solutions.Key Responsibilities :- Design, develop, and maintain scalable and efficient data pipelines and ETL/ELT workflows using GCP services.- Architect and implement data warehouse... 
    Suggested
    Hybrid work

    Cantik Technologies Pvt. Ltd.

    Pune
    9 days ago
  •  ...strategic hubs: Spain, Brazil, the UK, Germany. The Telefónica Tech UK&I hub has an end- to-end portfolio of market leading services and develops integrated technology solutions to accelerate digital transformation through: Cloud, Data & AI (Adatis), Enterprise Applications (... 
    Suggested
    Full time
    Remote job

    Telefonica Tech

    Pune
    8 days ago
  •  ...the Role? Full-time job to work with Exusia's clients to design, develop and maintain large scale data engineering solutions. The right...  ...IT, MetadataHub Databricks and should be fluent with Pyspark & Spark SQL  Experience working with multiple databases like Oracle/SQL... 
    Suggested
    Full time

    Exusia

    Pune
    2 days ago
  •  ...data engineers, data scientists and research scientists to design, develop, and maintain data pipelines and infrastructure to support...  ...with big data technologies and distributed processing such as Spark , Hadoop ecosystem, Kafka etc. ~ Experience in designing and maintaining... 
    Suggested

    Cognyte

    Pune
    1 day ago
  •  ...to join our high performing team. We aim to attract and further develop the best Data Science & Supply Chain talent. The role and its...  ...Preferred Qualifications Big data frameworks: Apache Hadoop, Apache Spark, RapidMiner, Cloudera Experience in supply chain Familiar... 

    AkzoNobel

    Pune
    9 days ago
  •  ...requirements. Apply Data Modeling Expertise: Utilize expertise in data modeling tools and techniques, including ERWin, MDM, and/or ETL, to develop logical and physical data models. Translate Business Needs: Translate business needs into logical and physical data models,... 
    Long term contract
    Hybrid work
    Worldwide

    IBM

    Pune
    9 days ago
  •  ...solutions.   To be successful as a Senior Data Engineer, you should have experience with: Key Responsibilities: Design, develop, and optimize end-to-end data pipelines using Ab Initio and SQL Build scalable batch and/or streaming data processing solutions... 
    Immediate start

    Barclays

    Pune
    3 days ago
  • Rs 5 - 8 lakhs p.a.

     ...methodologies Strong analytical and problem-solving skills Effective communication and teamwork abilities Responsibilities Develop and maintain data pipelines and ETL processes to manage large scale datasets Collaborate to design test data architectures to... 

    Sightspectrum

    Pune
    a month ago
  •  ...Senior Data Engineer with strong expertise in SQL, DBT, Python, and modern cloud-based data ecosystems. The ideal candidate will design, develop, and maintain scalable data pipelines while ensuring data quality, reliability, and accessibility for analytics and business teams.... 
    Full time
    Hybrid work
    Work at office
    Flexible hours

    Persistent Systems

    Pune
    2 days ago
  •  ...Setup and test AS2/ SFTP connectivity with the Trading Partners in Informatica MFT. Qualifications Should have proficiency in developing Informatica PowerCenter mapping especially with Unstructed Data Transformation Should have experience in Onboarding Partners in Informatica... 

    Zensar Technologies

    Pune
    2 days ago
  •  ...Role Description Our Data Strategy team which is part of DWS Global Technology is looking for a experienced Data Engineer to further develop the data strategy program. This program is a strategic program for DWS, where all of the core data domains of Asset Management are... 
    Flexible hours

    Deutsche Bank

    Pune
    a month ago
  •  ...Contract Notice Period: Immediate Mandatory Skill: Perl, Bash in Linux and SQL Job Description: Perl Scripting : Develop and maintain Perl scripts for data loading, extraction, and archiving. Create tools to generate feeds and graphical reports for regulatory... 
    Contract work
    Immediate start

    People Prime Worldwide

    Pune
    3 days ago
  •  ...have experience with: Hands on experience in pyspark and strong knowledge on Dataframes, RDD and SparkSQL Hands on Experience in developing, testing and maintaining applications on AWS Cloud. Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake... 

    Barclays

    Pune
    3 days ago
  •  ...Location Name: Pune Corporate Office - Mantri Job Purpose To effectively design, develop, and manage data solutions using ETL technologies such as Azure Databricks (ADB) and Azure Data Factory (ADF), work with NoSQL databases like Cosmos DB, and apply object-oriented... 
    Long term contract
    Fixed term contract
    Work at office

    Bajaj Finserv

    Pune
    2 days ago
  • About Company :Our Client is a multinational IT services and consulting company headquartered in USA, With revenues 19.7 Billion USD, with Global work force of 3,50,000 and Listed in NASDAQ, It is one of the leading IT services firms globally, known for its work in digital...
    Contract work
    Hybrid work
    Immediate start
    Remote job

    People Prime Worldwide

    Pune
    3 days ago
  •  ...About Position: A highly skilled Data Engineering professional with 6 to 8 years of experience in designing, developing, and maintaining robust data solutions. Proven ability to work across the full software development lifecycle, from requirement gathering to deployment... 
    Full time
    Hybrid work
    Work at office
    Flexible hours

    Persistent Systems

    Pune
    9 days ago
  •  ...Pune Work Mode: Onsite during initial training period, transitioning to Hybrid post-training. What You Will Do: Design, develop, and maintain highly scalable data pipelines and applications using Python and PySpark . Build end-to-end data solutions from... 
    Hybrid work

    Sourcebae

    Pune
    2 days ago
  • Rs 4 - 6 lakhs p.a.

    Job Responsibilities: To transition legacy Rules from Python using the Polars Library to SparkSQL To create new Rules using SparkSQL based on written requirements. Must Have Skills: Understanding of Polars library Understanding of SparkSQL (this is more important...

    Iospl Technology Services

    Pune
    a month ago
  • Overview Connecting clients to markets – and talent to opportunity With 4,300 employees and over 400,000 retail and institutional clients from more than 80 offices spread across five continents, we're a Fortune-100, Nasdaq-listed provider, connecting clients to the global...

    StoneX Group Inc.

    Pune
    5 days ago