Average salary: Rs3,940,357 /yearly

More stats
Get new jobs by email
  •  ...back-office applications , including regulatory reporting, settlements, and reconciliation . We are seeking an experienced ETL/Java Developer to join a mixed Luxoft/client team , focusing on new functionality development . This role provides an excellent opportunity... 
    Suggested
    Remote job
    Work from home
    Flexible hours
    work from home
    29 days ago
  •  ...Responsibilities Design, develop, and implement ETL processes using Teradata tools such as BTEQ and TPT Utility. Optimize and enhance existing ETL workflows for increased performance and reliability. Collaborate with cross-functional teams to gather data requirements... 
    Suggested
    Full time

    Appsierra Group

    work from home
    a month ago
  •  ...As a Senior ETL Developer at CodeNinja, you will play a critical role in transforming raw data into actionable insights. You'll design, develop, and maintain ETL processes to ensure the seamless flow of data between various systems. Your expertise in ETL tools and data integration... 
    Suggested
    Remote job
    Full time

    Codeninja

    work from home
    14 days ago
  •  ...:We are seeking a talented Informatica BDM (Big Data Management) Developer to design, develop, and optimize data integration workflows on modern...  ...platforms. The ideal candidate will have hands-on experience in ETL development, data pipelines, and data lake integrations using... 
    Suggested

    Awign Enterprise Pvt ltd

    work from home
    14 days ago
  •  ...Summary :We are seeking a highly skilled and detail-oriented Cognos Developer to join our Business Intelligence and Data Analytics team. The...  ...Experience working with relational databases, data modeling, and ETL processes.- Proficient in Report Studio, Framework Manager, Query... 
    Suggested

    THOTNR CONSULTING PRIVATE LIMITED

    work from home
    14 days ago
  •  ...Kubernetes, and Jenkins, with a passion for working with large datasets and cloud-native technologies.Key Responsibilities :- Develop, optimize, and maintain ETL/ELT data pipelines using Python and PySpark.- Work extensively on Databricks for data processing, notebook development,... 
    Suggested

    questhiring

    work from home
    6 days ago
  •  ...house environment. You will leverage your 10+ years of expertise to develop complex data pipelines, ensure data quality, and drive innovation...  ...and reliability. ~ Develop and maintain complex ETL/ELT processes for large-scale data ingestion and transformation.... 
    Suggested
    work from home
    29 days ago
  • Description :Key Responsibilities : - Design, build, and optimize data pipelines and ETL processes using AWS Glue, Lambda, RDS, and S3.- Develop scalable batch and streaming data processing workflows using Apache Spark, Spark Streaming, and Kafka.- Work with SQL and NoSQL databases... 
    Suggested

    Catalyst IQ

    work from home
    8 days ago
  •  ...similar roles- Strong foundation in cloud-native data infrastructure and scalable architecture design- Build and maintain reliable, scalable ETL/ELT pipelines using modern cloud-based tools- Design and optimize Data Lakes and Data Warehouses for real-time and batch processing-... 
    Suggested

    People Connect Consultants

    work from home
    14 days ago
  •  ...Responsibilities: Data Pipeline Architecture: Design, develop, and optimize end-to-end data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse. Ensure data quality, reliability, and performance throughout the pipeline. Data... 
    Suggested
    work from home
    a month ago
  •  ...Job Responsibilities: Develop and maintain data pipelines for large-scale data processing. Work with streaming data technologies, including...  ...implement near real-time data streaming solutions. Optimize ETL processes for performance, scalability, and reliability.... 
    Suggested
    work from home
    29 days ago
  •  ...multiple products and features we have. 7+ years of experience in developing highly scalable Big Data pipelines. In-depth understanding of...  ...Hadoop, and the file types they deal with. Experience with ETL and Data pipeline tools like Apache NiFi, Airflow etc. Excellent... 
    Suggested
    work from home
    a month ago
  •  ...framework (Airflow, Prefect, Dagster)- Strong SQL skills; comfortable writing, tuning, and debugging complex queries- Experience building ELT/ETL workflows and integrating APIs/webhooks- Deep familiarity with a modern cloud data warehouse (Snowflake, BigQuery, or Redshift)-... 
    Suggested
    Remote job

    Savantis Solutions India Pvt.Ltd.

    work from home
    2 days ago
  •  ...GIS Data Engineer / FME ETL Engineer LOCATION- Remote Experience- 5- 10 yrs Budget - 20 -40 LPA Job Summary: We are looking...  ...: # Data Engineering & Integration Design and develop ETL workflows using FME Desktop / FME Server for large-scale Electric... 
    Suggested
    Full time
    Remote job

    Venpa Global Technologies

    work from home
    3 days ago
  •  ...Description We are seeking an experienced Talend Developer to join our team in India. The ideal candidate will have a strong background in ETL processes and data integration, with a proven track record of delivering high-quality data solutions. Responsibilities Design... 
    Suggested
    work from home
    4 days ago
  •  ...Description We are seeking a skilled Snowflake Developer / Data Engineer to join our team in India. The ideal candidate will be responsible...  ...architecture, data loading, and SQL queries. ~ Experience with ETL tools and data integration techniques. ~ Strong knowledge of data... 
    work from home
    23 days ago
  •  ...candidate will have 10+ years of hands-on experience in designing, developing, and implementing robust data solutions on the Snowflake platform...  ...and transformation into Snowflake. Data Integration & ETL/ELT: ~ Design and implement ETL/ELT processes using various tools... 
    Immediate start
    work from home
    29 days ago
  •  ...security, and performance optimization.- Design, implement, and manage scalable data pipelines and ETL/ELT processes using Snowflake and related technologies.- Architect, develop, review, and optimize complex SQL queries and data models to support business intelligence and... 
    Long term contract

    Duck Creek Technologies India LLP

    work from home
    11 days ago
  •  ...and quality of data pipelines across the company. This is a hands-on position that requires deep expertise in SQL, AWS infrastructure, ETL design, and BI tooling.Key Responsibilities : - Lead and mentor a small team of data engineers, promoting technical excellence and best... 
    Remote job

    Aidewiser Soltek

    work from home
    11 days ago
  •  ...designing and implementing data pipelines to extract, transform, and load (ETL) Salesforce data. You will collaborate closely with clients to understand their data needs and requirements, and develop customized solutions to normalize, cleanse, and fine-tune their Salesforce... 
    work from home
    9 days ago
  •  ...engineers while establishing engineering best practices, coding standards, and governance models.- Design and implement high-performance ETL/ELT pipelines using modern Big Data technologies for diverse internal and external data sources.- Drive modernization initiatives... 
    Permanent employment
    Full time
    Remote job

    Hunarstreet Technologies Pvt Ltd

    work from home
    14 days ago
  •  ...Snowflake, and Airflow. Key Responsibilities: Data Integration & ETL: Design, implement, and manage ETL pipelines using Airbyte and...  ...and scalability. Data Modeling & Transformation: Develop and maintain data models using dbt/dbt Cloud. Transform raw data... 
    work from home
    2 days ago
  • Description :Position Overview :We are seeking an experienced Incorta Developer with a strong background in SQL and a secondary skill set in...  ...optimized SQL queries for data extraction, transformation, and loading (ETL). - Utilize PySpark for big data processing and integration with... 

    Bellfast Management Private Limited

    work from home
    14 days ago
  •  ...Senior Software Development Engineer in Test (SDET) specializing in ETL and data automation to join our team remotely. The ideal...  ...quality and automation coverage.Key Responsibilities :- Design, develop, and maintain scalable automation frameworks tailored for ETL pipeline... 
    Remote job

    Digivance Solution

    work from home
    14 days ago
  •  ...efficient data integration, and enable advanced analytics and reporting across the organization.Key Responsibilities : - Design, develop, and optimize ETL/ELT pipelines using Python, PySpark, and AWS Glue.- Implement data ingestion, transformation, and integration from diverse... 
    Remote job

    Intraedge Technologies Ltd.

    work from home
    6 days ago
  •  ...Type : Full-timeAbout the Role : We are seeking a Senior Palantir Developer to join our data engineering and analytics team. The ideal...  ...in SQL, Python, and data transformation logic.- Experience with ETL design, data modeling, and ontology development in Palantir.- Strong... 
    Full time
    Remote job

    QBrainX

    work from home
    14 days ago
  •  ...years of relevant experience and a strong background in designing, developing, and optimizing large-scale data pipelines and data warehouse...  ...replication, backup strategies, and ensuring data integrity and security.- ETL/ELT Development : Develop, construct, test, and maintain data... 

    Digihelic Solutions Private Limited

    work from home
    14 days ago
  • Description : Role : Lead Alteryx Developer.Job Summary : We are seeking a seasoned Lead Alteryx Developer with strong expertise in data automation...  ...relational databases such as SQL Server, Oracle, or Snowflake.- ETL Tools Familiarity with other ETL platforms (e.g., Informatica,... 

    Capco Technologies Pvt Ltd

    work from home
    1 day ago
  •  ...Senior Data Engineer with strong hands-on experience in SQL, PySpark, ETL processes, Data Lakes, and the Azure data ecosystem.The ideal...  ...performance of data workflows.Key Responsibilities : - Design, develop, and maintain scalable data pipelines using Python and PySpark.-... 

    Inxite Out

    work from home
    14 days ago
  •  ...restore procedures for business continuity. Data Integration & ETL Support Support and optimize ETL pipelines between MongoDB,...  ...MongoDB, S3, ADLS) Collaboration & Documentation Collaborate with developers, data scientists, and DevOps engineers. Maintain up-to-date... 
    work from home
    8 days ago