Get new jobs by email
  •  ...Our technology services client is seeking multiple GCP Data Developer- BigData, ETL to join their team on a Full Time basis. Below are further details about the role: Role: GCP Data Developer- BigData, ETL Experience: 5-12 Years Location: Bangalore, Gurugram... 
    Big Data
    Full time
    Immediate start
    Gurgaon
    1 day ago
  • Looking for a Senior Big Data Developer to design, build, and optimize data pipelines and analytics solutions. Candidates with prior exposure to...  ...data pipelines using Hadoop, Spark, Kudu, and HBase.- Implement ETL/ELT workflows and ensure data integrity and scalability.- Work... 
    Big Data
    Chennai
    1 day ago
  •  ...Responsibilities Work closely with our data science team to help build...  ...data from disparate sources Develop models that can be used to make...  ...tools ~ Familiarity with the Bigdata & Google Cloud ~ Solid work...  ...in building or maintaining ETL processes ~ Professional certification... 
    Big Data
    Chennai
    1 day ago
  •  ...Job Description: Must have : Bigdata ,GCP (Bigquery, Dataproc) We are looking for energetic, high-performing and highly skilled data engineers to help shape our technology and product...  ...Hive, Kafka & Java. Focus: Designs, develops, solves problems, debugs, evaluates,... 
    Big Data
    Bangalore
    2 days ago
  •  ...GENERAL FUNCTION: The data engineer designs and builds platforms, tools, and solutions...  ...solutions in any of the following domains ETL, business intelligence, analytics, persistence...  ..., and related topics. Work with developers to Build CI/CD pipelines, Self-service Build... 
    Big Data
    Chennai
    18 days ago
  •  ...Role : Senior Data Engineer Location : Pune Onsite Full time permanent position with Vivid...  ..., and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas. Developing complex SQL queries and SQL optimization.... 
    Big Data
    Permanent employment
    Full time
    Pune
    1 day ago
  •  ...online merchants to accept, process, and disburse payments through developer-friendly APIs. We are focused on building plug-and-play...  ...Bengaluru, and Gurugram. Job Description Experience with ETL, Data Modeling, and Data Architecture. Design, build and operationalize... 
    Big Data
    Full time
    Pune
    3 days ago
  •  .... Since our inception in 2019, we have developed and operate a proprietary loyalty solution...  ...operational management, App Development, Data analysis, Quality Assurance, and Solutioning...  ...implement data models, algorithms, and ETL processes to support data integration and... 
    Big Data
    Hybrid work
    Work at office
    Flexible hours
    Bangalore
    1 day ago
  • Description :Job Title : Big Data DeveloperLocation : Bangalore (Work from Office...  ....Key Responsibilities :- Design and develop scalable data pipelines and ETL processes.- Work with large datasets...  ...), Azure (Data Lake, Databricks), or GCP (BigQuery, Dataflow).- Familiarity with... 
    Big Data
    Work at office
    Immediate start

    Infobell IT

    Bangalore
    2 days ago
  •  ...We are seeking an experienced GCP Big Data Engineer with 8–10 years of expertise in designing, developing, and optimizing large-scale data processing solutions. The ideal...  ...mentor teams in designing scalable and efficient ETL pipelines on Google Cloud Platform (GCP) .... 
    Big Data
    Bangalore
    1 day ago
  •  ...Insight Global is looking for a GCP Data Engineer to work remotely in Gurgaon, India. GCP Data Engineer Location: Remote - candidates...  ...in Big Query Experience with Pyspark, Dataflows and Dataproc, ETL to process and transform data Experience with Apache Airflow... 
    Big Data
    Contract work
    Local area
    Remote job
    Gurgaon
    2 days ago
  •  ...Shift 1pm to 10pm/3 pm to 12 am Key Responsibilities Design, develop, and optimize data pipelines and ETL workflows using Databricks. Implement scalable data integration solutions for large datasets across diverse data sources. Build and maintain data architectures... 
    Big Data
    Shift work
    Pune
    1 day ago
  •  ...Job Title: GCP Data Engineer Location: Pune, India Experience: 4 to 7 Years Job Type: Full-Time Job Summary: We...  ...pipelines on GCP, leveraging Dataproc, BigQuery, Cloud Storage. Develop and manage ETL/ELT workflows using Apache Spark, SQL, and Python.... 
    Big Data
    Full time
    Pune
    1 day ago
  •  ...Role:- GCP Data Engineer/GCP Architect Experience:-6+ Years Location :- Pan India Role & responsibilities Cloud Skills: Strong...  ...Data Engineering: Experience in designing and implementing ETL/ELT workflows and data pipelines. Familiarity with data modeling... 
    Big Data
    Bangalore
    11 days ago
  •  ...Job Title: Quality Engineer (Data) Interview Details Interview Date: 21st Aug 2025...  ...NoSQL) and data formats (CSV, JSON, XML). ETL Testing – Perform ETL validations across...  ...Strong communication skills to work with developers, QA engineers, and stakeholders, including... 
    Big Data
    Work at office
    Bangalore
    2 days ago
  •  ...client attention to serve them in over 50 Countries. Position: GCP Data Engineer Location: Pune Work Type: Hybrid Job Type: Full...  ...using PySpark for efficient and scalable data processing. ETL Workflow Development ~ Building and maintaining ETL workflows for... 
    Big Data
    Long term contract
    Full time
    Hybrid work
    Local area
    Pune
    1 day ago
  •  ...Our technology services client is seeking multiple GCP Data Engineer to join their team on a contract basis. These positions offer a strong potential for conversion to full-time employment upon completion of the initial contract period. Below are further details about the... 
    Big Data
    Full time
    Contract work
    Immediate start
    Chennai
    1 day ago
  •  ...Profile : AWS Data Engineer Mandate Skills : AWS + Databricks + Pyspark + SQL role Location : Bangalore /Pune /Hyderabad...  ...for performance, scalability, and cost-efficiency Develop and manage ETL/ELT processes with schema transformations and data slicing/dicing... 
    Big Data
    Immediate start
    Remote job
    Gurgaon
    2 days ago
  •  ...Position Summary – Big Data GCP Developer GCP + Bigdata (any programming ) Location: Gurgaon Experience: 3.5-6.5 yrs Work Mode-Hybrid Interview Mode: virtual Joining: Immediate joiners preferred Mandatory Skills: • Strong experience in Big Data, PySpark... 
    Big Data
    Full time
    Hybrid work
    Immediate start

    Boolean Staffing Recruitment Solutions Pvt Ltd

    Gurgaon
    a month ago
  •  ...more than a year. Role: Sr. Data Engineer (Cloud big data Engineer...  ..., Python, Spark, SQL, GCP/AWS Duration of career break...  ...JD ~4-7 years of experience developing, delivering, and/or supporting...  ...~ Experienced in developing ETL/ELT processes using Apache Ni-Fi... 
    Big Data
    Relocation
    Secunderabad
    18 days ago
  •  ...for a highly experienced Senior Data Engineer with deep expertise in...  ...have a proven track record in ETL processes, cloud-based data architecture...  ...Responsibilities Design and develop robust, scalable, and secure...  ...with Google Cloud Platform (GCP) or other cloud providers.... 
    Big Data
    Pune
    1 day ago
  •  ..., agencies, system integrators, solution providers, and other data sources to integrate relevant datasets into the data exchange...  ...Responsibilities : Evaluate APIs and datasets, create data models, develop software ETL modules, perform unit testing, and deploy them in cloud... 
    Big Data
    Jalgaon
    1 day ago
  •  ...We are looking for an experienced GCP Data Engineer with 5–10 years of experience in Google Cloud Platform (GCP) services and Big Data Analytics solutions . This is an exciting opportunity for professionals passionate about designing and implementing scalable data engineering... 
    Big Data
    Bangalore
    1 day ago
  •  ...and contribute to our healthcare data engineering initiatives. This...  ...Responsibilities Design and develop scalable real-time data streaming...  .... Architect and implement ETL/ELT pipelines using Azure Databricks...  ...a preference for Azure (AWS/GCP is a plus). Strong knowledge... 
    Big Data
    Noida
    1 day ago
  •  ...growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to...  ...knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc,...  ...technologies and software paradigms Developing and implementing an overall organizational... 
    Big Data
    Bangalore
    2 days ago
  •  ...Sftware Engineer || Technical Analyst Skill -  ETL || SQL || DWH || SSIS || Insurance-P&C...  ...Bangalore Hands on Experience in ETL, Data Warehousing Concepts, Tools, Software...  ...Knowledge and experience in writing and developing well-designed, testable, reusable efficient... 
    Immediate start
    Pune
    11 days ago
  •  ...Engineering team.- Designing and building data pipelines from data ingestion to consumption...  ...NOSQL, SQL etc.- Responsible to design and develop distributed, high volume, high velocity multi...  ...should have data processing ability (ETL techniques) using have scripting experience... 
    Big Data
    Hybrid work
    Bangalore
    1 day ago
  •  ...Position Overview Job Title: Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) , Assistant Vice President Location: Pune, India Role Description Senior engineer is responsible for developing and delivering elements of engineering solutions to accomplish business... 
    Big Data
    Flexible hours
    Pune
    1 day ago
  •  ...Role: Sr GCP Data Engineer (Google Cloud Platform) Location: Remote - India Job Description Develop, construct, test and maintain data acquisition pipelines for large volumes of structed and unstructured data. This includes batch and real-time processing (in google... 
    Big Data
    Remote job
    Chennai
    3 days ago
  •  ...About The Role We are actively hiring a Data Engineer (SDE2 level) with strong expertise in Core Python, PySpark...  ...Responsibilities Data Engineering & Development : Design, develop, and maintain scalable and efficient ETL pipelines using Core Python and PySpark. Work with... 
    Big Data
    Hybrid work
    Bangalore
    1 day ago