Average salary: Rs6,066,186 /yearly
More statsGet new jobs by email
- ...Required Skills and Experience: Hands-on Experience in GCP Data Components: BigQuery Data Fusion Cloud SQL Data Management: Understanding of Data Lake and Data Warehouse concepts. Experience in Data Architecture design and Implementation on GCP. Project...Suggested
- ...Responsibilities: GCP Solution Architecture & Implementation: Implement and architect data solutions on Google Cloud Platform (GCP) , leveraging its various components... ...and Python . Required Skills: GCP Data Engineering Expertise: Strong experience with GCP Data...Suggested
- ...We are actively seeking a highly skilled and experienced GCP Data Engineer to join our client's team through Acme Services . This pivotal role requires very strong hands-on experience as a GCP Data Engineer with exceptional proficiency in SQL and PySpark . The ideal...SuggestedImmediate start
- ...Primary Skills: GCP- Big query, pub-sub, GCS, dataflow, SQL, Python Expereince: 3 to 8 years Notice Period- Immediate to 30 days... ...in software design and development experience in the data engineering field is preferred Hands on experience in GCP cloud data implementation...SuggestedImmediate start
- Description : Role : GCP Data EngineerExperience : 5+ yearsPreferred : Data Engineering BackgroundLocation : Bangalore, Chennai, Pune, Kolkata, HyderabadRequired Skills : GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection...Suggested
- ...Title: Data Engineer - PySpark About CentricaSoft: CentricaSoft is a data-driven technology partner delivering end-to-end data solutions... ...are looking for a PySpark Developer with solid AWS or Azure or GCP experience and in-depth knowledge of CI/CD pipelines with Parquet...SuggestedHybrid workRemote jobFlexible hours
- ...and creating more compassionate and connected communities. Role Description: This is a full-time on-site role for a Senior Data Engineer with 3 to 5 years of experience, and it is located in Chennai . The Senior Data Engineer will be responsible for developing and...SuggestedFull time
- ...Description We are looking for a skilled Data Engineer to join our team in India. The ideal candidate will play a crucial role in building and... ...Scala. ~ Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services. ~ Understanding of data modeling, data...Suggested
- ...About the role : We are looking for a Data Engineer with strong Databricks expertise to join our team. This role will be pivotal in building robust ingestion pipelines, creating curated data views, and normalizing legacy datasets from sources like SharePoint, Excel, and other...Suggested
- ...Senior Data Engineer - 5+ Years (Snowflake, AWS, AZURE) Availability - Immediate Time - CET We are seeking a highly skilled and motivated... ...Qualifications Experience with other cloud platforms (Azure, GCP) is a plus. Knowledge of data governance frameworks and tools...SuggestedImmediate start
- ...strategies, digital product and innovation, marketing, commerce, sales, and service. We are a team of strategists, data scientists, operators, creatives, designers, engineers, and architects, balancing business strategy, technology, creativity, and ongoing managed services to help...Suggested
- We are looking for an experienced data engineer to join our team. You will use various methods to transform raw data into useful data systems. For example, youll create algorithms and conduct statistical analysis. Overall, youll strive for efficiency by aligning data systems...Suggested
- GCP Vertex AI Engineer Notice Period : Immediate Or Serving NP (< 30 Days) Primary Skills : Hands-on Experience in AIML + GCP + VertexAI + PythonJob... ...16 years of hands-on experience with GenerativeAI, AI ML and Data Science technologies. - Extensive hands-on experience in...SuggestedImmediate start
- ...level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-... ...Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage...SuggestedFreelance
- ...Key Responsibilities: Data Architecture: Design and implement scalable storage solutions using Hadoop , NoSQL databases , HBase... ...: Work with data scientists and stakeholders; mentor junior engineers and provide technical guidance. Documentation: Document and communicate...Suggested
- ...Key Deliverables: Design and implement scalable, secure enterprise-grade data platforms Build real-time and batch data pipelines to support ML and analytics use cases Optimize cloud-based data infrastructure for performance, scalability, and compliance Enforce robust...Freelance
- ...Key Responsibilities: Design, develop, and maintain scalable and reliable ETL/ELT pipelines using GCP tools. Build and manage data warehouses and data lakes using BigQuery and Cloud Storage. Integrate data from multiple sources, ensuring high data quality and integrity...
- ...seeking a highly skilled Machine Learning Engineer with expertise in Python and a strong understanding of Hadoop and other Big Data technologies. You will be instrumental in developing... ...in model deployment using cloud platforms (e.g., AWS, GCP, Azure) or MLOps tools....
- ...~ We are seeking a highly skilled and experienced Senior Staff Data Engineer to join our dynamic team ~ The right candidate shall lead... ...workflows ~ Adeptness with cloud platforms (AWS / Azure / GCP) and utilization of cloud-native services for crafting robust data...
- ...seeking a motivated ETL Informatica Developer to join our dynamic data engineering team. The ideal candidate will be responsible for designing,... ...Have :- Exposure to cloud-based data platforms (e.g., AWS, Azure, GCP).- Knowledge of Informatica Administration, workflow scheduling...
- ...Bachelors or masters degree in computer science, Engineering, or a related field. ~5+ years of hands-on experience as a Data Engineer, preferably in a global organization.... ...Experience working with cloud platforms such as Azure or GCP, leveraging their data services. ~ Strong...
- ...What You'll Do Design, develop, and code Hadoop applications to process and analyze large data collections. Create and maintain scalable data processing frameworks for various business needs. Extract, transform, and load (ETL) data and isolate data clusters for analysis...
- Description : Job : ETL Informatica PowerCenter Developer.Location : Noida/Kolkata/Bengaluru.Type : Long Term Contract.Primary Skill (must have) : - ETL Developer (Informatica PowerCenter)Role & Responsibilities (what exactly the resource will be doing in the project) : - Resource...Long term contractPermanent employmentShift work
- ...EY and help to build a better working world. Career Family - Data Engineer The opportunity We are the only professional services organization... ...Platforms: Familiarity with cloud data platforms (AWS, Azure, GCP). CI/CD Practices: Exposure to CI/CD practices for data...Full time
- ...Job Title: Analytics Engineer Language: Excellent English proficiency (both written and verbal... ...of experience in building and optimizing data models, transforming raw data into... ...Experience with cloud platforms such as AWS, GCP, or Azure. Exposure to CI/CD pipelines,...Remote jobFlexible hours
- Description : Job Title : Senior Data EngineerLocation : Kolkata (Work from Office)Employment Type : Full-TimeExperience Required : 35... ...human progress.Position Overview : We are seeking a Senior Data Engineer to join our growing team in Kolkata. The ideal candidate will have...Full timeWork at office
- ...Role : We are seeking a Senior Palantir Developer to join our data engineering and analytics team. The ideal candidate will have hands-on experience... ...data systems, APIs, or cloud platforms (AWS, Azure, or GCP).- Excellent problem-solving, debugging, and communication skills...Full timeRemote job
- ...ArcGIS Desktop and FME Workbench / FME Server.The ideal candidate will be responsible for designing, developing, and automating spatial data workflows to support enterprise GIS initiatives.Key Responsibilities :- Develop, configure, and optimize FME ETL workflows for spatial...Hybrid work
- ...Experience : Python, PySpark, Amazon Redshift, PostgreSQLAbout the Role :We are looking for an experienced PySpark Developer with strong data engineering capabilities to design, develop, and optimize scalable data pipelines for large-scale data processing. The ideal candidate must...Immediate startWorking Monday to Friday
- ...PDL Continuous Flows and Continuous Components ACE/BRE ICFF Generic graphs Parallelism, multiple file system (mfs) XML Data Handling, Vector Data handling Unix, Oracle, Scheduling Tool Performance Tuning & Generic process development Concepts of...
