Get new jobs by email
- ...About Us CLOUDSUFI is a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise... ...self-starter, go-getter and team player Should have experience of working under stringent deadlines in a Matrix organization structureData
- ...Key Desirables : A minimum of 6+ years of experience in data modeling, data warehousing, and building ETL pipelines Strong experience in ETL platforms like Informatica ETL, Powercenter, IICS, or similar tools Deep knowledge of various big data technologies like BigQuery...Data
- ...Qualifications Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse,... ...goals Ability to lead others without direct authority in a matrixed environment. Differentiating Competencies Required...DataFor contractorsLocal areaRemote jobFlexible hours
- ...customers each day. At Circle K, we are building a best-in-class global data engineering practice to support intelligent business decision-... ...control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication...Data
- ...TELUS, our multi-billion dollar telecommunications parent. Required Skills : ~4+ years of industry experience in the field of Data Engineering ~ Proficient in Google Cloud Platform (GCP) services such as Dataflow, BigQuery, Cloud Storage and Pub/Sub ~ Strong understanding...DataShift work
- ...Roles and Responsibilities Design, develop, and maintain large-scale data pipelines using Azure Data Factory (ADF) to extract, transform, and load data from various sources into Azure Databricks. Develop complex ETL processes using Python scripts and SQL queries to process...Data
- Key deliverables: # Enhance and maintain the MDM platform # Monitor and optimize system performance # Troubleshoot and resolve technical issues # Support production incident resolution Role responsibilities: # Develop and refactor Python and SQL code # Integrate...Data
- ...Key Deliverables : # Build and optimize scalable data pipelines and infrastructure. # Automate data ingestion, transformation, and integration processes. # Collaborate with AI/ML teams to ensure quality data access. # Enhance performance and cost-efficiency in data...Data
- ...Job Summary Designing Data Vault models for hubs, links, and satellites Enterprise and Operational Information. Data Mapping: Mapping data from various source systems to the Data Vault model Understanding of Data Vault modeling tools like Coalesce, DBT Data Quality...Data
- ...Mandatory Skills- Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. Hands-on on technologies such as Apache Nifi, PostgresSQL, DBT...Data
- ...Open Location - Gurgaon and Bangalore Job Description ~4-7 years' experience working on Data engineering & ETL/ELT processes, data warehousing, and data lake implementation with Cloud services ~ Hands on experience in designing and implementing solutions like creating...Data
- VVV Tech30 is seeking a skilled and experienced Data Engineer to join our growing team. This role is crucial in ensuring the efficient and reliable processing of our payroll data. You will be responsible for designing, building, and maintaining data pipelines, transforming raw...Data
- ...Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy...DataFull timeWork at officeImmediate start
- ...commerce in India by combining world-class infrastructure, robust logistics operations, and technology excellence . About the Role: Data Enginee r We're looking for a Data Engine er who can design, optimize, and own our high-throughput data infrastructure. Are...Data
- ...Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines to collect, clean, transform, and load data from diverse sources. Build and optimize data warehouses and data lakes using modern cloud platforms (e.g., GCP, AWS, Azure). Work with structured...Data
- ...Key Responsibilities:1. Data Migration and Management: Execute data migration projects for Salesforce applications using Informatica, including data extraction, transformation, and loading (ETL). Develop and implement data migration strategies to ensure accurate and efficient...Data
- ...For Aptar - the only priority position right now is Data Engineer (Priority Role) Experience: 5+ years Notice period: Immediate Joiner only Location: Gurgaon, noida, pune, mumbai, banglore, Hyderabad, mohali, panchkula Core Responsibilities: • Data integration...DataImmediate start
- ...Job Title: Technical Data Project Manager Key Skills: Snowflake, Informatica, ETL, SQL , Project Management Experience: 10 - 15 Years Location: Greater Noida, Pune and Hyderabad Mode: Work from Office We at Coforge are seeking 'Technical Data Project Manager'...DataFull timeWork at office
- ...Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Data Engineering Good to have skills : NA Minimum...DataFull timeWork at office
- ...Design and Model DWH, pipelines and data products. Create new ETL processes and optimize existing ones, ensuring quality data deliverables. Work with various teams to define key metrics (KPIs) and provide the methodologies to accurately measure performance. Create...Data
- ...AWS or Cloud SQL Knowledge Should have knowledge on any one of these ETL Tools: Talend, Datastage, Informatica ISS (Cloud version), Data Fusion, Data Flow, Data Proc Must have SQL PL/SQL Scripting Experience Should possess mandatory Linux/Unix Skills Expertise in...Data
- ...Key Deliverables : # Build scalable data pipelines integrating Oracle and SQL Server data. # Automate data ingestion and processing for AI/ML workflows. # Optimize data storage, compute performance, and security compliance. # Collaborate with AI/ML teams for model-...Data
- ...Role: Data Architect Location: Gurugram Mode: Hybrid (3 days WFO) Experience: 12+ Years Employment type: Contract Duration: 6+ Months Key Responsibilities: · Define and drive the data architecture roadmap aligned with business and analytics goals. ·...DataContract workHybrid work
- ...Job Description Role: Data Engineer Skills Skills: Minimum of 6-9 years of experience hands on experience on AWS Glue, Python, Pyspark, good in writing codes . Exp. with Airflow and SQL. Authoring ETL processes using Python and PySpark ETL process monitoring...Data
- ...Responsibilities: Collaborate with business stakeholders to identify and document requirements for data engineering projects. Analyze existing data processes and workflows to identify opportunities for improvement and optimization. Work closely with data engineers...Data
- ...Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices...Data
- ...To leverage expertise in data architecture and management to design, implement, and optimize a robust data warehousing platform for the pharmaceutical industry. The goal is to ensure seamless integration of diverse data sources, maintain high standards of data quality and...Data
- Join us as a Data Engineer We're looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, you'll develop innovative, data-driven solutions through data pipelines, modelling...Data
- ...Job Title: Data Architect ( GCP / Databricks) Location: Noida, India Experience: 10+ Years About The Role We are seeking an experienced Data Engineer Architect to design, architect, and optimize large-scale enterprise data platforms . The ideal candidate will...Data
- ...Responsibilities Design, build, and maintain efficient and reliable data pipelines using Scala, Databricks, and Kafka. Collaborate with data scientists and analysts to understand data requirements and provide solutions that meet their needs. Optimize ETL processes...Data