Average salary: Rs120,000 /yearly
More statsGet new jobs by email
- ...candidate will lead end-to-end migration initiatives, provide technical leadership, and ensure high-quality ETL solutions for enterprise data integration.Must-Have Skills :- Strong hands-on experience in Informatica PowerCenter (development, enhancements, performance tuning)-...Data
- ...Req ID: 343256 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica Developer to...DataLong term contractHybrid workWork at officeRemote jobFlexible hours
- ...Description Hindi language is compulsory We are seeking a detail-oriented Data Entry Operator to join our team. The ideal candidate will be responsible for inputting, updating, and maintaining data across various platforms while ensuring accuracy and confidentiality....Data
- ...Actively hiring Data Entry Specialist Start Date Starts Immediately CTC (ANNUAL) ₹ 3,00,000 - 8,00,000 ₹ 3,00,000 - 8,00,000 /year...DataRemote jobPart timeImmediate start
$ 72800 p.a.
...documents for cleaning and process validation of equipment and products in compliance with site SOPs and regulatory guidance. Provides data to support management evaluation of performance trends. Owns quality records (change control, CAPA’s, deviations) and delivers to...DataFull timeLocal areaRelocationShift workWeekend work- ...events, including logistics and materials. 5. Documentation: Prepare and manage reports, presentations, and confidential documents. 6. Data management: Maintain records and files, ensuring accuracy and confidentiality. 7. Project assistance: Support the executive team with...DataRemote jobWork at officeImmediate start
- ...Responsibilities: Data Pipeline Architecture: Design, develop, and optimize end-to-end data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse. Ensure data quality, reliability, and performance throughout the pipeline. Data...Data
- ...packages etc.) ~Support budgeting and bookkeeping procedures ~Create and update records and databases with personnel, financial and other data ~Track stocks of office supplies and place orders when necessary ~Submit timely reports and prepare presentations/proposals as...DataPart timeInternshipWork at office
- ...Job description: We are seeking a highly experienced Senior Databricks Data Engineer to design, build, and optimize our data lake house environment. You will leverage your 10+ years of expertise to develop complex data pipelines, ensure data quality, and drive innovation...Data
- ...Key Responsibilities: Understand and analyze data architecture of various source systems and map data efficiently into Master Data systems. Manage and maintain master data using MDM tools such as Boomi Master Data Hub. Perform data profiling, data analysis, and ensure...Data
- ...Job Responsibilities: Develop and maintain data pipelines for large-scale data processing. Work with streaming data technologies, including Kafka, kSQL, and Mirror Maker. Design and implement near real-time data streaming solutions. Optimize ETL processes for performance...Data
- ...Share resume at [HIDDEN TEXT] Role Overview We are looking for a QA Consultant with strong hands-on experience in ETL/Data Pipeline Testing and API Testing. Experience with MDM Testing is preferred. The ideal candidate will have strong SQL skills, good understanding of data...DataLong term contract
- ...Optimize and enhance existing ETL workflows for increased performance and reliability. Collaborate with cross-functional teams to gather data requirements and translate them into technical specifications. Perform data profiling, cleansing, and validation to ensure data...DataFull time
- ...Description We are seeking an experienced Syniti Developer to join our team in India. The ideal candidate will have a strong background in data management and integration, with a focus on using Syniti tools to deliver high-quality data solutions. Responsibilities Design,...Data
- ...ImmediateRole Overview :We are seeking an experienced Python Developer with strong programming skills (not scripting) to develop and optimize data connectors for DataHub/Acryl Data. The candidate should have hands-on experience in data engineering concepts, building ETL components,...DataRemote job
- Description : Looking for a Senior Data Engineer skilled in Azure Databricks, Apache Spark (PySpark), and Unity Catalogue to design, optimise, and govern large-scale data pipelines on Azure.Responsibilities : - Build and optimise ETL/ELT pipelines using Databricks and PySpark...Data
- ...Responsibilities :- Design, develop, and optimize Tableau dashboards and reports.- Write efficient SQL queries to extract, transform, and analyze data from multiple sources.- Collaborate with business and engineering teams to understand requirements and translate them into data...Data
- ...This is a remote position. Have experience of 10+ years Design and develop scalable, high-performance data architecture solutions on Snowflake. Create and maintain data models. Integrate Snowflake with various data sources and ETL (Extract, Transform, Load) tools....DataRemote job
- ...Oracle HCM Cloud Consultant with strong techno-functional expertise to design, develop, and manage HCM BI and OTBI reports, dashboards, and data integrations.The ideal candidate will demonstrate leadership in solution delivery, mentoring junior team members, and ensuring seamless...Data
- ...growing team. As a key member of our consulting practice, you will be responsible for designing, developing, and implementing cutting-edge data solutions leveraging the Snowflake Data Cloud. Responsibilities: Solution Design Architecture: Lead the design and architecture of...Data
- ...Key Responsibilities: Architect and implement end-to-end data solutions using Azure/AWS/GCP Databricks . Design and develop data lakehouses, ETL/ELT pipelines , and data integration frameworks. Define best practices for data ingestion, transformation, orchestration...Data
- ...ProArch is looking for a highly skilled and experienced Azure Data Engineer to join our team. As an Azure Data Engineer, you will be responsible for designing, implementing, and maintaining data solutions using the Azure data platform. Responsibilities: Design, develop...DataRemote job
- Description : Job Title : SAP Data Migration Specialist (Syniti)Experience : 5 to 8 YearsLocation : RemoteShift : 7 : 00 AM StartJob Description : - We are looking for an SAP Data Migration Specialist with experience in Syniti tools to handle end-to-end data migration activities...DataEarly shift
- ...Implementing ETL processes . Creating comprehensive test suites using Python . Validating data quality through advanced SQL queries . Collaborating with Data Scientists, Engineers , and Software teams . Developing and monitoring data tools, frameworks...Data
- 3-5 years of relevant experience in experience in Sql,PL/SQL,Python • 2-3 years experience working with Cloud data platforms such as Snowflake,Redshift,Databricks etc •2-3 years experience in AWS (DMS,Glue,Lambda) • 3-5 years experience in ETL/ELT ,Datamodelling such as Star...Data
- ...operational best practices. Design, develop, and manage ETL/ELT pipelines using Python (PySpark) in Databricks. Leverage Unity Catalog for data lineage, security, and governance management. Implement and maintain CI/CD pipelines for Databricks deployments using Git and DevOps...Data
- Description :Responsibilities :Validation Framework Development :- Build scalable rule-based, statistical, and pattern-driven data validation frameworks using Pandas, NumPy, and distributed processing tools.- Develop mechanisms for contradiction detection, record reconciliation...Data
- ...10 AM GST to 7 PM GST).Employment Type : Contractual 2 months & Extendable.Required Skills & Experience :- 3+ years of experience as a Data Engineer or similar role.- Hands-on experience with Databricks (SQL, PySpark, Delta Lake).- Strong proficiency in Python and SQL for data...DataContract workRemote job
- ...Databricks Solution Architecture: Design and implement scalable, secure, and efficient Databricks solutions that meet client requirements.2. Data Engineering: Develop data pipelines, architect data lakes, and implement data warehousing solutions using Databricks.3. Data Analytics:...Data
- Description :Roles & Responsibilities :Data Ingestion & Pipeline Development :- Design, build, and maintain scalable data ingestion pipelines for structured and unstructured data.- Develop batch and real-time data processing workflows to support analytics and business reporting...Data
