Average salary: Rs1,223,384 /yearly
More statsRs 10 lakh p.a.
...Job Description :
The ETL Developer will provide support on projects including designing, building, and maintaining metadata models and complex ETL packages. This position is responsible for data reporting, converting customer requests into technical details, and ensuring...
Primary Purpose :We are seeking a talented and motivated Qlik Developer to join our team, playing a pivotal role in creating automated, scalable data models and designing BI Applications using Qlik for our eClinical platform within the AWS cloud infrastructure. Your main duties...
Job Description. As a Salesforce Data Cloud Developer you will be responsible for. - Configure and customize Salesforce Data Cloud components, including Data Ingestion, Cleansing,. Validation, transformation, Data Mapping, Identity Resolution, Insight building, Segmentation,...
...several countries, we serve global clients across multiple time zones. For more information please login to
Job Description: Big data developer
About the role:
Expertise on Data processing, Orchestration, Parallelization, and Transformation
Expertise in Spark Job...
Rs 6 - 30 lakhs p.a.
...Required Skills: Spark Scala/Python Programming, Big Data[Hive]
Roles & Responsibilities:
Design and develop data applications using selected tools and frameworks as required and requested.
Working on disparate data sets.
Process unstructured data into a form suitable...
...employs approximately 17,000 people globally.
Job Description
Develop and maintain responsive dash applications using modern front-end... ...design principles
~ Experience with PowerBI or other data visualisation tools
~ Experience with modern cloud-based data &...
Job Description :We are seeking a highly skilled and motivated Snowflake Developer with 2 to 5 years of experience to join our team. The ideal candidate will have a strong background in data warehousing and hands-on experience with Snowflake cloud data platform. As a Snowflake...
...DESCRIPTION:Must Have: - Min 3 years' Experience with Building Pipelines using GCP-Big-Query - Min 2-3 Experience in working on Large Scale Data warehouses like Teradata - Experience building and optimizing data warehousing data pipelines (ELT and ETL), architectures and data...
Job Description:We are looking for a talented SAS Developer with specialized skills in SAS DI (Data Integration) to join our team. The ideal candidate will possess strong proficiency in SAS DI along with experience in MSSQL and additional technical competencies as outlined...
Job Description- Must have 4+ years of IT experience- Must have good experience in Spark and Scala- Develop and maintain big data applications using Spark and Scala, ensuring high performance, reliability, and scalability.- Implement Spark data processing techniques, performance...
...challenges and deliver immediate value.
We are seeking a skilled and experienced Analytics Developer to join our dynamic team. The ideal candidate will have a strong background in building data warehouses, building and maintaining data pipelines and transformations. Proficiency...
Mandatory Skills:- Rest API, ETL, JSON, Data Integration, SQLResponsibilities:- Design, develop, and implement ETL processes to extract, transform, and load data from various sources into data warehouses or data marts.- Collaborate with cross-functional teams to understand...
Roles & Responsibilities :- 5 + Exp in Qlik.- Should have good understanding of Data Modelling and Set Analysis operations along with SQL scripting.- Good understanding of Visualization, whereby they provide the customer with the required reports i.e. charts/graphs etc- Good...
About the Role :- We are seeking a highly skilled and versatile ETL Developer to join our team and play a key role in designing, developing, and maintaining our data integration pipelines.- You will leverage your expertise in various tools and technologies to extract, transform...
...using Ab Initio's Graphical Development Environment (GDE), ensuring data accuracy, consistency, and availability. - Data Integration :... ...addressing and resolving bottlenecks as needed. - Error Handling : Develop robust error handling and logging mechanisms to track and manage...
Job Overview :We are looking for a seasoned Python Developer with over 10 years of experience in IT, with a focus on data engineering. The ideal candidate will have extensive knowledge of Python, Pyspark, Azure Databricks, ADF, SQL, and data warehouse concepts. This role requires...
Position Overview :We are seeking a talented SnapLogic Developer to join our team. As a SnapLogic Developer, you will be responsible for designing... ...the SnapLogic platform. You will leverage your expertise in data integration, API management, and cloud technologies to build...
...Opening for Big data Developer
Primary Location : India-Bangalore (Pritech)
Experience: 3.5 to 7yrs
Mode of work : 3 days work from office and 2 days work from home
Mode of interview : F2F - Face to face on 4th May
Mandatory Skills:
Sound knowledge on SPARK...
...Designation: Data Engineer- Informatica developer
Job Location: Bangalore(Hybrid)
Experience: 4-6 Years
Job Description
About the Role
# Design, develop, and implement data integration workflows using Informatica PowerCenter or other Informatica products...
Skill : Data Informatica DeveloperExperience : 7- 12 YearsJob Type : Full-Time EmploymentNotice Period : Immediate to 15 Days OnlyRole & Responsibilities :- We are seeking Data Developer with a minimum of 7-12 years of work experience in designing, developing, testing and deployment...
Job Description : - Person should have 6+ years of hands-on experience in BI and 5 + Exp in Qlik. - Should have good understanding of Data Modelling and Set Analysis operations along with SQL scripting. - Good understanding of Visualization, whereby they provide the customer...
Job Summary :- We are seeking an experienced ETL Developer to join our team.- The ETL Developer will be responsible for designing, developing, implementing, and maintaining ETL processes for our data warehouse.- The ideal candidate will have a strong understanding of ETL principles...
...HiveQL along with basic performance tuning knowledge
Proficient understanding of Business Intelligence solutions: operational
Perform data formatting, assess the suitability and quality of candidate data sets and data analysis capabilities will be a plus
Basic knowledge...
...your talent and ambition to make a difference. We will create a world of opportunities for you.
Job Details
Job Title : Big Data – Kafka & Airflow expert
Location: Bangalore / Pune / Hyderabad / Noida / Kolkata
Quick joiners needed.
Key Responsibilities...
Requirements :- Bachelor's Degree in Computer Science or equivalent degree is required.- 7 to 12 years of data engineering experience around database marketing technologies and data management, and technical understanding.- Strong hands-on experience in opensource components...
Position : Tableau DeveloperJob description :- Design, develop, test, and maintain Tableau reports and dashboards based on user requirements... ...with Tableau reports and dashboards. - Develop Tableau data visualizations and dashboards using different levels of Tableau calculations...
Job Title : Java DeveloperJob Summary : We are seeking a skilled Java Developer with expertise in core Java, multithreading, collections, data structures, algorithms, and databases. The ideal candidate will have a strong background in developing high performance, scalable, and...
Job Description :Job role : We are seeking a skilled BI Developer to join our team in the vibrant cities of Bangalore, Hyderabad, or Pune. As... ..., and maintaining business intelligence solutions that empower data-driven decision-making across the organization. Key Responsibilities...
...of experience building backend systems using any of the following languages : Java.- Candidate should have strong experience with DSA (Data Structure Algorithm).- Someone who takes ownership of delivery thereby not only closing your own work but also identifying dependencies...
...Job Description:
Data & Analytics
- Ability to handle large scale datasets
- Hands-on experience in handling batch and real-time data processing on Google Services (pub-sub, Kafka etc)
- Ability to migration analytics data from DynamoDB to Cloud Spanner, BigQuery...