Average salary: Rs370,249 /yearly
More statsSearch Results: 246 vacancies
Rs 12 - 16 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Microsoft SQL Server,Unix to Linux Migration,Data Warehouse ETL Testing,Amazon Web Service Job Requirements :...
Rs 7 - 15 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good To Have Skills : Python Programming Language Job Requirements : Key Responsibilities: a Lead the team and contribute to the...
Rs 7 - 11 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must Have Skills : Apache Spark Good To Have Skills : Job Requirements : Key Responsibilities: AStrong experience in creating ScalaSpark jobs for data transformation...
Rs 12 - 16 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Hadoop, Job Requirements : Key Responsibilities : A: Create scala/spark jobs for data transformation and...
...Candidate should have worked in at least one Production project, not only understanding or theoretical knowledge3. Should have basic of Spark and Kafka -- at least good POC experience. At least basic knowledge of Spark with Java or Scala4. HIVE, SQL, HDFS, SQOOP - hands on...
...Mumbai, Pune, Nagpur, Indore, Delhi/NCR, Ahmedabad.Job Description :- 6+ years of overall Data Analytics and BI experience- Experience in Spark, Hive, Scala.- Build data pipelines for ETL that fetch data from variety of sources such as flat files relational databases and APIs-...
...Must have Skills: Apache Spark
Good to Have Skills: Data Warehouse ETL Testing
Key Responsibilities:
A: The resource will write and review complex SQL statements
B: The resource will work on ETL preferably on OWB
C: The resource will work on...
Key Responsibilities :Develop, implement, and maintain Spark applications for data processing and analytics.Design and optimize Spark jobs for performance and scalability.Implement Delta lake solutions for efficient data storage and management.Build streaming solutions for...
...Mathematics, Computer Engineering, ManagementSkills for Lead Data Engineer :Desired skills for lead data engineer include :- Python- Spark- Java- Hive- SQL- Hadoop architecture- Large scale search applications and building high volume data pipelines- Message queuing- NoSQL...
...Skills : Python , Pyspark, Azure data bricks, data factor, data lake, SQL.- Deep knowledge and experience working with Python/Scala and Spark- Experienced in Azure data factory, Azure Data bricks, Azure Data Lake, Blob Storage, Delta Lake, Airflow.- Experience working with...
Job Description :Technical Expertise :- Should have experience of working on Microsoft Azure tools like Spark, Databrick, Synapse (knowledge of these will be an added advantage).- Should be very strong on BI and EDWH concepts.- Must have good experience on working on Microsoft...
Rs 12 - 20 lakhs p.a.
...Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have Skills : Apache Spark Good to Have Skills : No Technology Specialization Job Requirements : Key Responsibilities : 1 Set-up and configure data bricks for an...
...Production environment.Should have good working experience on :- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Spark - Batch Processing- Setting ETL pipelines- Python or Java programming language is mandatory.- Worked in data Lake , Data warehousing Domain...
Rs 20 - 21 lakhs p.a.
...Responsibilities
Roles And Responsibilities Of Cloudera & Spark Developer
Secure data management and portable cloud-native data analytics delivered in an open, hybrid data platform. Whether you're powering business-critical AI applications or real-time analytics at scale...
...stewards / data custodians / end users etc- Demonstrated experience of building efficient data pipelines using tools or technologies like - Spark, Cloud native tools like EMR, Azure data factory, Informatica, Abinitio etc.- Understand the need for both batch and stream ingestion...
...YearsLocation - Gurugram/Delhi NCRNotice Period - Max 30 daysJob Description : We are looking for an experienced MLOPs Engineer with expertise in Spark/PySpark, MLOps/LLMops/DLOps, CI/CD, Kafka, Python, distributed computing, GitHub, data pipelines, cloud hosting, Azure services,...
Job Description : - Experience with design and coding across one or more platforms and languages (e.g. Java, Spark/SQL) as appropriate- Hands-on expertise with application design, software development and automated testing Proficient in Big Data technologies - Designs, codes...
Job Title: Data Engineer (Spark, Scala, Java)Location: Gurgaon (Hybrid Mode).Experience: 4-10 Yrs.Job Description:We are seeking a highly skilled Data Engineer with expertise in Apache Spark, Scala, and Java to join our dynamic team. The ideal candidate will have a strong background...
Rs 8 - 20 lakhs p.a.
...Skills Required: Python + AWS/Airflow (Candidate should be very good in Python) Spark, SQL.
Experience: 3-8 years
Salary: 8 to 20 LPA
Location: Bangalore
No. of positions: 4 (2 SSE, 2 TC)
Job Description
Role requires experience in AWS...
...years’ strong experience in web application development using Java
~· Prior experience in building data lakes or data pipe lines using Spark
~· Strong experience in Data Engineering practices.
~· Good Hands on with streaming technologies like Kafka/ Pulsar
~· Must have...