Average salary: Rs1,360,000 /yearly
More statsSearch Results: 338 vacancies
Rs 12 - 16 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Microsoft SQL Server,Unix to Linux Migration,Data Warehouse ETL Testing,Amazon Web Service Job Requirements :...
Rs 12 - 16 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Hadoop, Job Requirements : Key Responsibilities : A: Create scala/spark jobs for data transformation and...
Rs 7 - 15 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good To Have Skills : Python Programming Language Job Requirements : Key Responsibilities: a Lead the team and contribute to the...
...Must have Skills: Apache Spark
Good to Have Skills: Data Warehouse ETL Testing
Key Responsibilities:
A: The resource will write and review complex SQL statements
B: The resource will work on ETL preferably on OWB
C: The resource will work on database...
Rs 12 - 20 lakhs p.a.
...Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have Skills : Apache Spark Good to Have Skills : No Technology Specialization Job Requirements : Key Responsibilities : 1 Set-up and configure data bricks for...
Rs 7 - 11 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must Have Skills : Apache Spark Good To Have Skills : Job Requirements : Key Responsibilities: AStrong experience in creating ScalaSpark jobs for data...
Key Responsibilities :Develop, implement, and maintain Spark applications for data processing and analytics.Design and optimize Spark jobs... ...Computer Science, Engineering, or a related field.Proficiency in Apache Spark 3.X and SCALA programming.Hands-on experience with Delta...
Rs 7 - 10 lakhs p.a.
...Job Title: Java Apache Camel Developer (5+ Year Experience)
Location: Gurugram / Bangalore /Noida
Position Type: Full-time (WFO)/Remote
About Us:
Solvexis Consulting Private Limited is a dynamic and innovative IT consulting firm specializing in SAP solutions...
...Architect to join our dynamic team with expertise in Scala, Kafka, Spark, Big Data, and Batch Processing. As a Data Architect, you will... ...proficiency in Scala programming language.- In-depth knowledge of Apache Spark for large-scale data processing.- Experience in designing and...
Rs 7 - 11 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Kafka Good to Have Skills : Spring Boot Job Requirements : Key Responsibilities : A: Candidate should able to Analyze the requirements...
...stewards / data custodians / end users etc- Demonstrated experience of building efficient data pipelines using tools or technologies like - Spark, Cloud native tools like EMR, Azure data factory, Informatica, Abinitio etc.- Understand the need for both batch and stream ingestion...
Rs 12 - 16 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Kafka Good to Have Skills : No Technology Specialization Job Requirements : Key Responsibilities : 1:Should perform the Software design...
...engineering workflow- Experience with S3, EC2, EMR, Lambda, Glue, Athena, Airflow, Redshift.- Strong experience with SQL and Scala skills using Spark framework.- Experience with enterprise Data warehousing implementations using Batch and real time systems.- Expertise in working with...
Location : Chennai, Pune, Noida, Kochi, Hyderabad, Trivandrum.Job Description :- Proficient in SQL, Spark, Scala, and AWS, with a strong command over these technologies.- Minimum 6-7 years of relevant experience in Spark and SQL, plus 2-3 years of hands-on practice in AWS.-...
...Candidate should have worked in at least one Production project, not only understanding or theoretical knowledge3. Should have basic of Spark and Kafka -- at least good POC experience. At least basic knowledge of Spark with Java or Scala4. HIVE, SQL, HDFS, SQOOP - hands on...
...build robust data architecture.Extensive experience in data modeling and database design.At least 6+ years of hands-on experience in Spark/Bigdata Tech stack.Stream processing engines - Spark Structured Streaming/Flink.Analytical processing on Big Data using Spark.At least...
...Production environment.Should have good working experience on :- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Spark - Batch Processing- Setting ETL pipelines- Python or Java programming language is mandatory.- Worked in data Lake , Data warehousing Domain...
Rs 20 - 21 lakhs p.a.
...Responsibilities
Roles And Responsibilities Of Cloudera & Spark Developer
Secure data management and portable cloud-native data analytics delivered in an open, hybrid data platform. Whether you're powering business-critical AI applications or real-time analytics at scale...
Rs 8 - 20 lakhs p.a.
...Skills Required: Python + AWS/Airflow (Candidate should be very good in Python) Spark, SQL.
Experience: 3-8 years
Salary: 8 to 20 LPA
Location: Bangalore
No. of positions: 4 (2 SSE, 2 TC)
Job Description
Role requires experience in AWS...
Job Description :We are seeking a highly skilled and motivated Data Lead with expertise in Spark, Scala, Kafka, Big Data, and Batch Processing. As a Data Lead, you will play a key role in leading and managing the end-to-end data processing pipeline, ensuring the reliability...