Average salary: Rs1,375,000 /yearly
More statsSearch Results: 387 vacancies
Job Description :As a Kafka Architect specializing in Spark and Apache Server, you will play a key role in designing, architecting, and implementing real-time data streaming solutions using Apache Kafka, Apache Spark, and related technologies. You will work closely with our...
Rs 12 - 16 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Hadoop, Job Requirements : Key Responsibilities : A: Create scala/spark jobs for data transformation and...
Rs 12 - 16 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Microsoft SQL Server,Unix to Linux Migration,Data Warehouse ETL Testing,Amazon Web Service Job Requirements :...
Rs 7 - 15 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good To Have Skills : Python Programming Language Job Requirements : Key Responsibilities: a Lead the team and contribute to the...
Rs 12 - 20 lakhs p.a.
...Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have Skills : Apache Spark Good to Have Skills : No Technology Specialization Job Requirements : Key Responsibilities : 1 Set-up and configure data bricks for...
..., particularly utilizing Flink and Kafka, alongside expertise in Spark and AWS technologies. As a Data Engineer, you will play a crucial... ...Design, develop, and optimize stream processing pipelines using Apache Flink and Kafka to process real-time data streams efficiently.Data...
...(Must-have).2+ years of experience in python programming (Must-have) .Sound knowledge of distributed systems and data processing with spark.Knowledge of any tool for scheduling and orchestration of data pipelines or workflows (preferred Airflow)(must to have)1+ years experience...
Rs 12 - 30 lakhs p.a.
...Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have Skills : Apache Kafka Good to Have Skills : Functional Test Planning Job Requirements : Key Responsibilities : a : Work as a Kafka developer Implement...
Job Profile : Spark ( Pyspark ) DeveloperIndustry Type : IT Services Job description :- The developer must have sound knowledge in Apache Spark and Python programming.- Deep experience in developing data processing tasks using pySpark such as reading data from external sources...
...our team.- You will play a key role in designing, developing, and maintaining large-scale data processing pipelines using Python and Spark/PySpark.- Your expertise in distributed computing frameworks and DevOps tools will be instrumental in building efficient and scalable...
...vast data sets using distributed processing tools such as Akka and Spark.- Evaluate and enhance existing data pipelines in collaboration... ...file types.- Proficiency with ETL and Data pipeline tools such as Apache NiFi, Airflow, etc.- Strong coding skills in Java or Scala, and...
...Mode-Hybrid
NP- Immediate to 15 days
JD:
Qualifications:
Mandatory Skills & Experience:
Strong GCP, Expertise in Apache Spark, Beam, Airflow, Python experience, SQL.
Good understanding of the Hadoop eco-system and Hive.
Exceptional troubleshooting...
Rs 7 - 11 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Kafka Good to Have Skills : Spring Boot Job Requirements : Key Responsibilities : A: Candidate should able to Analyze the requirements...
Location : Chennai, Pune, Noida, Kochi, Hyderabad, Trivandrum.Job Description :- Proficient in SQL, Spark, Scala, and AWS, with a strong command over these technologies.- Minimum 6-7 years of relevant experience in Spark and SQL, plus 2-3 years of hands-on practice in AWS.-...
...Location
Offer in hand if any:
Pan Card No:
Notice period/how soon you can join:
- Should have 4-8 years Experienced in Java, Spark SQL and worked in Unix/Linux Background.
- Should have Good Knowledge in Oracle Database SQL Queries and Views.
- Experienced to...
...engineering workflow- Experience with S3, EC2, EMR, Lambda, Glue, Athena, Airflow, Redshift.- Strong experience with SQL and Scala skills using Spark framework.- Experience with enterprise Data warehousing implementations using Batch and real time systems.- Expertise in working with...
...ob Description Responsibilities:
Configure, fine-tune, and optimize Spark and YARN administration for high performance
and resource efficiency.
Setup, configuring and manage Data Lake Houses using Hudi and/or Delta Lake for
efficient data storage and processing...
...Production environment.Should have good working experience on : - Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet - Spark - Batch Processing - Setting ETL pipelines - Python or Java programming language is mandatory. - Worked in data Lake , Data...
...Designing and implementing data pipelines using Azure Data Factory (ADF).- Developing and maintaining data processing scripts using Python, Spark, and Scala.- Building and optimizing data storage solutions using Azure Data Lake Storage (ADLS), Blob Storage, and Synapse.-...
...designing, implementing, and maintaining data processing systems. You will work with large-scale data sets using technologies such as Apache Spark, Scala, Hadoop, Jenkins CI/CD, and Microservices. This is an excellent opportunity to contribute to the development of cutting-edge...