Average salary: Rs1,320,689 /yearly
More statsSearch Results: 2,776 vacancies
Rs 7 - 11 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must Have Skills : Apache Spark Good To Have Skills : Job Requirements : Key Responsibilities: AStrong experience in creating ScalaSpark jobs for data...
Rs 7 - 15 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good To Have Skills : Python Programming Language Job Requirements : Key Responsibilities: a Lead the team and contribute to the...
...able to work on multiple medium to large projects. The successful candidate will have excellent technical skills (Apache/Confluent Kafka, Big Data technologies, Spark/Pyspark) and also will be able to take oral and written business requirements and develop efficient code to...
Rs 12 - 16 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Microsoft SQL Server,Unix to Linux Migration,Data Warehouse ETL Testing,Amazon Web Service Job Requirements :...
Rs 12 - 16 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Hadoop, Job Requirements : Key Responsibilities : A: Create scala/spark jobs for data transformation and...
Rs 12 - 20 lakhs p.a.
...Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have Skills : Apache Spark Good to Have Skills : No Technology Specialization Job Requirements : Key Responsibilities : 1 Set-up and configure data bricks for...
...designing, building, and maintaining data pipelines, with expertise in setting up and optimizing data processing frameworks such as Apache Spark or similar, Kafka/Pulsar, and Data Lake.Responsibilities: Design and develop robust, scalable, and efficient data pipelines to...
Rs 10 - 16 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Java Enterprise Edition,Apache Kafka,AAAP (Accenture Advanced Analytics Platform) Job Requirements : Key...
...Job Title: SPARK L3 Support Engineer Location: Bangalore or Hyderabad
Workspace Type: Hybrid (preferably office – at least 3 days... ...Hadoop Cloud Streaming technologies.
Open Source Ecosystem (Linux, Apache, etc.).
Strong knowledge of Databricks and Open source Spark,...
Rs 13 - 16 lakhs p.a.
...Assist in defining requirements and designing applications to meet business process and application requirements.
Must have Skills : Apache Spark
Good to Have Skills : Python Programming Language
Job Requirements : Key Responsibilities : Python, Linux OS Has hands on...
...of 6+ years of experience in developing ETL jobs using any industry leading ETL tool.
Ability to design, develop, and optimize Apache Spark applications for large-scale data processing.
Ability to implement efficient data transformation and manipulation logic using Spark...
.../MCA or equivalent
Strong experience with the following technologies:
Hands-on Development/Administration experience with Apache Ignite
Conceptual knowledge on working of distributed caching frameworks like Redis.
Experience in Spring Boot, PL/SQL and JAVA...
Job Description :- Looking in for a minimum of 3+ years experience working in as a data engineer.- Experience of working in airflow orchestration tool is Mandate.- Need to have hands on experience working in either AWS, Azure, Kafka, Scala.- Must have experience of working in...
Rs 12 - 30 lakhs p.a.
...Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have Skills : Apache Kafka Good to Have Skills : Functional Test Planning Job Requirements : Key Responsibilities : a : Work as a Kafka developer Implement...
...(Must-have).2+ years of experience in python programming (Must-have) .Sound knowledge of distributed systems and data processing with spark.Knowledge of any tool for scheduling and orchestration of data pipelines or workflows (preferred Airflow)(must to have)1+ years experience...
...of overall technology experience that includes at least 5+ years of hands-on data engineering utilizing the power of Python and the Apache Spark framework to design, build, and maintain data pipelines and perform large-scale data processing tasks.- Writing efficient and...
...skilled and experienced Senior Python Developer with expertise in Apache Airflow to join our team. The ideal candidate will have a strong... ...Qualifications :- Experience with other data processing technologies such as Spark, Kafka, or Hadoop.- Familiarity with containerization...
...Greetings From Maneva!
Job Description
Job Title Apache Pyspark
Location Chennai
Experience 4 10 Years... ...warehousing project.
Must have excellent knowledge in Apache Spark and Python programming experience
Deep experience in...
...similar data warehouse technology- Having working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker- Experience in agile processes, such as SCRUM- Extensive experience in writing advanced SQL...
...EngineerLocation : BangaloreExperience : 3+ Yrs.Responsibilities :- Develop and maintain scalable data pipelines for batch processing using Apache Spark in Big Data projects.- Utilize Scala programming language to implement efficient data processing solutions.- Collaborate with cross...