Average salary: Rs900,000 /yearly
More statsSearch Results: 13,301 vacancies
...in web development frameworks such as Play or Akka - Solid understanding of relational databases, SQL- Experience with designing and developing RESTful APIs and microservices- Familiarity with version control systems (e.g., Git) and collaborative development workflows- Strong...
...Airflow, Redshift.- Strong experience with SQL and Scala skills using Spark framework.- Experience with enterprise Data warehousing... ...knowledge- Give timely, balanced, honest, and caring feedback to help develop othersPrimary Skill Set :- Hands on experience on AWS and Bigdata...
Roles & Responsibilities:
- Design, develop, and test applications using Apache Spark to meet business process and application requirements.
- Collaborate with cross-functional teams to ensure successful project delivery, including working with business analysts, project...
Rs 7 - 15 lakhs p.a.
...Role : Application Developer Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good To Have Skills : Python Programming Language Job Requirements : Key Responsibilities: a Lead...
...to deliver data and analytical services, including data ingestion, transformation, storage, and reporting and strives to continuously develop new and improved data engineering capabilities.The key accountabilities:- Design data solutions to meet business, technical and user...
...environment alongside software engineers, and a solid foundation in developing solutions for marketing, and advertising organizations.... ...Data Science and/or Statistical analysis- Expertise in Python, Spark, SQL or other modern programming languages- Experience in using...
...Description :Roles & Responsibilities :- Must have 4 years of Experience on Python.- Must have 4 years' relevant experience on Apache Spark OR pyspark. - Must have 4 years of experience in SQL [Must be proficient in writing Advanced complex queries]. - Must have 4 years' relevant...
Total Experience - 3-8 yrs
Location- Pan India (Hybrid model )
Mandatory -JAVA AND SnowFlake OR SPARK
Process- L1 +L2 +Client
NP- 30-45 Days
Hands on experience – JAVA , snowflake OR Snowpark
Programming experience – 3+ years in Java
Minimum 1 Year Experience...
Rs 8 - 20 lakhs p.a.
...Python + AWS/Airflow (Candidate should be very good in Python) Spark, SQL.
Experience: 3-8 years
Salary: 8 to 20 LPA... ...cloud services needed to fulfil the technical design
Design, Develop and Deliver data integration interfaces in the AWS
Design, Develop...
...Job Role : Data Engineer Experience : Spark / Streaming – 2-3 Years – 7-9 YearsÂ
Location : Pune/Bangalore/Chennai/Hyderabad/Mumbai... ...Data Engineer (Spark/Streaming)
Roles & Responsibilities :
1. Develop and maintain architectures such as databases and large-scale...
...~5-8 years' experience in Core Java (8 and above) and Spring.
~ Proficiency with SQL
~ Automated unit testing using Junit.
~ Spark and Scala are desirable.
~ Exposure to Big Data technologies (e.g. Parquet, Dremio etc.) is desirable.
~ Good knowledge of Investment...
...one of our MNC based out Mumbai.
Skill : Data Informatica Developer
Experience : 7– 12 Years
Job Type: Full Time Employment... ...Teradata, Hadoop or Cloud – Snowflake,
Unix shell scripting/Python/Spark Framework .
Candidate should have knowledge of tools used for...
...OUR PARTNER COMPANY AAPNA INFOTECH
Responsibilities:
Design, develop, test, and maintain Java-based applications.
Write clean,... ...well-known Scala and Spring libraries and frameworks, including Spark.
Affinity for learning and applying new technologies and solving...
...Dear Candidates, We are looking for Axiom Developer
Location-Goregaon East Mumbai
Please find below job description.
Responsibilities... ...• DevOps Tooling
• Experience with Java/Scala/Python and Spark Framework
• Exposure to BFSI and finance industry
Non-...
...Minimum 2 years of experience in Big Data technologies
· Hands-on experience with the Hadoop stack - HDFS, sqoop,
kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie,
airflow and other components required in building end-to-end data
pipelines....
...Proficient in building and maintaining data pipelines in Databricks Proficient in data engineering technologies, such as Delta Lake, Apache Spark, Azure Data Factory and similar Proficient in big data engineering programming languages such as Python and/or Scala Experience in T-...
...Immediate to 15 Days OnlyRole & Responsibilities :- We are seeking Data Developer with a minimum of 7-12 years of work experience in designing,... ..., Hadoop or Cloud - Snowflake,- Unix shell scripting/Python/Spark Framework .- Candidate should have knowledge of tools used for...
...backend code interacts with the front endsystems13. Designing and developing APIs(Rest /Restful)
This Role Requires:
Requirement -... ...throughput data-relatedarchitecture and technologies (e.g. Kafka, Spark, Hadoop)7. You are familiar with the most up-to-date technologies...
...or equivalent
Role & Responsibilities:
This is a Developer role as part of Liquidity Powai development team. The person would... ...different departments, countries and regions.
Scala with Spark Framework
Hadoop eco system
Good Unix / Linux OS...
...product development initiatives.
Responsibilities :
Designing, developing and maintaining core system features, services and engines... ...at least one server-side framework like Servlets, Spring, java spark (Java).
Proficient in using ORM/Data access frameworks like Hibernate...