Average salary: Rs1,390,538 /yearly
More statsRs 11 - 15 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Microsoft SQL Server,Unix to Linux Migration,Data Warehouse ETL Testing,Amazon Web Service Job Requirements :...
Rs 12 - 16 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Hadoop, Job Requirements : Key Responsibilities : A: Create scala/spark jobs for data transformation and...
Rs 7 - 11 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must Have Skills : Apache Spark Good To Have Skills : Job Requirements : Key Responsibilities: AStrong experience in creating ScalaSpark jobs for data transformation...
Rs 7 - 15 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good To Have Skills : Python Programming Language Job Requirements : Key Responsibilities: a Lead the team and contribute to the...
Rs 12 - 20 lakhs p.a.
...Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have Skills : Apache Spark Good to Have Skills : No Technology Specialization Job Requirements : Key Responsibilities : 1 Set-up and configure data bricks for an...
...Location
Offer in hand if any:
Pan Card No:
Notice period/how soon you can join:
- Should have 4-8 years Experienced in Java, Spark SQL and worked in Unix/Linux Background.
- Should have Good Knowledge in Oracle Database SQL Queries and Views.
- Experienced to...
...languages commonly used in data engineering, such as Python, Java, or Scala, and experience with data processing frameworks such as Apache Spark, Apache Kafka, or Apache Beam.- Hands-on experience with cloud-based data platforms and services, such as AWS, Azure, or Google Cloud...
Rs 10 - 16 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Java Enterprise Edition,Apache Kafka,AAAP (Accenture Advanced Analytics Platform) Job Requirements : Key...
...We are looking for exceptional Spark Scala engineer with 5+ yrs experience who will be responsible for:
Experience- 5+ Yrs
Location- Hyderabad & Indore
Responsibilities:
Implementing the large scale Spark applications and finetune at runtime
Design and implement...
...client-facing projects, including working in close-knit teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects within the cloud ideally AWS or Azure- Data Warehousing experience with...
...working experience in Elasticsearch to join our team. The ideal candidate should have a minimum of 2 years of experience in Hadoop and Spark development. This role will involve working with large datasets and implementing data processing solutions using Hadoop and Spark,...
...stewards / data custodians / end users etc- Demonstrated experience of building efficient data pipelines using tools or technologies like - Spark, Cloud native tools like EMR, Azure data factory, Informatica, Abinitio etc.- Understand the need for both batch and stream ingestion...
...We are seeking Experienced Scala Spark developers with extensive experience to join our growing Data Engineering practice delivering transformative solutions. We strongly encourage immediate joiners.
Location :Pune, Bangalore & Chennai
Skill : Spark Scala
Total...
Exp : 4-9 YrsLocation : Mumbai & BangaloreNotice : Immediate-Max 30 daysRole Profile :- Minimum 4 yrs exp out of which Proven experience working with Scala programming language and its ecosystem 3+ years- Strong understanding of functional programming concepts and design patterns...
Rs 20 - 21 lakhs p.a.
...Responsibilities
Roles And Responsibilities Of Cloudera & Spark Developer
Secure data management and portable cloud-native data analytics delivered in an open, hybrid data platform. Whether you're powering business-critical AI applications or real-time analytics at scale...
...Job Description
Mission As a Apache Spark™ Backline Engineer you will help our customers to be successful with the Databricks Data Intelligence platform by resolving important technical customer escalations and the support team. You will be the technical bridge between...
Job Description :- Design and develop real-time data ingestion pipelines using Databricks and Spark Streaming to enable timely processing of large volumes of data.- Implement complex data transformations and aggregations to extract actionable insights from streaming data sources...
...working in an agile environment (e.g.user stories, iterative development, etc.).- Knowledge and working experience in elastic search is mandatory.- 3-5 Years of experience in Hadoop & Elastic Search mandatory.- 3-5 Years of experience in Spark is mandatory. (ref:hirist.tech)
...range of enterprise technology transformations and solutions at some of the world's leading multinational organizations.Skills - Apache Spark.Location - BangaloreYears of Experience - 7.5 yrsAs an Application Developer, you will be responsible for designing, building, and...
...Skills : Python , Pyspark, Azure data bricks, data factor, data lake, SQL.- Deep knowledge and experience working with Python/Scala and Spark- Experienced in Azure data factory, Azure Data bricks, Azure Data Lake, Blob Storage, Delta Lake, Airflow.- Experience working with...
...Designing and implementing data pipelines using Azure Data Factory (ADF).- Developing and maintaining data processing scripts using Python, Spark, and Scala.- Building and optimizing data storage solutions using Azure Data Lake Storage (ADLS), Blob Storage, and Synapse.-...
...-scale enterprise data solutions and applications using one or more AWS data and analytics services in combination with 3rd parties - Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, and Glue.- Experience with AWS cloud data lake for the development of real-time or near-real...
...Design, develop, and maintain data processing pipelines using Apache Spark and Scala.
Optimize Spark jobs for performance, scalability, and reliability.
Work closely with data engineers and data scientists to implement data-driven solutions.
Develop and maintain...
...Experience working with Data platforms, including EMR, Data Bricks etc- Experience working with distributed technology tools, including Spark, Presto, Scala, Python, Databricks, Airflow- Developed the Pysprk code for AWS Glue jobs and for EMR.. Worked on scalable distributed...
...engineering, with a strong focus on large-scale data platforms and data products.- Strong experience with big data technologies such as Hadoop, Spark, and Hive- Proficiency in either Scala or Java programming language.- Experience leading and managing teams of data engineers-...
...our team.- You will play a key role in designing, developing, and maintaining large-scale data processing pipelines using Python and Spark/PySpark.- Your expertise in distributed computing frameworks and DevOps tools will be instrumental in building efficient and scalable...
...engineering/administration with a focus on Data Mesh/Virtualization technologies.- Proficiency in distributed file and table formats, Spark, Object Storage (preferably Dell ECS), and metadata management.- Solid understanding of hardware systems performance and distributed systems...
...Experienced Apache Spark & Java developer will be responsible for supporting and enhancing enterprise wide data platform. In this role, you will be not only be designing and coding, but also collaborating with team members to help support core capabilities in the enterprise...
...Experience : 7-10 YearsLocation : Anywhere in India Education : BE, B.Tech, Any Tech GraduateMust-Have Technical Skills : including 3+ years Spark or Scala,- 2+ years of Hadoop/Big Data using tools like Hive, Spark, PySpark, Scala, and RDBMS/SQL Strongly- Preferred: GCP, including...
Job Profile : Spark ( Pyspark ) DeveloperIndustry Type : IT Services Job description :- The developer must have sound knowledge in Apache Spark and Python programming.- Deep experience in developing data processing tasks using pySpark such as reading data from external sources...