Average salary: Rs437,999 /yearly

More stats

Search Results: 300 vacancies

 ...Technologies should be Hive, PIG, Sqoop, Zookeeper,Cloudera - these are Hadoop Ecosystem tools2. Candidate should have worked in at least one...  ...understanding or theoretical knowledge3. Should have basic of Spark and Kafka -- at least good POC experience. At least basic... 

Huquo Consulting

Gurgaon
5 days ago
 ...Build processes supporting data transformation, data structures, metadata, dependency and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream processing, and highly scalable - big data- data stores- Building and optimizing - big data-... 

HuQuo

Gurgaon
5 days ago

Rs 12 - 16 lakhs p.a.

 ...process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Microsoft SQL Server,Unix to Linux Migration...  ...actions, spark configuration and tuning techniques B:Knowledge of Hadoop architecture; execution engines, frameworks, applications tools C... 

ICS Consultancy Services

Gurgaon
a month ago

Rs 7 - 11 lakhs p.a.

 ...process and application requirements. Must Have Skills : Apache Spark Good To Have Skills : Job Requirements : Key Responsibilities: AStrong...  ...query tuning and performance optimization BGood understanding of Hadoop architecture and distributed systems eg CAP theorem partitioning... 

ICS Consultancy Services

Noida
a month ago

Rs 12 - 16 lakhs p.a.

 ...Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Hadoop, Job Requirements : Key Responsibilities : A: Create scala/spark jobs for data transformation and aggregation B: Write... 

ICS Consultancy Services

Noida
a month ago
 ...understanding of the mechanism necessary to successfully implement a change.Good to have :- Experience in data management and data lineage tools like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)

HuQuo Consulting Pvt. Ltd.

Gurgaon
5 days ago
 ...- 2 years of experience working with Cloud Platforms (e. g., Microsoft Azure, AWS, GCP).- 2 years of experience in Python, Databricks/Hadoop.- 2 years of experience in MLOps.Good to have :- Curiosity, consultative mindset, and eagerness to explore new technologies.- Ownership... 

Talent Socio

Gurgaon
12 days ago
 ...position include :- Provides technical leadership in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, .NoSQL...  ...Engineer :Desired skills for lead data engineer include :- Python- Spark- Java- Hive- SQL- Hadoop architecture- Large scale search... 

HuQuo Consulting Pvt. Ltd.

Gurgaon
10 days ago
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude... 

HuQuo Consulting Pvt. Ltd.

Gurgaon
17 days ago
We are seeking a skilled AWS Hadoop Administrator to join our team and manage our Hadoop infrastructure on the AWS platform. In this role...  ...ecosystem on AWS such as EMR,S3,EC2,IAM policy, MWAA/Airflow,Hadoop,yarn,spark, Mlflow. Experience on Linux enviroment and bash... 

Valiance Analytics

Gurgaon
24 days ago

Rs 7 - 15 lakhs p.a.

 ...Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good To Have Skills : Python Programming Language Job Requirements : Key Responsibilities: a Lead the team and contribute to the... 

ICS Consultancy Services

Gurgaon
a month ago
 ...tuning, and deploying the apps to the Production environment.Should have good working experience on :- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Spark - Batch Processing- Setting ETL pipelines- Python or Java programming language is mandatory.- Worked... 

HuQuo

Gurgaon
5 days ago

Rs 20 - 21 lakhs p.a.

 ...Responsibilities Roles And Responsibilities Of Cloudera & Spark Developer Secure data management and portable cloud-native data analytics...  ...Enterprise includes CDH, the world’s most popular open source Hadoop-based platform, as well as advanced system management and data... 

MM Staffing & Career Consultants Pvt Ltd

Gurgaon
a month ago
 ...Must have Skills: Apache Spark Good to Have Skills: Data Warehouse ETL Testing Key Responsibilities: A: The resource will write and review complex SQL statements B: The resource will work on ETL preferably on OWB C: The resource will work on... 

Opalforce Inc

Gurgaon
more than 2 months ago
 ...Mumbai, Pune, Nagpur, Indore, Delhi/NCR, Ahmedabad.Job Description :- 6+ years of overall Data Analytics and BI experience- Experience in Spark, Hive, Scala.- Build data pipelines for ETL that fetch data from variety of sources such as flat files relational databases and APIs-... 

Forward eye technologies

Delhi
8 days ago

Rs 12 - 20 lakhs p.a.

 ...Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have Skills : Apache Spark Good to Have Skills : No Technology Specialization Job Requirements : Key Responsibilities : 1 Set-up and configure data bricks for an... 

ICS Consultancy Services

Gurgaon
a month ago
 ...Skills : Python , Pyspark, Azure data bricks, data factor, data lake, SQL.- Deep knowledge and experience working with Python/Scala and Spark- Experienced in Azure data factory, Azure Data bricks, Azure Data Lake, Blob Storage, Delta Lake, Airflow.- Experience working with... 

Unique Occupational Services Private Limited

Noida
21 days ago
Key Responsibilities :Develop, implement, and maintain Spark applications for data processing and analytics.Design and optimize Spark jobs for performance and scalability.Implement Delta lake solutions for efficient data storage and management.Build streaming solutions for... 

HuQuo Consulting Pvt. Ltd.

Gurgaon
5 days ago
Job Description :Technical Expertise :- Should have experience of working on Microsoft Azure tools like Spark, Databrick, Synapse (knowledge of these will be an added advantage).- Should be very strong on BI and EDWH concepts.- Must have good experience on working on Microsoft... 

Ally eXecutive HR Consulting

Noida
a month ago
 ...Proven real-time exposure and use of contemporary data mining, cloud computing, and data management ecosystems like Google Cloud, Hadoop, HDFS, and Spark. - Proficient in Data Modelling that can represent complex data structures while ensuring accuracy, consistency, and... 

IFLOWTECH SOLUTIONS PRIVATE LIMITED

Gurgaon
6 hours agonew