Average salary: Rs1,412,933 /yearly
More statsSearch Results: 8,740 vacancies
Job Description :1. Technologies should be Hive, PIG, Sqoop, Zookeeper,Cloudera - these are Hadoop Ecosystem tools2. Candidate should have worked in at least one Production project, not only understanding or theoretical knowledge3. Should have basic of Spark and Kafka -- at...
Rs 5 - 30 lakhs p.a.
...deployment, and cloud-native OSS Platforms. Additionally, we have developed our own cloud-native probing solutions for network debugging,... ...0
Qualifications and Skills
~4-6 years of experience in Hadoop administration
~ Proficiency in NoSQL databases like MongoDB,...
...Development team to implement data strategies, build data flows and develop conceptual data models.- Recognize the need for a specific... ...like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)
...operationalize and deploy offline models into production-ready systems.- Develop and deploy robust tools and services to streamline machine... ...Azure, AWS, GCP).- 2 years of experience in Python, Databricks/Hadoop.- 2 years of experience in MLOps.Good to have :- Curiosity, consultative...
...datasets- Build processes supporting data transformation, data structures, metadata, dependency and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream processing, and highly scalable - big data- data stores- Building and optimizing - big...
...cloud computing, and data management ecosystems like Google Cloud, Hadoop, HDFS, and Spark. - Proficient in Data Modelling that can... ...Management, and Data Governance in General.- Programming Hands on : Develop and maintain software components using Python, PySpark, and GCP services...
Job Description :- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.Good to Have:- Airflow- Good aptitude, strong problem-solving abilities, and analytical...
We are seeking a skilled AWS Hadoop Administrator to join our team and manage our Hadoop infrastructure on the AWS platform. In this role... ...team, you will collaborate closely with data engineers, developers, and other stakeholders to support our big data initiatives and...
...experience in Data modelling, data mapping, and data conversion/migration Added Advantage
Understanding of Big Data/No SQL databases like Hadoop, MongoDB, Snowflake, PostGre
Should have experience in administration and configuring / maintenance of multiple Pentaho...
...days Max.Mandatory Skill : Unix Shell Scripting, SQL, Ab-Initio Developer, ETL, should be good in Programming.Job Description :- 1-4 years... ...Knowledge on ETL side of Cloud platforms like AWS or Azure and on Hadoop platform is also an added advantage.- Experience in working with...
Rs 18 lakh p.a.
...Job Description
Roles And Responsibilities of ETL Developer
• Participate in Design discussions & updating Design documents
• Transforms... ...• Specialization in database engineering skills – SQL, NoSQL, Hadoop, etc.
• Exposure to warehousing architecture processes – MOLAP...
...Software Development experience.- Hands on experience in designing and developing applications using Java EE platforms.- Object Oriented analysis... ...PL/SQL experience in multiple database platforms is preferred.- Hadoop, Exadata experience is desired and MicroStrategy/Tableau...
...Lead Consultant Java Developer - ITO080250
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes... ...· Experience in development using Java, Spring, Hibernate with Hadoop
· SQL
· Knowledge of Hadoop and related technologies including...
...with appropriate reasoning.- Experience in any relational database (Oracle, Mysql, etc.) is a significant plus- Experience working with Hadoop technologies and Spark framework.- Foundational understanding of Cloud based development in general and working on AWS deployed...
...0% quality assurance parameters
Do
# Instrumental in understanding the requirements and design of the product/ software
# Develop software solutions by studying information needs, studying systems flow, data usage and work processes
# Investigating problem areas...
...language (Python, C,C++, Java, SQL).- Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau).Title : Business Intelligence Developer About the Job : As a Business Intelligence Developer, you'll be at the forefront of leveraging data to...
...responsiveness of the application- Creating database schemas and developing database objects (SQL/NoSql) that represent and support business... ...or SQL Server).- NoSQL knowledge (MongoDB, Redis, Cassandra or Hadoop).- Strong understanding of NodeJS and Nest.js & Javascript.- Understanding...
...Amantya Technologies - Gurgaon is hiring for:
Title - Java Backend Developer
No of Positions - 3
Exp - 3+ Years
Job Description :... ...an added advantage.
- Experience with big data tools such as Hadoop, Hive, Spark, etc is a plus.
- Knowledge with modern data...
Designation : Software Developer – Application
Job Location : Gurgaon
Report Manager : Architect - Software
Objective Of This Role: (Brief description of the role, why are we hiring, and what will...
...Collaborate with enterprise architects, data architects, developers & engineers, data scientists, and information designers to lead the identification... ...level.
Hands-on experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOps.
Strong SQL (Hive/...