Average salary: Rs700,000 /yearly
More stats ...Job Title: Sr Hadoop Developer Location: Hyderabad, India
Experience: 3 to 8 yrs
Job Description:
We are seeking a Senior Hadoop Developer with a minimum of 3 years of experience to join our dynamic team. The ideal candidate will have extensive experience in designing...
...Job Description: Good hands on experience in Hadoop
Strong technical knowledge in pyspark
Good communication to handle client requirement.
Must Have Skills: Hadoop + Pyspark
Experience: 4-6 years
Notice Period: 0-15 Days
Work Timing: Regular Shift...
...Have a Good-day…!!!!
We have immediate opportunity for Hadoop Data Engineer – 4 to 8 years
Job Role: Hadoop Data Engineer
Job Location: Pune
Experience- 4 – 8 years
Notice Period : 0 – 15 days
About Company:
At Synechron, we believe in the power...
...datasets- Build processes supporting data transformation, data structures, metadata, dependency and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream processing, and highly scalable - big data- data stores- Building and optimizing - big...
...with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3. Hands on and working knowledge in Python4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora -...
We are seeking a skilled AWS Hadoop Administrator to join our team and manage our Hadoop infrastructure on the AWS platform. In this role, you will be responsible for deploying, configuring, monitoring, and maintaining our Hadoop clusters to ensure optimal performance, scalability...
...Greetings From Maneva!
Job Description
Job Title Cloudera Hadoop Architect
Location Chennai / Pune / GNDC
Experience 8 15 Year
Relevant Experience 10 Years
Job Requirements:
Strong knowledge and Hands on experience...
...leading on client-facing projects, including working in close-knit teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects within the cloud ideally AWS or Azure- Data Warehousing...
...Job title: Senior Hadoop Developer Experience: 4+ Years
Notice Period: Immediate to 15 days only
Location: Manyata Embassy Business Park, Bangalore.
Mode of Employment: Contract
Mode Of Work: Hybrid (2-3 days in a Week- Mandatory)
Working experience in elastic...
Rs 15 - 20 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Hadoop Good to Have Skills : Banking Strategy Job Requirements : 1: Responsblities: A: Understanding the requirements of input to output transformations...
Responsibilities:- Designing and implementing fine-tuned production-ready data/ML pipelines in Hadoop platform.- Driving optimization, testing, and tools to improve quality.- Reviewing and approving high-level & detailed design to ensure that the solution delivers to the business...
...working in an agile environment (e.g.user stories, iterative development, etc.).- Knowledge and working experience in elastic search is mandatory.- 3-5 Years of experience in Hadoop & Elastic Search mandatory.- 3-5 Years of experience in Spark is mandatory. (ref:hirist.tech)
...AWS/Azure cloud data stores and its DB/DW related service offerings.- Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases.- Should have technical expertise and working experience in at least 2 Reporting tools among Power BI,...
Job Description :Hadoop/Omnia Dev Engineers To develop and deliver codes for the work assigned in accordance with time| quality and cost standardsJob responsibilities :- Interact with business stake holders and designers to implement to understand business requirements.- Hadoop...
Role : Hadoop AdministratorIndustry Type : IT Employment Type : Full TimeRole Category : PermanentDepartment : Engineering Job Description :- Configure various property files like coresite.xml, hdfssite.xml, mapredsite.xml based on the job requirement.- Manage and review Hadoop...
...manner.
Work closely with Functional Analysis and Quality Assurance teams
Your skills and experience
Extensive experience with Hadoop & Java/Scala related technologies such as
Understanding of Hadoop ecosystem and HDFS file System.
Understanding of different...
Mandatory Skills:- 4+ years of hands-on experience - Hadoop, System administration with sound knowledge in Unix based Operating System internals.- Working experience on Cloudera CDP and CDH and Hortonworks HDP Distribution.- Linux experience (RedHat, CentOS, Ubuntu).- Experience...
Job Profile : Hadoop AdministratorWFH and WFO (Hiring Office Jaipur and Ahmedabad)Role: Big Data EngineerIndustry Type: IT Services & ConsultingJob description :- Good understanding of SDLC and agile methodologies- Installation and configuration of Hadoop clusters, including...
...Job Description:
We are seeking a skilled and experienced Hadoop Developer with expertise in Java and Kafka to join our team. The ideal candidate will play a crucial role in designing, developing, and maintaining our Hadoop-based data processing and analytics solutions...
Job Description :We are seeking a skilled Hadoop Developer with mandatory working experience in Elasticsearch to join our team. The ideal candidate should have a minimum of 2 years of experience in Hadoop and Spark development. This role will involve working with large datasets...
...engineering, with a strong focus on large-scale data platforms and data products.- Strong experience with big data technologies such as Hadoop, Spark, and Hive- Proficiency in either Scala or Java programming language.- Experience leading and managing teams of data engineers-...
...understanding of the mechanism necessary to successfully implement a change.Good to have :- Experience in data management and data lineage tools like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude...
...data-intensive, distributed environments with minimum 5 years of experience in big data related technologies like Spark, Hive, HBase, Hadoop, etc. Programming background : - Mandatory- Scala, Spark and Java / Python - Experience in following technologies - MapReduce, HDFS, YARN...
...projects, including working in close-knit teams - Overall, 5 + years of experience, with at least 3+ years in Big Data technologies (Hadoop / Spark / Relational DBs) and similar experience in working on projects within the cloud ideally AWS or Azure - Data Warehousing experience...
...FIS architectural standards.
What you will be doing:
Administrator will be responsible for installation and Configuration of Hadoop, Deployment of applications across multiple Clusters and Instance and Cloudera cloud environment setup.
Monitoring of the environment...
...any industry standard tool.- Good experience working with RDBMS MPP systems as well as NoSQL environments like Object Storage, MongoDB, Hadoop / HIVE etc.- Good Data Profiling / Analysis experience.- Preferably having Financial Services Industry experience.- Excellent...
...scale distributed data processing systems with one or more technologies such as MS SQL Server, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, Hive, Teradata, or MicroStrategy- A high-level understanding of automation in a cloud environment: AWS experience preferred.-...
...join our team and play a crucial role in building and maintaining our big data infrastructure. You will leverage your expertise in the Hadoop ecosystem and data processing frameworks to design, implement, and optimize scalable data pipelines.Responsibilities : - Design,...
Job Description :1. Technologies should be Hive, PIG, Sqoop, Zookeeper,Cloudera - these are Hadoop Ecosystem tools2. Candidate should have worked in at least one Production project, not only understanding or theoretical knowledge3. Should have basic of Spark and Kafka -- at...