Average salary: Rs1,860,859 /yearly
More statsJob Designation : ETL DeveloperWFH / WFOJob Description :- Work Experience : Overall 3+ years of experience working on Databases, Data Warehouse... ....- Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases.- Should have technical expertise...
...scheduled execution.- Conduct both functional and technical testing of ETL jobs, utilizing automation tools whenever possible.- Utilize your... ...to identify and design effective test scenarios.- Experience with Hadoop Hive is essential for testing data transformation and querying...
...Job Title: Sr Hadoop Developer Location: Hyderabad, India
Experience: 3 to 8 yrs
Job Description:
We are seeking a Senior Hadoop... ...with high velocity and variety.
Architect data pipelines and ETL processes to ingest, transform, and store data from various sources...
...teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on... ...and constructing relational and dimensional data models using any ETL/ELT tools (e.g. Talend, Informatica, Alteryx etc.)- Posses ETL Experience...
...Job Description: Good hands on experience in Hadoop
Strong technical knowledge in pyspark
Good communication to handle client requirement.
Must Have Skills: Hadoop + Pyspark
Experience: 4-6 years
Notice Period: 0-15 Days
Work Timing: Regular Shift...
...Job Description:
We are seeking a skilled and experienced Hadoop Developer with expertise in Java and Kafka to join our team. The ideal... ...field.
Key skills required: Java, Hadoop, Spark, Kafka, SQL, ETL tools, good to have cloud knowledge
Proven experience as a...
...datasets- Build processes supporting data transformation, data structures, metadata, dependency and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream processing, and highly scalable - big data- data stores- Building and optimizing - big...
...Have a Good-day…!!!!
We have immediate opportunity for Hadoop Data Engineer – 4 to 8 years
Job Role: Hadoop Data Engineer
Job Location: Pune
Experience- 4 – 8 years
Notice Period : 0 – 15 days
About Company:
At Synechron, we believe in the power...
Job Description :We are seeking a skilled Hadoop Developer with mandatory working experience in Elasticsearch to join our team. The ideal candidate... ...Spark jobs for performance and scalability.- Develop and maintain ETL (Extract, Transform, Load) processes to ingest and process large...
We are seeking a skilled AWS Hadoop Administrator to join our team and manage our Hadoop infrastructure on the AWS platform. In this role, you will be responsible for deploying, configuring, monitoring, and maintaining our Hadoop clusters to ensure optimal performance, scalability...
...Responsibilities:
Primary:
1. Good understanding in the ETL Advanced concepts and administration activities to support R&D/Project... ...or knowledge in Bigdata related tools like (Spark, Hive, Kafka, Hadoop, Horton works, Python, R) would be good to go
5. Should...
...with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3. Hands on and working knowledge in Python4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora -...
...Greetings From Maneva!
Job Description
Job Title Cloudera Hadoop Architect
Location Chennai / Pune / GNDC
Experience 8 15 Year
Relevant Experience 10 Years
Job Requirements:
Strong knowledge and Hands on experience...
...design and development, data integration and ingestion, designing ETL architectures using a variety of ETL tools and techniques.- Plan and... ...of experience, with at least 3+ years in Big Data technologies (Hadoop / Spark / Relational DBs) and similar experience in working on projects...
...experience in big data related technologies like Spark, Hive, HBase, Hadoop, etc. Programming background : - Mandatory- Scala, Spark and Java... ...the code, and error handling. - Experience in Building end to end ETL Pipelines - Self-starter & resourceful personality with the...
Rs 15 - 20 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Hadoop Good to Have Skills : Banking Strategy Job Requirements : 1: Responsblities: A: Understanding the requirements of input to output transformations...
Rs 18 lakh p.a.
...Job Description
Roles And Responsibilities of ETL Developer
• Participate in Design discussions & updating Design documents
• Transforms... ...• Specialization in database engineering skills – SQL, NoSQL, Hadoop, etc.
• Exposure to warehousing architecture processes – MOLAP,...
...Job title: Senior Hadoop Developer Experience: 4+ Years
Notice Period: Immediate to 15 days only
Location: Manyata Embassy Business Park, Bangalore.
Mode of Employment: Contract
Mode Of Work: Hybrid (2-3 days in a Week- Mandatory)
Working experience in elastic...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
Mandatory Skills:- 4+ years of hands-on experience - Hadoop, System administration with sound knowledge in Unix based Operating System internals.- Working experience on Cloudera CDP and CDH and Hortonworks HDP Distribution.- Linux experience (RedHat, CentOS, Ubuntu).- Experience...
Job Description :Hadoop/Omnia Dev Engineers To develop and deliver codes for the work assigned in accordance with time| quality and cost standardsJob responsibilities :- Interact with business stake holders and designers to implement to understand business requirements.- Hadoop...
Responsibilities:- Designing and implementing fine-tuned production-ready data/ML pipelines in Hadoop platform.- Driving optimization, testing, and tools to improve quality.- Reviewing and approving high-level & detailed design to ensure that the solution delivers to the business...
Role : Hadoop AdministratorIndustry Type : IT Employment Type : Full TimeRole Category : PermanentDepartment : Engineering Job Description :- Configure various property files like coresite.xml, hdfssite.xml, mapredsite.xml based on the job requirement.- Manage and review Hadoop...
...Mandatory Skill : Unix Shell Scripting, SQL, Ab-Initio Developer, ETL, should be good in Programming.Job Description :- 1-4 years hands... ...Knowledge on ETL side of Cloud platforms like AWS or Azure and on Hadoop platform is also an added advantage.- Experience in working with banking...
...working in an agile environment (e.g.user stories, iterative development, etc.).- Knowledge and working experience in elastic search is mandatory.- 3-5 Years of experience in Hadoop & Elastic Search mandatory.- 3-5 Years of experience in Spark is mandatory. (ref:hirist.tech)
...manner.
Work closely with Functional Analysis and Quality Assurance teams
Your skills and experience
Extensive experience with Hadoop & Java/Scala related technologies such as
Understanding of Hadoop ecosystem and HDFS file System.
Understanding of different...
Job Profile : Hadoop AdministratorWFH and WFO (Hiring Office Jaipur and Ahmedabad)Role: Big Data EngineerIndustry Type: IT Services & ConsultingJob description :- Good understanding of SDLC and agile methodologies- Installation and configuration of Hadoop clusters, including...
Job Description :Primary & mandatory : PYSPARKSecondary : GCP- At least 5 years of experience in Big Data, Hadoop Data Platform (HDP) and ETL and capable of configuring data pipelines.- Possess the following technical skills - SQL, Python, Pyspark, Hive, Unix, ETL, Control-M...
...understanding of the mechanism necessary to successfully implement a change.Good to have :- Experience in data management and data lineage tools like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)
...engineering, with a strong focus on large-scale data platforms and data products.- Strong experience with big data technologies such as Hadoop, Spark, and Hive- Proficiency in either Scala or Java programming language.- Experience leading and managing teams of data engineers-...