Average salary: Rs2,065,333 /yearly
More stats ...Experience: 8 to 12 years of experience in Big Data, Apache Hadoop System Implementation, and administration.
Education: BE (IT... ...and recommend new technologies.
11. Stay updated with industry developments and emerging technologies.
What skills you should have?...
Job Description :Hadoop/Omnia Dev Engineers To develop and deliver codes for the work assigned in accordance with time| quality and cost standardsJob responsibilities :- Interact with business stake holders and designers to implement to understand business requirements.- Hadoop...
Summary : As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process... ...using PySpark. Your typical day will involve working with Hadoop Administration and collaborating with cross-functional teams to...
...At least 3 years of experience building production-grade data processing systems as a Data Engineer
In-depth knowledge of:
The Hadoop ecosystem.
Building applications with Apache Spark;Experience with Spark Streaming
Columnar storage solutions like Apache HBase;...
...Iterators in Scala.- Have worked on multi-threading it will be helpful.- Experience in working with Kafka will be helpful.- Knowledge of Hadoop MapReduce, HDFS, Hbase, and Hive will be considered a plus.- Exposure to DevOps and SQL (Postgres, MS SQL) will be considered as an...
...Solutions.- Work with stakeholders to understand business objectives and develop value-added data driven solutions, frameworks and reporting... ...- Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases.- Should have technical expertise...
...organization's data-driven initiatives.Key Responsibilities :- Design, develop, and deploy scalable and reliable data pipelines to ingest,... ...solutions using distributed computing frameworks such as Apache Hadoop, Apache Spark, or Apache Flink.- Develop and maintain ETL (Extract...
...Science, Engineering, or a related field.- Master's degree preferred.- Proven experience as a Big Data Developer or similar role, with strong knowledge of Big Data and Hadoop ecosystems.- Hands-on experience with HDFS, Spark, MapReduce, Sqoop, and Hive.- Experience in loading...
Rs 15 - 20 lakhs p.a.
...Role : Application Developer Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Hadoop Good to Have Skills : Banking Strategy Job Requirements : 1: Responsblities: A: Understanding the requirements...
Job Description :- Must have working experience Designing, building, installing, configuring and supporting of Hadoop.- Good to have Teradata, Cloud & Snowflakes Knowledge- Must have working experience in IntelliJ IDEA, AutoSys(Control M), WinSCP, Putty & GitHub.- Translate...
...with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3. Hands on and working knowledge in Python4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora -...
...Job Description:
We are seeking a skilled and experienced Hadoop Developer with expertise in Java and Kafka to join our team. The ideal candidate will play a crucial role in designing, developing, and maintaining our Hadoop-based data processing and analytics solutions...
...and provide deadline estimates for implementing new features
Develop software applications using technologies that include and not limited... ...relational (Oracle) and non-relational databases (SQL, MongoDB, Hadoop, etc), with RESTful microservice architecture
Implement...
...Life at Visa.
Job Description
A Sr Site Reliability Engineer has to perform a... ...and demonstrate a deep understanding of Hadoop and its related tools, such as Hive, Spark... ...issues/outages affecting multiple users.
Develop standards: The team would prepare...
...and implementing fine-tuned production-ready data/ML pipelines in Hadoop platform.- Driving optimization, testing, and tools to improve... ...roadmap.- Understanding business requirements and solution design to develop and implement solutions that adhere to big data architectural...
...- 15 daysSalary : 15-26 LPA JD's : You will be responsible for developing and implementing data science projects, analyzing large and complex... ...with big data processing frameworks, such as Spark or Hadoop- Strong analytical and problem-solving skills, with the ability...
...experience with RedHat Linux and Cloudera is mandatory.- Experience installing, configuring, upgrading, managing, and administering Cloudera Hadoop- Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues- Work closely with data scientists...
...bring your talent and ambition to make a difference. We will create a world of opportunities for you.
Job Details
Job Title : Hadoop Administrator
Location: Bangalore / Pune / Hyderabad / Noida / Kolkata
Quick joiners needed
Minimum Requirement :
Must...
...experience in the following areas,Mandatory Skills :- Tableau, Big Data, Hadoop, Data Warehousing, Legacy BI Migrations, Cloud technologies.-... ...vital for a solutions architect, he / she will help determine, develop, and improve technical solutions in support of business goals.-...
...Responsibilities
Design, develop, and deploy scalable Big Data applications using Hadoop ecosystem technologies (Hadoop, HDFS, Hive, etc.).
Collaborate with data scientists and business analysts to understand requirements and translate them into technical solutions...
...Development team to implement data strategies, build data flows and develop conceptual data models.- Recognize the need for a specific... ...like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)
Job Profile : Hadoop AdministratorWFH and WFO (Hiring Office Jaipur and Ahmedabad)Role: Big Data EngineerIndustry Type: IT Services & ConsultingJob... ...and prevent bottlenecks- Providing technical support to developers and end-users as needed- Awareness of latest technologies and...
Skills : Hadoop, Java, Python, Scala, SqlRequirements :- 3-4+ yrs as a backend developer- Worked in a good startup or good engineering colleges- Working with an early-stage startup in a role that involves- Building scalable systems to extract information from 15Mn+ domains....
...Role - Hadoop and Big Data Administrator Location - Indore, MP/Noida
Years of Experience required - 2-5 Years
Job Description-
You will work in a multi-functional role with a combination of expertise in System and Hadoop administration. You will work in a team that...
Job Description :1. Technologies should be Hive, PIG, Sqoop, Zookeeper,Cloudera - these are Hadoop Ecosystem tools2. Candidate should have worked in at least one Production project, not only understanding or theoretical knowledge3. Should have basic of Spark and Kafka -- at...
Mandatory Skills:- Data scientist exp- Hadoop, Scala- Python- Machine learning- SQL- Only from product OrgsRequirements from Past Experience... ...of documents) is strongly preferred.- End-to-end ownership of developing the model, writing production quality code and deploying the...
...team that supports the product. You help develop and gain insight in the application architecture... ...thinking and visionary goals. As a Sr. Engineer, you'll take the lead as you.Role... ...building Data Pipelines using Scala, Spark, Hadoop, HiveQL etc.Have experience with streaming...
...datasets- Build processes supporting data transformation, data structures, metadata, dependency and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream processing, and highly scalable - big data- data stores- Building and optimizing - big...
Mandatory Skills:- 4+ years of hands-on experience - Hadoop, System administration with sound knowledge in Unix based Operating System internals.- Working experience on Cloudera CDP and CDH and Hortonworks HDP Distribution.- Linux experience (RedHat, CentOS, Ubuntu).- Experience...
Job Description :- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.Good to Have:- Airflow- Good aptitude, strong problem-solving abilities, and analytical...