Average salary: Rs1,238,122 /yearly
More stats ...Hands on coder with good experience in programming languages like Java, Python or Scala.- Hands-on experience on the Big Data stack like Hadoop, Mapreduce, Spark, Hbase, and ElasticSearch.- Good understanding of programming principles and development practices like checkin...
...and increase their business profitability
Good knowledge on software configuration management systems
Awareness of latest technologies... ...and Team management Technical and Professional Requirements:
Primary skills:Technology- Big Data - Hadoop- Hadoop Administration
...developed solutions, as well as implement industry leading packaged software. This team has embarked on a major transformational journey to... ...complex database queries and scripts for RDBMS (like Oracle), Hadoop (like HBase, Hive etc) and NoSQL (like MongoDB).
• Must have working...
...bring your talent and ambition to make a difference. We will create a world of opportunities for you.
Job Details
Job Title : Hadoop Administrator
Location: Bangalore / Pune / Hyderabad / Noida / Kolkata
Quick joiners needed
Minimum Requirement :
Must...
...Position : Hadoop Developer
Experience: 6+ Years
Location : Chennai/Bangalore(Work from Office)
Mandatory Skill - pySpark,Hadoop,GCP, Python
Professional experience with a cloud platform
Developer must have sound knowledge in Apache Spark and Python programming...
Responsibilities:- Designing and implementing fine-tuned production-ready data/ML pipelines in Hadoop platform.- Driving optimization, testing, and tools to improve quality.- Reviewing and approving high-level & detailed design to ensure that the solution delivers to the business...
....
About the role....
Managing, Installing & config. of Hadoop (hive, hdfs, Ambari & Nifi) . Designing the architecture and to... ...tuning multi-node Hadoop clusters
Strong experience implementing software and/or solutions in Cloud (Azure, AWS, GCP)
Experience...
CSA ID Hadoop CSA Status Active Title Hadoop developer Job Type Contract JD hadoop developer or big data engineer.
Advanced knowledge of the hadoop ecosystem and its components.
In-depth knowledge of Hive, HBase, and spark
Familiarity with MapReduce
Knowledge of...
...leading on client-facing projects, including working in close-knit teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects within the cloud ideally AWS or Azure- Data Warehousing...
3-4 years of experience with building big data applications and robust data pipelines running on Hadoop cluster with exposure to building frameworks related to batch and stream consumption using Big Data tech stack such as Spark, Kafka, Hive, HDFS, HBase, along with exposure...
...BigData-Hadoop/Scala Engineer
Position Overview
Job Title: Data Engineer (ETL, Big Data, Hadoop, Spark, GCP)
Corporate Title: Associate
Location: Bangalore, India
Role Description
Senior Engineer is responsible for developing and delivering elements...
...with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3. Hands on and working knowledge in Python4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora -...
Job Description :- Must have working experience Designing, building, installing, configuring and supporting of Hadoop.- Good to have Teradata, Cloud & Snowflakes Knowledge- Must have working experience in IntelliJ IDEA, AutoSys(Control M), WinSCP, Putty & GitHub.- Translate...
...Design, code, test, implement, maintain and support applications software that is delivered on time and within budget. Work closely with... ...Administrator will be responsible for installation and Configuration of Hadoop, Deployment of applications across multiple Clusters and...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
...quantitative field.They should also have hands-on experience of using many software/tools such as :- Advanced SQL and relational databases, query... ..., dependency and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream processing, and...
Rs 5 - 30 lakhs p.a.
...Hadoop/Omnia Dev Engineers To develop and deliver codes for the work assigned in accordance with time| quality and cost standards Job responsibilities:
Interact with business stake holders and designers to implement to understand business requirements.
Hadoop development...
...Please find below details MSTR and Haas/Hadoop Testing
Current CTC:
Expected CTC (in INR):
Official Notice Period:
Qualification:
Minimum time required to join:
Total IT Experience:
Relevant Experience:
Current Location:
Preferred Location:
Latest...
...Informatics, Information Systems or another quantitative field.They should also have experience using the following software/tools:- Experience with big data tools: Hadoop, Spark, Kafka, etc.- Experience with relational SQL and NoSQL databases.- Experience with data pipeline and...
Mandatory Skills:- Data scientist exp- Hadoop, Scala- Python- Machine learning- SQL- Only from product OrgsRequirements from Past Experience :- Prior experience of working with large-scale datasets (10s of millions of documents) is strongly preferred.- End-to-end ownership of...
...projects, including working in close-knit teams - Overall, 5 + years of experience, with at least 3+ years in Big Data technologies (Hadoop / Spark / Relational DBs) and similar experience in working on projects within the cloud ideally AWS or Azure - Data Warehousing experience...
Skills : Hadoop, Java, Python, Scala, SqlRequirements :- 3-4+ yrs as a backend developer- Worked in a good startup or good engineering colleges- Working with an early-stage startup in a role that involves- Building scalable systems to extract information from 15Mn+ domains.-...
...Iterators in Scala.- Have worked on multi-threading it will be helpful.- Experience in working with Kafka will be helpful.- Knowledge of Hadoop MapReduce, HDFS, Hbase, and Hive will be considered a plus.- Exposure to DevOps and SQL (Postgres, MS SQL) will be considered as an...
...EMR S3 RDS) Python as programming skills and Shell scripting. Work experience in any CI/CD environment is a must. 2) Secondary Skills: Experience in managing Hadoop clusters Airflow instances. Integrating Ranger Hive Metastore service in EKS.
hadoop,aws,cloud computing...
...Job Title : Hadoop Admin
Years of Experience : 4-8 Years
Job Location : Bangalore
Job Description :
Mandatory Skills :
Cloudera CDP - Hadoop Administration experience – Mandatory
Good knowledge and experience of Postgres, MySQL DB Administrator/ Support...
...experience using Azure SQL or SQL DWH (synapse)- Knowledge of batch and streaming data architectures- Experience using big data technologies (Hadoop, Hive, HBase, Spark and others).- Strong knowledge of SQL (MSSQL or MySQL or PostgreSQL or Presto).- Demonstrated strength in data...
...Lead Consultant Scala/Spark/Hadoop Developer - ITO079537
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility,...
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude...
...impacting Target technology Support to Perform knowledge based recovery of incidents and events Support in Understanding and experience with Hadoop, Spark, HBase, Linux, Chef, Shell scripting Support to Communicate with end-users through both verbal and written communication...
Key Skills : Java, Apache Spark, Apache Kafka, Hadoop, AWSJob Description :- Designing, installing, testing, and maintaining scalable data... ...business requirements and industry standards- Creating custom software components and analytics applications- Research data acquisition...