Average salary: Rs1,238,122 /yearly
More statsSearch Results: 10,962 vacancies
...Job title: Senior Hadoop Developer Experience: 4+ Years
Notice Period: Immediate to 15 days only
Location: Manyata Embassy Business Park, Bangalore.
Mode of Employment: Contract
Mode Of Work: Hybrid (2-3 days in a Week- Mandatory)
Working experience in elastic...
Responsibilities:- Designing and implementing fine-tuned production-ready data/ML pipelines in Hadoop platform.- Driving optimization, testing, and tools to improve quality.- Reviewing and approving high-level & detailed design to ensure that the solution delivers to the business...
...leading on client-facing projects, including working in close-knit teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects within the cloud ideally AWS or Azure- Data Warehousing...
...Design, code, test, implement, maintain and support applications software that is delivered on time and within budget. Work closely with... ...Administrator will be responsible for installation and Configuration of Hadoop, Deployment of applications across multiple Clusters and...
...Job Title: Java + Hadoop Developer
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded... ...Hadoop .
Responsibilities:
Develop and maintain robust software solutions using Java and Hadoop.
Collaborate with cross-functional...
Skills : Hadoop, Java, Python, Scala, SqlRequirements :- 3-4+ yrs as a backend developer- Worked in a good startup or good engineering colleges- Working with an early-stage startup in a role that involves- Building scalable systems to extract information from 15Mn+ domains.-...
...bring your talent and ambition to make a difference. We will create a world of opportunities for you.
Job Details
Job Title : Hadoop Administrator
Location: Bangalore / Pune / Hyderabad / Noida / Kolkata
Quick joiners needed
Minimum Requirement :
Must...
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude...
Key Skills : Java, Apache Spark, Apache Kafka, Hadoop, AWSJob Description :- Designing, installing, testing, and maintaining scalable data... ...business requirements and industry standards- Creating custom software components and analytics applications- Research data acquisition...
...projects, including working in close-knit teams - Overall, 5 + years of experience, with at least 3+ years in Big Data technologies (Hadoop / Spark / Relational DBs) and similar experience in working on projects within the cloud ideally AWS or Azure - Data Warehousing experience...
...coder with good experience in programming languages like Java or Python.- Hands-on experience on the Big Data stack like PySpark, Hbase, Hadoop, Mapreduce and ElasticSearch.- Good understanding of programming principles and development practices like checkin policy, unit...
...engineering, with a strong focus on large-scale data platforms and data products.- Strong experience with big data technologies such as Hadoop, Spark, and Hive- Proficiency in either Scala or Java programming language.- Experience leading and managing teams of data engineers-...
...experience in S3, Dataproc, Spark, Python, CloudFunctions, Orchestration using Airflow/Composer
- Ability to handle and migration hundreds of TB of data, code optimization
- Good communication skills, self motivated and quick learning capabilities
Spark, Data proc, Hadoop...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
...with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3. Hands on and working knowledge in Python4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora -...
....
About the role....
Managing, Installing & config. of Hadoop (hive, hdfs, Ambari & Nifi) . Designing the architecture and to... ...tuning multi-node Hadoop clusters
Strong experience implementing software and/or solutions in Cloud (Azure, AWS, GCP)
Experience...
...Principal Consultant ,QA : Scala/Spark/Hadoop - ITO080274
Genpact (NYSE: G) is a global professional services and solutions firm delivering... ...with developers and stakeholders to ensure the quality of software products.
• Identify and report bugs, issues, and defects in a...
Job Description :- Must have working experience Designing, building, installing, configuring and supporting of Hadoop.- Good to have Teradata, Cloud & Snowflakes Knowledge- Must have working experience in IntelliJ IDEA, AutoSys(Control M), WinSCP, Putty & GitHub.- Translate...
...experience using Azure SQL or SQL DWH (synapse)- Knowledge of batch and streaming data architectures- Experience using big data technologies (Hadoop, Hive, HBase, Spark and others).- Strong knowledge of SQL (MSSQL or MySQL or PostgreSQL or Presto).- Demonstrated strength in data...
...implementation. You show expertise in applying the appropriate software engineering patterns to build robust and scalable systems. You are... ...experience in building Data Pipelines using Scala, Spark, Hadoop, HiveQL etc.Have experience with streaming frameworks such as Kafka...