Search Results: 898 vacancies
...working in an agile environment (e.g.user stories, iterative development, etc.).- Knowledge and working experience in elastic search is mandatory.- 3-5 Years of experience in Hadoop & Elastic Search mandatory.- 3-5 Years of experience in Spark is mandatory. (ref:hirist.tech)
Job Description :We are seeking a skilled Hadoop Developer with mandatory working experience in Elasticsearch to join our team. The ideal... ...candidate should have a minimum of 2 years of experience in Hadoop and Spark development. This role will involve working with large datasets...
...leading on client-facing projects, including working in close-knit teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects within the cloud ideally AWS or Azure- Data Warehousing experience...
...projects, including working in close-knit teams - Overall, 5 + years of experience, with at least 3+ years in Big Data technologies (Hadoop / Spark / Relational DBs) and similar experience in working on projects within the cloud ideally AWS or Azure - Data Warehousing...
...engineering, with a strong focus on large-scale data platforms and data products.- Strong experience with big data technologies such as Hadoop, Spark, and Hive- Proficiency in either Scala or Java programming language.- Experience leading and managing teams of data engineers-...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
...Principal Consultant ,QA : Scala/Spark/Hadoop - ITO080274
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility...
...configuration and deployment along with ability to build custom solutionsHave experience in building Data Pipelines using Scala, Spark, Hadoop, HiveQL etc.Have experience with streaming frameworks such as Kafka.Have experience with Data Warehousing, Data Modelling and Data...
...• Experience in creating data driven business solutions and solving data problems using a wide variety of technologies such as Hadoop, Hive, Spark, MongoDB, NoSQL, as well as traditional data technologies like RDBMS, MySQL a plus
• Ability to program in one or more scripting...
...optimization and security.
~3+ years of experience: big data using Apache Spark in developing distributed processing. applications; building... ...programming languages & tools/services: Java or Scala, Big Data, Hadoop, Spark, Spark SQL, Presto \ Hive,Cloud (preferably AWS), Docker,...
...Cloud Spanner, BigQuery
- Hands-on experience in S3, Dataproc, Spark, Python, CloudFunctions, Orchestration using Airflow/Composer
-... ...of data, code optimization
- Good communication skills, self motivated and quick learning capabilities
Spark, Data proc, Hadoop...
...• Experience in creating data driven business solutions and solving data problems using a wide variety of technologies such as Hadoop, Hive, Spark, MongoDB, NoSQL, as well as traditional data technologies (RDBMS) like MySQL or Oracle DB a plus.
• Experience developing large...
...Job title: Senior Hadoop Developer Experience: 4+ Years
Notice Period: Immediate to 15 days only
Location: Manyata Embassy Business... ...experience in Hadoop is mandatory.
4-5 Years of experience in Spark is mandatory.
If you lack interest in the contract or have...
...and implementing fine-tuned production-ready data/ML pipelines in Hadoop platform.- Driving optimization, testing, and tools to improve... ...frameworks- Ability to troubleshoot and optimize complex queries on the Spark platform- Expert in optimizing 'big data' data/ML pipelines,...
Rs 12 - 16 lakhs p.a.
...Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Hadoop, Job Requirements : Key Responsibilities : A: Create scala/spark jobs for data transformation and aggregation B: Write...
...Job Description: Good hands on experience in Hadoop
Strong technical knowledge in pyspark
Good communication to handle client requirement.
Must Have Skills: Hadoop + Pyspark
Experience: 4-6 years
Notice Period: 0-15 Days
Work Timing: Regular Shift...
Rs 12 - 16 lakhs p.a.
...process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Microsoft SQL Server,Unix to Linux Migration... ...actions, spark configuration and tuning techniques B:Knowledge of Hadoop architecture; execution engines, frameworks, applications tools C...
...the world's leading multinational organizations.Skills - Apache Spark.Location - BangaloreYears of Experience - 7.5 yrsAs an Application... ...understanding of Spark architecture and related technologies, including Hadoop, Hive, and HBase.- Experience with programming languages such as...
...Design, develop, and maintain data processing pipelines using Apache Spark and Scala.
Optimize Spark jobs for performance, scalability,... ...Streaming.
Experience with big data technologies such as Hadoop, Hive, and Kafka.
Proficiency in SQL for data querying and...
...FIS architectural standards.
What you will be doing:
Administrator will be responsible for installation and Configuration of Hadoop, Deployment of applications across multiple Clusters and Instance and Cloudera cloud environment setup.
Monitoring of the environment...