Search Results: 774 vacancies
...leading on client-facing projects, including working in close-knit teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects within the cloud ideally AWS or Azure- Data Warehousing experience...
...either Java or Python or Both)Big Data Experience - 2-5 Years - (Hadoop, snowflake, or Kafka)Notice period - Immediate or maximum up to 3... ...with Pure/Alloy.4. Working knowledge of open-source tools such as AWS Lambda, Prometheus, Spark, Hadoop, or Snowflake. (ref:hirist.tech)
Job Description :We are seeking a skilled Hadoop Developer with mandatory working experience in Elasticsearch to join our team. The ideal... ...candidate should have a minimum of 2 years of experience in Hadoop and Spark development. This role will involve working with large datasets...
...projects, including working in close-knit teams - Overall, 5 + years of experience, with at least 3+ years in Big Data technologies (Hadoop / Spark / Relational DBs) and similar experience in working on projects within the cloud ideally AWS or Azure - Data Warehousing...
...working in an agile environment (e.g.user stories, iterative development, etc.).- Knowledge and working experience in elastic search is mandatory.- 3-5 Years of experience in Hadoop & Elastic Search mandatory.- 3-5 Years of experience in Spark is mandatory. (ref:hirist.tech)
...engineering, with a strong focus on large-scale data platforms and data products.- Strong experience with big data technologies such as Hadoop, Spark, and Hive- Proficiency in either Scala or Java programming language.- Experience leading and managing teams of data engineers-...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
...data pipelines.- Proficiency in programming languages such as Python, Java, or Scala.- Experience with big data technologies such as Hadoop, Spark, or Kafka.- Strong SQL skills and experience with relational databases (e.g., MySQL, PostgreSQL).- Experience with cloud platforms...
...A Sr Site Reliability Engineer has to perform a variety of tasks and demonstrate a deep understanding of Hadoop and its related tools, such as Hive, Spark, and HDFS. Some of the main responsibilities are:
Single window support: Leverage deep understanding of Hadoop...
...Greetings From Maneva!
Job Description
Job Title Hadoop Developer
Location Bangalore
Experience 5 10 Years
Job Requirements:
Hadoop strong in Scala Spark along with SQL and good understanding of handling big size data.
Hadoop...
...expertise.
Job Description:
We are seeking a skilled AWS Hadoop Administrator to join our team and manage our Hadoop infrastructure... ...and ecosystem technologies such as HDFS, YARN, MapReduce, Hive, Spark, and Kafka.
Experience deploying and managing Hadoop clusters...
...skills on SQL and Advance Level SQL.- Knowledge of Big Data Processing Cluster (Azure HDInsight / AWS EMR / Hadoop/ Databricks )- Big data processing experience using Spark Spark-SQL/ PY-Spark/ Azure Data Factory/ AWS Glue etc. - Working Experience on Jupyter Notebook /...
...collaborative Agile development process :- Software design, Scala & Spark development, automated testing of new and existing components in... ....8+.- Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python,...
...Minimum of 3 years of experience with a focus on ML Engineering.- Big Data Tools : Familiarity with big data processing tools like Hadoop, Spark, SQL or similar.- Familiarity with AWS/Databricks or another cloud environment is required.- Programming Skills : Proficiency in...
...Responsibilities:
Design, deploy and support highly available and scalable Hadoop Clusters for high data volumes.
Ensure performance, security,... ...issues of batch jobs and should be able understand the code (Spark,python,java).
Deep understanding of Cloudera Manager,...
Remote job
...and collaborative environment, leveraging your expertise in Hive, Hadoop, and PySpark to unlock valuable insights from our data.Key... ...Implement performance tuning and optimization strategies for Hadoop and Spark.Data Governance :- Implement data security and access controls to...
...configuration and deployment along with ability to build custom solutions- Have experience in building Data Pipelines using Scala, Spark, Hadoop, HiveQL etc.- Have experience with streaming frameworks such as Kafka.- Have experience with Data Warehousing, Data Modelling and...
...Job Description: Good hands on experience in Hadoop
Strong technical knowledge in pyspark
Good communication to handle client requirement.
Must Have Skills: Hadoop + Pyspark
Experience: 4-6 years
Notice Period: 0-15 Days
Work Timing: Regular Shift...
Job Description :- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.Good to Have:- Airflow- Good aptitude, strong problem-solving abilities, and analytical...
...YearsLocation : Anywhere in India Education : BE, B.Tech, Any Tech GraduateMust-Have Technical Skills : including 3+ years Spark or Scala,- 2+ years of Hadoop/Big Data using tools like Hive, Spark, PySpark, Scala, and RDBMS/SQL Strongly- Preferred: GCP, including GCS (Google...