Average salary: Rs1,350,000 /yearly
More statsSearch Results: 180 vacancies
Job Description :Hadoop/Omnia Dev Engineers To develop and deliver codes for the work assigned in accordance with time| quality and cost standardsJob responsibilities :- Interact with business stake holders and designers to implement to understand business requirements.- Hadoop...
Job Description :- Must have working experience Designing, building, installing, configuring and supporting of Hadoop.- Good to have Teradata, Cloud & Snowflakes Knowledge- Must have working experience in IntelliJ IDEA, AutoSys(Control M), WinSCP, Putty & GitHub.- Translate...
...with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3. Hands on and working knowledge in Python4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora -...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
...with common business use cases of analytics.Nice-to-Have:- Experience with data engineering and big data tools (e.g., Google Big Query, Hadoop/AWS EMR, Kafka).- Programming in NOSQL environments.- Experience with data visualization and presentation using tools like Tableau,...
Primary skill : Scala, Spark, Hive, Hbase, SQL
Secondary skill : Teradata
Notice Period : maximum 30 days
Need candidates who have experience in application development, data analytics.
Candidates who has their major experience in Data importing, Data Loading, ...
Greetings From Maneva!
Job Description
Job Title - Apache Spark/ Python/ Hadoop Developer
Location - Bangalore / Chennai
Experience - 4 - 10 Years
Job Requirement:
Skills:-
Apache Spark, Unix, MS Excel.
Hadoop or Dataiku (One with experience, Other can...
Rs 15 - 20 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Hadoop Good to Have Skills : Banking Strategy Job Requirements : 1: Responsblities: A: Understanding the requirements of input to output transformations...
...data science concepts- Experience in working with SQL and relational databases- Proficiency in big data technologies such as Spark or Hadoop- Excellent understanding of machine learning algorithms and techniques- Solid knowledge of Azure cloud services and infrastructure- Strong...
...software systems and building them in a way that is scalable, maintainable, and robust
Experience in designing application solutions in hadoop ecosystem
Deep understanding of the concepts in Hive, HDFS, yarn, Spark, Spark sql, Scala and Pyspark
HDFS file formats and...
Jobs for Humanity is collaborating with FIS Global to build an inclusive and just employment ecosystem. We support individuals coming from all walks of life.
Company Name: FIS Global
Job Description
Position Type :
Full time Type Of Hire :
Experienced...
...Responsibilities
Design, develop, and deploy scalable Big Data applications using Hadoop ecosystem technologies (Hadoop, HDFS, Hive, etc.).
Collaborate with data scientists and business analysts to understand requirements and translate them into technical solutions...
...Analysis Tools-(Online analytical processing (OLAP), ETL frameworks)
Servers Knowledge
~ HortonWorks & Cloudera distribution of Hadoop. Other distribution knowledge is add on.
Skills Nice to Have
~ Experience on Big Data Tools - not limited to - Python, Spark...
...delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake
· Experience with big data technologies (Hadoop)
· Databricks & Azure Big Data Architecture Certification would be plus
· Must be team oriented with strong collaboration,...
...Kubernetes, Microservices
Good knowledge of caching technologies like Redis and queues like Kafka, SQS, MQ etc.
Knowledge of Big Data, Hadoop would be a plus.
You should be a creative problem-solver who demonstrates clear and thoughtful approaches to challenging technical...
...for identifying accelerators of customer satisfaction and loyalty
Tech Skill
Experience with SQL and data warehousing (e.g. GCP/Hadoop/Teradata/Oracle/DB2)
Experience using tools in BI, ETL, Reporting /Visualization/Dashboards
Programming experience in languages...
...writing complex SQL queries to extract and manipulate data from relational databases (e.g., MySQL, PostgreSQL).
# Experience with Hadoop ecosystem tools, particularly HIVE for data warehousing and analytics.
# Strong analytical skills with the ability to interpret data...
..., building and executing data pipeline using ETL/ELT tool Apache Nifi (Source, Extraction and Processors & Sink modules).
Big Data Hadoop - Hortonworks HDP 3.1.x & core Components covering data engineering, management, operation stack
Big Data Hadoop - Detailed Knowledge...
...side scripting
Has a broad experience from either a development or operations perspective
Expert understanding in data analytics, Hadoop, MapReduce, visualization is a plus
Experience working in a software engineering environment is a plus
Drive complex deployments...
...Requirements
A solid foundation of Java/Scala. Have a passion for developing and supporting data-driven systems with Big data components (Hadoop/Spark/Hive).
Strong in problem-solving. Ability to synthesize information and work in a DevOps way.
Good documentation habits...