Average salary: Rs1,795,863 /yearly
More statsSearch Results: 128 vacancies
...datasets- Build processes supporting data transformation, data structures, metadata, dependency and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream processing, and highly scalable - big data- data stores- Building and optimizing - big...
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude...
Job Description :1. Technologies should be Hive, PIG, Sqoop, Zookeeper,Cloudera - these are Hadoop Ecosystem tools2. Candidate should have worked in at least one Production project, not only understanding or theoretical knowledge3. Should have basic of Spark and Kafka -- at...
...understanding of the mechanism necessary to successfully implement a change.Good to have :- Experience in data management and data lineage tools like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)
We are seeking a skilled AWS Hadoop Administrator to join our team and manage our Hadoop infrastructure on the AWS platform. In this role, you will be responsible for deploying, configuring, monitoring, and maintaining our Hadoop clusters to ensure optimal performance, scalability...
...candidate should have a minimum of 3+ years of experience in big data engineering and possess strong expertise in Python, SQL, Pyspark, Hadoop, Hive, Delta Lake, and Airflow.Responsibilities : - Design, develop, and maintain large-scale data processing systems - Implement data...
...Experience in Fine-tuning open-source LLMs
Ability to analyze and interpret data
Familiarity with big data technologies such as Hadoop and Spark.
Experience with cloud platforms such as AWS, Google Cloud, or Azure.
Experience with version control systems such as...
...process and tools.
Clear understanding and experience with Python and PySpark or Spark and SCALA, with HIVE, Airflow, Impala, and Hadoop and RDBMS architecture.
Experience in writing Python programs and SQL queries.
Experience in SQL Query tuning.
Experienced in...
...it has successfully trained over 20000+ IT professionals in a variety of courses including Business Analytics, Data Science, Big Data Hadoop, Artificial Intelligence, Digital Marketing, and more.
Role Description
This is a full-time on-site role for a Journalist at Madrid...
...Learning, LLM application development, DevOPS and UI designing.
Expert proficiency interrogating distributed databases (Map/Reduce, Hadoop, Hive, ) is also highly desired should have worked on handling web automation projects.
Positive work attitude with ability to work...
...Excellent communication and collaboration abilities
Preferred Qualifications :
Experience with big data technologies (e.g., Hadoop, Spark)
Knowledge of cloud platforms (e.g., AWS, GCP, Azure)
Familiarity with machine learning and natural language processing techniques...
...to reporting and MI development.
• SQL and SAS expertise ( required to review report build prior to live )
• Tableau, Hive, Impala, Hadoop knowledge
• Credit Risk reporting and analysis experience
• Stakeholder management and strong communication and organizational...
...Regression, Logistic Regression, Random Forest and time series modelling
~ Strong conceptual knowledge of Big Data technologies like Hadoop (Spark, Hive, Pig)
~ Good written and oral communication, with the ability to communicate effectively with stakeholders
~ Prior...
...IIIT etc.)
Good analytical thought process and aptitude for problem solving
Hands on experience using SQL/Pyspark SQL in a big data Hadoop/spark environment
Hands on experience in data cleaning and feature creation using Python / Pyspark
Strong business acumen with a...
...Studio
• Execution Paradigm – low latency/Streaming, batch
Preferred Qualifications/ Skills
• Data platforms – Big Data (Hadoop, Spark, Hive, Kafka etc.) and Data Warehouse (Teradata, Redshift, BigQuery, Snowflake etc.)
• Visualization Tools - PowerBI, Tableau...
...Azure or GCP)
~6+ years of experience with Big Data technologies , including Apache Spark™k, AI, Data Science, Data Engineering, Hadoop, Cassandra, and others
~ Coding experience in Python, R, Java, Apache Spark™ or Scala
Benefits
Private medical insurance...
...Very good knowledge of Datawarehouse, SQL and Unix shell scripting.- Knowledge on ETL side of Cloud platforms like AWS or Azure and on Hadoop platform is also an added advantage.- Experience in working with banking domain data is an added advantage.- Excellent technical...
...SQL to manipulate and analyze data.- Employ visualization techniques to present data insights clearly.- Utilize Big Data tools such as Hadoop and Hive for data analysis.- Design tests and integrate data from various sources to provide a holistic view of analytics....
Rs 18 lakh p.a.
...Expertise in scripting languages like Python, Bash, Perl, etc. preferred
• Specialization in database engineering skills – SQL, NoSQL, Hadoop, etc.
• Exposure to warehousing architecture processes – MOLAP, ROLAP, EDW, etc.
• Exposure to Big Data components like Hadoop...
...The courses imparted to the students are Business Analytics, Data Science with R and Python, Data Analytics with R and Python, Big Data Hadoop, Machine Learning, Artificial Intelligence, Digital Marketing, Software Testing, UI/UX Design, Amazon Web Services (AWS) in the...