Average salary: Rs1,284,555 /yearly
More stats ...datasets- Build processes supporting data transformation, data structures, metadata, dependency and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream processing, and highly scalable - big data- data stores- Building and optimizing - big...
We are seeking a skilled AWS Hadoop Administrator to join our team and manage our Hadoop infrastructure on the AWS platform. In this role, you will be responsible for deploying, configuring, monitoring, and maintaining our Hadoop clusters to ensure optimal performance, scalability...
Job Description :1. Technologies should be Hive, PIG, Sqoop, Zookeeper,Cloudera - these are Hadoop Ecosystem tools2. Candidate should have worked in at least one Production project, not only understanding or theoretical knowledge3. Should have basic of Spark and Kafka -- at...
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude...
...understanding of the mechanism necessary to successfully implement a change.Good to have :- Experience in data management and data lineage tools like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)
...Learning, LLM application development, DevOPS and UI designing.
Expert proficiency interrogating distributed databases (Map/Reduce, Hadoop, Hive, ) is also highly desired should have worked on handling web automation projects.
Positive work attitude with ability to work...
...to reporting and MI development.
• SQL and SAS expertise ( required to review report build prior to live )
• Tableau, Hive, Impala, Hadoop knowledge
• Credit Risk reporting and analysis experience
• Stakeholder management and strong communication and organizational...
...ability to problem solve
Experience in (or a willingness to get to grips with) database interrogation and analysis tools, such as Hadoop, SQL and SAS
Strong problem solving skills with an emphasis on product development
Experience using statistical computer languages...
...IIIT etc.)
Good analytical thought process and aptitude for problem solving
Hands on experience using SQL/Pyspark SQL in a big data Hadoop/spark environment
Hands on experience in data cleaning and feature creation using Python / Pyspark
Strong business acumen with a...
...Studio
• Execution Paradigm – low latency/Streaming, batch
Preferred Qualifications/ Skills
• Data platforms – Big Data (Hadoop, Spark, Hive, Kafka etc.) and Data Warehouse (Teradata, Redshift, BigQuery, Snowflake etc.)
• Visualization Tools - PowerBI, Tableau...
Rs 18 lakh p.a.
...Expertise in scripting languages like Python, Bash, Perl, etc. preferred
• Specialization in database engineering skills – SQL, NoSQL, Hadoop, etc.
• Exposure to warehousing architecture processes – MOLAP, ROLAP, EDW, etc.
• Exposure to Big Data components like Hadoop...
...Very good knowledge of Datawarehouse, SQL and Unix shell scripting.- Knowledge on ETL side of Cloud platforms like AWS or Azure and on Hadoop platform is also an added advantage.- Experience in working with banking domain data is an added advantage.- Excellent technical...
.../ Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.Role & Responsibilities...
Rs 12 - 16 lakhs p.a.
...conversant with Apache Spark architecture, RDDs, various transformations and actions, spark configuration and tuning techniques B:Knowledge of Hadoop architecture; execution engines, frameworks, applications tools C:Pyspark using Spark MLlib library D:Exposure to Data warehousing...
...Athena, EMR, Spark, S3.
~4+ years of experience as a SQL developer or similar role.
~ Fundamentals of distributed systems like Apache Hadoop, Apache Spark.
~ Should be experienced in data structures, data modeling, data lakes, data architecture.
~ Excellent understanding...
...Machine learning: TensorFlow, PyTorch, or Scikit-learn
Artificial intelligence: AWS SageMaker, Azure ML, or GCP AI Platform
Big data: Hadoop, Spark, or Kafka
Programming languages: Python, Scala, or Java
Desirable skills-
Data engineering: Databricks, AWS EMR, or...
...structures, algorithms, and statistical techniques for data analysis and modeling.
Experience with big data technologies (e.g., Hadoop, Spark) and cloud computing platforms (e.g., AWS, Azure, GCP) is a plus.
Strong problem-solving skills and the ability to think creatively...
...modeling
· Strong SQL query-writing skills
· Strong analytical, profiling and troubleshooting skills
· Strong knowledge with Big Data/Hadoop platform
· Strong knowledge Unix shell and experience with Python
· Experience in data integration and data conversions
·...
...of Data Models like Snowflake Schema and Star Schema and conceptual understanding of Data Warehouse, Big Data, Dimensional Modeling, Hadoop- Exposure of working in Agile framework. Tools : JIRA, ConfluenceResponsibilities :- Support the build of outcome driven Visualisations...
Rs 12 - 16 lakhs p.a.
...and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Hadoop, Job Requirements : Key Responsibilities : A: Create scala/spark jobs for data transformation and aggregation B: Write Scala doc style...
...working with unstructured datasets.
Proficiency in Spark Technology and Elastic pipeline engine.
Experience with big data tools: Hadoop, Spark, Kafka, etc.
Experience with relational SQL and NoSQL databases.
Experience with data pipeline and workflow management...
...with different priorities at the same time in a fast-paced environment- Other : Elastic stack, RESTful APIs, Node.js- Big data tools : Hadoop, Spark, Presto, Kafka, etc- Object-oriented languages : Java, C++, etc- Excellent self-management and problem-solving skills- Can work...
...Scala(Preferred) or Python
Additional Skills: Exposure to multiple technologies: Java, Big Data, Spark, PySpark, Spark SQL, Scala, Hadoop, HDFS, YARN, Hive, GCP Cloud, Google Cloud Dataflow, Big Query, Dataproc, Elasticsearch, MySQL, Oracle, Cloudera, Azure HDInsight, AWS...
...foundation in OOPS and functional programming concepts.
Big Data Technologies : Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.).
Database...
...- Experience in managing Python codes and collaborating with customer on model evolution- Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise).- Superior analytical and problem solving skills - Should be able to work on a problem independently...
...and testing using (AWS Dynamo DB, EKS, Kafka, Kinesis/Spark/Streaming/Python) to enable seamless data ingestion to process on to the Hadoop platform.
Data Governance and Data Discovery on Cloud Platform
data processing framework using Spark, Glue, pyspark, Kinesis...
...with hands-on experience in Talend.- Strong knowledge of data modeling, ETL development, and data warehousing concepts.- Experience with Hadoop, Spark, and other big data technologies.- Excellent programming skills in Java, Scala, or Python.- Strong analytical and problem-...
...modeling
Strong SQL query-writing skills
Strong analytical, profiling and troubleshooting skills
Strong knowledge with Big Data/Hadoop platform
Strong knowledge Unix shell and experience with Python
Experience in data integration and data conversions...
Rs 20 - 21 lakhs p.a.
...What is Cloudera Enterprise? Enterprise Platform for Big Data. Cloudera Enterprise includes CDH, the world’s most popular open source Hadoop-based platform, as well as advanced system management and data management tools plus dedicated support and community advocacy from our...
Experience is 7-11 years.Key Skills : - Hadoop, Hive, GCP, API, design principles, design architect, data management, data governance Responsibilities :- Working with the DAO ESG delivery leads, Portfolio and Platform architects from multiple GB/GFs,Technical Lead 8- Engineers...