Average salary: Rs360,999 /yearly
More statsSearch Results: 103 vacancies
...Professional Services Role Big Data Infrastructure Implementation and Support
Experience: 8 to 12 years of experience in Big Data, Apache Hadoop System Implementation, and administration.
Education: BE (IT/Computers), B.Tech, M.Tech, MCA
Work Location: Riyadh, Saudi Arabia...
...understanding of the mechanism necessary to successfully implement a change.Good to have :- Experience in data management and data lineage tools like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude...
...datasets- Build processes supporting data transformation, data structures, metadata, dependency and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream processing, and highly scalable - big data- data stores- Building and optimizing - big...
Job Description :1. Technologies should be Hive, PIG, Sqoop, Zookeeper,Cloudera - these are Hadoop Ecosystem tools2. Candidate should have worked in at least one Production project, not only understanding or theoretical knowledge3. Should have basic of Spark and Kafka -- at...
...volumes of structured and unstructured data.- Implement data processing solutions using distributed computing frameworks such as Apache Hadoop, Apache Spark, or Apache Flink.- Develop and maintain ETL (Extract, Transform, Load) processes to ensure data quality and consistency...
...experience with RedHat Linux and Cloudera is mandatory.- Experience installing, configuring, upgrading, managing, and administering Cloudera Hadoop- Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues- Work closely with data scientists...
...candidate should have a minimum of 3+ years of experience in big data engineering and possess strong expertise in Python, SQL, Pyspark, Hadoop, Hive, Delta Lake, and Airflow.Responsibilities : - Design, develop, and maintain large-scale data processing systems - Implement data...
...pattern recognition and predictive modeling
Experience with Excel PowerPoint Tableau SQL
Familiarity with big data technologies (e.g. Hadoop Spark) and cloud platforms (e.g. AWS Azure) is a plus
Great problemsolving skills and the ability to translate business questions...
...technologies and frameworks
Exposure to Terraform, Kubernetes, Python and Shell Scripting is essential
Hands-on experience in Spark, Hive, Hadoop, or Presto.
Familiarity with ElasticSearch, and MongoDB is a plus
Liaison with Product Management, DevOps, QA, and other teams...
...data mining and segmentation techniques
Experience with Excel PowerPoint Tableau SQL
Familiarity with big data technologies (e.g. Hadoop Spark) and cloud platforms (e.g. AWS Azure) is a plus
Great problemsolving skills and the ability to translate business questions...
...to spot trends and tease out patterns in large datasets
• Good analytical and problem-solving skills, and ability to set goals and meet deadlines in a fast-paced working environment
Preferred Qualifications/skills
• Knowledge of Hadoop/Hive is good to have...
...Description for Java spring boot with Big Data-406562-1 Total Exp: 8-12 Years
Location: Bangalore
Job Skill Required:
Java Spring Boot + Big Data
Skill - Core Java + Java Spring + Big Data/Hadoop
Java Spring is critical to have
Hands on experience in Big Data...
...AWS/Azure cloud data stores and its DB/DW related service offerings.
Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases.
Should have technical expertise and working experience in at least 2 Reporting tools among Power...
...and testing using (AWS Dynamo DB, EKS, Kafka, Kinesis/Spark/Streaming/Python) to enable seamless data ingestion to process on to the Hadoop platform.
Data Governance and Data Discovery on Cloud Platform
data processing framework using Spark, Glue, pyspark, Kinesis...
...process and tools.
● Clear understanding and experience with Python and PySpark or Spark and SCALA, with HIVE, Airflow, Impala, and Hadoop and RDBMS architecture.
● Experience in writing Python programs and SQL queries.
● Experience in SQL Query tuning.
● Experienced...
...Insights
About Kyvos Insights Inc. : Kyvos Insights is committed to unlocking the power of big data analytics with its unique OLAP on Hadoop technology. Backed by years of analytics expertise and a passion for big data, the company aims to revolutionize big data analytics by...
...regression, deep learning, NLP, etc.)
Experience with a ML/data-centric programming language (such as Python, Scala, or R) and ML libraries (pandas, numpy, scikit-learn, etc.)
Experience with Apache Hadoop / Spark (or equivalent cloud-computing/map-reduce framework...
...on AWS/Azure cloud data stores and it’s DB/DW related service offerings.
have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases.
have technical expertise and working experience in at least 2 Reporting tools among Power BI, Tableau...
Rs 25 - 30 lakhs p.a.
...functional team. Excellent communication and collaboration skills
Bonus points for: Experience working with big data platforms (Spark, Hadoop, etc.)
Experience with cloud platforms (AWS, GCP, Azure, etc.) Experience with A/B testing and experimentation methodologies...