Average salary: Rs700,000 /yearly
More statsSearch Results: 41,255 vacancies
Job Profile : Hadoop AdministratorWFH and WFO (Hiring Office Jaipur and Ahmedabad)Role: Big Data EngineerIndustry Type: IT Services & ConsultingJob description :- Good understanding of SDLC and agile methodologies- Installation and configuration of Hadoop clusters, including...
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude...
...25 -35 LPAJob Description :Responsibilities :- Responsible for data analysis, data profiling and data sourcing- Creating Data Quality... ..., profiling and troubleshooting skills- Strong knowledge with Big Data/Hadoop platform- Strong knowledge Unix shell and experience with...
...Deploying Machine learning and deep learning models developed by data science teams into production environments, ensuring they... ...be beneficial.- Familiarity with distributed computing and big data technologies like Hadoop and Spark.- Familiarity with machine learning operations (...
...of the technical team alongside Engineers, Data Scientists and Data Users, you will be expected... ...most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN... ...management in JIRA.- Understanding of big data modelling techniques using relational...
...libraries to ease development, monitoring and control of data and models- Identify potential improvements to the current... ...and operationalization of Data Engg & AI/ML models across Big Data Ecosystem (PySpark, Hadoop, Snowflake, Python) - Experience in architecture, design,...
Big Data + Data Bricks DeveloperExperience : 5-8Years- 3+ Years in Big Data Engineer/Min 1 yr in DatabricksNotice Period : Immediate - 30... ...Days Location : Bangalore / PuneSkills : - Experience in Big Data/Hadoop Ecosystem. Hands on experience in Spark and Hive. Mandatory in...
...Description :- 5-8 years of total experience of solid data engineering experience, especially in Open Source,... ...environments with minimum 5 years of experience in big data related technologies like Spark, Hive, HBase, Hadoop, etc. Programming background : - Mandatory- Scala, Spark...
...Position Overview
Job Title- Data Engineer (Oracle, Big Data, Hadoop, Spark, GCP)
Location- Magarpatta, Pune
Role Description –
Senior Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness...
We are seeking a highly skilled and motivated Hadoop Developer to join our team and play a crucial role in developing and maintaining our big data processing pipelines.The Ideal Candidate :Experience :- 5+ years of experience in developing and maintaining big data applications...
...of ETL jobs, utilizing automation tools whenever possible.- Utilize your in-depth knowledge of Big Data ETL to identify and design effective test scenarios.- Experience with Hadoop Hive is essential for testing data transformation and querying functionalities.- Expertise in...
...Experience : Overall 3+ years of experience working on Databases, Data Warehouse, Data Integration and BI/Reporting solutions with... ...service offerings.- Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases.- Should have technical...
Job Description :As a Big Data Engineer, you will be responsible for designing, developing, and maintaining our big data infrastructure.... ...and collaborative environment, leveraging your expertise in Hive, Hadoop, and PySpark to unlock valuable insights from our data.Key...
Job Description :- Excellent skills on SQL and Advance Level SQL.- Knowledge of Big Data Processing Cluster (Azure HDInsight / AWS EMR / Hadoop/ Databricks )- Big data processing experience using Spark Spark-SQL/ PY-Spark/ Azure Data Factory/ AWS Glue etc. - Working Experience...
As Data Engineer, your specific responsibilities include the following :- Create and maintain... ...of data sources using SQL and AWS - big data technologies- Build analytics tools that... ...and workload management- Big data tools like Hadoop, Spark, Kafka, etc.- Message queuing, stream...
...experience.
Your Role and Responsibilities
Understand a data warehousing solution and able to work independently in... ...of the ETL tool like informatica, 5+ years of experience, Big Data technologies’ like Hadoop ecosystem, its various components, along with different tools...
...Responsibilities
Design, develop, and deploy scalable Big Data applications using Hadoop ecosystem technologies (Hadoop, HDFS, Hive, etc.).
Collaborate with data scientists and business analysts to understand requirements and translate them into technical solutions...
...Collaborate with enterprise architects, data architects, developers & engineers, data scientists, and information designers... ...at organization level.
Hands-on experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOps.
Strong SQL (Hive...
...experience leading on client-facing projects, including working in close-knit teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects within the cloud ideally AWS or Azure- Data...
Data Modeler Responsibilities : As a data modeler, you will be responsible to understand and translate business needs into data models... ...like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)