Average salary: Rs2,013,437 /yearly
More statsSearch Results: 173 vacancies
Key Skills :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.Good to Have :- Airflow- Good aptitude, strong...
...Position - Hadoop Developer Location - Pune/Hyderabad/Gurgaon/Bangalore (Hybrid)
Experience - 5+
We are looking for a talented Hadoop... ...and expertise in Hadoop ecosystem MapReduce, Apache Spark, Hive, data integration using JDBC/ODBC connectors, Impala and databases...
...DataProc, Dataflow, Pub/Sub.- Experience with Big Data Tools such as Hadoop and Apache Spark (Pyspark)- Experience Developing DAGs in Apache... ...of data to the cloud using Big Data Technologies like Spark, Hive, Talend, Java- Interact with customers on daily basis to ensure smooth...
...Pune/Bangalore/HyderabadMust have : Scala, Spark, CI/CD, Airflow, Hadoop, SQLJob Description :As a key member of the technical team... ...Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services).- Sound...
...understanding of the mechanism necessary to successfully implement a change.Good to have :- Experience in data management and data lineage tools like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)
...Source, data-intensive, distributed environments with minimum 5 years of experience in big data related technologies like Spark, Hive, HBase, Hadoop, etc. Programming background : - Mandatory- Scala, Spark and Java / Python - Experience in following technologies - MapReduce,...
Job Description :- 4+ years of overall Data Analytics and BI experience- Experience in Spark, Hive, Scala.- Build data pipelines for ETL that fetch data from variety of sources such as flat files relational databases and APIs- Python scripting with focus on data transformation...
...Position Overview
Job Title- Data Engineer (Oracle, Big Data, Hadoop, Spark, GCP)
Location- Magarpatta, Pune
Role... ...CICD pipelines.
Proficient in Hadoop, Python, Spark, SQL, Unix, Hive,
Understanding of On-Prem, & GCP data security.
Hands on experience...
...Greetings From Maneva!
Job Description
Job Title Cloudera Hadoop Architect
Location Chennai / Pune / GNDC
Experience 8 15 Year
Relevant Experience 10 Years
Job Requirements:
Strong knowledge and Hands on experience...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
...Table- Proven real-time exposure and use of contemporary data mining, cloud computing, and data management ecosystems like Google Cloud, Hadoop, HDFS, and Spark. - Proficient in Data Modelling that can represent complex data structures while ensuring accuracy, consistency, and...
...in DatabricksNotice Period : Immediate - 30 Days Location : Bangalore / PuneSkills : - Experience in Big Data/Hadoop Ecosystem. Hands on experience in Spark and Hive. Mandatory in DatabricksData Pipeline Development : Design, develop, and maintain scalable and efficient data...
...that makes better decisions, drives innovation and delivers better business results.
Title and Summary
Senior Software Engineer - Hadoop, Spark with Python, SQL, Tableau / PowerBIOverview:
Are you a skilled and visionary Data Analyst with a passion for shaping robust...
...or more of the following subject areas:
Very Good knowledge of the following technologies are needed:
Java / Scala, Spark, Hadoop
Hive, Workflow orchestrators like Airflow, Control-M
Automation through Python/ Bash / Shell script
Data Pipelines on-prem or on...
...NoSQL, document databases, HDFS, caching, data access virtualization, streaming and messaging.
~ Required Experience : in Spark, Hadoop, Hive, XML processing with XSLT, Apache Flume, PIG, Cloudera ecosystem.
~ Required Experience with Spring Framework and Spring Boot,...
...Analyst with good decision-making, analytical and problem-solving skills.
~ Working knowledge / experience of Big Data frameworks like Hadoop, Hive and Spark.
~ Hands-on experience in query languages like HQL or SQL (Spark SQL) for Data exploration.
~ Data mapping:...
..., and data pipeline and orchestration of such tools as Apache Airflow and Nifi.
o Large Scale/ Big Data technology, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka.
o Fast and distributed data processing in Ray, Dask, DuckDb, Polars, cuDF
o Building performant...
...enterprise databases and ensure follow industry best practices around data privacy.
• Expertise in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop), SQL to build Big Data products & platforms.
• Experience in Java/.net, Scala, or Python...
...attitude to getting stuff done!
Good understanding about Big Data Hadoop implementations onprem and cloud.
Our Tech Stack:
Languages... ...Dataflow RxJava Micrometer
Databases: MySQL BigQuery Apache Hive Apache Spark SQLNeo4jOracle
Scheduling Tools: ControlM Airflow...
...implementation of large-scale Big Data, micro-services, event driven distributed systems.
• Expertise in Big Data Technologies (Hadoop, Spark, Hive, HBase) , Web applications (Springboot Angular, Java, PCF/Cloud Native) , Web Services (REST/OAuth), Event technologies ( Kafka...