Average salary: Rs933,749 /yearly
More statsSearch Results: 6,033 vacancies
...Required: 4-5 yrs. Location: Gurgaon
Immediate Joiners only
JOB DESCRIPTION:
Skillset:
Should have strong experience in big data technologies.
Basic skills in data pipeline – experience in building data pipelines for data migration
Big Data – Pyspark,...
...working on java for last 3 years at least.(must have good understanding of multithreading concepts, data structures and algorithms, maven).
Docker/K8s/GCP will be a big plus point.
Cassandra/Kafka/maven - Nice to have.
Experience with integration and / or batch frameworks...
Job Description :Responsibilities : - Support critical project initiative to migrate programming codes, processes, data, and reports from YellowBrick / dBeaver Big Data environment to Google Cloud Platform Big Data environment. - Expert in SQL, Google BigQuery and Google Cloud...
Rs 5 - 10 lakhs p.a.
# Hiring Alert #
We are looking for Immediate Joiners
Skills:
Big Data with GCP
GCP; Hive; PySpark; Python
GCP Data Engineer with hands-on experience of 3+ years in Big Query, Data Proc, Airflow, Cloud Composer, GCP Hydra Services, Cloud Data Optimization...
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude...
Role : Data EngineerLocation : GurgaonDesignation : AM- AVP (as per interview performance & EXL fitment)Work Timings : 1pm- 10pmRole Overview... ...jobs.- Must have experience with managing and transforming big data sets using pyspark, spark-scala- Experience with AWS services...
Must-Have : - 5+ years of experience in Software and IT industry - 3 + years of relevant work experience in data engineering- Hands on in Python/Scala data engineering application development - Hands on in developing streaming application.- Knowledge on Test automation and scripting...
...specializing in Application & Process Integration, API Management, Data Engineering, Data Science/MLOps, DevOps & SRE, Cloud Solutions,... ...skills.- Proficiency in SQL and database management.- Experience with big data technologies such as Hadoop, Spark, or Kafka.- Strong problem...
...active with strong analytical skills, raising clarifications, risks and solution driven- Prior work experience in handling SQL / ETL or Big Data Testing- Proven track record in leading team, Test deliveries for ST, SIT and UAT- Experience in JIRA, Zephyr, Traceability, Test...
...the design, development, and maintenance of scalable and efficient data architectures that support the collection, storage, and analysis... ...modern technologies such as cloud platforms (e.g., AWS, Azure, GCP), big data technologies (e.g., Snowflake, Redshift, BigQuery), and...
Job Description :As Manager, Data Engineering, you will be responsible for translating client requirements into design, architecting, and implementing Azure Cloud based big data solutions for clients. Your role will be focused on delivering high quality solutions by independently...
Job Description :We are seeking a skilled Big Data GCP Engineer to join our team. The ideal candidate will have substantial experience with Google Cloud Platform (GCP) and a strong background in Big Data technologies. This role involves working with various GCP services and...
...Collaborate with enterprise architects, data architects, developers & engineers, data scientists, and information designers to lead the... ...its optimization at organization level.
Hands-on experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOps...
...- Responsible for designing, deploying, and managing high quality data solutions in the AWS cloud ecosystem.- Create and maintain optimal... ...time systems.- Expertise in working with efficient Data Models in a big data ecosystem.- Experience with Docker/containerization,...
Job Description :Role : Data Engineer - Overall experience of 5.5 years with Minimum 4 years of relevant experience in Big Data technologies.- Hands-on experience with the Hadoop stack - HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow...
...Byndr, and Learn We handle UK and US projects the same way.Experience : 4+YearsClient : American Express.Location : Gurgaon (Hybrid)Big Data with Talend Job Description :We are looking for a talented Big Data Engineer with hands-on experience in Talend to join our team. The...
...Java, Spark/SQL) as appropriate- Hands-on expertise with application design, software development and automated testing Proficient in Big Data technologies - Designs, codes, tests, corrects and documents large and/or complex programs and program modifications from supplied...
...business requirements- Technical requirements of the analytic solutions- Data requirements of the analytic solution processes- The person will... ...jobs.- Must have experience with managing and transforming big data sets using pyspark, spark-scala- Experience with AWS services...
Rs 5 - 7 lakhs p.a.
...Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Big Data Analysis Tool and Techniques Good to Have Skills : Amazon Web Services, Linux with spark and shell Scr, ansible, puppet and kubernets...
Your Skills & Experience :- Overall experience of 5.5 years with Minimum 4 years of relevant experience in Big Data technologies- Hands-on experience with the Hadoop stack - HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other...