Average salary: Rs89,998 /yearly
More statsSearch Results: 4,835 vacancies
...solutions. Utilize a deep understanding
of data integration and big data design principles in creating custom
solutions or... ...Well-versed and working knowledge with data platform-related
services on AWS
· Bachelor's degree and year of work experience of 4 to...
...software product development organization building modern and scalable big data applications.- Expertise with Scala functional programming (... ...data processing e.g. Spark.- Experience implementing RESTful web services in Scala, Python, or similar languages.- Experience with NoSQL...
...experience in writing complex SQL especially around OLAP systems
Sound knowledge of the ETL tool like informatica, 5+ years of experience, Big Data technologies’ like Hadoop ecosystem, its various components, along with different tools including Spark, Hive, Sqoop,etc..
In-...
...building and maintaining large scale distributed systems.- Previous experience with web services, and large-scale systems is a must- Extensive experience in data engineering and working with Big data- You have at least 3 years of experience leading a team of engineers -...
...experience of 5.5 years with Minimum 4 years of relevant experience in Big Data technologies- Hands-on experience with the Hadoop stack -... ...Redshift, Azure SQLDW, GCP BigQuery etc.- Well-versed and working knowledge with data platform related services on Azure. (ref:hirist.tech)
Title: Kafka Developer Location: Pan India
Experience: 4+years
NP: Immediate to 15days
Company Profile:
Tech Mahindra represents the connected world, offering innovative and customer-centric information technology experiences. We #Rise together to create sustainable...
...contribute to the optimization of existing Trino codebase.- Experience working with Presto is a significant advantage.- Familiarity with big data platforms and frameworks is a plus (e.g., Hadoop, Spark)- Experience with data visualization tools like Elasticsearch and Grafana...
...and code for resiliency. - Monitor and support data platforms and services: respond to alerts, improve process efficiency, reduce costs,... ...and loading),and data storage solutions. - Experience working with big data technologies such as Hadoop, Apache Hive, Redshift, Kafka, or...
...YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services).- Sound knowledge on working Unix/Linux Platform- Hands-on... ...& Jenkins) and requirement management in JIRA.- Understanding of big data modelling techniques using relational and non-relational techniques...
...talents, so we do everything we can to make that possible. We dream big together, supporting each other to make our individual and... ...under license.
Fortune and Fortune Media IP Limited are not affiliated with, and do not endorse products or services of, ServiceNow....
...offers training and development solutions to Individuals, Enterprises and Institutions.
Job Overview:
We are seeking a Faculty (Big Data ) for our China location, who will be an integral part of NIIT, contributing to educational excellence by providing instruction in...
Job Description :- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.Good to Have:- Airflow- Good aptitude, strong problem-solving abilities, and analytical...
...Experian is the world’s leading global information services company. During life’s big moments — from buying a home or a car to sending a child to college to growing a business by connecting with new customers — we empower consumers and our clients to manage their data...
...active with strong analytical skills, raising clarifications, risks and solution driven- Prior work experience in handling SQL / ETL or Big Data Testing- Proven track record in leading team, Test deliveries for ST, SIT and UAT- Experience in JIRA, Zephyr, Traceability, Test...
...Enterprise Data Modelling and Semantic Modelling & working experience in ERwin, ER/Studio, PowerDesigner, or Similar- Logical/Physical model on Big Data sets or modern data warehouse & working experience in ERwin, ER/Studio, PowerDesigner, or Similar- Agile Process (Scrum cadences,...
Job Description :- 10+ years of design & development experience with big data technologies like Azure, AWS or GCP- Preferred is Azure & Databricks, with experience in Azure DevOps- Experience in data visualizing technology in DL, like PowerBI- Proficient in Python, PySpark...
...team. We're highly collaborative and work across all areas of our technology stack. We enable critical services for the business, qualify complex compute changes, enable big data analytics and trail-blaze new engineering solutions for the cloud.
Responsibilities
*...
Job Description :Primary & mandatory : PYSPARKSecondary : GCP- At least 5 years of experience in Big Data, Hadoop Data Platform (HDP) and ETL and capable of configuring data pipelines.- Possess the following technical skills - SQL, Python, Pyspark, Hive, Unix, ETL, Control-...
...projects and solutions.
Demonstrate hands-on skills in processing large data sets, expertise in data structure, data access patterns, Big Data concepts, and cloud computing.
Work with Data Product Owners to understand data requirements, and to build ETL processes....
...years of extensive experience in design, build and deployment of PySpark-based applications.- Expertise in handling complex large-scale Big Data environments preferably (20Tb+).- Minimum 3 years of experience in the following: HIVE, YARN, HDFS preferably on Hortonworks Data...