Average salary: Rs1,750,000 /yearly
More statsSearch Results: 4,424 vacancies
...Information Technology, Engineering .
● At least 8 years experience in BigData Development.
○ Strong Hands On Experience on Python, PySpark, Java Script
● Primary Skills
○ Knowledge of SQL and relational databases
○ Proficiency in developing, deploying, and debugging...
...daysLocation : Bangalore/chennai/puneRoels & Responsibilities : - 5+ years of working experience in ETL scalable data pipeline usin61Scala, Python, Pyspark, Hadoop, Apache Spark, Spark SQL, Kafka, Nill, and incremental Data Load with Big data technologies.- Experience working with...
...following :1. Manipulating, analyzing and interpreting complex data sources to tell a story from data through analyses.2. Programming in Python, PySpark and SQL for data pipeline development used to support analytics3. Trending and visualizing data within Tableau.- Experience...
Senior Cloud Data Engineer (AWS, Python, Pyspark) Job Title : Senior AWS Cloud Data EngineerWork Location : Hyderabad/Kolhapur/Bangalore/Chennai Experience : 6 to 9 yearsRequired Skills : Python, Pyspark, AWS services such as DMS, SAM, Glue, Redshift, Lamda S3 with Apache Airflow...
Experience Required :- Minimum 7+ years Experience in Data Engineering.- Must have good knowledge and experience in Python.- Mush have good Knowledge of Pyspark.- Mush have good Knowledge of Databricks.- Must have good experience in AWS (Glue, Athena, Redshift, EMR)-...
...years hands on Databricks (DB) experience.- Should have thorough knowledge in creation of jobs using Pyspark. Should be extremely good with SQL and possess good exposure to Python.- Should be able to create New Clusters , Cluster Pools and attach existing clusters to pool in DB...
...Job Role: Software Developer (Python Pyspark AWS Devops )
Experience: 5 Years Mandatory Skills: Python Pyspark AWS Devops
Location: Pune Work mode: Hybrid
Budget: 16 LPA Education: Graduate
Technology Skills:
Strong knowledge of Python/Pyspark and Bash...
...Saleforce, SAP, Codata is a plus.- Candidates with AWS Certification will be preferred.Tools : Aurora Postgres DB, SQL, Spark, Kafka, Python, Redshift, Snowflake, Airflow, Glue (ETL, Catalog, Crawler), Lambda, SQS, SNS, Data Lake, Cloud watch, Cloud Trail, DBT, AWS SDK, Boto...
...maintain scalable data pipelines and ETL processes using Databricks and PySpark.Collaborate with data scientists, analysts, and other... ...with a focus on Databricks and PySpark.Strong programming skills in Python and SQL.Hands-on experience with cloud platforms such as Azure or...
🌟 Join Our Team: Senior Data Analyst - PySpark 🌟🏦 Domain: BFSI
🔍 Role Overview:
Define & gather source data for insights & use cases.
Map & join multiple data sets from various sources.
Identify data inconsistencies & propose migration solutions.
Aid in...
...Apache Kafka pipelines for real-time data streaming and event-driven architectures.
Development and deep technical skill in Python , PySpark , Scala, NIFI and SQL/Procedure.
Working knowledge and understanding on Unix/Linux operating system like awk, ssh, crontab...
..., ChenniaInterview mode is Walkin driveRequirements :Mandatory skills : (8+ Years of experience in Data engineering with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3....
Main Skills - Azure Databricks, PySpark, SQL and Azure Cloud.Role : Senior Databricks Engineer / Databricks Technical Lead/ Data ArchitectExperience... ...Delta Lake Architecture- Should have hands-on experience in SQL, Python and Spark (PySpark)- Candidate must have experience in AWS/ Azure...
...Bachelor's or Master's degree in Computer Science, Engineering, or a related field.- Proven experience working with Databricks, Spark, and PySpark in real-time data processing environments.- Strong understanding of streaming data architectures, messaging systems (e.g., Kafka),...
...:Good development practices :- Hands on coder with good experience in programming languages like Java or Python.- Hands-on experience on the Big Data stack like PySpark, Hbase, Hadoop, Mapreduce and ElasticSearch.- Good understanding of programming principles and development...
...Consultant ,PySpark/HadoopDeveloper - ITO080454
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes... .../ Data Lake solutions;
• Well versed and Hand-on in Python, Spark,pyspark, Hdfs, hive, Hadoop. Spark SQL optimization.
•...
Azure Data ArchitectMandatory Skills : Solution Architecture - Pyspark + Databricks + Adf + Synapse is mandatoryJob Description :We are seeking a highly skilled and experienced Azure Data Architect to join our team. As an Azure Data Architect, you will play a key role in designing...
...Provide technical leadership in architecting end-to-end solutions in cloud environments. 3. Develop and maintain a deep understanding of PySpark, SQL, Palantir Foundry, and other cloud development technologies. 4. Collaborate with cross-functional teams to integrate different...
...years
Must Have:
1. basic data structure, algorithms, and problem-solving knowledge
2. Functional programming: Spark, Scala/Python
3. Big data: Hadoop (Spark and Hive)
4. Any RDBMS (preferably MySQL)
WHY JOIN CAPCO?
You will work on engaging projects with...
...engineering field. The ideal candidate will have strong expertise in PySpark, SQL databases, and Big Data programming for data transformation... ...experience in data engineering.- Strong proficiency in PySpark, Python, Spark, and SQL.- Experience with Big Data technologies and...