Average salary: Rs397,199 /yearly
More statsSearch Results: 6,201 vacancies
...Role: Python Developer (Python +Airflow +AWS) Location: Hyderabad
Duration: Permanent
Job Description
· Strong knowledge of Python/Pyspark and Bash scripting
· Experience working with Dataframes, Pandas, Scipy, Numpy libraries is a plus
· Strong hands-on working...
...engineering experience with 3+ years hands on Databricks (DB) experience.- Should have thorough knowledge in creation of jobs using Pyspark. Should be extremely good with SQL and possess good exposure to Python.- Should be able to create New Clusters , Cluster Pools and attach...
...data and metrics is a huge plus.
Responsibilities
Actively develop, enhance and maintain data pipelines and workflows for marketing... ...the frameworks and automations in ETL processes
Develop PySpark and DBT ETL pipelines for data ingestion and transformation....
Azure Data ArchitectMandatory Skills : Solution Architecture - Pyspark + Databricks + Adf + Synapse is mandatoryJob Description :We are... ...a deep understanding of Azure data services, and the ability to develop scalable and efficient data solutions that meet business requirements...
..., ChenniaInterview mode is Walkin driveRequirements :Mandatory skills : (8+ Years of experience in Data engineering with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3....
Required technical skills:PySpark, Hive, HDFS, Sqoop, Shell scripting, Kafka, SQL, UNIX, Linux,AWS S3, TWS, Airflow, Bit bucket.Description: - Monitor EOD jobs, Kafka notifications and application status - Rerun any failure jobs (TWS/Airflow/Jobserver) - Alert respective stakeholders...
...Java or Python.- Hands-on experience on the Big Data stack like PySpark, Hbase, Hadoop, Mapreduce and ElasticSearch.- Good... ...and technology and translate them into large scale engineering developments- Excellent experience in Application development and support, integration...
Job : PySpark/Databricks EngineerOpen for Multiple Locations with WFO and WFHJob Description :We are looking for a PySpark solutions developer and data engineer that is able to design and build solutions for one of our Fortune 500 Client programs, which aims to build a data...
...Hadoop Data Platform (HDP) and ETL and capable of configuring data pipelines.- Possess the following technical skills - SQL, Python, Pyspark, Hive, Unix, ETL, Control-M (or similar)- At least 2 years of experience in Cloudera Data Platform (CDP)- Skills and experience to support...
Senior Cloud Data Engineer (AWS, Python, Pyspark) Job Title : Senior AWS Cloud Data EngineerWork Location : Hyderabad/Kolhapur/Bangalore... ...Artifactory, GitHub, Jenkins, VersionOne, etc. (is a plus).- AWS Developer or Solution Architect certification is a plus.- Advanced level...
...DevOps practices is a plus.- Programming experience with Python, PySpark, Shell scripting, and SQL is mandatory. Good to Have :Azure... ...Azure services such as Synapse, Azure Storage, and Data Factory.- Developing and maintaining data pipelines to efficiently extract, transform...
...SQL
T-SQL, Dynamic SQLskills
Azure DevOps
Good to have data visualisation/Power BI experience
Exposure to Python, Scala, PySpark, Databricks MUST
Additional Qualifications:
Bachelor's degree with 3+ years of professional experience. A Computer Science Degree...
...Walk-in Drive in Hyderabad, Telangana on 4th May [Saturday], 2024 , and we believe your skills in Databricks, Data Factory, SQL, and Pyspark or Spark align perfectly with what we are seeking.
Experience Level: 3 years to 25 years
Details of the Walk-in Drive:
Date...
...responsible for conceptualizing and executing clear, quality code to develop the best software. You will also support our customers and... ...sharing and other Snowflake capabilities
Good knowledge of Python/PySpark, advanced features of Python
Support business development...
...analysis, design, development and implementation of Data Engineering applications
3+ years of development experience in Python and PySpark
Strong knowledge on data warehouse and database concepts
Strong knowledge in SQL and Query tuning
Working experience with AWS...
...Principal Consultant - Snowflake Developer - BFS036026
Genpact (NYSE: G) is a global professional services and solutions firm delivering... ...hybrid tables, transient tables.
· Knowledge on Snowpark and Pyspark is plus.
· Should have knowledge on orchestration, data migration...
...Experience working on Spring Framework/Spring Boot concepts/ Microservices, GraphQL- Experience working with AWS Glue - Python, Big Data - Pyspark, Control-M - ETL Batch job implementation stack-- Experience in cloud native development on cloud providers like AWS, GCP or Azure-...
...Mandatory Skills
Experience in SAP MII/SAP PCo/ SAP UI5
Capable of developing architecture/solutions based on SAP MII/SAP PCo/SAP UI5 platform.
Experience in Integration with L2/L3/L4 (ECC, SAP ME, MES, systems.
Experience in requirements, gap analysis, design,...
...analytical and interpersonal skills
· Ability to prioritize and work on your own
R oles and Responsibilities
· Developing and maintaining data architecture and data models
· Create standardized procedures for data flows using python scripting...
...0% quality assurance parameters
Do
# Instrumental in understanding the requirements and design of the product/ software
# Develop software solutions by studying information needs, studying systems flow, data usage and work processes
# Investigating problem areas...