Average salary: Rs762,499 /yearly
More statsGet new jobs by email
Rs 11.5 - 12.5 lakhs p.a.
...As a Custom Software Engineer, you will develop custom software solutions to design, code, and enhance components across systems or applications... ...& Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong hands-on experience with PySpark including optimization...SuggestedFull timeImmediate startRs 6.5 - 17 lakhs p.a.
...Design, implement, and optimize ETL pipelines and data processing workflows using PySpark Work on distributed computing frameworks for large-scale data processing Collaborate with Databricks and other cloud platforms for data storage and transformation Perform data...SuggestedRs 4 - 6 lakhs p.a.
...ETL/Data Warehousing: Build ETL/Data Warehouse transformation processes. Solution Development: Develop Big Data and non-Big Data cloud-based enterprise solutions using PySpark, SparkSQL, and related frameworks/libraries. Framework Development: Develop scalable, re-...SuggestedRs 4 - 7 lakhs p.a.
...We are seeking a proactive Senior Snowflake PySpark Developer to lead the design and maintenance of data pipelines in cloud environments. You will be responsible for building robust ETL processes using Snowflake, PySpark, SQL, and AWS Glue . This role requires strong expertise...SuggestedRs 4 - 6 lakhs p.a.
...Warm welcome from SP Staffing Services! We're excited to tell you about a permanent opportunity for a PySpark Developer to join our team. Experience: We're seeking professionals with 4 to 6 years of experience. Location: This role is open in Chennai, Hyderabad,...SuggestedPermanent employment- About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range...SuggestedContract workHybrid workImmediate start
- ...5 to 8 yearsLoc : Of positions : 10NP : Immediate to 30 daysJob Description : - Must have strong knowledge and experience Databricks Pyspark python SQL AWS- Need to work on latest features like DLT Unity catalog- SQL or equivalent RDBMS- Strong and experienced on Spark Databricks...SuggestedImmediate start
- ...solutions. The ideal candidate should have strong expertise in PySpark/Python, SQL, and ETL processes, along with hands-on experience in... ...platforms and cloud environments.Key Responsibilities :- Design, develop, and maintain robust ETL/data pipelines.- Work on data warehouse...SuggestedFull timeHybrid workImmediate start
Rs 3.5 - 14 lakhs p.a.
...-Face Interview (Mandatory) Key Responsibilities Design, develop, and test software solutions. Work on data processing and transformation... ...such as Spark . Programming ~ Proficiency in PySpark, Python, Scala, Java, and SQL for data manipulation....Suggested- ...are seeking a skilled Data Engineer with strong expertise in PySpark and Apache Airflow to design, build, and optimize scalable data... ...-based data platforms. Key Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using PySpark...Suggested
- ...Join us as a Data Engineer - PySpark at Barclays, responsible for supporting the successful delivery of location strategy projects to plan... ...E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop...SuggestedImmediate start
- ...reliable data pipelines for ingestion, transformation, and integration across diverse data sources and destinations. Develop and optimize batch workflows (PySpark, SQL, orchestration) and support real-time/streaming pipelines (Kafka or similar) when applicable. Improve...Suggested
- ...Sr. Azure Data Engineer(Must - Databricks + PySpark) Experience: -5-8 years Shift : 11:00 am to 8 pm(Candidate has to be flexible, if required resource should be flexible to overlap with US business hours) Remote Contract : 6 months Budget :120 LPM Advance...SuggestedRemote jobContract workUS shiftFlexible hoursShift work
- ...Join us as an Engineer - PySpark/Snowflake at Barclays, responsible for supporting the successful delivery of location strategy projects... ...E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical...SuggestedPermanent employmentImmediate start
- ...Position: Data Engineer - AWS & Pyspark Location : Nagpur/Pune Type of Employment : Full-Time... ...and Databricks on large-scale distributed data environments. Develop reusable data ingestion frameworks, transformation modules, and...SuggestedFull time
- Job Title : Databricks on AWS and PySpark EngineerJob SummaryWe're seeking an experienced Databricks on AWS and PySpark Engineer to join... ...pipelines and architectures using Databricks on AWS and PySpark- Develop and optimize data processing workflows using PySpark and...
- ...Skills Snowflake & Spark � Building and managing scalable data pipelines. � Spark-based transformations and ETL workflows. � Expertise in PySpark, including optimization techniques and cost management. � Snowflake-specific capabilities: o Performance tuning and query...Hybrid work
- ..., Pune, Bangalore)Notice Period : Immediate JoinersKey skills : Pyspark, Azure, Datbricks (Mandatory)Skills & Experience :- 8+ years of... ...and data lakes- Ensuring data is properly secured and protected- Developing and implementing data governance policies and procedures- Collaborating...Immediate startRemote job
Rs 18 - 25 lakhs p.a.
...object-orientated approach. What are we looking for Problem-solving skills,Prioritization of workload ,Commitment to quality PySpark, Python, SQL (Structured Query Language) Roles and Responsibilities In this role, you need to analyze and solve moderately complex...
