Average salary: Rs141,000 /yearly
More statsGet new jobs by email
- ...their data and make data-driven decisions.Key Responsibilities :- Design and implement scalable and reliable data pipelines using Hadoop, Spark, and other big data technologies to process large volumes of data for various business applications.- Develop and maintain data models...Suggested
- ...frameworks.Good to Have : - Experience with BigQuery, Vertex AI, Cloud Storage, or Dataflow on Google Cloud Platform.- Experience with Spark / PySpark or big data technologies.- Exposure to MLOps, model deployment, and CI/CD pipelines.- Knowledge of LLMs or Generative AI is...Suggested
- ...engineering trucks shuttle vans electric carts) including checking oil fluid levels tire pressure/wear charging batteries and replacing spark plugs. Perform preventative maintenance on tools and equipment including cleaning and lubrication. Maintain proper maintenance...SuggestedFull time
Rs 8 - 13 lakhs p.a.
...capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to...SuggestedRs 3.5 - 15 lakhs p.a.
...to optimize models for real-world performance Preferred Qualifications Experience with Big Data technologies like Hadoop and Spark Familiarity with Docker, Kubernetes, and containerization tools Knowledge of MLOps practices and automating ML pipelines Experience...Suggested- ...Responsibilities Design and own end-to-end data pipelines and Snowflake data models Process large-scale data with Databricks (PySpark/Spark SQL) Translate business needs into scalable architectures and robust ETL/ELT frameworks Collaborate with teams to deliver...Suggested
Rs 5 - 7 lakhs p.a.
...models for high accuracy and performance in real-world scenarios. Preferred Skills Experience with Big Data technologies (Hadoop, Spark, etc.) . Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Experience in automating ML...Suggested- ...time-series queries - Experience with Cloud Composer (Apache Airflow) for pipeline orchestration - Working knowledge of Dataproc (Apache Spark) for batch ingestion and incremental processing - Experience with AI-assisted development tools such as GitHub Copilot or similar -...SuggestedFull timeLocal areaRemote job
- ...queries for data extraction, validation, and reporting. Process and analyze large datasets using Big Data technologies such as Hadoop, Spark, PySpark, and Databricks. Perform data analysis using Python libraries (Pandas, NumPy, etc.) to identify trends and insights....Suggested
- ...Scala for ML development. • Knowledge of cloud-based ML platforms (AWS, Azure, GCP). • Experience with big data processing (Spark, Hadoop, or Dask). • Ability to scale ML models from prototypes to production . • Strong analytical and problem-solving skills....Suggested
- ...world needs state of the art technology. Our infrastructure, hosted on-prem and in the cloud (Azure and GCP), includes MapR, Airflow, Spark, Kafka, jupyter, Kubeflow, Jenkins, GitHub, Tableau, Power BI, Synapse (Analytics), Databricks and further interesting tools. We like...SuggestedHybrid workWork at office
- ...project is to migrate the dashboards/ reports from Hadoop to GCP, which will require the Hive and Pig jobs to be migrated to BigQuery / Spark. Ensuring data consistency and accuracy through data validation and cleansing techniques Working together with cross-functional...SuggestedFull timeHybrid work
- ...Modification) IT Security/SecMCO (Secure MCO) Shipment Alert QualiTrack UV IPR (Under Value, Intellectual Property Rights) SPARK In this role, you will act as the senior technical authority and strategic advisor for all ServiceNow solutions within this...SuggestedLong term contractFull timeTemporary workHybrid workWork at officeFlexible hours
- ...Experience: 5-10 Years Must have: ETL/Data Testing/Azure Databricks + Azure Data Factory: 5+ years (Required) Strong SQL + Python/Py Spark: 5+ years (Required) Experienced in end-to-end of ETL (Data Pipeline) QA testing using various technologies in Azure cloud involving...SuggestedFull time
- ...representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Apache Spark Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time...SuggestedFull timeWork at officeImmediate start
Rs 7 - 12 lakhs p.a.
...of AWS services and strong fundamentals of Data Warehousing (DW) / Data Lakehouse (DH) concepts. Coding & Automation : Utilize Spark/Python for scripting and automation within the Databricks environment. Containerization & Orchestration : Work with Docker and...- ...Jenkins or equivalent) Strong SQL skills Secondary Skills Cloud & Containerization (Docker, Kubernetes) Azure Databricks, Apache Spark (PySpark) Banking domain knowledge About the Role We are seeking a Python Fullstack Developer with deep expertise in Python,...Permanent employment
- ...automation for more than 60 years. The success of our customers is what drives us along with our work future-oriented work, Leuze continuously sparks new ideas thus actively contributing to progress within the industry. Website: Below are the Key Responsibilities but not...
- ...solving for Bharat & the world! Who we are not looking for - Anyone looking for a part-time stint If education and skilling don't spark your curiosity & interest Impact creation is something you would not want to work for Not willing to call the team at 2 Am when...Full timePart timeWork at office
- ...atleast 1 yrs experience on MS Fabric. ~ Must possess excellent programming skills with SQL, and Python. ~ Having experience working on spark platform ~ Should be able to use services and tools to ingest, egress, and transform data from multiple sources, including SAP,...Full time
- ..., Python, and Azure ~ Experience with Fivetran, ETL/ELT design, and data modeling Preferred Databricks experience (PySpark, Spark SQL, Delta Lake) Orchestration (Airflow), streaming (Kafka), or infrastructure-as-code (Terraform) Share with someone awesome View...
- ...Experience on testing against modern cloud platforms and containerized applications (AWS/ Azure). ~ Understanding of Kafka / Hadoop (Spark) and/or event driven design and principles. ~ Understanding of job scheduler tools (Control M, Autosys, etc.) ~ Experience of the...Full timeWork at office
- ...Design and implement data transformation pipelines to convert raw data into curated datasets for analytics and reporting. Optimize Spark jobs and SQL queries to improve performance and reduce compute costs. Implement data quality validation, monitoring, and error...Full time
- ...Last Date to Apply 04th May 2026 Uniting curious minds Behind every innovative solution, there are people working together to transform the future. With careers sparked by initiative and lifelong learning, we unite curious minds, and you could be one of them....Work at office
- ...effectively About Company: DataFlair Web Services is a leading provider of online training in niche technologies like Big Data-Hadoop, Spark and Scala, HBase, Kafka, Storm, etc. We aim to reach the mass through our unique pedagogy model for self-paced learning and Instructor...
- ...to premium lifestyle and luxury brands, we build identities that create impact. What this role is really about You'll be the spark that starts conversations, builds connections, and identifies new business opportunities for the agency. This is not a target-heavy sales...InternshipWork at office

