Average salary: Rs1,150,000 /yearly
More statsGet new jobs by email
- ...EngineerExperience : 4 to 6 yearsLocation : ChennaiMust Have :- Databricks- Spark- Snowflake Schema- AzureRole And Responsibilities :- Develop,... ...and infrastructure, with a strong focus on Databricks and the Apache Spark framework.- Demonstrate proficiency in Python and other...Suggested
- ...mechanisms to ensure data integrity and system resilience within Spark jobs. Optimize PySpark jobs for performance, including partitioning... ...practices in a big data environment. ~ Proficiency in PySpark, Apache Spark, and related big data technologies for data processing,...SuggestedPermanent employmentFull time
- ...Job description: Experience: Minimum 5+ years . - Skills: Java, Spring Boot, Apache Camel, Kafka, JUnit, Spring Security (ADFS), basic Kubernetes. - Expectations: Should be able to work independently and handle client-facing roles. Technical Skill Requirements Core...Suggested
- ...responsible for the installation, configuration, maintenance, and performance of critical enterprise systems including Linux servers , Apache Server , and Oracle WebLogic . The ideal candidate will have strong scripting abilities and experience with writing SQL queries to...Suggested
- ...Responsibilities: - Development of applications on Java, spring core, Apache Camel, batch execution, development on REST, web services and REST API. Development of application on stored procedures, PL/SQL, Views. Database Design and optimization of SQL queries, best...Suggested
- ...and resolve issues related to data processing and storage.Key Skills & Competencies :- Strong knowledge of big data frameworks (Hadoop, Spark, Flink, Kafka).- Hands-on experience with cloud platforms (AWS, Azure, GCP).- Proficiency in SQL, NoSQL databases (MongoDB, Cassandra,...Suggested
- ...production rollout and infrastructure configuration • Demonstrable experience of successfully delivering big data projects using Kafka, Spark • Exposure working on NoSQL Databases such as Cassandra, HBase, DynamoDB, and Elastic Search • Experience working with PCI Data...SuggestedLong term contract
- ...infrastructure.You are a successful candidate if you have :- strong experience with data engineering tools and frameworks (e.g., Python, SQL, Spark, Airflow).- Hands-on experience with cloud data platforms (AWS, Azure, or GCP).- Familiarity with federated data architectures and...Suggested
- ...Snowflake for data ingestion and processing.- Understand and apply PySpark best practices and performance tuning techniques.- Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames).ETL & Data Warehousing : - Apply strong understanding of ETL...Suggested
- ...Chain Analytics- Strong problem-solving and quantitative skills.- Experience working with large datasets and distributed computing tools (e.g., Spark, Hadoop) is a plus.- Familiarity with data visualization tools (e.g., Tableau, Power BI, matplotlib, seaborn). (ref:hirist.tech)SuggestedHybrid workImmediate startShift work
- ...technologies.- Proficiency in programming languages such as SQL, Python, Spark, and Java.- Good working knowledge and practical experience with... ...and managing data pipeline and workflow management tools (e.g., Apache Airflow, Luigi, etc.).- Knowledge of cloud platforms such as...Suggested
- ...or libraries such as regular expressions (regex) Prior exposure to data analysis in a tabular form, for example with Pandas or Apache Spark/Databricks Experience of using basic statistics relevant to data science such as precision, recall and statistical significance...SuggestedLong term contractImmediate startHome officeFlexible hours
- ...parallelism and scalability in Data Processing, Modeling and Inferencing through Spark, Dask, RapidsAI or RapidscuDF) Ability to build python-based APIs (e.g.: use of FastAPIs/ Flask/ Django for APIs) Experience in Elastic Search and Apache Solr is a plus, vector databases....Suggested
- ...Familiarity with data warehousing solutions and ETL tools (e.g., Apache NiFi, Talend). Experience with programming languages such as Python... .... Understanding of big data technologies (e.g., Hadoop, Spark) is a plus. Knowledge of cloud platforms (e.g., AWS, Azure, GCP...Suggested
- ...experience with SQL, Python, and modern ETL frameworks (e.g., dbt, Apache Airflow) Understanding of data orchestration concepts and... ...Experience with real-time data processing frameworks (e.g., Kafka, Spark Streaming) Exposure to data observability and monitoring tools Understanding...SuggestedLong term contract
- ...Databricks and Snowflake for data engineering and analytics. Big Data : Experience working with Big Data technologies (e.g., Hadoop, Apache Spark). NoSQL : Familiarity with NoSQL databases (e.g., columnar or graph databases like Cassandra, Neo4j). Business-Related...
- ...Python, or Scala ~ Expertise in Hadoop ecosystem technologies, such as HDFS, Hive, Pig, MapReduce, Spark, and Kafka ~ Experience with big data processing frameworks such as Apache Flink and Apache Storm ~ Experience with cloud-based Hadoop solutions such as Amazon EMR and...
- ...data pipelines, ELT processes, and workflow orchestration using Apache Airflow, Python and PySpark to ensure the efficient and reliable... ..., and data modeling. ~ Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management....Permanent employmentFull time
- ...transformation workflows using SQL-based stream processing with Apache Flink for real-time analytics and low-latency data products.... ...Develop SQL-based batch and micro-batch data pipelines using Apache Spark to process largescale datasets efficiently. Engineer robust...Permanent employment
- ...least one modern language (Java, Python) and Big Data frameworks (Spark or Kafka) ~ Strong hold on system design and ability to develop... ...working with Distributed Messaging solutions such Azure EventHub, Apache Kafka, and Solace. ~ Experience building and optimizing...
- ...on-prem and in the cloud (Azure and GCP), includes MapR, Airflow, Spark, Kafka, jupyter, Kubeflow, Jenkins, GitHub, Tableau, Power BI,... ...normalization to ensure data quality and integrity. Awareness of Apache Spark, Hadoop Awareness of Agile / Scrum ways of working....Hybrid workWork at office
- ...architecture . Implement real-time event streaming using Apache Kafka . Build, optimize, and maintain Kafka producers, consumers... ...of monitoring and logging tools (Prometheus, Grafana, ELK stack). Familiarity with Big Data frameworks (Spark, Flink) ....Full time
- ...prioritize tasks in a dynamic work environment. Technical Skills: Strong data querying and processing skills using SQL, Oracle, Apache Spark or similar language Data Visualization tools – Power BI, Business Objects, Crystal or similar tool Data Warehousing and ETL...Full time
- ...tools. Work with distributed data processing frameworks like Spark, Hadoop, Hive, or similar. Implement ETL processes for structured... ...Python for data processing and scripting. Experience with Apache Spark , Hadoop ecosystem (Hive, HDFS, HBase), Kafka. Solid understanding...Full timeHybrid workImmediate startFlexible hours
- ...required ~5+ years of industry experience with hands-on experience in Apache technologies ~ Expertise in Python with knowledge of the Django... ...Science or a related technical discipline Familiarity with software engineering skills including Java, Spark, Python, Scala, etc....Full time
- ..., EKS Proven experience in: Java, Working experiencewith: AWS Athena and Glue Pyspark, EMR,DynamoDB, Redshift, Kinesis, Lambda, Apache Spark, Databricks on AWS, Snowflakeon AWS Proficient in AWS Redshift, S3, Glue, Athena, DynamoDB AWS Certification: AWS Certified SolutionsArchitect...Permanent employmentFull time
- ...experience with RDBMS and NoSQL databases, excellent communication and presentation skills Preferred Skills: Experience with BDD, Apache Spark, C#, Jira, GitLab, Confluence, Docker and Kubernetes, understanding of CI/CD processes with GitLab Experience: Minimum of 3 - 6...
- ..., data scientists, and business stakeholders to gather requirements and deliver high-quality solutions. Architect and optimize Apache Spark workloads for performance, cost, and reliability. Ensure best practices in data modeling, governance, security, and compliance...
- ...with building and maintaining a cloud system. Familiarity with databases like DB2 and Teradata. Strong working knowledge in Apache Spark, Apache Kafka, Hadoop and MapReduce. Strong troubleshooting skills and ability to design for scalability and flexibility. Expertise...
- ...solutions that are forward-thinking in data integration Programming experience in Scala or Python, SQL Working experience in Apache Spark is highly preferred Familiarity with some of these AWS and Azure Services like S3, ADLS Gen2, AWS, Redshift, AWS Glue, Azure Data...