...experience managing the Big Data application stack including HDFS, YARN, Spark, Hive and Hbase- A deeper understanding of all the configurations... ...applications on YARN- Experience with one or more automation tools such as Ansible, Terraform, etc- Experience working with CI/CD...
Primary Skills : Databricks with Pyspark/Spark/ Python and SQLSecondary skills : ADFJob Overview :- 7+ years of experience with detailed... ...warehouse technical architectures, ETL/ ELT and reporting/analytic tools.- 4+ years of work experience with very large data warehousing...
...We are looking for exceptional Spark Scala engineer with 5+ yrs experience who will be responsible for:
Experience- 5+ Yrs
Location... ...pipelines using Airflow
Experience with SQL, CICD, Gradle build tool, docker
Familiar with bigdata -
rest web server integration,...
...Job Description
Mission As a Spark Backline Engineer you will help our customers to be successful with the Databricks Data Intelligence... ...guides and runbooks.
Contribute to automation and tooling programs to make daily troubleshooting efficient.
Work with the...
...designing, developing, and maintaining large-scale data processing pipelines using Python and Spark/PySpark.- Your expertise in distributed computing frameworks and DevOps tools will be instrumental in building efficient and scalable data solutions.Responsibilities :- Design...
...Engineer or DataOps role.- 3+ years of experience working with SQL and Spark.- 2+ years of experience with Microsoft Azure Data Lake, Azure... ...having multiple sources.- Experienced in the use of standard ETL tools and techniques (e.g. SSIS or any equivalent ETL tools).- Familiar...
...Kochi, Hyderabad, Trivandrum.Job Description :- Proficient in SQL, Spark, Scala, and AWS, with a strong command over these technologies.-... ...effectively optimizing performance.- Adept at leveraging these tools to enhance efficiency and productivity within projects.- Crucial...
...maintain scalable data pipelines for batch processing using Apache Spark in Big Data projects.- Utilize Scala programming language to... ...the data ecosystem as needed.- Explore and implement visualization tools to provide insights into data for stakeholders.Requirements : - Bachelor...
...languages such as Hive, Kafka, high performance data libraries including Spark.
• Develop and deploy high-performance ETL processes using... .../ Scrum and waterfall methodologies.
• Experience with Agile tools such as Jira and the Atlassian suite.
• Passionate about...
Exp : 4-9 YrsLocation : Mumbai & BangaloreNotice : Immediate-Max 30 daysRole Profile :- Minimum 4 yrs exp out of which Proven experience working with Scala programming language and its ecosystem 3+ years- Strong understanding of functional programming concepts and design patterns...
...related field- Strong knowledge of machine learning, statistical analysis, and data visualization tools and technologies- Experience with big data processing frameworks, such as Spark or Hadoop- Strong analytical and problem-solving skills, with the ability to analyze complex...
...years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects... ...constructing relational and dimensional data models using any ETL/ELT tools (e.g. Talend, Informatica, Alteryx etc.)- Posses ETL Experience...
...Design, develop, and maintain robust data pipelines using Apache Spark, Python, and other relevant technologies.- Implement data ingestion... ...and SQL.- Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus.- Excellent problem-solving...
...governance and security protocols.- Evaluate new technologies and tools to extend or improve our data platforms.- Work closely with the... ...experience in data engineering.- Expertise in big data technologies like Spark, and Kafka.- Strong programming skills in Python- Proven...
...equivalent designing large data-heavy distributed systems and/or high-traffic web apps.- Must have 4+ experience using big data tools such as Spark / PySpark- Must have experience in Airflow pipeline orchestration.- At least 4 years of hands-on implementation experience...
Java Developer with APM Tool ExperienceExperience Level : 5+ yearsLocation : Bengaluru, Gurugram, Pune, Chennai, Hyderabad, JaipurWork Mode : HybridRole Overview : We are seeking a skilled Java Developer with hands-on experience in Application Performance Monitoring (APM) tools...
...will be responsible for Extract Transform Load (ETL), using ETL tools, data integration, data modeling, and analytical skills.- The role... ..., data analysis and data mining etc- Knowledge on Hadoop, Hive , Spark and Impala will be added advantage- Exposure to AWS Glue, Redshift...
...range of enterprise technology transformations and solutions at some of the world's leading multinational organizations.Skills - Apache Spark.Location - BangaloreYears of Experience - 7.5 yrsAs an Application Developer, you will be responsible for designing, building, and...
...Design, develop, and maintain data processing pipelines using Apache Spark and Scala.
Optimize Spark jobs for performance, scalability, and reliability.
Work closely with data engineers and data scientists to implement data-driven solutions.
Develop and maintain...
...Experience in Amazon Web Services (AWS) or other cloud platform tools- Experience working with Data warehousing tools, including Dynamo... ...Experience working with distributed technology tools, including Spark, Presto, Scala, Python, Databricks, Airflow- Developed the Pysprk...
...or in the cloud.- In-depth knowledge of distributed processing frameworks like Apache Spark.- Strong understanding of open-source big data technologies, including Hadoop ecosystem tools.- Familiarity with data warehousing concepts and solutions (e.g., Hive, Redshift).- Experience...
...insights with different stakeholders using reporting and visualizations tools.- Work on large datasets to meet functional and business... ...Airflow, Redshift.- Strong experience with SQL and Scala skills using Spark framework.- Experience with enterprise Data warehousing...
...is a global investment banking company having its headquarters in the US with 75000+ employees globally and is looking for a Scala / Spark to join their Bangalore regional team.Note : Looking for candidates who can join within 30daysWork Mode : Hybrid Experience : 5 to 8 yearsSome...
Job Description :As a Kafka Architect specializing in Spark and Apache Server, you will play a key role in designing, architecting, and implementing real-time data streaming solutions using Apache Kafka, Apache Spark, and related technologies. You will work closely with our...
...IntelliJ IDEA, AutoSys, WinSCP, Putty & GitHub.- Designing, building, installing, configuring and supporting Hadoop.- Transform data using spark & scala- Translate complex functional and technical requirements into detailed design.- Perform analysis of vast data stores and...
...Qualifications :- Bachelor's degree in Computer Science or related field.- 5+ years of experience in data engineering.- Proficiency in Python, Spark, and SQL.- Experience with BigQuery, Kafka, and cloud platforms.- Strong problem-solving and communication skills. (ref:hirist.tech)
...Job Title: Tool Maker
Location: Bangalore
Company: Laxmi Electronics
Responsibilities:
1. Fabricate, repair, and maintain precision tools, dies, jigs, and fixtures.
2. Interpret engineering drawings and specifications to create tooling components.
3. Operate...
..., particularly utilizing Flink and Kafka, alongside expertise in Spark and AWS technologies. As a Data Engineer, you will play a crucial... ...with the latest advancements in stream processing technologies, tools, and best practices, and incorporate them into the development process...
...: Anywhere in India Education : BE, B.Tech, Any Tech GraduateMust-Have Technical Skills : including 3+ years Spark or Scala,- 2+ years of Hadoop/Big Data using tools like Hive, Spark, PySpark, Scala, and RDBMS/SQL Strongly- Preferred: GCP, including GCS (Google Cloud Storage...
...and ingestion, designing ETL architectures using a variety of ETL tools and techniques.- Plan and execute secure, good practice data... ...experience, with at least 3+ years in Big Data technologies (Hadoop / Spark / Relational DBs) and similar experience in working on projects...