Average salary: Rs1,150,000 /yearly
More statsSearch Results: 145 vacancies
...Design, develop, and maintain data processing pipelines using Apache Spark and Scala
Optimize Spark jobs for performance, scalability, and reliability
Work closely with data engineers and data scientists to implement data-driven solutions
Develop and maintain ETL...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
Mandatory Skills : AWS, Python, SQL, spark, Airflow, SnowflakeResponsibilities :- Create and manage cloud resources in AWS- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST API, flat files, Streams, and Time series...
...Experience : 7-10 YearsLocation : Anywhere in India Education : BE, B.Tech, Any Tech GraduateMust-Have Technical Skills : including 3+ years Spark or Scala,- 2+ years of Hadoop/Big Data using tools like Hive, Spark, PySpark, Scala, and RDBMS/SQL Strongly- Preferred: GCP, including...
Location : Chennai, Pune, Noida, Kochi, Hyderabad, Trivandrum.Job Description :- Proficient in SQL, Spark, Scala, and AWS, with a strong command over these technologies.- Minimum 6-7 years of relevant experience in Spark and SQL, plus 2-3 years of hands-on practice in AWS.-...
...engineering workflow- Experience with S3, EC2, EMR, Lambda, Glue, Athena, Airflow, Redshift.- Strong experience with SQL and Scala skills using Spark framework.- Experience with enterprise Data warehousing implementations using Batch and real time systems.- Expertise in working with...
...Responsibilities
Design, develop, and deploy scalable Big Data applications using Apache Spark.
Collaborate with data scientists and business analysts to understand requirements and translate them into technical solutions.
Write efficient and optimized code...
Rs 7 - 11 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must Have Skills : Apache Spark Good To Have Skills : Job Requirements : Key Responsibilities: AStrong experience in creating ScalaSpark jobs for data transformation...
Rs 12 - 16 lakhs p.a.
...Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Microsoft SQL Server,Unix to Linux Migration,Data Warehouse ETL Testing,Amazon Web Service Job Requirements :...
Job Description :- Experience in application development experience in Scala, Spark, scala test/Junit libraries with maven.- Experience on Hive, Spark using Cloudera data platform.- Knowledge of performance optimization on Hadoop platform, Spark and Scala.- Ability to write...
...Options, Derivatives), regulatory trade reporting for various countries in Asia and Europe
Job Title:
Data Integration Engineer (TDH, Spark,ETL)
Date:
Department:
DDD - Data & AI Solutions
Location:
Chennai
Business Line / Function:
Wealth Management...
...the code
· Responsible for integrating finished models into larger data processes using UNIX scripting languages such as ksh, Python, Spark, Scala, etc.
· Produce and maintain documentation for released data sets, new programs, shared utilities, or static data. This must...
...Code, Unit test, Hive, HDFS, KAFKA and SPARKScala/Pyspark.
· Build libraries, user defined functions, and frameworks around Hadoop/Spark.
· Exposure to cloud platform such as AWS or equivalent is desired.
· Develop user defined functions to provide custom Hive, HDFS...
...fundamentals in data mining & data processing methodologies
Sound understanding of Big Data & RDBMS technologies, such as SQL, Hive, Spark, Databricks, Snowflake or Postgresql
Orchestration and messaging frameworks: Jenkins, Messaging queue frameworks (Kafka)...
...perfect blend of local traditions and flavors, ensuring unquestionable authenticity. Each box of our sweets evokes nostalgic memories and sparks conversations. With decades of experience and the expertise of our local chefs, we provide unmatched taste and panache, packaged...
...Apache Airflow for orchestrating and scheduling ETL workflows. - Big Data Technologies : Familiarity with big data technologies like Spark, Hadoop, and related frameworks. - Data security best practices and compliance standards. - Data Modelling : Understanding of data modelling...
...the bar for excellence on a regular basis. We, in turn, work hard to bring out the best in them as we strive to help them find their spark and become the best version of themselves that they can be.
If all this sounds like an environment you’ll thrive in, then you’re...
...s degree with Minimum 3+/5+ years of experience working in globally distributed teams successfully
Must have experience working on Spark, Kafka, and Python
Apply experience with cloud storage and computing for data pipelines in GCP (GCS, BQ, composer, etc)
Write pipelines...
...database migration transformation and integration solutions for any Data warehousing project.
Must have excellent knowledge in Apache Spark and Python programming experience
Deep experience in developing data processing tasks using PySpark such as reading data from...
...multitask, and drive your own projects
~ Proficiency in Python/PySpark, Scala or Java
~ Proficiency in SQL
~ Experience with Databricks/Spark, AWS ecosystem, Hadoop, Pig, Hive, Flink, or Beam.
~ Experience with orchestration tools such as Airflow
~ Comfortable with CI/CD...