Hiring For Big Data Engineer (Spark, Scala, AWS)Job description :- Spark, SCALA, AWS OR Spark, python with AWS lambda exposure.- Design,... ...language like Scala with Spark.- Good command and working experience on Hadoop Map Reduce, HDFS, Hive, HBase, and/or No-SQL Databases- Hands on...
...experience in defining and implementing cloud based large solutions.
Big-data Engineering experience setting up data lakes ( Hadoop, Hive, HDFS, Spark, API's, Collibra, etc)
Applicant must have working experience in AWS IaaS, PaaS, storage, network and database, Analyzing...
...Bachelor /Master's degree in Data Science, or related field.
• Basic understanding of Java and Scala/ Hadoop
• Understanding of Big Data technologies and solutions (Spark, Hadoop Hive,
MapReduce) and multiple scripting and languages (YAML, Python).
•...
...transformation.
Experience Required- Above 12 Years
Job Description:
Tech Stack: Java 1.6+, Spring Boot, Spring MVC, Hadoop, Hive, HDFS,
Map Reduce, Spark Batch & Spark Streaming, Scala, Kafka
~ Experience developing enterprise-grade data integration solution
~ Good...
...Role Requirements :-
Profound programming language experience in Java/Scala or Python.
Knowledge of Hadoop ecosystem and strong hands-on experience on Hive and Spark.
Understanding on workflow orchestration tools like oozie, Apache Airflow etc.
Experience on...
...Locations- Bangalore, Raipur , Indore ,
Role Requirements :-
Tech Stack : Java 1.6+, Spring Boot, Spring MVC, Hadoop , Hive, HDFS, Map Reduce, Spark Batch & Spark Streaming, Scala, Kafka
Proven hands-on Software Development experience
Proven working experience...
...techniques.
Experience in Cloud Computing.
Technical Skills -
Programming language - Java and Scala
Good understanding of Spark Internals
Good understanding of Unix Internals
Should have experience of working in Trading, Telecom, gaming, or Risk engines in...
...Expert proficiency in programming and querying languages, including Python and SQL- Proficiency in big data processing, including Apache Spark and/or Databricks.- Proficiency in technologies like MLFlow, Azure ML services, AutoML etc.- Proven track record of leading projects...
...detailed design conforms to user expectations.
Design, Build and Test data processing pipelines in a GCP environment using Python, Spark, PySpark, Scala code.
Provide support with application testing, UAT, and application migration in GCP.
Areas of expertise we are...
...Java.
Strong background in statistical analysis and data visualization.
Experience with big data platforms and tools such as Hadoop Spark or Kafka.
Excellent understanding of AI frameworks and libraries such as TensorFlow and PyTorch.
Advanced knowledge of cloud...