Search Results: 10 vacancies
Hiring For Big Data Engineer (Spark, Scala, AWS)Job description :- Spark, SCALA, AWS OR Spark, python with AWS lambda exposure.- Design,... ...MapR AWS), AWS preferred- Hands-on experience on Data Ingestion Apache Nifi, Apache Airflow, Sqoop, and Oozie- Hands on working experience...
...in Java/Scala or Python.
Knowledge of Hadoop ecosystem and strong hands-on experience on Hive and Spark.
Understanding on workflow orchestration tools like oozie, Apache Airflow etc.
Experience on working with monitoring and logging framework like Splunk/ ELK....
Rs 1.8 - 3 lakhs p.a.
...database independently. - Strong in Smarty, Zend framework with installation process. - Strong in Javascript, Ajax, Jquery. - Knowledge of Apache Server and Linux. - Knowledge of XML, HTML, HTML5, XHTML, CSS. - Should be able to handle the project independently. - Can understand...
...Science, or related field.
• Basic understanding of Java and Scala/ Hadoop
• Understanding of Big Data technologies and solutions (Spark, Hadoop Hive,
MapReduce) and multiple scripting and languages (YAML, Python).
• Understanding of Amazon Cloud Platform and...
...defining and implementing cloud based large solutions.
Big-data Engineering experience setting up data lakes ( Hadoop, Hive, HDFS, Spark, API's, Collibra, etc)
Applicant must have working experience in AWS IaaS, PaaS, storage, network and database, Analyzing and identifying...
....
Strong background in statistical analysis and data visualization.
Experience with big data platforms and tools such as Hadoop Spark or Kafka.
Excellent understanding of AI frameworks and libraries such as TensorFlow and PyTorch.
Advanced knowledge of cloud computing...
...detailed design conforms to user expectations.
Design, Build and Test data processing pipelines in a GCP environment using Python, Spark, PySpark, Scala code.
Provide support with application testing, UAT, and application migration in GCP.
Areas of expertise we are...
...techniques.
Experience in Cloud Computing.
Technical Skills -
Programming language - Java and Scala
Good understanding of Spark Internals
Good understanding of Unix Internals
Should have experience of working in Trading, Telecom, gaming, or Risk engines in...
...Experience Required- Above 12 Years
Job Description:
Tech Stack: Java 1.6+, Spring Boot, Spring MVC, Hadoop, Hive, HDFS,
Map Reduce, Spark Batch & Spark Streaming, Scala, Kafka
~ Experience developing enterprise-grade data integration solution
~ Good knowledge of Java...
...Bangalore, Raipur , Indore ,
Role Requirements :-
Tech Stack : Java 1.6+, Spring Boot, Spring MVC, Hadoop , Hive, HDFS, Map Reduce, Spark Batch & Spark Streaming, Scala, Kafka
Proven hands-on Software Development experience
Proven working experience in Java...