Average salary: Rs1,302,124 /yearly
More stats ...Job Title: Java + Hadoop Developer
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar...
...Responsibilities:
Utilize solid years of experience in software development using Scala, Java, or Python to build solutions in Hadoop, Spark, and Hive technologies.
Design, develop, and maintain ETL solutions using Hadoop, Spark, and related technologies.
Implement...
...Job Description:
We are seeking a skilled and experienced Hadoop Developer with expertise in Java and Kafka to join our team. The ideal candidate will play a crucial role in designing, developing, and maintaining our Hadoop-based data processing and analytics solutions...
Skills : Hadoop, Java, Python, Scala, SqlRequirements :- 3-4+ yrs as a backend developer- Worked in a good startup or good engineering colleges- Working with an early-stage startup in a role that involves- Building scalable systems to extract information from 15Mn+ domains....
...Job title: Senior Hadoop Developer Experience: 4+ Years
Notice Period: Immediate to 15 days only
Location: Manyata Embassy Business Park, Bangalore.
Mode of Employment: Contract
Mode Of Work: Hybrid (2-3 days in a Week- Mandatory)
Working experience in elastic...
Job Description :Hadoop/Omnia Dev Engineers To develop and deliver codes for the work assigned in accordance with time| quality and cost standardsJob responsibilities :- Interact with business stake holders and designers to implement to understand business requirements.- Hadoop...
...Experience: 8 to 12 years of experience in Big Data, Apache Hadoop System Implementation, and administration.
Education: BE (IT... ...and recommend new technologies.
11. Stay updated with industry developments and emerging technologies.
What skills you should have?...
Key Skills : Java, Apache Spark, Apache Kafka, Hadoop, AWSJob Description :- Designing, installing, testing, and maintaining scalable data management systems... ...opportunities and new uses for existing data- Developing data modeling, mining, and production processesRequired...
...At least 3 years of experience building production-grade data processing systems as a Data Engineer
In-depth knowledge of:
The Hadoop ecosystem.
Building applications with Apache Spark;Experience with Spark Streaming
Columnar storage solutions like Apache HBase;...
Summary : As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process... ...using PySpark. Your typical day will involve working with Hadoop Administration and collaborating with cross-functional teams to...
...Technologies should be Hive, PIG, Sqoop, Zookeeper,Cloudera - these are Hadoop Ecosystem tools2. Candidate should have worked in at least one... ...good POC experience. At least basic knowledge of Spark with Java or Scala4. HIVE, SQL, HDFS, SQOOP - hands on experience5. Spark,...
...Iterators in Scala.- Have worked on multi-threading it will be helpful.- Experience in working with Kafka will be helpful.- Knowledge of Hadoop MapReduce, HDFS, Hbase, and Hive will be considered a plus.- Exposure to DevOps and SQL (Postgres, MS SQL) will be considered as an...
...initiatives.Key Responsibilities :- Design, develop, and deploy scalable and reliable data... ...distributed computing frameworks such as Apache Hadoop, Apache Spark, or Apache Flink.- Develop... ...in programming languages such as Java, Scala, or Python, with experience in big...
...Solutions.- Work with stakeholders to understand business objectives and develop value-added data driven solutions, frameworks and reporting... ...- Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases.- Should have technical expertise...
Rs 15 - 20 lakhs p.a.
...Role : Application Developer Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Hadoop Good to Have Skills : Banking Strategy Job Requirements : 1: Responsblities: A: Understanding the requirements...
Job Description :- Must have working experience Designing, building, installing, configuring and supporting of Hadoop.- Good to have Teradata, Cloud & Snowflakes Knowledge- Must have working experience in IntelliJ IDEA, AutoSys(Control M), WinSCP, Putty & GitHub.- Translate...
...work closely with your Scrum team and program team to test, develop, refine and implement quality software in production via... ...development using programming languages & tools/services: Java or Scala, Big Data, Hadoop, Spark, Spark SQL, Presto \ Hive,Cloud (preferably AWS), Docker...
Primary skills expected :- Data engineering problem solving. - Able to understand requirement and to provide detailed implementation steps considering dependencies with - (upstream systems, downstream systems, and other modules).- PySpark data ingestion and transformation Batch...
...with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3. Hands on and working knowledge in Python4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora -...
...estimates for implementing new features
Develop software applications using technologies that include and not limited to core Java (11+ ), Kafka or messaging system, Web Frameworks... ...and non-relational databases (SQL, MongoDB, Hadoop, etc), with RESTful microservice...
...good experience in programming languages like Java, Python or Scala.- Hands-on experience on the Big Data stack like Hadoop, Mapreduce, Spark, Hbase, and ElasticSearch.-... ...and translate them into large scale engineering developments- Excellent experience in Application...
...good experience in programming languages like Java or Python.- Hands-on experience on the Big Data stack like PySpark, Hbase, Hadoop, Mapreduce and ElasticSearch.- Good... ...translate them into large scale engineering developments- Excellent experience in Application development...
...experience with RedHat Linux and Cloudera is mandatory.- Experience installing, configuring, upgrading, managing, and administering Cloudera Hadoop- Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues- Work closely with data scientists...
Job Profile : Hadoop AdministratorWFH and WFO (Hiring Office Jaipur and Ahmedabad)Role: Big Data EngineerIndustry Type: IT Services & ConsultingJob... ...and prevent bottlenecks- Providing technical support to developers and end-users as needed- Awareness of latest technologies and...
...Lead Consultant - Java, Hadoop, Python and API automation - ITO077577
With a startup spirit and 115,000+ curious and courageous minds, we... ...reliability, scalability, and performance of software applications.
§ Develop and maintain automated test suites for API endpoints.
§...
Mandatory Skills:- Data scientist exp- Hadoop, Scala- Python- Machine learning- SQL- Only from... ...preferred.- End-to-end ownership of developing the model, writing production quality code... ...Programming Languages: Hands-on with Python and Java.- Strong background in relational...
...- Expertise in R/Python, SQL, git and docker- At least basic coding skills in one of the following languages C#/Java/C/C++/Scala- Hands-on experience with Hadoop- Team work spirit i.e., experience in joint problem solving with 3+ Data Scientists- Independently make decisions...
...Development team to implement data strategies, build data flows and develop conceptual data models.- Recognize the need for a specific... ...like Collibra, Alteryx and Solidatus- Experience in Data Vault Modeling.- Knowledge of Hadoop, Google Big Query is a plus. (ref:hirist.tech)
...Responsibilities
Design, develop, and deploy scalable Big Data applications using Hadoop ecosystem technologies (Hadoop, HDFS, Hive, etc.).
Collaborate with data scientists and business analysts to understand requirements and translate them into technical solutions...
...'s experience.- Have Strong hands-on knowledge and experience in Java 8- Strong Knowledge of using Generics, Lamdas, Streams, Optional,... ...Experience in working with Kafka will be helpful.- Knowledge of Hadoop MapReduce, HDFS, Hbase, and Hive will be considered a plus.- Exposure...