Search Results: 458 vacancies
...bring your talent and ambition to make a difference. We will create a world of opportunities for you.
Job Details
Job Title : Hadoop Administrator
Location: Bangalore / Pune / Hyderabad / Noida / Kolkata
Quick joiners needed
Minimum Requirement :
Must...
Responsibilities:- Designing and implementing fine-tuned production-ready data/ML pipelines in Hadoop platform.- Driving optimization, testing, and tools to improve quality.- Reviewing and approving high-level & detailed design to ensure that the solution delivers to the business...
...statistical analysis, and data visualization tools and technologies- Experience with big data processing frameworks, such as Spark or Hadoop- Strong analytical and problem-solving skills, with the ability to analyze complex data sets and identify trends and insights- Excellent...
...Iterators in Scala.- Have worked on multi-threading it will be helpful.- Experience in working with Kafka will be helpful.- Knowledge of Hadoop MapReduce, HDFS, Hbase, and Hive will be considered a plus.- Exposure to DevOps and SQL (Postgres, MS SQL) will be considered as an...
...leading on client-facing projects, including working in close-knit teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects within the cloud ideally AWS or Azure- Data Warehousing...
Mandatory Skills:- Data scientist exp- Hadoop, Scala- Python- Machine learning- SQL- Only from product OrgsRequirements from Past Experience :- Prior experience of working with large-scale datasets (10s of millions of documents) is strongly preferred.- End-to-end ownership of...
...Position Overview
Job Title: Data Engineer (ETL, Big Data, Hadoop, Spark, GCP)
Corporate Title: Associate
Location: Bangalore, India
Role Description
Senior Engineer is responsible for developing and delivering elements of engineering solutions to accomplish...
...experience using Azure SQL or SQL DWH (synapse)- Knowledge of batch and streaming data architectures- Experience using big data technologies (Hadoop, Hive, HBase, Spark and others).- Strong knowledge of SQL (MSSQL or MySQL or PostgreSQL or Presto).- Demonstrated strength in data...
Job Description :Mandatory Skills : Microsoft Azure, Hadoop, Spark, Databricks, Airflow, Kafka, Py sparkRequirements :- Experience working with distributed technology tools for developing Batch and Streaming pipelines using SQL, Spark, Python, Airflow, Scala, Kafka- Experience...
...Hands on coder with good experience in programming languages like Java, Python or Scala.- Hands-on experience on the Big Data stack like Hadoop, Mapreduce, Spark, Hbase, and ElasticSearch.- Good understanding of programming principles and development practices like checkin...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
...Solutions Architect (Data & BI) with around 10 - 15 years of experience in the following areas,Mandatory Skills :- Tableau, Big Data, Hadoop, Data Warehousing, Legacy BI Migrations, Cloud technologies.- Experienced in BI Implementation, BI migration involving Tableau.Preferred...
...projects, including working in close-knit teams - Overall, 5 + years of experience, with at least 3+ years in Big Data technologies (Hadoop / Spark / Relational DBs) and similar experience in working on projects within the cloud ideally AWS or Azure - Data Warehousing experience...
...field.- Master's degree preferred.- Proven experience as a Big Data Developer or similar role, with strong knowledge of Big Data and Hadoop ecosystems.- Hands-on experience with HDFS, Spark, MapReduce, Sqoop, and Hive.- Experience in loading and transforming large datasets...
...architectural standards.
What you will be doing:
Administrator will be responsible for installation and Configuration of Hadoop, Deployment of applications across multiple Clusters and Instance and Cloudera cloud environment setup.
Monitoring of the environment...
...configuration and deployment along with ability to build custom solutionsHave experience in building Data Pipelines using Scala, Spark, Hadoop, HiveQL etc.Have experience with streaming frameworks such as Kafka.Have experience with Data Warehousing, Data Modelling and Data...
...with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3. Hands on and working knowledge in Python4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora -...
...EMR S3 RDS) Python as programming skills and Shell scripting. Work experience in any CI/CD environment is a must. 2) Secondary Skills: Experience in managing Hadoop clusters Airflow instances. Integrating Ranger Hive Metastore service in EKS.
hadoop,aws,cloud computing...
...troubleshoot complex issues (familiarity with open-source contributions a plus).- Leverage extensive experience in distributed systems (e.g., Hadoop ecosystem) to build reliable and fault-tolerant data solutions.- Develop and implement multithreaded programs while ensuring effective...
...extraordinary and make a real difference for companies and the planet .
About the role....
Managing, Installing & config. of Hadoop (hive, hdfs, Ambari & Nifi) . Designing the architecture and to integrate with o9 Product. Involved in Hadoop maintenance & monitoring...