Search Results: 166 vacancies
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
...writing complex SQL queries to extract and manipulate data from relational databases (e.g., MySQL, PostgreSQL).
# Experience with Hadoop ecosystem tools, particularly HIVE for data warehousing and analytics.
# Strong analytical skills with the ability to interpret data...
...created to help and reach this aim.
As a production engineer on the Hadoop HDP/CDP stack in the IT domain “DATAPLATFORMS” in the DATA... ...Installation & configuration of the componants : Zookeeper, HDFS, Spark, Hive, Kerberos, Sentry, Anaconda, Hue, Kafka…
o HDP/CDP clusters...
Rs 1 - 1.5 lakhs p.a.
...Currently, we have openings for MBA freshers who have the spark, confidence, and zeal to learn something new everyday.
Requirements
~ Educational Background: A bachelor's degree in business administration or related field is required. Preferably an MBA degree from...
Rs 12 - 16 lakhs p.a.
...process and application requirements. Must have Skills : Apache Spark Good to Have Skills : Microsoft SQL Server,Unix to Linux Migration... ...actions, spark configuration and tuning techniques B:Knowledge of Hadoop architecture; execution engines, frameworks, applications tools C...
Rs 7 - 11 lakhs p.a.
...process and application requirements. Must Have Skills : Apache Spark Good To Have Skills : Job Requirements : Key Responsibilities: AStrong... ...query tuning and performance optimization BGood understanding of Hadoop architecture and distributed systems eg CAP theorem partitioning...
We are seeking a highly skilled and motivated Hadoop Developer to join our team and play a crucial role in developing and maintaining our... ...Proficiency in distributed data processing frameworks like Apache Spark (Spark SQL, Spark Streaming).- Expertise in data warehousing and...
...Proficiency in writing complex SQL queries to extract and manipulate data from relational databases (e. g. MySQL, PostgreSQL)- Experience with Hadoop ecosystem tools, particularly HIVE for data warehousing and analytics- Strong analytical skills with the ability to interpret data,...
Rs 12 - 16 lakhs p.a.
...applications to meet business process and application requirements. Must have Skills : Python Programming Language Good to Have Skills : Hadoop,Apache Spark Job Requirements : Key Responsibilities : A: Role will be responsible for analysing profiling data in line with business use...
...YearsLocation : Anywhere in India Education : BE, B.Tech, Any Tech GraduateMust-Have Technical Skills : including 3+ years Spark or Scala,- 2+ years of Hadoop/Big Data using tools like Hive, Spark, PySpark, Scala, and RDBMS/SQL Strongly- Preferred: GCP, including GCS (Google...
...to identify and design effective test scenarios.- Experience with Hadoop Hive is essential for testing data transformation and querying... ...Expertise in automated testing frameworks is a plus, with Python/Spark experience being a valuable advantage.- Plan and execute all testing...
...travel. While our products are in development, our mission remains clear: to engineer unparalleled experiences that transcend boundaries, sparking a revolution where innovation fuels extraordinary journeys.
Website : |
Location : Coimbatore, Tamil Nadu
Hardware...
...develop, and deploy scalable Big Data applications using Apache Spark.
Collaborate with data scientists and business analysts to understand... ...or Python programming languages.
Hands-on experience with Hadoop, Hive, and/or HBase.
Familiarity with distributed computing...
...Overview:
The Databricks Spark Developer plays a crucial role in harnessing the power of data using Databricks and Spark to develop and maintain efficient data pipelines. They are responsible for implementing scalable and reliable solutions that enable datadriven decisionmaking...
...scale data on Redshift and S3, and develop data pipelines using Spark/Scala EMR, SQL based ETL, Airflow and Java services.
We are looking... ...issues.
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience with any ETL tool like, Informatica...
...Preferred Qualifications:
Degree in Data Science, Statistics, or related field.
Experience with big data technologies such as Hadoop, Spark, or Hive.
Knowledge of deep learning frameworks such as TensorFlow or PyTorch.
Familiarity with natural language processing...
...Unix ,Autosys and DB
Responsibilities
Developed using Apache Spark & Python Structured data API / Library.
Strong hands on... ...(Source, Extraction and Processors & Sink modules).
Big Data Hadoop - Hortonworks HDP 3.1.x & core Components covering data engineering...
...Identifying opportunities for architects in the account and outside through known contacts, Contribute towards domain and technology accelerator kits, Mentoring junior architects
5.
Team Management
Team attrition %, Employee satisfaction score
Hadoop
...exposure
4. Excellent working environment
How You'll Grow at HCL Tech, we offer continuous opportunities for you to find your spark and grow with us. We want you to be happy and satisfied with your role and to really learn what type of work sparks your brilliance the...
...for Team Leaders experienced UK or US life & Pension only .
At HCL Tech, we offer continuous opportunities for you to find your spark and grow with us. We want you to be happy and satisfied with your role and to really learn what type of work sparks your brilliance the...