...Experience-5 -7 years ( experience in either Java or Python or Both)
Big Data exp -2-5 Years - (Hadoop, snowflake, or Kafka)
Education Background -Degree in Computer Science
Domain - Financial Service or Pharma background
Budget – Maximum up to 28 LPA
Job Description...
...Nice to have exposure. • ITOM & Collaboration tools: ServiceNow, Microsoft Teams, Tableau. • Data/Data Structures: Oracle, SQL, Mongo, Hadoop, Cloudera, Spark, Teradata • API development using 3rd party libraries, REST/API • Enterprise Architecture experience • DBA experience...
...jobs, and data integration solutions. You will be working in a dynamic and collaborative environment, leveraging your expertise in Hive, Hadoop, and PySpark to unlock valuable insights from our data.
Key Responsibilities:
Data Ingestion and Integration:
·...
...engineer.
~2+ Years’ experience using Azure Services like Data Lake, Data Factory, Databricks etc.
~ Experience with big data tools: Hadoop, Spark, Kafka, etc.
~ Seeking candidates with advanced proficiency in SQL, including complex querying, optimization, data modelling,...
...techniques (LLMs a plus), classical machine learning, A/B testing
Data Storage & Processing: Spark, Hive, distributed data storage systems (Hadoop, BigQuery, EMR) (relational databases a plus: Oracle, SAP, DB2, Teradata, MS SQL Server, MySQL)
Data Visualization: Business...
...data pipelines for processing and analyzing large datasets using Spark, Airflow, Bodo, Flume, Flink, etc.
- Utilize technologies like Hadoop, Spark, Kafka, and NoSQL databases.
- Develop data ingestion, transformation, and aggregation strategies.
- Design and implement...
...Job Description: Good hands on experience in Hadoop
Strong technical knowledge in pyspark
Good communication to handle client requirement.
Must Have Skills: Hadoop + Pyspark
Experience: 4-6 years
Notice Period: 0-15 Days
Work Timing: Regular Shift...
...Pipelines using custom and packaged tools such as Azure Data Factory, Airflow or equivalent
• Spark based platform such as Databricks, Hadoop experience
• Cluster configurations including scaling, pooling, memory, and cost optimizations
• Data prep, migration, analysis,...
...with 5+ Years on Pyspark/NoSQL is Mandatory)1. Person should be strong in Pyspark2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework3. Hands on and working knowledge in Python4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora -...
...Job title: Senior Hadoop Developer Experience: 4+ Years
Notice Period: Immediate to 15 days only
Location: Manyata Embassy Business Park, Bangalore.
Mode of Employment: Contract
Mode Of Work: Hybrid (2-3 days in a Week- Mandatory)
Working experience in elastic...
...leading on client-facing projects, including working in close-knit teams- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects within the cloud ideally AWS or Azure- Data Warehousing...
Responsibilities:- Designing and implementing fine-tuned production-ready data/ML pipelines in Hadoop platform.- Driving optimization, testing, and tools to improve quality.- Reviewing and approving high-level & detailed design to ensure that the solution delivers to the business...
Job Description :We are seeking a skilled Hadoop Developer with mandatory working experience in Elasticsearch to join our team. The ideal candidate should have a minimum of 2 years of experience in Hadoop and Spark development. This role will involve working with large datasets...
...working in an agile environment (e.g.user stories, iterative development, etc.).- Knowledge and working experience in elastic search is mandatory.- 3-5 Years of experience in Hadoop & Elastic Search mandatory.- 3-5 Years of experience in Spark is mandatory. (ref:hirist.tech)
...engineering, with a strong focus on large-scale data platforms and data products.- Strong experience with big data technologies such as Hadoop, Spark, and Hive- Proficiency in either Scala or Java programming language.- Experience leading and managing teams of data engineers-...
...projects, including working in close-knit teams - Overall, 5 + years of experience, with at least 3+ years in Big Data technologies (Hadoop / Spark / Relational DBs) and similar experience in working on projects within the cloud ideally AWS or Azure - Data Warehousing experience...
...any industry standard tool.- Good experience working with RDBMS MPP systems as well as NoSQL environments like Object Storage, MongoDB, Hadoop / HIVE etc.- Good Data Profiling / Analysis experience.- Preferably having Financial Services Industry experience.- Excellent...
Skills :- Hadoop- Python- Spark- PySpark- ETL (Extract, Transform, Load)Roles & Responsibilities :- Data Ingestion: Develop and maintain data pipelines for ingesting raw data from various sources into the Hadoop ecosystem.- Data Processing: Utilize Python and Spark to process...
Job Description :- 4-7 years of experience in DE development.- Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)- Python programming language is mandatory.- PySpark - Excellent with SQL- Excellent with Airflow is a plus.- Good to Have Airflow- Good aptitude...
...FIS architectural standards.
What you will be doing:
Administrator will be responsible for installation and Configuration of Hadoop, Deployment of applications across multiple Clusters and Instance and Cloudera cloud environment setup.
Monitoring of the environment...
...Job Title: Java + Hadoop Developer
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar &...
...working towards design, architecture, development, and operationalization of Data Engg & AI/ML models across Big Data Ecosystem (PySpark, Hadoop, Snowflake, Python) - Experience in architecture, design, and implementation of data intensive applications for practical use-cases-...
...experience in S3, Dataproc, Spark, Python, CloudFunctions, Orchestration using Airflow/Composer
- Ability to handle and migration hundreds of TB of data, code optimization
- Good communication skills, self motivated and quick learning capabilities
Spark, Data proc, Hadoop...
...coder with good experience in programming languages like Java or Python.- Hands-on experience on the Big Data stack like PySpark, Hbase, Hadoop, Mapreduce and ElasticSearch.- Good understanding of programming principles and development practices like checkin policy, unit...
...extraordinary and make a real difference for companies and the planet .
About the role....
Managing, Installing & config. of Hadoop (hive, hdfs, Ambari & Nifi) . Designing the architecture and to integrate with o9 Product. Involved in Hadoop maintenance & monitoring...
...Principal Consultant ,QA : Scala/Spark/Hadoop - ITO080274
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility...
...configuration and deployment along with ability to build custom solutionsHave experience in building Data Pipelines using Scala, Spark, Hadoop, HiveQL etc.Have experience with streaming frameworks such as Kafka.Have experience with Data Warehousing, Data Modelling and Data...
...Functions, GC Storage, PythonMandatory Requirements : - Knowledge of Big Data Architecture patterns and experience in delivery of BigData and Hadoop ecosystems.- Strong experience in GCP with multiple large projects involving GCP Big Query and ETL.- Experience in GCP-based Big Data...
...and play a key role in designing, developing, and maintaining our big data infrastructure. You will leverage your expertise in Java, Hadoop, and Kafka to build efficient data pipelines, handle real-time data streams, and ensure high-quality data processing for our data analytics...
..., MD)
• Experience in creating data driven business solutions and solving data problems using a wide variety of technologies such as Hadoop, Hive, Spark, MongoDB, NoSQL, as well as traditional data technologies like RDBMS, MySQL a plus
• Ability to program in one or more...