Average salary: Rs1,600,000 /yearly
More statsSearch Results: 10 vacancies
...from distinguished institutions and diverse professional backgrounds to help us achieve our goal of building electoral strategies that spark conversations, effect change, and help shape electoral and legislative ecosystems in our country.
About the Role:
As a part of...
...creating, managing and monitoring data warehouse like BigQuery.- Experience with data processing frameworks and tools (e.g., Hadoop, Spark, Apache Airflow) which includes ETL pipelines.- Strong understanding of machine learning algorithms and experience with libraries /frameworks...
...) and Neural Networks (NN) for deep learning applications.
Bonus points for:
Experience working with big data platforms (Spark Hadoop etc.).
Experience with cloud platforms (AWS GCP Azure etc.).
Experience with A/B testing and experimentation methodologies...
...Python and experience with modern data stack tools such as Airflow, dbt, Airbyte, Fivetran, Stitch, Segment, Apache Superset, Snowflake, Spark, etc.- Education in Computer Science, Engineering, or training in a related field.- Knowledge of Pub/Sub architectures or other...
...machine learning frameworks
Knowledge of data mining and data visualization
Experience with big data technologies such as Hadoop and Spark
Strong problem-solving and analytical skills
Excellent written and verbal communication skills
Ability to work independently...
...data processing and analytics using Machine Learning.- Redshift: Expertise in managing and optimizing Amazon Redshift data warehouses.- Spark: Advanced knowledge of Apache Spark for large-scale data processing.- Hadoop: Solid experience with the Hadoop ecosystem, including...
...team.
Excellent communication and collaboration skills.
Bonus points for:
Experience working with big data platforms (Spark Hadoop etc.).
Experience with cloud platforms (AWS GCP Azure etc.).
Experience with A/B testing and experimentation...
...accessibility.Optimize data processing and storage architectures for performance and cost efficiency, utilizing technologies such as Python, Spark, and AWS services.Develop and maintain monitoring and alerting systems to ensure the reliability and availability of data pipelines...
...Apache Solr/Elasticsearch)- Building Semantic search engines for a Big Data dataset- Big Data processing and associated frameworks (Apache Spark/pyspark)- Utilising Natural language processing algorithms to build comparison and ranking algorithms.Operational Competencies :-...
Rs 12 - 20 lakhs p.a.
...AWS Cloud, NoSQL Database (Mongo, Cassandra), Message Broker (Active MQ/RabbitMQ/Apache Kafka) and Big Data Technologies (Hadoop/Hive/Spark)
Additionally, must possess capability to review code, produce technical specification document and knowledge of code quality tools...