Cloud Architect Job Description Template
Our company is looking for a Cloud Architect to join our team.
Responsibilities:
- Respond to technical issues in a professional and timely manner;
- Work closely with IT security to monitor the companys cloud privacy;
- Develop and organize cloud systems;
- Create a well-informed cloud strategy and manage the adaption process;
- Identify the top cloud architecture solutions to successfully meet the strategic needs of the company;
- Regularly evaluate cloud applications, hardware, and software;
- Drive thought leadership and support evangelization activities working with Sales;
- Product Evaluation and tracking cloud services and impact on reference architecture;
- Creating Point of View of Cloud Solution;
- Work with Cloud Practice office to provide technical solutions/services roadmap;
- Responsible for Executing the Tools POC on Cloud;
- On need basis, involve in customer project delivery and managing customer expectations;
- Drive business innovations through systems on Cloud;
- Prepare technical proposals and work on sizing and estimation;
- Drive technical discussions and explain Mindtree solutions and product options to meet customer needs.
Requirements:
- Experience supporting and working with cross-functional teams in a dynamic environment;
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift;
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc;
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores;
- Experience in AWS & Azure Cloud;
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets;
- Experience with big data tools: Hadoop, Spark, Kafka, etc;
- Build processes supporting data transformation, data structures, metadata, dependency and workload management;
- Strong project management and organizational skills;
- Experience with stream-processing systems: Storm, Spark-Streaming, etc;
- Strong analytic skills related to working with unstructured datasets;
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra;
- A successful history of manipulating, processing and extracting value from large disconnected datasets;
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.