Responsibilities:
Lead the design and implementation of our ETL data pipelines, cloud deployments, and CI/CD pipelines, ensuring efficient and reliable processes.
Architect and automate infrastructure using tools such as Terraform, Ansible, or equivalent, with a focus on scalability, security, and maintainability.
Develop and maintain integrations with various cloud services, optimizing for performance and reliability.
Collaborate with software engineers to incorporate infrastructure-as-code (IaC) principles into the development lifecycle.
Implement and maintain security best practices, including access control, encryption, and vulnerability management.
Troubleshoot complex issues, conduct root cause analysis, and implement solutions to ensure high system availability.
Mentor and guide junior team members, fostering a culture of continuous improvement and knowledge sharing.
Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
Proven experience as a DevOps Engineer, with a focus on cloud environments (e.g., AWS, Azure, GCP).
Proven experience as a DevOps Engineer, with a focus on ETL data pipelines (e.g., ELK, Snowflake, Databricks, Kafka, Spark).
Extensive expertise in infrastructure as code (IaC) tools, such as Terraform, CloudFormation, or equivalent.
Strong programming and scripting skills (e.g., Python, Bash) over Linux.
Experience developing integrations with cloud services and APIs.
In-depth understanding of containerization and orchestration tools, such as Docker and Kubernetes.
Knowledge of cybersecurity concepts and best practices.
Excellent problem-solving skills and attention to detail.
Leadership and management skills are a plus.
Strong communication and leadership skills.