we are looking for a Data Engineer
RESPONSIBILITIES
Design, code and optimize scalable data pipelines
Design and build core storage and surrounding infrastructure
Participate in, or lead design reviews with peers and stakeholders to decide amongst available technologies
Review code developed by other developers and provide feedback to ensure best practices (e.g., style guidelines, checking code in, accuracy, testability, and efficiency)
Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on hardware, network, or service operations and quality
RESPONSIBILITIES
Design, code and optimize scalable data pipelines
Design and build core storage and surrounding infrastructure
Participate in, or lead design reviews with peers and stakeholders to decide amongst available technologies
Review code developed by other developers and provide feedback to ensure best practices (e.g., style guidelines, checking code in, accuracy, testability, and efficiency)
Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on hardware, network, or service operations and quality
Requirements:
Bachelors degree or equivalent practical experience
4 years of experience with software development in Python or more programming languages, and with data structures/algorithms
3 years of experience testing, maintaining, or launching ETL and/or ELT pipelines, and 1 year of experience with software design and architecture
Practical knowledge of SQL scripting
2 years of experience working with one of Cloud Providers (GCP and/or AWS)
Would be a plus:
Bachelors degree or equivalent practical experience
4 years of experience with software development in Python or more programming languages, and with data structures/algorithms
3 years of experience testing, maintaining, or launching ETL and/or ELT pipelines, and 1 year of experience with software design and architecture
Practical knowledge of SQL scripting
2 years of experience working with one of Cloud Providers (GCP and/or AWS)
Would be a plus:
Master's degree or PhD in Computer Science or related technical field
Practical experience with containerizing and orchestrating pipelines with Docker, Airflow, K8S and/or alternatives
Practical experience with organizing CI/CD of the pipelines with Terraform, Ansible or alternatives
Practical understanding of Google Coding Style Guidelines, Google Documentation Style Guidelines, TDD approach and GitFlow best practices
Knowledge of data visualization services
Experience developing large-scale infrastructure or distributed systems, and experience with streaming technologies and storage architecture
This position is open to all candidates.