Responsibilities
Design, develop, and maintain scalable ETL processes using Python and modern orchestration tools like Apache Airflow.
Architect and optimize data pipelines for Snowflake to support data warehousing and real-time data needs.
Manage and integrate cloud infrastructure, particularly in AWS, to ensure seamless data flow.
Work with ElasticSearch (or similar tools) to handle large-scale data indexing and searching.
Build and maintain streaming data pipelines to capture real-time game events and integrate them into the data platform.
Ensure data integrity, quality, and governance across all sources and destinations.
Collaborate with data analysts, game developers, and other stakeholders to understand data requirements and optimize systems.
Troubleshoot and resolve issues in data pipelines and streaming services.
3+ years of experience as a Data Engineer or in a similar role.
Proficiency with Snowflake (or similar) for data warehousing and SQL query optimization.
Strong experience with Apache Airflow or similar orchestration tools for managing ETL workflows.
Hands-on experience with AWS cloud services (S3, Kinesis, Lambda etc.).
Hands on experience and Proficiency in Python.
Knowledge of real-time data streaming frameworks (e.g., Kafka, Kinesis).
Strong understanding of database architecture, data modeling, and data integration.
Excellent problem-solving skills and attention to detail.
Ability to work independently and as part of a collaborative team.
Nice to Have:
Experience with ElasticSearch (or similar technology) for indexing and querying large datasets.
Knowledge of containerization technologies like Docker and Kubernetes.
Experience working in the gaming or entertainment industry.