Responsibilities:
Develop and maintain ETL processes and data pipelines to ensure smooth, accurate data flow.
Designed, built, and managed Airflow DAGs to orchestrate complex data workflows and ensure reliable execution of ETL and analytics tasks
Design and optimize relational databases, including conceptual and physical data modeling (ERD), schema design, and performance tuning to support scalable analytics and business applications.
Collaborate with cross-functional teams to translate business needs into technical solutions.
Build and maintain automation processes powered by algorithms to optimize workflows for reducing manual effort.
Build and maintain custom data scripts for integrations with brand and publisher partners
Support data visualization efforts through BI tools.
Gain in-depth understanding of discrepancy processes across departments and improve data accuracy.
3+ years of hands-on experience in data engineering or BI development.
Strong SQL skills (complex queries, stored procedures).
Practical experience with ETL processes and data pipelines.
Experience with Airflow, Dagster, or similar orchestration tools.
1+ Hands-on experience with Python.
Knowledge of database architecture (ERD, OLTP/OLAP models).
Strong analytical mindset with the ability to translate business requirements into data solutions.
B.Sc. degree in Information Systems Engineering, Industrial Engineering, or a related field.
Experience with DBT, BigQuery, AWS – advantage.




















