Responsibilities:
Design, develop, and maintain scalable data pipelines using tools such as Airflow and DBT.
Manage and optimize our data warehouse in Snowflake, ensuring data integrity and performance.
Collaborate with analytics and business teams to understand data requirements and deliver appropriate solutions.
Implement and maintain data integration processes between various systems and platforms.
Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption.
Stay updated with the latest industry trends and technologies to continually improve our data infrastructure.
Requirements:
3+ years of experience in data engineering or a related field.
Proficiency in SQL and experience with modern lakehouse modeling
Hands-on experience with data pipeline orchestration tools like Apache Airflow.
Experience with DBT for data transformation and modeling.
Familiarity with data visualization tools such as Tableau.
Strong programming skills in languages such as Python or Java.
Hands-on experience with AWS data solutions (or other major cloud vendor)
Excellent problem-solving skills and attention to detail.
Strong communication skills and the ability to work collaboratively in a team environment.
Relevant academic degree in Computer Science, Engineering, or related field (or equivalent work experience).
Preferred Qualifications:
Experience in the travel or insurance industries.
Familiarity with Mixpanel or similar analytics platforms.
Knowledge of data security and privacy best practices.