To do well in this role, you must be technical and up-to-date with modern technology to automate manual data collection processes. You should have strong analytical skills, strong modeling skills, and the ability to work well with other teams in the organization.
What youll do
Lead the design, implementation, optimization, and monitoring of ELT processes, data pipelines, and data lake solutions.
Lead and mentor a team of data engineers, providing guidance, feedback, and professional development opportunities.
Continuously explore and implement methods to improve and ensure data quality, reliability, and availability across all data platforms.
Collaborate with data analysts, data scientists, and other stakeholders to understand data requirements and deliver solutions that meet their needs.
Optimize and troubleshoot existing data pipelines and systems to improve performance and efficiency.
Implement and enforce best practices for data management, security, and governance.
Stay current with emerging trends and technologies in data engineering and make recommendations for their adoption.
Explore and implement AI-driven solutions to enhance data processing, analysis, and predictive modeling.
Manage project timelines, resources, and deliverables to ensure successful project outcomes.
5+ years of experience in data engineering and ETL/ELT development within a DWH environment, with at least 2 years in a leadership or team lead role.
Strong expertise in SQL and experience with relational databases such as PostgreSQL, MySQL, or similar.
Proficiency with DBT, strong data modeling skills, and an excellent understanding of DWH architecture and methodology.
Experience with cloud platforms such as AWS, GCP or Azure, and their data services.
Experience with Airflow, Python, Docker, and Git.
Deep understanding of MySQL/RDS replication and Change Data Capture (CDC) methods.
Strong problem-solving and analytical abilities.
Experience working in an agile development environment.
Bonus Skills:
Experience with Kubernetes.
Experience with Datadog or other enterprise monitoring systems.
Proficiency with data visualization tools such as Tableau.
Familiarity with AI and machine learning frameworks, and experience integrating AI solutions into data workflows.