This role requires expertise in distributed architectures, data infrastructure, and high-scale processing, along with the ability to lead initiatives, mentor developers, and drive engineering excellence.
What youll do:
Architect and optimize our data export infrastructure, ensuring scalability, performance, and cost efficiency.
Develop scalable backend platforms, integrating with BigQuery, Snowflake, AWS S3, and more.
Lead high-impact projects across engineering, platform, and product teams.
Establish best practices in system design and reliability, focusing on monitoring and optimization.
Mentor developers and drive high engineering standards.
Drive company-wide technical initiatives, dedicating ~20% of your time to our Staff Engineers Forum.
What you have:
10+ years of experience in software development, working on high-scale distributed systems.
3+ years in technical leadership roles, driving architectural decisions and mentoring developers.
Expertise in data infrastructure, pipelines, and cloud-based solutions.
Proven ability to design and build scalable backend platforms, ensuring performance, reliability, and security.
Hands-on experience with PostgreSQL, Kafka, Spark, Airflow, and Python.
Knowledge of distributed data processing, event-driven architectures, and scalable data access solutions.
Experience working with the BigQuery ecosystem, leveraging its capabilities for scalability, efficient data access, and integration with cloud-native services.
Strong collaboration skills, working closely with product, data, and engineering stakeholders.
Bonus points:
Experience with functional programming (Clojure/Scala).
Familiarity with modern frontend technologies and full-stack best practices.
Background in high-scale data platforms (BigQuery, Snowflake, or Spark).
Experience in AI-driven development, AI agents, or agentic workflows.
Recommended by our employee.