Our engineering challenges involve big data and stream processing in super high-scale environments, which keeps on increasing as the customers increase their usage and new customers join. Furthermore, we are seeking better ways to accurately analyze traffic patterns, identify attackers and attack vectors, and provide customers with improved visibility into their security posture.
Who are you?
5+ years of software engineering experience (3+ years in Scala/Java/Go).
2+ years of experience leading an engineering team (3+ Engineers).
Experience with data pipeline development for streaming/batch processing.
Bachelor's degree in Computer Science (or equivalent), or relevant employment background.
Experienced in designing, developing, and debugging complex, distributed systems (microservices, event-driven).
Experience with developing large-scale, data-oriented applications with high rates and volume.
Passion for writing clean, extensible, and robust code.
Preferred (but not mandatory) experience:
Hands-on experience with Kafka and the various technologies in its ecosystem (Kafka streams, Kafka connect, etc.) or similar technology – A big advantage.
No-SQL columnar databases.
Building microservices in AWS.

















