What You'll Do:
Develop ETL jobs to gather data from multiple sources and provide insights into various product areas
Building data warehouses where large amounts of metrics and data will be stored
Interacting with many product groups within the organization to collect key metrics via APIs, Kafka integrations or direct data access
Participation in configuring and receiving uptime alerts related to the services you control.
Keeping services up and running in a healthy state.
4+ years experience in programming, with proficiency in at least one object-oriented programming language, featuring strong types. Golang or Python are preferred.
Knowledge on services with at least two Cloud providers out of Aws, Azure and GCP.
Experience developing and consuming RESTful API web services.
Experience interacting with major cloud provider APIs to provision cloud infrastructure, and to monitor it. We use Amazon Web Services (AWS) cloud provider APIs the most, as well as Azure and Google Cloud (GCP).
Understanding data structures and commands for a key-value distributed caching solution, such as Redis.
Experience using RDBMS databases, and accompanying knowledge of SQL, such as Postgres.
Experience with data modeling and Extract-Transform-Load (ETL) concepts.
Bachelor's degree or equivalent work experience. Proficiency with common algorithms, data structures, code whiteboarding.
Bonus Points:
Experience with analytical databases
Understanding data structures and various APIs, for full-text search of application logs and event data in Elasticsearch.
Experience with Cassandra, CQL, and its wide-column store database.
Experience using graph structures (ie. nodes, edges), graph data, and graph databases.
Experience using a message queue. We use Kafka.