As part of our team, youll be responsible for Dynamic Yield customers data including ingestion, preparation and serving.
To handle around 1 Billion events per day, you will use the latest and strongest frameworks such as Spark, Kafka, Airflow, Flink, Cassandra, Redis, Snowflake, DBT and Elasticsearch, running on k8s.
Role:
– Design, code, and maintain Big Data solutions – both batch and stream processing
– Be fully responsible for the products lifecycle – from design and development to deployment
– Bring a strong opinion to the table and be proactively involved with product planning
– Work in teams and collaborate with others
– Improve application performance
– Troubleshoot and resolve data issues
To handle around 1 Billion events per day, you will use the latest and strongest frameworks such as Spark, Kafka, Airflow, Flink, Cassandra, Redis, Snowflake, DBT and Elasticsearch, running on k8s.
Role:
– Design, code, and maintain Big Data solutions – both batch and stream processing
– Be fully responsible for the products lifecycle – from design and development to deployment
– Bring a strong opinion to the table and be proactively involved with product planning
– Work in teams and collaborate with others
– Improve application performance
– Troubleshoot and resolve data issues
Requirements:
At least 4 years of experience with React
At least 5 years of JavaScript experience
At least 4 years of experience building backend systems with NodeJS
Object Oriented Programming
SQL/NoSQL database experience (MySQL, Redis) a plus
A degree in Computer Science or a related discipline
Excellent verbal and written communication skills in English
At least 4 years of experience with React
At least 5 years of JavaScript experience
At least 4 years of experience building backend systems with NodeJS
Object Oriented Programming
SQL/NoSQL database experience (MySQL, Redis) a plus
A degree in Computer Science or a related discipline
Excellent verbal and written communication skills in English
This position is open to all candidates.