We are looking for an experienced and passionate Data Engineer to join the Data Engineering Team in our rapidly growing TLV R&D location!You will be instrumental in maintaining current data pipelines and the data warehouse, as well as investigating and implementing new technologies that will help meet future analytical needs.
Responsibilities include working alongside developers from the BI and Backend teams, architects and business decision makers in order to implement data pipelines and improve data architecture and infrastructure.
The Data Engineering Team focuses on building long term, scalable solutions for our growing data needs.
Responsibilities include working alongside developers from the BI and Backend teams, architects and business decision makers in order to implement data pipelines and improve data architecture and infrastructure.
The Data Engineering Team focuses on building long term, scalable solutions for our growing data needs.
Requirements:
Bachelors degree in CS or other relevant field
Experience in Python or other modern object-oriented programming languages
Experience in Data Modeling using dbt or equivalent
Experience with ELT platforms like Fivetran, Airbyte, Rivery, etc.
Experience with Data Warehouse technologies like Snowflake, BigQuery, Redshift, etc.
Experience in working and delivering end-to-end projects independently
Experience with at least one cloud provider, preferably AWS
Strong written and verbal communication skills in English and Hebrew
Advantages:
Familiarity with DB internals, design considerations and management
Familiarity with Orchestration platforms like Airflow, Luigi, Prefect, Dagster, etc.
Familiarity with Data Validation and Testing using dbt, Great Expectations or others
Familiarity with Event Streaming platforms like Kafka. Redpanda, etc.
Bachelors degree in CS or other relevant field
Experience in Python or other modern object-oriented programming languages
Experience in Data Modeling using dbt or equivalent
Experience with ELT platforms like Fivetran, Airbyte, Rivery, etc.
Experience with Data Warehouse technologies like Snowflake, BigQuery, Redshift, etc.
Experience in working and delivering end-to-end projects independently
Experience with at least one cloud provider, preferably AWS
Strong written and verbal communication skills in English and Hebrew
Advantages:
Familiarity with DB internals, design considerations and management
Familiarity with Orchestration platforms like Airflow, Luigi, Prefect, Dagster, etc.
Familiarity with Data Validation and Testing using dbt, Great Expectations or others
Familiarity with Event Streaming platforms like Kafka. Redpanda, etc.
This position is open to all candidates.