Using the newest technologies, we're working on solving a huge problem all enterprises face today; to govern the accessibility of all its employees to all 3rd party vendors (GitHub, SendGrid, Atlassian, and thousands more!), and to make sure there is no leftover/unwanted access to any of the organization's SaaS and AI assets. The SaaS and AI Security field is complex and challenging. Therefore, we're looking for super-talented people, who are not afraid of technical challenges and breaking down barriers to achieve good solutions.
The job
As a Senior Data Engineer, you will have a leading role in developing our data platform, creating and extending our data infrastructure to allow research, development and BI across the company. You will work with multiple stakeholders (architects, analysts, data scientists, and more) to help enrich our data with additional insights.
Responsibilities
Implement robust cloud-based data infrastructure and pipelines as part of our Data Lakehouse infrastructure.
Collaborate with analysts, data scientists and other stakeholders to develop new products and features.
Ensure data integrity by extending our monitoring infrastructure.
Take part in challenging data migration and remodeling efforts.
Contribute to a culture of learning and knowledge-sharing within the team.
The job
As a Senior Data Engineer, you will have a leading role in developing our data platform, creating and extending our data infrastructure to allow research, development and BI across the company. You will work with multiple stakeholders (architects, analysts, data scientists, and more) to help enrich our data with additional insights.
Responsibilities
Implement robust cloud-based data infrastructure and pipelines as part of our Data Lakehouse infrastructure.
Collaborate with analysts, data scientists and other stakeholders to develop new products and features.
Ensure data integrity by extending our monitoring infrastructure.
Take part in challenging data migration and remodeling efforts.
Contribute to a culture of learning and knowledge-sharing within the team.
Requirements:
5+ years of hands-on experience in building scalable data infrastructure, in particular data lake or data warehouse architectures.
Extensive experience with cloud-based data platforms like AWS, Azure, or GCP, and tools such as Spark, Kafka, Athena, Airflow, DBT, AWS Glue.
Strong programming skills in Python and significant SQL experience.
Independent learner with a "can do" attitude and a strong sense of ownership.
Experience with containerization technologies – advantage.
5+ years of hands-on experience in building scalable data infrastructure, in particular data lake or data warehouse architectures.
Extensive experience with cloud-based data platforms like AWS, Azure, or GCP, and tools such as Spark, Kafka, Athena, Airflow, DBT, AWS Glue.
Strong programming skills in Python and significant SQL experience.
Independent learner with a "can do" attitude and a strong sense of ownership.
Experience with containerization technologies – advantage.
This position is open to all candidates.


















