Are you a smart, curious, nimble, relentless leader and quick learner enough to tackle one of societys largest emerging problems?
If you answer yes to all of the above, we might have one of the most interesting Senior Data Engineer positions out there for you.
Allies we are reinventing successful aging by working with many of the largest insurance carriers and their policyholders to introduce game-changing wellness and financial products.
At the core of our product is our ability to create impact, measure our program performance, and support our daily operation teams. All of these rely on accurate quality data. The ability to merge unstructured data from multiple sources into one data platform and to continuously deliver high-quality data to all of the stakeholders is crucial for the success of our business.
Responsibilities:
Play a pivotal role in data architecture and design decisions.
Design, build, launch, expand, and maintain our high-quality data platform using efficient and reliable data pipelines, ETLs, and tools.
Design & build pipelines, ETLs, and tools to ingest data incorporating Airflow and AWS services such as Glue, Athena, Kinesis, Redshift, S3, and more.
Contribute to the process methodology for Data Engineering by implementing methodologies that are repeatable, scalable, and efficient.
Build and maintain data systems, databases, and warehouses, ensuring data integrity, availability, and performance.
Work closely with engineers, clinical researchers, data scientists, actuaries, marketing specialists, and operation managers on data initiatives.
Use various technologies and techniques to solve data problems like – Airflow, Python, and AWS services such as Glue, Athena, Kinesis, Redshift, S3, spark and more
Deliver data to the most convenient data stores and in the most convenient format for data analytics.
If you answer yes to all of the above, we might have one of the most interesting Senior Data Engineer positions out there for you.
Allies we are reinventing successful aging by working with many of the largest insurance carriers and their policyholders to introduce game-changing wellness and financial products.
At the core of our product is our ability to create impact, measure our program performance, and support our daily operation teams. All of these rely on accurate quality data. The ability to merge unstructured data from multiple sources into one data platform and to continuously deliver high-quality data to all of the stakeholders is crucial for the success of our business.
Responsibilities:
Play a pivotal role in data architecture and design decisions.
Design, build, launch, expand, and maintain our high-quality data platform using efficient and reliable data pipelines, ETLs, and tools.
Design & build pipelines, ETLs, and tools to ingest data incorporating Airflow and AWS services such as Glue, Athena, Kinesis, Redshift, S3, and more.
Contribute to the process methodology for Data Engineering by implementing methodologies that are repeatable, scalable, and efficient.
Build and maintain data systems, databases, and warehouses, ensuring data integrity, availability, and performance.
Work closely with engineers, clinical researchers, data scientists, actuaries, marketing specialists, and operation managers on data initiatives.
Use various technologies and techniques to solve data problems like – Airflow, Python, and AWS services such as Glue, Athena, Kinesis, Redshift, S3, spark and more
Deliver data to the most convenient data stores and in the most convenient format for data analytics.
Requirements:
4+ years of experience as a data engineer including developing infrastructure
3+ years of experience with SQL, Python
2+ years experience with the following: Airflow, relational DBs, AWS services
Well experience with ETL processes, Data Pipelines Design & Implementation
Proven experience with Data warehouse, Data lake, relational and columnar DBs, NoSQL DBs, schema design, and dimensional data modeling
Degree in Engineering/Mathematics/Statistics/Information Systems or equivalent
Excellent written and verbal communication skills (Hebrew and English)
4+ years of experience as a data engineer including developing infrastructure
3+ years of experience with SQL, Python
2+ years experience with the following: Airflow, relational DBs, AWS services
Well experience with ETL processes, Data Pipelines Design & Implementation
Proven experience with Data warehouse, Data lake, relational and columnar DBs, NoSQL DBs, schema design, and dimensional data modeling
Degree in Engineering/Mathematics/Statistics/Information Systems or equivalent
Excellent written and verbal communication skills (Hebrew and English)
This position is open to all candidates.