We are seeking an adept Senior Data Engineer with a passion for tackling complex challenges across a diverse range of technologies.
Your role will involve a deep commitment to software design, code quality, and performance optimization.
As part of our Engineering team, your mission will be to empower critical infrastructure by enabling the detection, investigation, and response to complex attacks and data breaches on their networks.
You will play a pivotal role in developing pipelines to efficiently extract, transform, and load massive volumes of data.
Your expertise will contribute to the creation of a scalable, high-performance data lake that serves as a foundation for other services within the platform. Additionally, you will be responsible for translating intricate requirements into meticulous and actionable designs.
Responsibilities:
Be a significant part of the development of data pipelines to efficiently extract, transform, and load vast volumes of data.
Architect and build a scalable, high-performance data lake that supports various services within the platform.
Translate intricate requirements into meticulous design plans, maintaining a focus on software design, code quality, and performance.
Collaborate with cross-functional teams to implement data-warehousing and data-modeling techniques.
Apply your expertise in Core Linux, SQL, and scripting languages to create robust solutions.
Leverage your proficiency in cloud platforms such as AWS, GCP, or Azure to drive strong data engineering practices.
Utilize your experience with streaming frameworks, such as Kafka, to handle real-time data processing.
Employ your familiarity with industry-standard visualization and analytics tools, like Tableau and R, to provide insightful data representations.
Demonstrate strong debugging skills, identifying issues such as race conditions and memory leaks.
Solve complex problems with an analytical mindset and contribute to a positive team dynamic.
Bring your excellent interpersonal skills to foster collaboration and maintain a positive attitude within the team.
Your role will involve a deep commitment to software design, code quality, and performance optimization.
As part of our Engineering team, your mission will be to empower critical infrastructure by enabling the detection, investigation, and response to complex attacks and data breaches on their networks.
You will play a pivotal role in developing pipelines to efficiently extract, transform, and load massive volumes of data.
Your expertise will contribute to the creation of a scalable, high-performance data lake that serves as a foundation for other services within the platform. Additionally, you will be responsible for translating intricate requirements into meticulous and actionable designs.
Responsibilities:
Be a significant part of the development of data pipelines to efficiently extract, transform, and load vast volumes of data.
Architect and build a scalable, high-performance data lake that supports various services within the platform.
Translate intricate requirements into meticulous design plans, maintaining a focus on software design, code quality, and performance.
Collaborate with cross-functional teams to implement data-warehousing and data-modeling techniques.
Apply your expertise in Core Linux, SQL, and scripting languages to create robust solutions.
Leverage your proficiency in cloud platforms such as AWS, GCP, or Azure to drive strong data engineering practices.
Utilize your experience with streaming frameworks, such as Kafka, to handle real-time data processing.
Employ your familiarity with industry-standard visualization and analytics tools, like Tableau and R, to provide insightful data representations.
Demonstrate strong debugging skills, identifying issues such as race conditions and memory leaks.
Solve complex problems with an analytical mindset and contribute to a positive team dynamic.
Bring your excellent interpersonal skills to foster collaboration and maintain a positive attitude within the team.
Requirements:
5+ years of experience in developing large-scale cloud systems.
Proficiency in Core Linux, SQL, and at least one scripting language.
Strong data engineering skills with expertise in cloud platforms like AWS, GCP, or Azure.
Expertise in developing pipelines for ETL processes, handling extensive data loads.
Familiarity with streaming frameworks, such as Kafka, or similar technologies.
Knowledge of data-warehousing and data-modeling techniques.
Practical experience with industry-wide visualization and analytics tools such as Tableau, R, etc.
Strong understanding of operating system concepts.
Proven ability to diagnose and address issues like race conditions and memory leaks.
Adept problem solver with analytical thinking abilities.
Outstanding interpersonal skills and a positive attitude.
Demonstrated ability to collaborate effectively within a team.
Advantages:
Previous experience working on-premises solutions.
5+ years of experience in developing large-scale cloud systems.
Proficiency in Core Linux, SQL, and at least one scripting language.
Strong data engineering skills with expertise in cloud platforms like AWS, GCP, or Azure.
Expertise in developing pipelines for ETL processes, handling extensive data loads.
Familiarity with streaming frameworks, such as Kafka, or similar technologies.
Knowledge of data-warehousing and data-modeling techniques.
Practical experience with industry-wide visualization and analytics tools such as Tableau, R, etc.
Strong understanding of operating system concepts.
Proven ability to diagnose and address issues like race conditions and memory leaks.
Adept problem solver with analytical thinking abilities.
Outstanding interpersonal skills and a positive attitude.
Demonstrated ability to collaborate effectively within a team.
Advantages:
Previous experience working on-premises solutions.
This position is open to all candidates.