What you’ll be doing:
Discover and assess emerging technologies and innovations to ensure they align with our technological goals and business value.
Develop and integrate software services into the Data Center life-cycle.
Incorporate new data sources to consistently expand the capabilities of the centralized Data Lake.
Collaborate with various teams to define and implement solutions for their business requirements.
Work closely with architects, data engineers, and data scientists to shape solutions and meet business needs.
Contribute to defining the architecture for next-generation monitoring and analytics solutions for large-scale Data Centers.
Expand the capabilities of existing solutions and Open Source projects, and contribute to the community.
What we need to see:
BSc. or MSc. in Computer Engineering or Computer Science.
5+ years experience in development.
Perfect understanding of modern application platforms and design paradigms.
Knowledge of network stacks (Ethernet/IP, IB) is a noticeable benefit.
Experience with modern analytic tools and platforms such as Spark, Data Bricks, etc.
Advanced coding skills agnostic to development languages and tools.
Ways to stand out from the crowd:
Familiarity with cloud-native development and deployment methodologies.
Experience with public cloud solutions (AWS, GCP, Azure).”
Proficiency in GPU-accelerated solutions.
Experience with large-scale data processing systems.
Existing contributions to open-source communities.