What you'll be doing:
Discover and assess emerging technologies and innovations to ensure they align with our technological goals and business value.
Develop and integrate software services into the Data Center life-cycle.
Incorporate new data sources to consistently expand the capabilities of the centralized Data Lake.
Build high-performance ETL pipelines with advanced data integrity control.
Collaborate with various teams to define and implement solutions for their business requirements.
Work closely with architects and data scientists to shape solutions and meet business needs.
Contribute to defining the architecture for next-generation monitoring and analytics solutions for large-scale Data Centers.
Expand the capabilities of existing solutions and Open Source projects, and contribute to the community.
What we need to see:
BSc. or MSc. in Computer Engineering or Computer Science.
5+ years of experience in development.
Experience with modern analytic tools and platforms such as Spark, Data Bricks, etc.
Perfect understanding of telemetry solutions at scale and automation technologies, along with modern application platforms and paradigms.
Advanced coding skills agnostic to development languages and tools, from ETL pipelines to standalone solutions.
Ways to stand out from the crowd:
Familiarity with cloud-native development and deployment methodologies.
Experience with public cloud solutions (AWS, GCP, Azure).
Background in data center design and technologies.
Experience with large-scale data processing systems.
Existing contributions to open-source communities.