We are seeking a highly skilled and experienced Machine Learning Architect to join our Innovation team, a unit at the forefront of technological advancement in sports data analytics. This role demands expertise in designing and implementing comprehensive machine learning systems, with a strong foundation in developing complex algorithms, data modeling, and predictive analytics. You will manage the entire lifecycle of machine learning development, from ideation with stakeholders to the deployment of scalable models in production settings. By integrating strategy, research, and product development, you will play a pivotal role in transforming visionary ideas into industry-leading solutions, securing position as a pioneer in the sports-tech landscape.
Responsibilities:
Develop and implement machine learning systems, overseeing the complete lifecycle from conceptualization to deployment.
Collaborate with internal teams and external partners to deliver innovative solutions that enhance market leadership.
Architect, build, and scale production-grade data pipelines and services.
Take ownership of significant projects, guiding them from inception through to successful deployment.
Requirements:
5+ years of industry experience, with a bachelor’s degree or higher in Computer Science, Information Systems, or a related technical field, or equivalent experience.
Demonstrated experience with large language models (LLMs) and a strong background in Natural Language Processing (NLP) and Machine Learning models, including deployment, monitoring, and performance evaluation.
Proven ability to thrive in a fast-paced, high-pressure environment, with impeccable attention to detail and a strong decision-making and problem-solving skill set.
Excellent communication skills, capable of effectively articulating technical concepts to both technical and non-technical stakeholders.
Familiarity with the software development life cycle and Agile methodologies.
3+ years of experience in delivering production-grade data pipelines and backend services.
Expertise in data pipeline construction, distributed architectures, SQL and NoSQL databases, and data lake/warehouse design and implementation.
Knowledge of modern CI environments (e.g., Git, Docker, Kubernetes), ETL tools (e.g., AWS Glue, Apache Airflow), and messaging systems (e.g., Kafka, RabbitMQ) is highly desirable.
5+ years of industry experience, with a bachelor’s degree or higher in Computer Science, Information Systems, or a related technical field, or equivalent experience.
Demonstrated experience with large language models (LLMs) and a strong background in Natural Language Processing (NLP) and Machine Learning models, including deployment, monitoring, and performance evaluation.
Proven ability to thrive in a fast-paced, high-pressure environment, with impeccable attention to detail and a strong decision-making and problem-solving skill set.
Excellent communication skills, capable of effectively articulating technical concepts to both technical and non-technical stakeholders.
Familiarity with the software development life cycle and Agile methodologies.
3+ years of experience in delivering production-grade data pipelines and backend services.
Expertise in data pipeline construction, distributed architectures, SQL and NoSQL databases, and data lake/warehouse design and implementation.
Knowledge of modern CI environments (e.g., Git, Docker, Kubernetes), ETL tools (e.g., AWS Glue, Apache Airflow), and messaging systems (e.g., Kafka, RabbitMQ) is highly desirable.
This position is open to all candidates.