We are looking for a data scientist with proven hands-on skills to join the team and build high-performing, scalable applications that turn time-series telemetry into data products. You will collaborate with great people from multiple disciplines to create innovative digital products and integrate them within the Sensing-as-a-Service cloud platform.
Responsibilities:
Rapidly experiment and build models to clean signals and provide valuable insights to our customers.
Design algorithms and SW solutions, considering large scale, high availability, security, robustness, performance, and cloud aspects.
Work hands-on with the data using SQL, PySpark, and other tools to extract, analyze, and visualize data.
Understand and oversee all phases of the development life cycle, such as Automation, CI\CD, TDD, Integrations, Builds, and Deployment.
Work with cross-functional stakeholders such as product management, marketing, and SW engineering.
Make strategic recommendations on data collection, integration, and retention requirements incorporating business best practices.
Responsibilities:
Rapidly experiment and build models to clean signals and provide valuable insights to our customers.
Design algorithms and SW solutions, considering large scale, high availability, security, robustness, performance, and cloud aspects.
Work hands-on with the data using SQL, PySpark, and other tools to extract, analyze, and visualize data.
Understand and oversee all phases of the development life cycle, such as Automation, CI\CD, TDD, Integrations, Builds, and Deployment.
Work with cross-functional stakeholders such as product management, marketing, and SW engineering.
Make strategic recommendations on data collection, integration, and retention requirements incorporating business best practices.
Requirements:
BSc in Computer Science or Electrical Engineering or equivalent. Higher degree is an advantage.
Strong background in statistics
Experience with AI/ML/DL methods
At least 3 years of development experience Python is an advantage
Experience with cloud data environments and big data ETL procedures, cloud architecture (AWS/GCP/Azure)
Experience with IOT an advantage
Familiarity with some ML/data-processing frameworks and libraries (TensorFlow, Keras, PyTorch, JAX, Spark, Scikit-learn, Pandas)
High self-learning and independent working abilities
Strong spoken and written English
Strong interpersonal and communication skills.
BSc in Computer Science or Electrical Engineering or equivalent. Higher degree is an advantage.
Strong background in statistics
Experience with AI/ML/DL methods
At least 3 years of development experience Python is an advantage
Experience with cloud data environments and big data ETL procedures, cloud architecture (AWS/GCP/Azure)
Experience with IOT an advantage
Familiarity with some ML/data-processing frameworks and libraries (TensorFlow, Keras, PyTorch, JAX, Spark, Scikit-learn, Pandas)
High self-learning and independent working abilities
Strong spoken and written English
Strong interpersonal and communication skills.
This position is open to all candidates.