We seek a highly skilled and experienced Solution Architect with a deep understanding of Data and operations to join our team. In this role, you will be responsible for designing and implementing data-centric solutions that enable our customers to achieve their business objectives, with a strong focus on building efficient and reliable ETL pipelines. You will work closely with clients to understand their needs, translate them into technical requirements, and architect scalable, reliable, and secure data pipelines.
Key Responsibilities:
Collaborate with clients to gather requirements and understand their data
challenges.
Design and implement end-to-end DataOps solutions, including data
ingestion, processing, storage, and analysis.
Leverage cloud technologies and best practices to architect scalable and
cost-effective data architectures.
Ensure data quality, integrity, and security throughout the data lifecycle.
Automate data pipelines and workflows to improve efficiency and reduce
manual effort.
Stay up-to-date with the latest DataOps trends and technologies.
Provide technical guidance and mentorship to other team members.
Requirements:
Proven experience as a Solution Architect or in a similar role, with a strong focus on DataOps.
Deep understanding of data management principles, data modeling, and ETL processes.
Expertise in cloud technologies (AWS, Azure, GCP) and data platforms (Hadoop, Spark, Snowflake, Looker).
Strong knowledge of programming languages (Python, Java, Scala) and scripting languages (Bash, PowerShell).
Experience with DevOps tools and practices (CI/CD, containerization, orchestration).
Excellent communication and collaboration skills, with the ability to work effectively with both technical and non-technical stakeholders.
Strong problem-solving and analytical skills
Bachelors degree in Computer Science, Engineering, or a related field.
Proven experience as a Solution Architect or in a similar role, with a strong focus on DataOps.
Deep understanding of data management principles, data modeling, and ETL processes.
Expertise in cloud technologies (AWS, Azure, GCP) and data platforms (Hadoop, Spark, Snowflake, Looker).
Strong knowledge of programming languages (Python, Java, Scala) and scripting languages (Bash, PowerShell).
Experience with DevOps tools and practices (CI/CD, containerization, orchestration).
Excellent communication and collaboration skills, with the ability to work effectively with both technical and non-technical stakeholders.
Strong problem-solving and analytical skills
Bachelors degree in Computer Science, Engineering, or a related field.
This position is open to all candidates.