Required Senior Data Engineer (Customer)
Responsibilities
Design and build data solutions that support our Credit Card and Servicing business goals.
Develop advanced data pipelines to support the infrastructure, architecture and the product growth initiatives.
Create ETL/ELT processes and SQL queries to bring data to the data warehouse and other data sources.
Own and evolve data lake pipelines, maintenance, schema management, and improvements.
Collaborate with stakeholders across Product, Backend Engineering, and Data Science to align technical work with business outcomes.
Implement new tools and modern development approaches that improve both scalability and business agility.
Ensure adherence to coding best practices and development of reusable code.
Constantly monitor the data platform and make recommendations to enhance architecture, performance, and cost efficiency.
Responsibilities
Design and build data solutions that support our Credit Card and Servicing business goals.
Develop advanced data pipelines to support the infrastructure, architecture and the product growth initiatives.
Create ETL/ELT processes and SQL queries to bring data to the data warehouse and other data sources.
Own and evolve data lake pipelines, maintenance, schema management, and improvements.
Collaborate with stakeholders across Product, Backend Engineering, and Data Science to align technical work with business outcomes.
Implement new tools and modern development approaches that improve both scalability and business agility.
Ensure adherence to coding best practices and development of reusable code.
Constantly monitor the data platform and make recommendations to enhance architecture, performance, and cost efficiency.
Requirements:
4+ years of experience as a Data Engineer.
4+ years of Python and SQL experience.
4+ years of experience in data modeling and building scalable ELT/ETL pipelines across leading Data Warehouses (Snowflake – Preferred, Redshift, BigQuery).
3+ years of experience designing and managing automated data pipelines using Apache Airflow.
3+ years of experience developing scalable, production-grade data models with DBT.
Hands-on experience with cloud environments (AWS preferred) and big data technologies.
Strong troubleshooting and debugging skills in large-scale systems.
Proven experience packaging applications with Docker and utilizing Argo Workflows to automate, execute, and monitor containerized task sequences.
Experience with design patterns, coding best practices.
Proficiency with Git and modern source control.
Basic Linux/Unix system administration skills.
Nice to Have:
BS/MS in Computer Science or related field.
Experience with NoSQL or large-scale DBs.
Experience with microservices architecture.
Familiarity with Airbyte or other modern ETL platforms.
Experience with Apache Spark or Apache Kafka and the broader Data Engineering ecosystem.
4+ years of experience as a Data Engineer.
4+ years of Python and SQL experience.
4+ years of experience in data modeling and building scalable ELT/ETL pipelines across leading Data Warehouses (Snowflake – Preferred, Redshift, BigQuery).
3+ years of experience designing and managing automated data pipelines using Apache Airflow.
3+ years of experience developing scalable, production-grade data models with DBT.
Hands-on experience with cloud environments (AWS preferred) and big data technologies.
Strong troubleshooting and debugging skills in large-scale systems.
Proven experience packaging applications with Docker and utilizing Argo Workflows to automate, execute, and monitor containerized task sequences.
Experience with design patterns, coding best practices.
Proficiency with Git and modern source control.
Basic Linux/Unix system administration skills.
Nice to Have:
BS/MS in Computer Science or related field.
Experience with NoSQL or large-scale DBs.
Experience with microservices architecture.
Familiarity with Airbyte or other modern ETL platforms.
Experience with Apache Spark or Apache Kafka and the broader Data Engineering ecosystem.
This position is open to all candidates.













