Employee Applicant Privacy Notice
Who we are:
Shape a brighter financial future with us.
Together with our members, we’re changing the way people think about and interact with personal finance.
We’re a next-generation financial services company and national bank using innovative, mobile-first technology to help our millions of members reach their goals. The industry is going through an unprecedented transformation, and we’re at the forefront. We’re proud to come to work every day knowing that what we do has a direct impact on people’s lives, with our core values guiding us every step of the way. Join us to invest in yourself, your career, and the financial world.
The role
As a Risk Data Engineer,you will be a pivotal member of the data team, leveraging extensive knowledge and skills to design and implement highly scalable and robust data solutions that drive significant functional impact and contribute directly to company goals. This role demands deep and advanced expertise in data modeling, alongside proficiency in Python, dbt, Apache Airflow, and Snowflake, applied creatively and effectively to resolve complex data issues. You will independently determine methods to solve most problems, take ownership of managing identified risks within your projects, and influence data strategy within the team. You will also provide technical leadership and mentorship to junior engineers, fostering a culture of excellence and continuous improvement.
What You'll Do
- Independently collaborate with data engineers and business stakeholders to understand complex data requirements, and lead the delivery of comprehensive data solutions with significant functional impact.
- Lead the design and implementation of advanced, robust data models for complex analytical and operational needs, ensuring optimal performance, scalability, and adherence to best practices.
- Lead the development and maintenance of sophisticated ETL/ELT pipelines using dbt, Python, and Apache Airflow, addressing high-exposure risks and ensuring data integrity.
- Proactively develop and optimize intricate data workflows and pipelines on Snowflake, pushing the boundaries of performance and advanced features.
- Take ownership of the design and maintenance of a modern data platform, contributing significantly to its strategic direction.
- Drive the implementation of advanced and robust data quality processes, handling most escalations related to data accuracy, reliability, and compliance.
- Lead code reviews, provide expert feedback, and champion data engineering best practices across the team and department.
- Proactively troubleshoot and resolve highly complex data pipeline issues, suggesting solutions to avoid unintended negative impact cross-functionally.
- Drive the adoption of team tools and frameworks, simplifying processes and reducing complexity.
- Provide technical leadership and mentorship to junior data engineers, guiding them in best practices, complex problem-solving, and career development.
What you’ll need:
- 5+ years of advanced experience in data engineering, with deep and demonstrated expertise in complex data modeling and end-to-end pipeline development.
- Expert proficiency in Python for complex data processing, automation, and building scalable data applications.
- Extensive hands-on experience with dbt for advanced data transformation, modeling, and managing complex data dependencies.
- Expertise in Apache Airflow for designing, orchestrating, and optimizing highly complex and critical data workflows.
- Deep knowledge of Snowflake, including advanced features, cost optimization, and architecting solutions for large-scale data processing.
- Advanced proficiency in SQL for complex querying, data manipulation, and performance tuning.
- Expert-level understanding and practical experience with implementing data governance principles, data cataloging, metadata management, and advanced data quality tools.
- Experience with data observability tools and practices to monitor data pipelines, quality, and lineage.
- Proven ability to provide technical leadership and mentor junior team members.
Nice to have:
- Experience in implementing CI/CD pipelines for data engineering workflows.
- Familiarity with data cataloging and metadata tools.
- Knowledge of cloud-native data platforms and tools.
- Experience with Large Language Models (LLMs), including data preparation, fine-tuning, or operationalizing LLM-based applications, is highly desirable.
- Exceptional problem-solving and analytical skills, with meticulous attention to detail and the ability to resolve complex issues in creative and effective ways.
- Excellent communication and collaboration skills, with the ability to influence peers and key stakeholders to gain buy-in.
- Proactive mindset and eagerness to learn and adapt to new technologies, independently determining methods to solve most problems.
- Ability to distill complex concepts, facilitate dialogue, and understand key objectives and strategic priorities to influence team and department strategy.
- Strong ability to adapt to shifting priorities, proactively thinking through and communicating downstream implications to partners.
- Remains approachable and accessible while supporting and contributing to other team members' work, nurturing positive working relationships within an established network and developing relationships with functional leadership.