Here at Lower we believe homeownership is the key to building wealth and were making it easier and more accessible than ever. As a mission-driven fintech we simplify the home-buying process through cutting-edge technology and a seamless customer experience.
With tens of billions in funded home loans and top ratings on Trustpilot (4.8) Google (4.9) and Zillow (4.9) were a leader in the industry. But what truly sets us apart Our people. Join us and be part of something bigger.
Job Description:
We are seeking a Senior Data Engineer to play a key role in building and optimizing our data infrastructure to support business insights and this role you will design and enhance denormalized analytics tables in Snowflake build scalable ETL pipelines and ensure data from diverse sources is transformed into accurate reliable and accessible formats. You will collaborate with business and sales stakeholders to gather requirements partner with developers to ensure critical data is captured at the application level and optimize existing frameworks for performance and integrity. This role also includes creating robust testing frameworks and documentation to ensure quality and consistency across data pipelines.
What youll do:
Data Pipeline Engineering:
Design develop and optimize high-performance ETL/ELT pipelines using Python dbt and Snowflake.
Build and manage real-time ingestion pipelines leveraging AWS Lambda and CDC systems.
Cloud & Infrastructure:
Develop scalable serverless solutions with AWS adopting event-driven architecture patterns.
Manage containerized applications using Docker and infrastructure as code via GitHub Actions.
Advanced Data Management:
Create sophisticated multi-layered Snowflake data models optimized for scalability flexibility and performance.
Integrate and manage APIs for Salesforce Braze and various financial systems emphasizing robust error handling and reliability.
Quality Assurance & Operations:
Implement robust testing frameworks data lineage tracking monitoring and alerting.
Enhance and manage CI/CD pipelines drive migration to modern orchestration tools (e.g. Dagster Airflow) and manage multi-environment deployments.
Who you are:
5 years of data engineering experience ideally with cloud-native architectures.
Expert-level Python skills particularly with pandas SQLAlchemy and asynchronous processing.
Advanced SQL and Snowflake expertise including stored procedures external stages performance tuning and complex query optimization.
Strong proficiency with dbt including macro development testing and automated deployments.
Production-grade Pipeline Experience specifically with Lambda S3 API Gateway and IAM.
Proven experience with REST APIs authentication patterns and handling complex data integrations.
Preferred Experience
Background in financial services or fintech particularly loan processing customer onboarding or compliance.
Experience with real-time streaming platforms like Kafka or Kinesis.
Familiarity with Infrastructure as Code tools (Terraform CloudFormation).
Knowledge of BI and data visualization tools (Tableau Looker Domo).
Container orchestration experience (ECS Kubernetes).
Understanding of data lake architectures and Delta Lake.
Technical Skills
Programming: Python (expert) SQL (expert) Bash scripting.
Cloud: AWS (Lambda S3 API Gateway CloudWatch IAM).
Data Warehouse: Snowflake dimensional modeling query optimization.
ETL/ELT: dbt pandas custom Python workflows.
DevOps: GitHub Actions Docker automated testing.
APIs: REST integration authentication error handling.
Data Formats: JSON CSV Parquet Avro.
Version Control: Git GitHub workflows.
What Sets You Apart
Systems Thinking: You see the big picture designing data flows that scale and adapt with the business.
Problem Solver: You quickly diagnose and resolve complex data issues across diverse systems and APIs.
Quality Advocate: You write comprehensive tests enforce data quality standards and proactively prevent data issues.
Collaborative: You thrive working alongside analysts developers and product teams ensuring seamless integration and teamwork.
Continuous Learner: You actively seek emerging data technologies and best practices to drive innovation.
Business Impact: You understand how your data engineering decisions directly influence and drive business outcomes.
Benefits & Perks
Competitive salary and comprehensive benefits (healthcare dental vision 401k match)
Hybrid work environment (primarily remote with two days a week in downtown Columbus Ohio
Professional growth opportunities and internal promotion pathways
Collaborative mission-driven culture recognized as a local and national best place to work
If you dont think you meet all of the criteria below but still are interested in the job please apply. Nobody checks every box and were looking for someone excited to join the team.
Lower provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race color religion age sex national origin disability status genetics protected veteran status sexual orientation gender identity or expression or any other characteristic protected by federal state or local laws.
This policy applies to all terms and conditions of employment including recruiting hiring placement promotion termination layoff recall transfer leaves of absence compensation and training.
Required Experience:
Senior IC
"Best mortgage ever." Technology and people work together to create a simple, connected experience—led by your very own expert. It’s the perfect balance to make sure this decision” is your best one ever.