Location: Onsite 3 days/week in Jersey City NJ no relocation
Start: ASAP
Interview Process: 2 rounds 1 virtual and 1 in person
*When submitting please make sure resume is 3 pages or less this is required by BBH; also please include their LinkedIn profile with picture and full name *
Must have:
-Apache Airflow/DBT
-Kubernetes
-OpenShift
-Python
-Communication both written & verbal
-8 years of experience
We are seeking a highly skilled Senior Data Engineer with 8 years of hands-on experience in enterprise data engineering including deep expertise in Apache Airflow DAG development dbt Core modeling and implementation and cloud-native container platforms (Kubernetes / OpenShift).
This role is critical to building operating and optimizing scalable data pipelines that support financial and accounting platforms including enterprise system migrations and high-volume data processing workloads.
The ideal candidate will have extensive hands-on experience in workflow orchestration data modeling performance tuning and distributed workload management in containerized environments.
Key Responsibilities:
Data Pipeline & Orchestration
- Design develop and maintain complex Airflow DAGs for batch and event-driven data pipelines
- Implement best practices for DAG performance dependency management retries SLA monitoring and alerting
- Optimize Airflow scheduler executor and worker configurations for high-concurrency workloads
dbt Core & Data Modeling
- Lead dbt Core implementation including project structure environments and CI/CD integration
- Design and maintain robust dbt models (staging intermediate marts) following analytics engineering best practices
- Implement dbt tests documentation macros and incremental models to ensure data quality and performance
- Optimize dbt query performance for large-scale datasets and downstream reporting needs
Cloud Kubernetes & OpenShift
- Deploy and manage data workloads on Kubernetes / OpenShift platforms
- Design strategies for workload distribution horizontal scaling and resource optimization
- Configure CPU/memory requests and limits autoscaling and pod scheduling for data workloads
- Troubleshoot container-level performance issues and resource contention
Performance & Reliability
- Monitor and tune end-to-end pipeline performance across Airflow dbt and data platforms
- Identify bottlenecks in query execution orchestration and infrastructure
- Implement observability solutions (logs metrics alerts) for proactive issue detection
- Ensure high availability fault tolerance and resiliency of data pipelines
Collaboration & Governance
- Work closely with data architects platform engineers and business stakeholders
- Support financial reporting accounting and regulatory data use cases
- Enforce data engineering standards security best practices and governance policies
Location: Onsite 3 days/week in Jersey City NJ no relocation Start: ASAP Interview Process: 2 rounds 1 virtual and 1 in person *When submitting please make sure resume is 3 pages or less this is required by BBH; also please include their LinkedIn profile with picture and full name * Must have:...
Location: Onsite 3 days/week in Jersey City NJ no relocation
Start: ASAP
Interview Process: 2 rounds 1 virtual and 1 in person
*When submitting please make sure resume is 3 pages or less this is required by BBH; also please include their LinkedIn profile with picture and full name *
Must have:
-Apache Airflow/DBT
-Kubernetes
-OpenShift
-Python
-Communication both written & verbal
-8 years of experience
We are seeking a highly skilled Senior Data Engineer with 8 years of hands-on experience in enterprise data engineering including deep expertise in Apache Airflow DAG development dbt Core modeling and implementation and cloud-native container platforms (Kubernetes / OpenShift).
This role is critical to building operating and optimizing scalable data pipelines that support financial and accounting platforms including enterprise system migrations and high-volume data processing workloads.
The ideal candidate will have extensive hands-on experience in workflow orchestration data modeling performance tuning and distributed workload management in containerized environments.
Key Responsibilities:
Data Pipeline & Orchestration
- Design develop and maintain complex Airflow DAGs for batch and event-driven data pipelines
- Implement best practices for DAG performance dependency management retries SLA monitoring and alerting
- Optimize Airflow scheduler executor and worker configurations for high-concurrency workloads
dbt Core & Data Modeling
- Lead dbt Core implementation including project structure environments and CI/CD integration
- Design and maintain robust dbt models (staging intermediate marts) following analytics engineering best practices
- Implement dbt tests documentation macros and incremental models to ensure data quality and performance
- Optimize dbt query performance for large-scale datasets and downstream reporting needs
Cloud Kubernetes & OpenShift
- Deploy and manage data workloads on Kubernetes / OpenShift platforms
- Design strategies for workload distribution horizontal scaling and resource optimization
- Configure CPU/memory requests and limits autoscaling and pod scheduling for data workloads
- Troubleshoot container-level performance issues and resource contention
Performance & Reliability
- Monitor and tune end-to-end pipeline performance across Airflow dbt and data platforms
- Identify bottlenecks in query execution orchestration and infrastructure
- Implement observability solutions (logs metrics alerts) for proactive issue detection
- Ensure high availability fault tolerance and resiliency of data pipelines
Collaboration & Governance
- Work closely with data architects platform engineers and business stakeholders
- Support financial reporting accounting and regulatory data use cases
- Enforce data engineering standards security best practices and governance policies
View more
View less