DevOps Engineer
Boston MA
Seeking an experienced DevOps Engineer to support our cloud data warehouse modernization initiative migrating from a SQL Server/AWS based system to a Snowflake-based data platform. The DevOps Engineer is responsible for developing maintaining and optimizing data pipelines and integration processes that support analytics reporting and business operations. The DevOps Engineer will design and implement CI/CD pipelines automate data pipeline deployments and ensure operational reliability across Snowflake Informatica and Apache Airflow environments. This role works closely with our IT team supporting the Department of Mental Health and Department of Public Health Hospitals.
DETAILED LIST OF JOB DUTIES AND RESPONSIBILITIES:
- Build and maintain CI/CD (Continuous Integration (CI)/Continuous Delivery/Deployment (CD) pipelines for Snowflake Informatica (IICS) and Airflow DAG (Directed Acyclic Graph) deployments
- Implement automated code promotion between development test and production environments
- Integrate testing linting and security scanning into deployment processes
- Develop IaC(Infrastructure as Code using Terraform or similar tools to manage Snowflake objects network and cloud resources
- Manage configuration and environment consistency across multi-region/multi-cloud setups
- Maintain secure connectivity between cloud and on-prem systems (VPNs private links firewalls)
- Implement logging and alerting for Airflow DAGs Informatica workflows and Snowflake performance
- Develop proactive monitoring dashboards for job failures data quality triggers and warehouse usage
- Optimize pipeline performance concurrency and cost governance in Snowflake
- Own deployment frameworks for ETL/ELT code SQL scripts metadata updates
- Support user access provisioning & RBAC alignment across Snowflake Informatica and Airflow
- Troubleshoot platform and orchestration issues lead incident response during outages
- Enforce DevSecOps practices including encryption secrets management and key rotation
- Implement audit logging compliance and backup/restore strategies aligned with governance requirements
- Participate in testing deployment and release management for new data workflows and enhancements.
Required Qualifications
- 3 7 years in DevOps Cloud Engineering or Data Platform Engineering role
- Snowflake (roles warehouses performance tuning cost control)
- Apache Airflow (DAG orchestration monitoring deployments)
- Informatica (IICS pipeline deployment automation preferred)
- Strong CI/CD skills using GitLab GitHub Actions Azure DevOps Jenkins or similar
- Proficiency with Terraform Python and Shell scripting
- Deep understanding of cloud platforms: AWS Azure or GCP
- Experience with containerization (Docker Kubernetes) especially for Airflow
- Strong knowledge of networking concepts and security controls
Preferred Knowledge Skills & Abilities:
- Experience migrating from SQL Server or other legacy DW platforms
- Knowledge of FinOps practices for Snowflake usage optimization
- Background in healthcare finance or regulated industries a plus
Education and Certification:
- Bachelors degree or equivalent years in Computer Science Information Systems Data Engineering Health Informatics or related field.
| Skills | Yrs of experience | Project name | Rating |
| 3 7 years in DevOps Cloud Engineering or Data Platform Engineering role | | | /10 |
| Snowflake (roles warehouses performance tuning cost control) | | | /10 |
| Apache Airflow (DAG orchestration monitoring deployments) | | | /10 |
| Informatica (IICS pipeline deployment automation preferred) | | | /10 |
| Strong CI/CD skills using GitLab GitHub Actions Azure DevOps Jenkins or similar | | | /10 |
| Proficiency with Terraform Python and Shell scripting | | | /10 |
| Deep understanding of cloud platforms: AWS Azure or GCP | | | /10 |
| Experience with containerization (Docker Kubernetes) especially for Airflow | | | /10 |
| Strong knowledge of networking concepts and security controls | | | /10 |
| Experience migrating from SQL Server or other legacy DW platforms | | | /10 |
| Knowledge of FinOps practices for Snowflake usage optimization | | | /10 |
DevOps Engineer Boston MA Seeking an experienced DevOps Engineer to support our cloud data warehouse modernization initiative migrating from a SQL Server/AWS based system to a Snowflake-based data platform. The DevOps Engineer is responsible for developing maintaining and optimizing data pipelin...
DevOps Engineer
Boston MA
Seeking an experienced DevOps Engineer to support our cloud data warehouse modernization initiative migrating from a SQL Server/AWS based system to a Snowflake-based data platform. The DevOps Engineer is responsible for developing maintaining and optimizing data pipelines and integration processes that support analytics reporting and business operations. The DevOps Engineer will design and implement CI/CD pipelines automate data pipeline deployments and ensure operational reliability across Snowflake Informatica and Apache Airflow environments. This role works closely with our IT team supporting the Department of Mental Health and Department of Public Health Hospitals.
DETAILED LIST OF JOB DUTIES AND RESPONSIBILITIES:
- Build and maintain CI/CD (Continuous Integration (CI)/Continuous Delivery/Deployment (CD) pipelines for Snowflake Informatica (IICS) and Airflow DAG (Directed Acyclic Graph) deployments
- Implement automated code promotion between development test and production environments
- Integrate testing linting and security scanning into deployment processes
- Develop IaC(Infrastructure as Code using Terraform or similar tools to manage Snowflake objects network and cloud resources
- Manage configuration and environment consistency across multi-region/multi-cloud setups
- Maintain secure connectivity between cloud and on-prem systems (VPNs private links firewalls)
- Implement logging and alerting for Airflow DAGs Informatica workflows and Snowflake performance
- Develop proactive monitoring dashboards for job failures data quality triggers and warehouse usage
- Optimize pipeline performance concurrency and cost governance in Snowflake
- Own deployment frameworks for ETL/ELT code SQL scripts metadata updates
- Support user access provisioning & RBAC alignment across Snowflake Informatica and Airflow
- Troubleshoot platform and orchestration issues lead incident response during outages
- Enforce DevSecOps practices including encryption secrets management and key rotation
- Implement audit logging compliance and backup/restore strategies aligned with governance requirements
- Participate in testing deployment and release management for new data workflows and enhancements.
Required Qualifications
- 3 7 years in DevOps Cloud Engineering or Data Platform Engineering role
- Snowflake (roles warehouses performance tuning cost control)
- Apache Airflow (DAG orchestration monitoring deployments)
- Informatica (IICS pipeline deployment automation preferred)
- Strong CI/CD skills using GitLab GitHub Actions Azure DevOps Jenkins or similar
- Proficiency with Terraform Python and Shell scripting
- Deep understanding of cloud platforms: AWS Azure or GCP
- Experience with containerization (Docker Kubernetes) especially for Airflow
- Strong knowledge of networking concepts and security controls
Preferred Knowledge Skills & Abilities:
- Experience migrating from SQL Server or other legacy DW platforms
- Knowledge of FinOps practices for Snowflake usage optimization
- Background in healthcare finance or regulated industries a plus
Education and Certification:
- Bachelors degree or equivalent years in Computer Science Information Systems Data Engineering Health Informatics or related field.
| Skills | Yrs of experience | Project name | Rating |
| 3 7 years in DevOps Cloud Engineering or Data Platform Engineering role | | | /10 |
| Snowflake (roles warehouses performance tuning cost control) | | | /10 |
| Apache Airflow (DAG orchestration monitoring deployments) | | | /10 |
| Informatica (IICS pipeline deployment automation preferred) | | | /10 |
| Strong CI/CD skills using GitLab GitHub Actions Azure DevOps Jenkins or similar | | | /10 |
| Proficiency with Terraform Python and Shell scripting | | | /10 |
| Deep understanding of cloud platforms: AWS Azure or GCP | | | /10 |
| Experience with containerization (Docker Kubernetes) especially for Airflow | | | /10 |
| Strong knowledge of networking concepts and security controls | | | /10 |
| Experience migrating from SQL Server or other legacy DW platforms | | | /10 |
| Knowledge of FinOps practices for Snowflake usage optimization | | | /10 |
View more
View less