Senior Data Engineer with CI/ CD - Finance (Regulatory)
New York NY (Need Onsite day 1 hybrid 3 days from office).
Position type: W2 contract
Job Description:
We are seeking a skilled Data Engineer with expertise in Databricks Snowflake Python Pyspark SQL and Release Management to join our dynamic team. The ideal candidate will have a strong background in the banking domain and will be responsible for designing developing and maintaining robust data pipelines and systems to support our banking operations and analytics.
Responsibilities:
-
Design develop and maintain scalable and efficient data pipelines using Snowflake Pyspark and SQL.
-
Write optimized and complex SQL queries to extract transform and load data.
-
Develop and implement data models schemas and architecture that support banking domain requirements.
-
Collaborate with data analysts data scientists and business stakeholders to gather data requirements.
-
Automate data workflows and ensure data quality accuracy and integrity.
-
Manage and coordinate release processes for data pipelines and analytics solutions.
-
Monitor troubleshoot and optimize the performance of data systems.
-
Ensure compliance with data governance security and privacy standards within the banking domain.
-
Maintain documentation of data architecture pipelines and processes.
-
Stay updated with the latest industry trends and incorporate best practices.
Requirements:
-
Proven experience as a Data Engineer or in a similar role with a focus on Snowflake Python Pyspark and SQL.
-
Strong understanding of data warehousing concepts and cloud data platforms especially Snowflake.
-
Hands-on experience with release management deployment and version control practices.
-
Solid understanding of banking and financial services industry data and compliance requirements.
-
Proficiency in Python scripting and Pyspark for data processing and automation.
-
Experience with ETL/ELT processes and tools.
-
Knowledge of data governance security and privacy standards.
-
Excellent problem-solving and analytical skills.
-
Strong communication and collaboration abilities.
-
Expertise in CI/CD practices and implementation.
-
Strong financial background experience with regulatory knowledge and compliance requirements.
Preferred Qualifications:
-
Good Knowledge in Azure and Databricks in highly preferred.
-
Knowledge of Apache Kafka or other streaming technologies.
-
Familiarity with DevOps practices and CI/CD pipelines.
-
Prior experience working in the banking or financial services industry.
Senior Data Engineer with CI/ CD - Finance (Regulatory) New York NY (Need Onsite day 1 hybrid 3 days from office). Position type: W2 contract Job Description: We are seeking a skilled Data Engineer with expertise in Databricks Snowflake Python Pyspark SQL and Release Management to join our dynam...
Senior Data Engineer with CI/ CD - Finance (Regulatory)
New York NY (Need Onsite day 1 hybrid 3 days from office).
Position type: W2 contract
Job Description:
We are seeking a skilled Data Engineer with expertise in Databricks Snowflake Python Pyspark SQL and Release Management to join our dynamic team. The ideal candidate will have a strong background in the banking domain and will be responsible for designing developing and maintaining robust data pipelines and systems to support our banking operations and analytics.
Responsibilities:
-
Design develop and maintain scalable and efficient data pipelines using Snowflake Pyspark and SQL.
-
Write optimized and complex SQL queries to extract transform and load data.
-
Develop and implement data models schemas and architecture that support banking domain requirements.
-
Collaborate with data analysts data scientists and business stakeholders to gather data requirements.
-
Automate data workflows and ensure data quality accuracy and integrity.
-
Manage and coordinate release processes for data pipelines and analytics solutions.
-
Monitor troubleshoot and optimize the performance of data systems.
-
Ensure compliance with data governance security and privacy standards within the banking domain.
-
Maintain documentation of data architecture pipelines and processes.
-
Stay updated with the latest industry trends and incorporate best practices.
Requirements:
-
Proven experience as a Data Engineer or in a similar role with a focus on Snowflake Python Pyspark and SQL.
-
Strong understanding of data warehousing concepts and cloud data platforms especially Snowflake.
-
Hands-on experience with release management deployment and version control practices.
-
Solid understanding of banking and financial services industry data and compliance requirements.
-
Proficiency in Python scripting and Pyspark for data processing and automation.
-
Experience with ETL/ELT processes and tools.
-
Knowledge of data governance security and privacy standards.
-
Excellent problem-solving and analytical skills.
-
Strong communication and collaboration abilities.
-
Expertise in CI/CD practices and implementation.
-
Strong financial background experience with regulatory knowledge and compliance requirements.
Preferred Qualifications:
-
Good Knowledge in Azure and Databricks in highly preferred.
-
Knowledge of Apache Kafka or other streaming technologies.
-
Familiarity with DevOps practices and CI/CD pipelines.
-
Prior experience working in the banking or financial services industry.
View more
View less