Job Title: Senior Data Engineer (Banking)
Location: New York City NY
Position type: W2 contract Mandatory skills:
Data engineer Snowflake Databricks Python Pyspark SQL Banking Key Responsibilities -
Design and develop robust ETL/ELT pipelines using Snowflake Databricks Python PySpark and SQL.
-
Build and optimize data warehouses data marts and real-time data solutions for banking applications.
-
Collaborate with quantitative analysts data scientists and business stakeholders to deliver actionable data products.
-
Implement data governance quality checks and monitoring frameworks aligned with banking regulations (SOX GDPR CCAR).
-
Mentor junior engineers and contribute to architectural decisions and best practices.
-
9 years of hands-on data engineering experience with 3 years in banking/financial services.
-
Expertise in Snowflake (SnowSQL performance tuning security) and Databricks (Delta Lake Spark optimization).
-
Proficiency in Python and PySpark for large-scale data processing.
-
Advanced SQL skills for complex data modeling and query optimization.
-
Experience with cloud platforms (AWS/Azure) and CI/CD tools (Jenkins Git).
-
Strong understanding of banking data domains: trading risk compliance customer transactions.
Certifications and additional skills:
-
SnowPro Databricks Certified AWS/Azure Cloud.
-
Knowledge of real-time streaming (Kafka Spark Streaming).
-
Experience with data orchestration tools (Airflow Dagster).
-
Familiarity with BI/visualization tools (Tableau Power BI)
Job Title: Senior Data Engineer (Banking) Location: New York City NY Position type: W2 contract Mandatory skills: Data engineer Snowflake Databricks Python Pyspark SQL Banking Key Responsibilities Design and develop robust ETL/ELT pipelines using Snowflake Databricks Python PySpark and ...
Job Title: Senior Data Engineer (Banking)
Location: New York City NY
Position type: W2 contract Mandatory skills:
Data engineer Snowflake Databricks Python Pyspark SQL Banking Key Responsibilities -
Design and develop robust ETL/ELT pipelines using Snowflake Databricks Python PySpark and SQL.
-
Build and optimize data warehouses data marts and real-time data solutions for banking applications.
-
Collaborate with quantitative analysts data scientists and business stakeholders to deliver actionable data products.
-
Implement data governance quality checks and monitoring frameworks aligned with banking regulations (SOX GDPR CCAR).
-
Mentor junior engineers and contribute to architectural decisions and best practices.
-
9 years of hands-on data engineering experience with 3 years in banking/financial services.
-
Expertise in Snowflake (SnowSQL performance tuning security) and Databricks (Delta Lake Spark optimization).
-
Proficiency in Python and PySpark for large-scale data processing.
-
Advanced SQL skills for complex data modeling and query optimization.
-
Experience with cloud platforms (AWS/Azure) and CI/CD tools (Jenkins Git).
-
Strong understanding of banking data domains: trading risk compliance customer transactions.
Certifications and additional skills:
-
SnowPro Databricks Certified AWS/Azure Cloud.
-
Knowledge of real-time streaming (Kafka Spark Streaming).
-
Experience with data orchestration tools (Airflow Dagster).
-
Familiarity with BI/visualization tools (Tableau Power BI)
View more
View less