Role: SSIS Developer
Location: NJ/NY
About the job
We are seeking a SSIS Developer with experience in designing and developing data pipelines using Azure Databricks Data Factory and Datalake. The role involves managing large volumes of data building complex ETL solutions and working closely with business teams to deliver robust data transformations and analytics solutions.
Responsibilities:
- Design and develop ETL pipelines using SSIS for data transformation.
- Collaborate with Azure stack modules like Data Lakes and SQL DW to build robust data solutions.
- Write SQL Python and PySpark code for efficient data processing and transformation.
- Understand and translate business requirements into technical designs.
- Develop mapping documents and transformation rules as per project scope.
- Communicate project status with stakeholders ensuring smooth project execution.
Requirements Must have:
- 10-12 years of experience in SSIS ETL for data ingestion data processing and analytical pipelines for big data and relational databases.
- Hands-on experience with Azure services: ADLS Azure Databricks Data Factory Synapse Azure SQL DB.
- Experience in SQL Python and PySpark for data transformation and processing.
- Familiarity with DevOps and CI/CD deployments.
- Strong communication skills and attention to detail in high-pressure situations.
- Experience in the insurance or financial industry is preferred.
Regards
Manoj
Derex Technologies INC
Contact : Ext 206
Additional Information :
All your information will be kept confidential according to EEO guidelines.
Remote Work :
No
Employment Type :
Full-time
Role: SSIS DeveloperLocation: NJ/NY About the jobWe are seeking a SSIS Developer with experience in designing and developing data pipelines using Azure Databricks Data Factory and Datalake. The role involves managing large volumes of data building complex ETL solutions and working closely with busin...
Role: SSIS Developer
Location: NJ/NY
About the job
We are seeking a SSIS Developer with experience in designing and developing data pipelines using Azure Databricks Data Factory and Datalake. The role involves managing large volumes of data building complex ETL solutions and working closely with business teams to deliver robust data transformations and analytics solutions.
Responsibilities:
- Design and develop ETL pipelines using SSIS for data transformation.
- Collaborate with Azure stack modules like Data Lakes and SQL DW to build robust data solutions.
- Write SQL Python and PySpark code for efficient data processing and transformation.
- Understand and translate business requirements into technical designs.
- Develop mapping documents and transformation rules as per project scope.
- Communicate project status with stakeholders ensuring smooth project execution.
Requirements Must have:
- 10-12 years of experience in SSIS ETL for data ingestion data processing and analytical pipelines for big data and relational databases.
- Hands-on experience with Azure services: ADLS Azure Databricks Data Factory Synapse Azure SQL DB.
- Experience in SQL Python and PySpark for data transformation and processing.
- Familiarity with DevOps and CI/CD deployments.
- Strong communication skills and attention to detail in high-pressure situations.
- Experience in the insurance or financial industry is preferred.
Regards
Manoj
Derex Technologies INC
Contact : Ext 206
Additional Information :
All your information will be kept confidential according to EEO guidelines.
Remote Work :
No
Employment Type :
Full-time
View more
View less