Job Title: AWS Data Architect
Location: Columbia SC
Work Model: Onsite
Job Type: Long Term Contract
Experience:Min. 14 Years
Role Overview
We are seeking a highly experienced hands-on AWS Data Architect to lead the design and execution of our cloud data ecosystem. The primary focus is the end-to-end migration and modernization of legacy on-premise Microsoft stacks (SQL Server EDW SSIS and multi-platform reporting) to a cloud-native AWS architecture.
Core Roles & Responsibilities
1. Architectural Strategy & Migration Design
Design and evolve a modern Lakehouse / Data Mesh architecture using AWS S3 Glue and Amazon Redshift
Lead migration of MS SQL Server EDW to AWS ensuring performance and data integrity
Modernize pipelines by refactoring SSIS packages into AWS Glue Step Functions or MWAA (Airflow)
Drive BI/report modernization (SSRS Crystal Reports Power BI Tableau Hyperion AWS / QuickSight)
Implement governance scalability and compute optimization (Athena/Lambda vs EMR/MSK)
2. Hands-on Engineering & Implementation
Convert legacy SSIS ETL logic into Python / Spark (Glue / EMR)
Perform database migration using AWS DMS & SCT
Build real-time streaming solutions (Amazon Kinesis / MSK)
Automate infrastructure via Terraform / AWS CDK / CloudFormation
3. Optimization Security & Compliance
Tune Amazon Redshift (distribution styles sort keys query performance)
Optimize AWS costs (S3 lifecycle Glue job efficiency)
Implement security & governance (IAM Lake Formation KMS Secrets Manager)
Technical Skills & Experience Requirements
Mandatory AWS Expertise
Migration / Storage: AWS DMS SCT Amazon S3
Processing & Analytics: AWS Glue EMR Lambda Redshift (RA3) Athena
Data Stores: DynamoDB Aurora (PostgreSQL/MySQL) Neptune
Messaging / Orchestration: Kinesis MSK SQS Step Functions MWAA (Airflow)
Legacy & General Skills
Deep expertise in SQL Server SSIS SSRS
Experience with Crystal Reports Power BI Tableau Hyperion
Advanced Python & SQL (T-SQL / SparkSQL)
DevOps / CI-CD (Git Jenkins/GitLab/CodePipeline)
Data Formats
Strong understanding of Parquet Avro Delta Lake
Required Skills:
TABLEAUSPARKSQLGITLABSQL SERVERSPARKAWSSQLPOSTGRESQLT-SQLSSISDEVOPSAWS S3GITPOWER BIMYSQLPYTHON
Job Title: AWS Data ArchitectLocation: Columbia SCWork Model: OnsiteJob Type: Long Term ContractExperience:Min. 14 YearsRole OverviewWe are seeking a highly experienced hands-on AWS Data Architect to lead the design and execution of our cloud data ecosystem. The primary focus is the end-to-end migra...
Job Title: AWS Data Architect
Location: Columbia SC
Work Model: Onsite
Job Type: Long Term Contract
Experience:Min. 14 Years
Role Overview
We are seeking a highly experienced hands-on AWS Data Architect to lead the design and execution of our cloud data ecosystem. The primary focus is the end-to-end migration and modernization of legacy on-premise Microsoft stacks (SQL Server EDW SSIS and multi-platform reporting) to a cloud-native AWS architecture.
Core Roles & Responsibilities
1. Architectural Strategy & Migration Design
Design and evolve a modern Lakehouse / Data Mesh architecture using AWS S3 Glue and Amazon Redshift
Lead migration of MS SQL Server EDW to AWS ensuring performance and data integrity
Modernize pipelines by refactoring SSIS packages into AWS Glue Step Functions or MWAA (Airflow)
Drive BI/report modernization (SSRS Crystal Reports Power BI Tableau Hyperion AWS / QuickSight)
Implement governance scalability and compute optimization (Athena/Lambda vs EMR/MSK)
2. Hands-on Engineering & Implementation
Convert legacy SSIS ETL logic into Python / Spark (Glue / EMR)
Perform database migration using AWS DMS & SCT
Build real-time streaming solutions (Amazon Kinesis / MSK)
Automate infrastructure via Terraform / AWS CDK / CloudFormation
3. Optimization Security & Compliance
Tune Amazon Redshift (distribution styles sort keys query performance)
Optimize AWS costs (S3 lifecycle Glue job efficiency)
Implement security & governance (IAM Lake Formation KMS Secrets Manager)
Technical Skills & Experience Requirements
Mandatory AWS Expertise
Migration / Storage: AWS DMS SCT Amazon S3
Processing & Analytics: AWS Glue EMR Lambda Redshift (RA3) Athena
Data Stores: DynamoDB Aurora (PostgreSQL/MySQL) Neptune
Messaging / Orchestration: Kinesis MSK SQS Step Functions MWAA (Airflow)
Legacy & General Skills
Deep expertise in SQL Server SSIS SSRS
Experience with Crystal Reports Power BI Tableau Hyperion
Advanced Python & SQL (T-SQL / SparkSQL)
DevOps / CI-CD (Git Jenkins/GitLab/CodePipeline)
Data Formats
Strong understanding of Parquet Avro Delta Lake
Required Skills:
TABLEAUSPARKSQLGITLABSQL SERVERSPARKAWSSQLPOSTGRESQLT-SQLSSISDEVOPSAWS S3GITPOWER BIMYSQLPYTHON
View more
View less