Job Title: AWS Data Architect
Location: Fort Mill SC / New York/ New Jersey (Hybrid)
Role Overview:
Incedo is seeking a seasoned Backbridge Solution Lead / Architect to drive the design and delivery of a critical data integration and processing platform leveraging AWS native services. This solution will serve as a strategic data bridge across key enterprise domains (such as transactions positions and commissions) supporting broader data modernization and platform consolidation goals.
The ideal candidate will bring deep experience in AWS-based data lake and streaming architectures with the ability to operate across business and technical stakeholders to ensure scalable secure and compliant data solutions that integrate seamlessly with enterprise data ecosystem.
Key Responsibilities:
- Lead the architecture and delivery of the Backbridge solution serving as a near-real-time data integration layer across systems
- Collaborate with enterprise data architects domain leads and product owners to align on business rules governance and platform integration
- Design and build data pipelines leveraging AWS Glue Lambda MSK (Kafka) and Step Functions for both batch and streaming scenarios
- Develop API-based integration services using API Gateway and Lambda to connect with upstream and downstream systems
- Ensure role-based access control data encryption auditing and compliance using IAM Secrets Manager and Lake Formation
- Oversee orchestration of workflows using Glue Workflows and Step Functions
- Support CI/CD automation using Terraform and GitHub Actions
- Promote best practices in performance optimization error handling and observability using CloudWatch and CloudTrail
- Act as a key advisor in technical decision-making for future-state data architecture
Required Skills & Experience:
Cloud & AWS Technologies:
- Proficiency in AWS services: S3 Glue Lambda Step Functions CloudWatch CloudTrail IAM Lake Formation API Gateway Secrets Manager Parameter Store
- Experience working with Amazon MSK (Kafka) for streaming ingestion and distribution
- Familiarity with Athena Redshift Spectrum QuickSight for reporting use cases
Data Lake & Processing Frameworks:
- Deep knowledge of Amazon S3 Apache Iceberg AWS Glue Data Catalog and Lake Formation for data governance
- Prior experience with Snowflake or Databricks as part of cloud-native data strategy
- Strong ETL development using Python PySpark and Glue Jobs
Orchestration & Integration:
- Experience in orchestrating batch/streaming jobs with Glue Workflows and Step Functions
- Expertise in building and managing RESTful APIs and secured endpoints via API Gateway
DevOps & CI/CD:
- Hands-on experience with Terraform GitHub Actions and version-controlled infrastructure
- Exposure to delivery framework or familiarity with secure SDLC and regulatory-compliant DevOps practices
BI and Analytics:
- Understanding of Athena and QuickSight to support ad-hoc queries and reporting capabilities for operational teams
Preferred Qualifications:
- Experience working in the financial services industry or a highly regulated environment
- Exposure to Broker Dealer and WM data domains (e.g. Book of Records positions transactions accounts commissions etc)
- AWS Certifications (Solutions Architect Data Analytics Specialty)
- Strong communication and leadership skills to work across business and technical teams