Role: Software Development Senior Specialist
Location: Dallas TX (Onsite)
Type: Contract
Day to Day job Duties:
- Engineer will be part of the datastore-migration Factory team responsible for end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse. This is a high visibility and crucial project.
- Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment.
- Executing the physical migration of underlying datasets while ensuring data integrity.
- Acting as a technical liaison to internal clients facilitating handoff and sign-off conversations with data owners.
- Translating and optimizing legacy SQL and Spark-based consumption patterns for compatibility with Snowflake and Iceberg.
- Understanding usage patterns to deliver the required data products.
- Working on data reconciliation frameworks to ensure migrated data is functionally equivalent to production data.
- Working with internal data management platforms and learning new workflows and language constructs as necessary.
Core Data Engineering Competencies:
- Temporal Data Modeling: Managing state changes over time (e.g. SCD Type 2).
- Schema Management: Expertise in Schema Evolution (Apache Iceberg) and enforcement strategies.
- Performance Optimization: Knowledge of data partitioning and clustering.
- Architectural Theory: Balancing normalization vs denormalization and natural vs surrogate keys.
- Extraction & Logic: Kafka ANSI SQL FTP Apache Spark
- Data Formats: JSON Avro Parquet
- Platforms: Hadoop (HDFS/Hive) Snowflake Apache Iceberg Sybase IQ
- Demonstrates strong integrity and ethical decision-making.
- Acts as a trusted team player collaborating across teams.
- Communicates with clarity and confidence.
- Works effectively with global teams.
- Delivery-focused with strong ownership.
- High energy and urgency with professionalism.
- Shows intellectual curiosity and continuous improvement.
Basic Qualifications:
- Education: Bachelors or Masters degree in Computer Science Applied Mathematics Engineering or related field.
- Experience: Minimum 3 5 years of hands-on coding experience; ability to troubleshoot SQL and basic scripting.
- 3 Years of Professional proficiency in Python or Java.
- 3 Years of experience with SDLC CI/CD best practices and Kubernetes (K8s) deployment experience.
Nice to Have:
- Migration experience
- Financial services domain
- Lakehouse experience
Role: Software Development Senior Specialist Location: Dallas TX (Onsite) Type: Contract Day to Day job Duties: Engineer will be part of the datastore-migration Factory team responsible for end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse. This is a high visibility and c...
Role: Software Development Senior Specialist
Location: Dallas TX (Onsite)
Type: Contract
Day to Day job Duties:
- Engineer will be part of the datastore-migration Factory team responsible for end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse. This is a high visibility and crucial project.
- Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment.
- Executing the physical migration of underlying datasets while ensuring data integrity.
- Acting as a technical liaison to internal clients facilitating handoff and sign-off conversations with data owners.
- Translating and optimizing legacy SQL and Spark-based consumption patterns for compatibility with Snowflake and Iceberg.
- Understanding usage patterns to deliver the required data products.
- Working on data reconciliation frameworks to ensure migrated data is functionally equivalent to production data.
- Working with internal data management platforms and learning new workflows and language constructs as necessary.
Core Data Engineering Competencies:
- Temporal Data Modeling: Managing state changes over time (e.g. SCD Type 2).
- Schema Management: Expertise in Schema Evolution (Apache Iceberg) and enforcement strategies.
- Performance Optimization: Knowledge of data partitioning and clustering.
- Architectural Theory: Balancing normalization vs denormalization and natural vs surrogate keys.
- Extraction & Logic: Kafka ANSI SQL FTP Apache Spark
- Data Formats: JSON Avro Parquet
- Platforms: Hadoop (HDFS/Hive) Snowflake Apache Iceberg Sybase IQ
- Demonstrates strong integrity and ethical decision-making.
- Acts as a trusted team player collaborating across teams.
- Communicates with clarity and confidence.
- Works effectively with global teams.
- Delivery-focused with strong ownership.
- High energy and urgency with professionalism.
- Shows intellectual curiosity and continuous improvement.
Basic Qualifications:
- Education: Bachelors or Masters degree in Computer Science Applied Mathematics Engineering or related field.
- Experience: Minimum 3 5 years of hands-on coding experience; ability to troubleshoot SQL and basic scripting.
- 3 Years of Professional proficiency in Python or Java.
- 3 Years of experience with SDLC CI/CD best practices and Kubernetes (K8s) deployment experience.
Nice to Have:
- Migration experience
- Financial services domain
- Lakehouse experience
View more
View less