- Bachelors or Masters in Computer Science Applied Mathematics Engineering or a related quantitative field. b. c. d.
- Experience: Minimum of 3-5 years of professional hands-on-keyboard coding experience in a collaborative team-based environment.
- Ability to trouble shoot (SQL) and basic scripting experience. Languages: Professional proficiency in Python or Java.
- Methodology: Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience.
2. Core Data Engineering Competencies:
- Candidates must demonstrate a sophisticated understanding of the following modeling concepts to ensure data correctness during reconciliation:
- Temporal Data Modeling: Managing state changes over time (e.g. SCD Type 2). b. c. d. Schema Management:
- Expertise in Schema Evolution (Ref: Iceberg Apache) and enforcement strategies. Performance Optimization:
- Advanced knowledge of data partitioning and clustering. Architectural Theory: Balancing Normalization vs. Denormalization and the strategic use of Natural vs. Surrogate Keys.
Stack Requirements:
- While candidates are not expected to be experts in every tool the collective team must cover the following technologies:
- Extraction & Logic Kafka ANSI SQL FTP Apache Spark Data Formats JSON Avro Parquet Platforms Hadoop (HDFS/Hive) Snowflake Apache Iceberg Sybase IQ
Bachelors or Masters in Computer Science Applied Mathematics Engineering or a related quantitative field. b. c. d. Experience: Minimum of 3-5 years of professional hands-on-keyboard coding experience in a collaborative team-based environment. Ability to trouble shoot (SQL) and basic scripting exper...
- Bachelors or Masters in Computer Science Applied Mathematics Engineering or a related quantitative field. b. c. d.
- Experience: Minimum of 3-5 years of professional hands-on-keyboard coding experience in a collaborative team-based environment.
- Ability to trouble shoot (SQL) and basic scripting experience. Languages: Professional proficiency in Python or Java.
- Methodology: Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience.
2. Core Data Engineering Competencies:
- Candidates must demonstrate a sophisticated understanding of the following modeling concepts to ensure data correctness during reconciliation:
- Temporal Data Modeling: Managing state changes over time (e.g. SCD Type 2). b. c. d. Schema Management:
- Expertise in Schema Evolution (Ref: Iceberg Apache) and enforcement strategies. Performance Optimization:
- Advanced knowledge of data partitioning and clustering. Architectural Theory: Balancing Normalization vs. Denormalization and the strategic use of Natural vs. Surrogate Keys.
Stack Requirements:
- While candidates are not expected to be experts in every tool the collective team must cover the following technologies:
- Extraction & Logic Kafka ANSI SQL FTP Apache Spark Data Formats JSON Avro Parquet Platforms Hadoop (HDFS/Hive) Snowflake Apache Iceberg Sybase IQ
View more
View less