Job Title: Data Architect
Location: Dallas TX - In-person interview after L1
Type: Contract
Job Requirements:
- Bachelors or masters degree in computer science Information Systems or related field
- Hands-on experience with Snowflake cloud data platform including data ingestion transformation and orchestration.
- Strong background in building and maintaining data warehouse solutions on Snowflake.
- Proficiency in SQL Python or other programming languages for data processing and automation
- Experience with ETL/ELT tools data pipeline development and ApacheAirflow workflow management
- Proficiency in real-time data processing (Spark Streaming Flink Kafka Streams).
- Experience with cloud data warehouses Snowflake and data lakes (Delta Lake Iceberg)
- Familiarity with NoSQL (MongoDB Cassandra) and key-value stores (Redis DynamoDB) is highly desirable.
- Experience with batch & streaming pipelines (Kafka Kinesis Pub/Sub).
- Experience with Azure cloud platforms Azure Event Hubs and their integration with Snowflake
- Understanding of marketing technologies customer data platforms and data integration challenges
- Knowledge of data quality data governance and security practices in data engineering
- Strong problem-solving skills and ability to optimize data processes for performance and scalability
- Good communication and teamwork skills to collaborate with data architects analysts and marketing teams
- Relevant certifications (e.g. Snowflake Azure Cloud and Big Data) are a plus