What is The Role
We are seeking a Junior Data Engineer to join our Data Engineering & Architecture team. In this role you will contribute to our mission of building a world-class Data Platform. Your work will directly impact our strategy by making it easier for internal customers to leverage data as a strategic asset. This position offers an excellent opportunity to help craft critical initiatives that support real business outcomes.
What You Will Be Doing
- Assist in developing implementing and optimizing data pipelines using Apache Airflow for workflow orchestration.
- Develop and maintain ETL/ELT processes using Apache Spark and dbt.
- Collaborate with senior engineers to implement real-time data streaming solutions using Apache Kafka and Apache Flink.
- Learn to use Terraform for infrastructure-as-code to manage our Google Cloud Platform (GCP) resources.
- Document & Guide Users on Best Practices for using our platform.
- Monitor and Audit the Data Platform to ensure policies and procedures are maintained.
- Participate in code reviews to improve code quality and validate standard processes are met.
- Process Access Requests ensuring.
What You Bring
- 1 years of experience or significant internships in data-related projects.
- Demonstrated interest in data engineering through personal projects coursework or contributions to open-source projects.
- Strong Analytical and Problem Solving Skills.
- Excellent Verbal and Written Communication Skills.
- Eagerness to learn and stay updated with the latest data engineering trends.
- Proficiency in programming languages particularly Bash Python and SQL.
- Knowledge of database systems including both SQL and NoSQL databases.
- General understanding of data pipelines ETL/ELT processes data modeling and data warehousing concepts.
- Basic Knowledge of Git and DevOps practices.
- Bonuses: Familiarity with data processing frameworks such as Kafka Flink Spark and Iceberg.
#LI-DS1