Responsibilities:
- Design develop and maintain data pipelines and ETL processes.
- Utilize SQL and relational databases to extract transform and load data from various sources.
- Work with AWS services such as Step Functions Lambda S3 CloudWatch EventBridge Secrets Manager SNS SES ECR EKS MWAA (Airflow) and CodeArtifact to build scalable and reliable data solutions.
- Collaborate with crossfunctional teams to gather requirements and design data solutions that meet business needs.
- Develop and implement best practices for data management including data quality data governance and data security.
- Optimize performance and efficiency of data pipelines and ETL processes.
- Monitor data pipelines and troubleshoot issues as needed.
- Document technical designs processes and procedures.
- Stay updated on industry trends and emerging technologies in data engineering.
Requirements:
- Fouryear college degree in computer science or data engineering; or commensurate work experience.
- 3 years of experience using Snowflake (or SQL tool equivalent).
- Expertise in SQL and relational databases.
- Knowledge and experience with AWS services mentioned above.
- Proficiency in ETL and/or other data manipulation languages such as Python.
- Experience with dbtcore or dbtcloud is a plus.
- Strong communication (verbal and written) and interpersonal skills.
- Ability to work independently and collaboratively in a fastpaced environment.
- Problemsolving mindset with attention to detail.
data management,data solutions,dbt,data engineering,dbt-core,snowflake,pipelines,aws services,dbt-cloud,relational databases,etl processes,aws,data pipelines,data,sql,etl,databases,python