Our Opportunity:
Chewy is looking for a Data Engineer I to join our growing Transportation Systems team. You will get to work side by side with hardworking passionate and motivated engineers. You will have the opportunity to gain handson experience in the field of data engineering working on different projects building data pipelines ETL processes and data warehouse management. The ideal candidate should have a strong interest in building and maintaining cloud databases ingesting data using a variety of methods (including nonSQL technologies like SOAP and REST) and working on joining datasets from different cloudbased source systems in a centralized database. If you are equally as passionate about supply chain transportation management information systems ecommerce and career growth an opportunity at Chewy may be a match!
What Youll Do:
- Develop and Maintain data pipelines to extract transform and load (ETL) data from various sources into our data lake.
- Configure custom data pipelines within Snowflake/AWS/Databricks for ingestion
- Maintain realtime alarming and debugging tools
- Design and implement solutions on a cloud platform using Infrastructure as code (Terraform)
- Maintain support and develop within the Supply Chain Transportation Data Mart Snowflake instance including code build/review auditing performance tuning and security.
- Create and maintain technical user documentation and models for the Data Mart
What Youll Need:
- Bachelor of Science or equivalent experience in Computer Science Engineering Information Systems or related field.
- Excellent verbal and written communication skills and the ability to explain details of complex concepts to nonexpert partners in a simple and understandable way.
- Strong knowledge of SQL and relational databases
- Python programming skills for data processing and pipeline development
- Experience with ETL (Extract Transform Load) processes
- Basic knowledge of data warehousing concepts
- Familiarity with cloud platforms (especially AWS services like S3 Lambda Airflow)
- Version control experience (Git)
Bonus:
- Experience of translating ambiguous customer requirements into clear problem definitions and delivering them.
- Design and experience of analytical projects.
- Experience with Python data libraries (pandas numpy)
- Some exposure to big data technologies (Hadoop Spark)
- Understanding of data modeling