drjobs Data Engineer III - Rx Systems

Data Engineer III - Rx Systems

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Pittston, PA - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Our Opportunity:

Chewys Pharmacy Operations Team is looking for a Data Engineer III to join the pack! In this role you will demonstrate your extensive expertise in data engineering to build and maintain high-quality data pipelines that enable development of insights data visualization and models to drive operational and business improvements. To deliver this you will develop and implement advanced data solutions and technologies that scale to meet the needs of our evolving business. Come join a team dedicated to redefining pharmacy operations where your work will directly influence strategic decisions and customer experiences!

What Youll Do:

  • Implement the strategy design execution and configuration of our evolving data stack including customer and pet data order transactions operational floow data prescription records among other complex data sets.
  • Lead the evaluation implementation and deployment of emerging tools and technologies to improve our productivity as a team.
  • Build and monitor data pipelines for accuracy missing data enhancements changes and volumes to ensure all data is captured and processed accurately and when needed.
  • Work with cross-functional stakeholders in defining and documenting requirements for building high-quality and impactful data products
  • Reconcile data issues and alerts between various systems finding opportunities to innovate and drive improvements that continuously improve data quality
  • Develop and deliver communication and education plans on data engineering capabilities standards and processes.
  • Code test and document new or modified data systems to create robust and scalable applications for reporting and data analytics.
  • Own and document data pipelines monitoring data accuracy and data lineage

What Youll Need:

  • Bachelors degree in MIS Computer Science Computer Engineering or relevant fields
  • 3 years of proven experience in data warehousing modeling and ETL pipeline development along with proficient understanding of dimensional and relational database architecture.
  • 2 years of professional scripting using Python to automate data workflows system monitoring and continuously optimize processes.
  • Advanced technical experience using SQL in a cloud environment (Snowflake preferred)
  • Proficiency in building and optimizing ETL pipelines using AWS Glue Control M Pyspark DBT and other applications.
  • Experience with setting up end-to-end data pipelines for new and/or changing businesses in an enterprise environment.
  • Hands-on with Cloud computing technology like AWS Google Cloud etc.
  • Experience with developing solutions for cloud computing services and infrastructure with AWS (S3 Athena Glue Lambda)
  • Familiarity with Tableau Looker or similar visualization/business intelligence platform
  • Proven ability to work collaboratively with data scientists analysts and business stakeholders to gather requirements and deliver impactful data solutions.
  • Ability to effectively operate both independently and as part of a team.
  • Self-motivated with strong problem-solving and self-learning skills.
  • The position may require travel

Bonus:

  • Masters degree in Computer Science Data Science or related field
  • Experience with dbt for transformation and testing in the ELT process.
  • Proficiency using Apache Airflow or other DAG frameworks
  • Expertise in crafting and implementing data pipelines using multiple modern data engineering approaches and tools: Spark PySpark Java Docker cloud-native DWH (Snowflake Redshift) Kafka/Confluence etc.
  • Experience with CI/CD processes and platforms.
  • Experience with Oracle OSvC data structures.
  • Experience crafting APIs or data services to expose data to downstream applications.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.