Data Engineer
Hybrid - Seattle WA
As an Data Engineer II you will bring a high level of technical knowledge but also an ability to spread knowledge to your co-workers. You will help form the core of our engineering practice at the company by contributing to all areas of development and operations (pre-production to production). You will be an example of what good engineering looks like and help others around you refine their skills. You will be part of a day-to-day production release team and may perform on-call support functions as needed. Having a DevOps mindset is the key to success in this role as Engineers are commonly part of full DevOps teams that own all parts of software development release pipelines production monitoring security and support.
- Data Engineering Projects
- Data pipeline creation and maintenance. Stack: Google Cloud Platform (GCP) Azure Cloud Azure Databricks Snowflake
-
- Includes engineering documentation knowledge transfer to other engineers future enhancements and maintenance
- Create secure data views and publish them to the Enterprise Data Exchange via Snowflake for other teams to consume
- Data pipeline modernization and migration via Databricks Delta Live Tables (DLT) and Unity Catalog
- Leverage existing CICD process for pipeline deployment
- Adhere to PII encryption and masking standards
- Data Engineering Tools/Techniques
- Orchestration tools- ADF AirFlow FiveTran
- Languages- SQL Python
- Data Modeling- Star and Snowflake Schema
- Streaming- Kafka EventHub Spark Snowflake Streaming
- DevOps Support
- Support improvements to current CICD process
- Production monitoring and failure support
- Provide an escalation point and participate in on-call support rotations
- Participate in discussions on how to improve DevOps
- Be aware of product release and how that impacts our business
- Take part in Agile ceremonies
- Perform engineering assignments using existing procedures and best practices
- Conduct research to aid in product troubleshooting and optimization efforts
- Participate in and contribute to our Engineering Community of Practice
Qualifications:
- Completed Bachelors degree or diploma (or equivalent experience) in Computer Science Software Engineering or Software Architecture preferred; candidates with substantial and relevant industry experience are also eligible
- 5 years of relevant engineering experience
- Google Professional Data Engineer Certification is preferred
- Experience in Big Table ClickStream data migration Semi-Structured and Un-Structured data management
- Experience with Google GCP and BigQuery
- Experience with developing complex SQL queries
- Experience with CI/CD principles and best practices
- Experience with Azure Data Factory Azure Data Bricks Snowflake and Storage Accounts.
- Experience working with a Data Engineering team and understanding of Data Engineering practices.
- Ability to learn understand and work quickly with new emerging technologies methodologies and solutions in the Cloud/IT technology space
- Experience with bug tracking and task management software such as JIRA etc.
- Experienced in managing outages customer escalations crisis management and other similar circumstances.
Must haves:
- Acknowledges the presence of choice in every moment and takes personal responsibility for their life.
- Possesses an entrepreneurial spirit and continuously innovates to achieve great results.
- Communicates with honesty and kindness and creates the space for others to do the same.
- Leads with courage knowing the possibility of greatness is bigger than the fear of failure.
- Fosters connection by putting people first and building trusting relationships.
- Integrates fun and joy as a way of being and working aka doesnt take themselves too seriously.
Data Engineer Hybrid - Seattle WA As an Data Engineer II you will bring a high level of technical knowledge but also an ability to spread knowledge to your co-workers. You will help form the core of our engineering practice at the company by contributing to all areas of development and operations...
Data Engineer
Hybrid - Seattle WA
As an Data Engineer II you will bring a high level of technical knowledge but also an ability to spread knowledge to your co-workers. You will help form the core of our engineering practice at the company by contributing to all areas of development and operations (pre-production to production). You will be an example of what good engineering looks like and help others around you refine their skills. You will be part of a day-to-day production release team and may perform on-call support functions as needed. Having a DevOps mindset is the key to success in this role as Engineers are commonly part of full DevOps teams that own all parts of software development release pipelines production monitoring security and support.
- Data Engineering Projects
- Data pipeline creation and maintenance. Stack: Google Cloud Platform (GCP) Azure Cloud Azure Databricks Snowflake
-
- Includes engineering documentation knowledge transfer to other engineers future enhancements and maintenance
- Create secure data views and publish them to the Enterprise Data Exchange via Snowflake for other teams to consume
- Data pipeline modernization and migration via Databricks Delta Live Tables (DLT) and Unity Catalog
- Leverage existing CICD process for pipeline deployment
- Adhere to PII encryption and masking standards
- Data Engineering Tools/Techniques
- Orchestration tools- ADF AirFlow FiveTran
- Languages- SQL Python
- Data Modeling- Star and Snowflake Schema
- Streaming- Kafka EventHub Spark Snowflake Streaming
- DevOps Support
- Support improvements to current CICD process
- Production monitoring and failure support
- Provide an escalation point and participate in on-call support rotations
- Participate in discussions on how to improve DevOps
- Be aware of product release and how that impacts our business
- Take part in Agile ceremonies
- Perform engineering assignments using existing procedures and best practices
- Conduct research to aid in product troubleshooting and optimization efforts
- Participate in and contribute to our Engineering Community of Practice
Qualifications:
- Completed Bachelors degree or diploma (or equivalent experience) in Computer Science Software Engineering or Software Architecture preferred; candidates with substantial and relevant industry experience are also eligible
- 5 years of relevant engineering experience
- Google Professional Data Engineer Certification is preferred
- Experience in Big Table ClickStream data migration Semi-Structured and Un-Structured data management
- Experience with Google GCP and BigQuery
- Experience with developing complex SQL queries
- Experience with CI/CD principles and best practices
- Experience with Azure Data Factory Azure Data Bricks Snowflake and Storage Accounts.
- Experience working with a Data Engineering team and understanding of Data Engineering practices.
- Ability to learn understand and work quickly with new emerging technologies methodologies and solutions in the Cloud/IT technology space
- Experience with bug tracking and task management software such as JIRA etc.
- Experienced in managing outages customer escalations crisis management and other similar circumstances.
Must haves:
- Acknowledges the presence of choice in every moment and takes personal responsibility for their life.
- Possesses an entrepreneurial spirit and continuously innovates to achieve great results.
- Communicates with honesty and kindness and creates the space for others to do the same.
- Leads with courage knowing the possibility of greatness is bigger than the fear of failure.
- Fosters connection by putting people first and building trusting relationships.
- Integrates fun and joy as a way of being and working aka doesnt take themselves too seriously.
View more
View less