drjobs Data Engineer

Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Lehi, UT - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

About Us:
Tech9 is shaking up a 20-year-old industry and were not slowing down. Recognized by Inc. 5000 as one of the nations fastest-growing companies we are dedicated to building innovative highly complex web applications. Our team is passionate about delivering quality software that meets the highest standards. We offer a 100% remote working environment with a collaborative and supportive team allowing you to focus on what you do best. Our current need is for a Data Engineer to join our US team. This role will allow you to work directly with our client where you will get hands-on experience working with new projects and teams.


Why Join Us

  • Challenging Problems: You will tackle complex and exciting challenges.

  • Flexibility: We offer a flexible and autonomous working environment.

  • Collaboration: Work with skilled and friendly teammates who are committed to quality.

  • Support: We fully support your efforts to build software the right way.

  • Tools: We provide the necessary tools for you to excel at your job.

  • Remote Work: Enjoy the benefits of a 100% remote work environment.


About this Project Security & Monitoring Data

This initiative focuses on internal security and code safety across development teams. The project involves handling vulnerability data and security monitoring data from multiple sources ensuring they are properly ingested standardized and available in the data warehouse. While most data sources are already ingested the Data Engineer will complete the ingestion of remaining sources validate existing pipelines and ensure the Databricks warehouse meets security compliance and performance standards.

The role also involves integrating streaming data pipelines working with graph APIs and enabling Power BI dashboards that provide clear visibility into security and vulnerability metrics. Additionally the engineer will support automation and deployment efforts through Azure DevOps pipelines and infrastructure-as-code ensuring a secure modern and reliable data ecosystem.


Key Responsibilities

  • Ingest and integrate security and vulnerability data sources into Databricks.

  • Validate and standardize the data warehouse to meet security and compliance requirements.

  • Build and maintain streaming pipelines for real-time security monitoring data.

  • Work with graph APIs and REST APIs to expand the organizations security data ecosystem.

  • Design and optimize ETL workflows using DBT Databricks notebooks and Python.

  • Develop and support Power BI dashboards and reports (backend and frontend) to deliver actionable insights.

  • Implement and maintain DevOps pipelines in Azure for automation CI/CD and secure deployments.

  • Collaborate with client stakeholders outsourced vendors and internal teams to ensure smooth delivery.

  • Participate in Agile sprint cycles making and delivering on commitments.

  • Communicate clearly and effectively in English both technically and cross-functionally.


Required Skills & Experience

  • Cloud & Data Platforms: Strong experience with Microsoft Azure (Data Lake Synapse etc.) Databricks and Azure DevOps.

  • ETL & Data Warehousing: Expertise in dimensional modeling ETL pipelines and DBT.

  • Programming & APIs: Proficiency in Python for data engineering and automation; familiarity with graph APIs and REST APIs.

  • Streaming Data: Hands-on experience with streaming ingestion and processing.

  • Security Data Engineering: Experience working with vulnerability/security monitoring data ingestion and validation.

  • Reporting & Visualization: Advanced experience with Power BI (backend and frontend) including semantic modeling and SQL endpoints.

  • Collaboration & Communication: Excellent English communication skills with proven success in Agile environments.


Preferred Skills

  • Terraform: Hands-on experience with infrastructure-as-code for cloud deployments.

  • ML Document Processing: Familiarity with ML-based document processing solutions.


Interview Process Overview

The process is designed to be thoughtful efficient and focused on both technical ability and team fit.

  1. 30-minute on-demand HireVue screening where youll respond to situational and behavioral questions to help us understand your ownership mindset adaptability and approach to collaboration.
  2. 10-minute virtual Q&A session with our recruiter to clarify the role and answer any questions you may have. This is not an interview just a conversation to ensure alignment.
  3. 60-minute live technical interview with one our Senior Data Engineers.
  4. 60-minute technical interview with a member of our Director of Engineering or CTO
  5. 15-30 minute chat with the hiring manger
  6. 30-60-minute session with the client

#LI-Remote

#UnitedStates



To ensure youve received our notifications please whitelist the domains and


Required Experience:

Senior IC

Employment Type

Full-Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.