Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailChallenging Problems: You will tackle complex and exciting challenges.
Flexibility: We offer a flexible and autonomous working environment.
Collaboration: Work with skilled and friendly teammates who are committed to quality.
Support: We fully support your efforts to build software the right way.
Tools: We provide the necessary tools for you to excel at your job.
Remote Work: Enjoy the benefits of a 100% remote work environment.
This initiative focuses on internal security and code safety across development teams. The project involves handling vulnerability data and security monitoring data from multiple sources ensuring they are properly ingested standardized and available in the data warehouse. While most data sources are already ingested the Data Engineer will complete the ingestion of remaining sources validate existing pipelines and ensure the Databricks warehouse meets security compliance and performance standards.
The role also involves integrating streaming data pipelines working with graph APIs and enabling Power BI dashboards that provide clear visibility into security and vulnerability metrics. Additionally the engineer will support automation and deployment efforts through Azure DevOps pipelines and infrastructure-as-code ensuring a secure modern and reliable data ecosystem.
Ingest and integrate security and vulnerability data sources into Databricks.
Validate and standardize the data warehouse to meet security and compliance requirements.
Build and maintain streaming pipelines for real-time security monitoring data.
Work with graph APIs and REST APIs to expand the organizations security data ecosystem.
Design and optimize ETL workflows using DBT Databricks notebooks and Python.
Develop and support Power BI dashboards and reports (backend and frontend) to deliver actionable insights.
Implement and maintain DevOps pipelines in Azure for automation CI/CD and secure deployments.
Collaborate with client stakeholders outsourced vendors and internal teams to ensure smooth delivery.
Participate in Agile sprint cycles making and delivering on commitments.
Communicate clearly and effectively in English both technically and cross-functionally.
Cloud & Data Platforms: Strong experience with Microsoft Azure (Data Lake Synapse etc.) Databricks and Azure DevOps.
ETL & Data Warehousing: Expertise in dimensional modeling ETL pipelines and DBT.
Programming & APIs: Proficiency in Python for data engineering and automation; familiarity with graph APIs and REST APIs.
Streaming Data: Hands-on experience with streaming ingestion and processing.
Security Data Engineering: Experience working with vulnerability/security monitoring data ingestion and validation.
Reporting & Visualization: Advanced experience with Power BI (backend and frontend) including semantic modeling and SQL endpoints.
Collaboration & Communication: Excellent English communication skills with proven success in Agile environments.
Terraform: Hands-on experience with infrastructure-as-code for cloud deployments.
ML Document Processing: Familiarity with ML-based document processing solutions.
#LI-Remote
#UnitedStates
To ensure youve received our notifications please whitelist the domains and
Required Experience:
Senior IC
Full-Time