(Internal Title: Business System Configuration/Development II)
We are seeking a highly skilled professional for a 6-month contract position (with eligibility for extension) to support our client
Work within the clients CI/CD process (GitHub/DevOps) to design and manage data pipelines
Design and architect ingestion methods while building secure pipelines and data transformations in the client environment
Experienced Data Engineer to support a high-impact executive reporting initiative for a major capital infrastructure owner
This role will work as part of a multi-disciplinary team integrating data from EPC contractor systems (e.g. Oracle Primavera P6 Deltek Cobra EcoSys) into the clients Snowflake data platform enabling interactive executive dashboards in Power BI
This role focuses on data ingestion transformation modeling and automation working closely with business analysts Power BI developers and the clients IT team
Design build and maintain data ingestion pipelines using Snowflake (e.g. Snowpipe COPY INTO) to ingest files delivered from EPC partners (e.g. via SharePoint or API)
Develop and optimize SQL-based transformations and data models to support reporting KPIs across engineering procurement construction and commissioning
Implement and manage data validation routines to ensure data quality completeness and conformance to business rules
Collaborate with Business Analysts the Data Architect and Power BI Developers to translate KPI definitions into technical data logic
Work with EPC contractor data extracts in various formats (CSV Excel JSON) and normalize these into structured Snowflake tables
Monitor daily/weekly data jobs and work with the team to troubleshoot and resolve ingestion issues
Participate in technical design discussions and contribute to project timelines and documentation
Requirements
Experience:
5 years of experience as a Data Engineer or similar role.
Advanced-level SQL with proven experience in Snowflake-specific SQL syntax functions and query optimization
Experience ingesting structured files using Snowpipe COPY INTO External Tables or similar mechanisms
Strong understanding of Snowflake architecture including Virtual Warehouses Storage/Compute separation Micro-partitioning and Time Travel
Experience with data modeling best practices (star schema SCDs conformed dimensions) in Snowflake
Hands-on experience building ETL/ELT workflows with dbt Azure Data Factory or custom SQL jobs targeting Snowflake
Comfort working with stakeholders across technical and business teams to clarify logic and validate outputs
Experience with JIRA Git and collaborative development practices
Education:
Bachelors degree in Computer Engineering Data Science or related discipline; Masters degree preferred
CSM ITIL V4 and certifications in tools like Microsoft SQL Server Power BI Microsoft FabricAzure or GCP
Nice to Have:
Exposure to Project Management Information Systems like Oracle Primavera P6 Deltek Cobra EcoSys etc.
Snowflake certifications such as SnowPro Core SnowPro Advanced: Data Engineer or other Snowflake specialty certifications or performance optimization badges are a strong asset
Experience supporting Power BI projects or building models that feed enterprise dashboards
Familiarity with SharePoint-based data delivery or M365 ecosystems
Understanding of capital project controls scheduling procurement or commissioning
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.