Client: State of Utah
Job ID: 150760
Job Title: Salt Lake City UT - IT - Salt Lake County - N/A - Microsoft Fabric Analytics Engineer
Job Location: 2001 South State Street Salt Lake City Utah 84114
Projected Start date: 12/01/2025
Duration: Length of assignment is anticipated to be 2 6 months
This is a repost of 148923. Previously interviewed candidates should not reapply but previously submitted candidates who did not interview but meet requirements can apply.
Hybrid schedule: All SLC employees and contractors are to be onsite two days of the week (Tuesdays and Wednesdays)
Candidate must be a US Citizen Local candidates only. This is a job requirement. If bid is a non-local candidate that person will be rejected.
No not-to-exceed bill rate. Bids should be in line with market rates.
Salt Lake County is looking for a Microsoft Fabric Analytics Engineer
Why is this a need They have an Immediate need to provide Tax admin agencies reporting as TORUS goes live
Required skills:
- Azure Cosmos DB (2-3 years of hands-on experience specifically with the NoSQL API and complex JSON data.)
- Azure Synapse Analytics (3-5 years of experience with a focus on Synapse Link and T-SQL)
- Synapse Serverless SQL Pools (3-5 years of experience with a focus on Synapse Link and T-SQL)
- Data Integration (Azure Data Factory) (3-5 years of experience building and managing data pipelines.)
- Power BI (5 years of experience as a Power BI expert with a strong portfolio showcasing advanced skills in data modeling DAX and Power Query (M Language).)
- SQL (5 years of experience with advanced SQL for querying and manipulation.)
- Proven experience in a collaborative cross-functional team environment
Preferred Skills
- Spark Pools
- Git or Azure DevOps for code management
- AWS or Google Cloud
- Azure Data Analyst Associate (PL-300)
- Azure Data Engineer Associate (DP-203)
- Azure Cosmos DB Developer Specialty
Job Requirements
- Azure Cosmos DB: This is the foundational skill. The contractor must have knowledge of Cosmos DB including its different APIs (especially for NoSQL) and a strong understanding of how to query and handle complex unstructured JSON data.
- Azure Synapse Analytics: This is the central hub for the reporting solution. The contractor needs to be proficient in Azure Synapse particularly in using Synapse Link for Cosmos DB to enable near-real-time analytics without impacting transactional workloads. This includes:
- Synapse Serverless SQL Pools: The ability to write T-SQL queries against the Cosmos DB analytical store.
- Data Integration (Azure Data Factory): Experience with building data pipelines to ingest transform and load data from various sources into Synapse.
- Power BI: This is the reporting layer. The person must be an expert in Power BI including:
- Data Modeling: Strong skills in creating efficient data models including star schemas to handle a large volume of data and complex relationships.
- DAX (Data Analysis Expressions): Proficiency in writing complex DAX measures calculated columns and tables to create meaningful metrics and insights.
- Power Query (M Language): Experience with data cleansing shaping and transformation in the Power Query editor.
- Report & Dashboard Development: The ability to design and build interactive visually appealing and high-performance reports and dashboards.
- SQL: A solid understanding of SQL is critical for querying and manipulating data in Synapse.
Key Responsibilities
The contractor should be able to handle the entire project lifecycle from initial design to final deployment. Their responsibilities should include:
- Solution Architecture: Designing a scalable and cost-effective reporting solution on Azure that leverages Cosmos DB Azure Synapse and Power BI.
- Data Engineering: Setting up and configuring Azure Synapse Link for your Cosmos DB containers and developing data pipelines to prepare data for reporting. This includes creating T-SQL views in Synapse Serverless SQL pools.
- Business Intelligence Development: Developing Power BI reports and dashboards including data models DAX measures and visualizations to meet complex reporting requirements.
- Performance Tuning: Optimizing queries in Synapse and DAX calculations in Power BI to ensure fast report load times and data refreshes.
- Stakeholder Collaboration: Working with you and other stakeholders to understand business requirements and translate them into technical specifications and reports.
Desirable Qualifications
- Experience with large datasets: Proven experience working with large-scale data and optimizing solutions for performance and cost.
- Communication Skills: The ability to clearly explain complex technical concepts to non-technical business users.
- Spark Pools (nice-to-have): Familiarity with PySpark or other languages for more complex data transformations within Synapse.
Client: State of Utah Job ID: 150760 Job Title: Salt Lake City UT - IT - Salt Lake County - N/A - Microsoft Fabric Analytics Engineer Job Location: 2001 South State Street Salt Lake City Utah 84114 Projected Start date: 12/01/2025 Duration: Length of assignment is anticipated to be 2 6 months Th...
Client: State of Utah
Job ID: 150760
Job Title: Salt Lake City UT - IT - Salt Lake County - N/A - Microsoft Fabric Analytics Engineer
Job Location: 2001 South State Street Salt Lake City Utah 84114
Projected Start date: 12/01/2025
Duration: Length of assignment is anticipated to be 2 6 months
This is a repost of 148923. Previously interviewed candidates should not reapply but previously submitted candidates who did not interview but meet requirements can apply.
Hybrid schedule: All SLC employees and contractors are to be onsite two days of the week (Tuesdays and Wednesdays)
Candidate must be a US Citizen Local candidates only. This is a job requirement. If bid is a non-local candidate that person will be rejected.
No not-to-exceed bill rate. Bids should be in line with market rates.
Salt Lake County is looking for a Microsoft Fabric Analytics Engineer
Why is this a need They have an Immediate need to provide Tax admin agencies reporting as TORUS goes live
Required skills:
- Azure Cosmos DB (2-3 years of hands-on experience specifically with the NoSQL API and complex JSON data.)
- Azure Synapse Analytics (3-5 years of experience with a focus on Synapse Link and T-SQL)
- Synapse Serverless SQL Pools (3-5 years of experience with a focus on Synapse Link and T-SQL)
- Data Integration (Azure Data Factory) (3-5 years of experience building and managing data pipelines.)
- Power BI (5 years of experience as a Power BI expert with a strong portfolio showcasing advanced skills in data modeling DAX and Power Query (M Language).)
- SQL (5 years of experience with advanced SQL for querying and manipulation.)
- Proven experience in a collaborative cross-functional team environment
Preferred Skills
- Spark Pools
- Git or Azure DevOps for code management
- AWS or Google Cloud
- Azure Data Analyst Associate (PL-300)
- Azure Data Engineer Associate (DP-203)
- Azure Cosmos DB Developer Specialty
Job Requirements
- Azure Cosmos DB: This is the foundational skill. The contractor must have knowledge of Cosmos DB including its different APIs (especially for NoSQL) and a strong understanding of how to query and handle complex unstructured JSON data.
- Azure Synapse Analytics: This is the central hub for the reporting solution. The contractor needs to be proficient in Azure Synapse particularly in using Synapse Link for Cosmos DB to enable near-real-time analytics without impacting transactional workloads. This includes:
- Synapse Serverless SQL Pools: The ability to write T-SQL queries against the Cosmos DB analytical store.
- Data Integration (Azure Data Factory): Experience with building data pipelines to ingest transform and load data from various sources into Synapse.
- Power BI: This is the reporting layer. The person must be an expert in Power BI including:
- Data Modeling: Strong skills in creating efficient data models including star schemas to handle a large volume of data and complex relationships.
- DAX (Data Analysis Expressions): Proficiency in writing complex DAX measures calculated columns and tables to create meaningful metrics and insights.
- Power Query (M Language): Experience with data cleansing shaping and transformation in the Power Query editor.
- Report & Dashboard Development: The ability to design and build interactive visually appealing and high-performance reports and dashboards.
- SQL: A solid understanding of SQL is critical for querying and manipulating data in Synapse.
Key Responsibilities
The contractor should be able to handle the entire project lifecycle from initial design to final deployment. Their responsibilities should include:
- Solution Architecture: Designing a scalable and cost-effective reporting solution on Azure that leverages Cosmos DB Azure Synapse and Power BI.
- Data Engineering: Setting up and configuring Azure Synapse Link for your Cosmos DB containers and developing data pipelines to prepare data for reporting. This includes creating T-SQL views in Synapse Serverless SQL pools.
- Business Intelligence Development: Developing Power BI reports and dashboards including data models DAX measures and visualizations to meet complex reporting requirements.
- Performance Tuning: Optimizing queries in Synapse and DAX calculations in Power BI to ensure fast report load times and data refreshes.
- Stakeholder Collaboration: Working with you and other stakeholders to understand business requirements and translate them into technical specifications and reports.
Desirable Qualifications
- Experience with large datasets: Proven experience working with large-scale data and optimizing solutions for performance and cost.
- Communication Skills: The ability to clearly explain complex technical concepts to non-technical business users.
- Spark Pools (nice-to-have): Familiarity with PySpark or other languages for more complex data transformations within Synapse.
View more
View less