Role: MS Fabric Developer
Location: Minneapolish MN (Onsite)
Job Type: Contract
JD:
Fabric Developer
Microsoft Fabric Python PySpark ETL Process Data Integration Azure
Job Responsiblities
Responsible for understanding the requirements and perform data analysis.
Responsible for setup of Microsoft fabric and its components
Building secure scalable solutions across the Microsoft Fabric platform.
Create and manage Lakehouse.
Implement Data Factory processes for data ingestion scalable ETL and data integration.
Design implement and manage comprehensive warehousing solutions for analytics using fabric
Creating and scheduling data pipelines using Azure data factory
Building robust data solutions using Microsoft data engineering tools like Notebook Lakehouse and spark application.
Build and automate deployment pipelines using CICD tools for the release of fabric content from lower to higher environments.
Set and use Git as a repository and versioning of fabric components
Create and manage Power BI reports and semantic models
Write and optimize complex SQL queries to extract and analyze data ensuring data processing and
accurate reporting.
Job Qualifications
Mandatory
Bachelors degree in computer science or similar field or equivalent work experience.
3 years of experience working in Microsoft Fabric.
Expertise in working with OneLake Lakehouse Warehouse and Notebook
Strong understanding of Power BI reports and semantic model using Fabric
Proven record of building ETL and data solutions using Azure data factory.
Strong understanding of data warehousing concepts and ETL processes.
Hand on experience of building data warehouses in fabric.
Strong skills in Python and PySpark
Practical experience of implementing spark in fabric scheduling spark jobs writing spark SQL queries.
Experience of utilizing Data Activator for effective data asset management and analytics.
Ability to flex and adapt to different tools and technologies.
Strong learning attitude.
Good written and verbal communication skills.
Demonstrated experience of working in a team spread across multiple locations.
Role: MS Fabric Developer Location: Minneapolish MN (Onsite) Job Type: Contract JD: Fabric Developer Microsoft Fabric Python PySpark ETL Process Data Integration Azure Job Responsiblities Responsible for understanding the requirements and perform data analysis. Responsible for setup of M...
Role: MS Fabric Developer
Location: Minneapolish MN (Onsite)
Job Type: Contract
JD:
Fabric Developer
Microsoft Fabric Python PySpark ETL Process Data Integration Azure
Job Responsiblities
Responsible for understanding the requirements and perform data analysis.
Responsible for setup of Microsoft fabric and its components
Building secure scalable solutions across the Microsoft Fabric platform.
Create and manage Lakehouse.
Implement Data Factory processes for data ingestion scalable ETL and data integration.
Design implement and manage comprehensive warehousing solutions for analytics using fabric
Creating and scheduling data pipelines using Azure data factory
Building robust data solutions using Microsoft data engineering tools like Notebook Lakehouse and spark application.
Build and automate deployment pipelines using CICD tools for the release of fabric content from lower to higher environments.
Set and use Git as a repository and versioning of fabric components
Create and manage Power BI reports and semantic models
Write and optimize complex SQL queries to extract and analyze data ensuring data processing and
accurate reporting.
Job Qualifications
Mandatory
Bachelors degree in computer science or similar field or equivalent work experience.
3 years of experience working in Microsoft Fabric.
Expertise in working with OneLake Lakehouse Warehouse and Notebook
Strong understanding of Power BI reports and semantic model using Fabric
Proven record of building ETL and data solutions using Azure data factory.
Strong understanding of data warehousing concepts and ETL processes.
Hand on experience of building data warehouses in fabric.
Strong skills in Python and PySpark
Practical experience of implementing spark in fabric scheduling spark jobs writing spark SQL queries.
Experience of utilizing Data Activator for effective data asset management and analytics.
Ability to flex and adapt to different tools and technologies.
Strong learning attitude.
Good written and verbal communication skills.
Demonstrated experience of working in a team spread across multiple locations.
View more
View less