Tasks
Analyze and define data requirements
Establish automated data extraction process for Sciforma, SWIFT and other PPM tools (i.e., API or other preferred methods)
Develop data lake structure and populate with data
Develop reports/queries for advanced data analytics
Develop Power BI reports/data visualization using advanced queries
Provide knowledge transfer
Minimum Qualifications
Five years of Data Lake Developer experience
Two engagements with two different entities in a Senior Data Lake Developer role
Preferred Qualifications
Experience in data lake configuration, setup, and bringing in data from various data sources using ETL, API
Experience in Azure Cloud services and solutions
Experience working with enterprise data warehouse
Experience as an ETL/ELT Developer using various ETL/ELT tools such as Azure Synapse Pipelines and Azure Data Factory, Apache Spark Pools using python scripts
Experience in Azure DevOps Services using Azure Git Repos, Azure data studio, Azure Analytics, data mapping, deployment artifacts and release packages for test & production environment
Experience in building end-end scalable data solution, from sourcing raw data, transforming data to producing analytics reports
Experience in Python (ETL and Data Visualization libraries)
Experience in Azure SQL databases across SQL DB, Managed instance & Data warehouse
Experience in Azure platform services such as blob storage, event hubs, monitoring services
Experience in creating data structures optimized for storage and various query patterns, for example Parquet
Experience in building secured Power BI reports, dashboards, paginated reports
Experience in working in an Agile SDLC methodology