Title: Data Engineer
Location: Phoenix AZ (Hybrid) 85255
Experience: 10 Years
Duration: 12 Months of contract
Key Responsibilities:
Solid experience with Database design practices and data warehousing concepts.
Design develop and maintain data pipelines and ETL processes to extract transform and load data from various sources into our data warehouse.
Hands-on experience delivering data reporting requirements on On-Prem SQL Servers using SSIS and/or other integration processes.
Expertise with coding and implementing data pipelines in cloud-based data infrastructure in batch and streaming ETL using Azure Data Factory Spark Python Scala on Databricks.
Experience working with Azure data lake storage and Azure SQL Databases.
Experience working with Databricks Delta Lake Medallion Architecture and Unity Catalog.
Optimize data ingestion and transformation processes to ensure efficient and scalable data flows. Monitor and optimize database performance including query tuning and index optimization.
Develop and maintain data models schemas and data dictionaries for efficient data storage and retrieval.
Perform data analysis data profiling and data cleansing to ensure data quality and accuracy
data validation quality checks and troubleshooting to identify and resolve data-related issues.
Develop Power BI dashboards reports and visualizations to effectively communicate data insights. Implement data transformations data modeling and data integration processes in Power BI.
Create and maintain technical documentation related to database structures schemas and reporting solutions.
Experience with MS Office tools like Excel Word and PowerPoint.
Experience working with build and deploy tools Azure DevOps GitHub and Jenkins.
Experience working on an Agile Development team and delivering features incrementally using Jira and Confluence.
Stay up to date with industry trends and best practices in data engineering Power BI and proactively recommend and implement improvements to our data infrastructure.
Ability to multitask be adaptable and nimble within a team environment.
Strong communication interpersonal analytical and problem-solving skills.
Preferred:
Experience with API/REST JSON
Experience in Banking and Financial Domains.
Industry certification Azure/GCP/AWS
Familiarity with a variety of programming languages like Java JavaScript and C/C.
Familiarity with containerization and orchestration technologies (Docker Kubernetes) and experience with shell scripting in Bash Unix or Windows shell is preferable.
Experience with Power Automate and Power Apps.
Experience with Collibra Data Lineage and Data Quality Modules.
Education:
Bachelors degree in computer science or information systems along with work experience in a related field.
Title: Data Engineer Location: Phoenix AZ (Hybrid) 85255 Experience: 10 Years Duration: 12 Months of contract Key Responsibilities: Solid experience with Database design practices and data warehousing concepts. Design develop and maintain data pipelines and ETL processes to extrac...
Title: Data Engineer
Location: Phoenix AZ (Hybrid) 85255
Experience: 10 Years
Duration: 12 Months of contract
Key Responsibilities:
Solid experience with Database design practices and data warehousing concepts.
Design develop and maintain data pipelines and ETL processes to extract transform and load data from various sources into our data warehouse.
Hands-on experience delivering data reporting requirements on On-Prem SQL Servers using SSIS and/or other integration processes.
Expertise with coding and implementing data pipelines in cloud-based data infrastructure in batch and streaming ETL using Azure Data Factory Spark Python Scala on Databricks.
Experience working with Azure data lake storage and Azure SQL Databases.
Experience working with Databricks Delta Lake Medallion Architecture and Unity Catalog.
Optimize data ingestion and transformation processes to ensure efficient and scalable data flows. Monitor and optimize database performance including query tuning and index optimization.
Develop and maintain data models schemas and data dictionaries for efficient data storage and retrieval.
Perform data analysis data profiling and data cleansing to ensure data quality and accuracy
data validation quality checks and troubleshooting to identify and resolve data-related issues.
Develop Power BI dashboards reports and visualizations to effectively communicate data insights. Implement data transformations data modeling and data integration processes in Power BI.
Create and maintain technical documentation related to database structures schemas and reporting solutions.
Experience with MS Office tools like Excel Word and PowerPoint.
Experience working with build and deploy tools Azure DevOps GitHub and Jenkins.
Experience working on an Agile Development team and delivering features incrementally using Jira and Confluence.
Stay up to date with industry trends and best practices in data engineering Power BI and proactively recommend and implement improvements to our data infrastructure.
Ability to multitask be adaptable and nimble within a team environment.
Strong communication interpersonal analytical and problem-solving skills.
Preferred:
Experience with API/REST JSON
Experience in Banking and Financial Domains.
Industry certification Azure/GCP/AWS
Familiarity with a variety of programming languages like Java JavaScript and C/C.
Familiarity with containerization and orchestration technologies (Docker Kubernetes) and experience with shell scripting in Bash Unix or Windows shell is preferable.
Experience with Power Automate and Power Apps.
Experience with Collibra Data Lineage and Data Quality Modules.
Education:
Bachelors degree in computer science or information systems along with work experience in a related field.
View more
View less