ETL Developer
Location : Resource can be based out of DFW or PHX. Dallas/Phoenix.
Contract-6 Months
Required Qualifications:
Bachelors or Masters degree in Computer Science Information Systems or related field.
5 years of experience in ETL/ELT development with at least 2 years focused on Azure Data Factory.
Strong proficiency in SQL Postgres and data transformation logic.
Experience with Azure Blob Storage Azure Functions and Databricks.
Hands-on experience integrating with Coupa and SAP platforms.
Familiarity with CI/CD pipelines and version control tools (e.g. Git Azure DevOps).
Experience with Unix/Linux shell scripting for automation and process orchestration.
Expertise in data extraction transformation and loading from/to various data sources (e.g. relational databases flat files XML JSON etc.).
Solid understanding of data warehousing concepts dimensional modeling and big data processing.
Excellent problem-solving communication and documentation skills.
Preferred / Good-to-Have Qualifications:
Experience with IBM DataStage development.
Experience with Power BI Snowflake or Apache Spark.
Knowledge of data lake architecture data mesh or data fabric concepts.
Microsoft Azure certifications (e.g. DP-203: Data Engineering on Microsoft Azure).
Experience with REST/SOAP APIs and middleware platforms for enterprise integration.
ETL Developer Location : Resource can be based out of DFW or PHX. Dallas/Phoenix. Contract-6 Months Required Qualifications: Bachelors or Masters degree in Computer Science Information Systems or related field. 5 years of experience in ETL/ELT development with at least 2 years f...
ETL Developer
Location : Resource can be based out of DFW or PHX. Dallas/Phoenix.
Contract-6 Months
Required Qualifications:
Bachelors or Masters degree in Computer Science Information Systems or related field.
5 years of experience in ETL/ELT development with at least 2 years focused on Azure Data Factory.
Strong proficiency in SQL Postgres and data transformation logic.
Experience with Azure Blob Storage Azure Functions and Databricks.
Hands-on experience integrating with Coupa and SAP platforms.
Familiarity with CI/CD pipelines and version control tools (e.g. Git Azure DevOps).
Experience with Unix/Linux shell scripting for automation and process orchestration.
Expertise in data extraction transformation and loading from/to various data sources (e.g. relational databases flat files XML JSON etc.).
Solid understanding of data warehousing concepts dimensional modeling and big data processing.
Excellent problem-solving communication and documentation skills.
Preferred / Good-to-Have Qualifications:
Experience with IBM DataStage development.
Experience with Power BI Snowflake or Apache Spark.
Knowledge of data lake architecture data mesh or data fabric concepts.
Microsoft Azure certifications (e.g. DP-203: Data Engineering on Microsoft Azure).
Experience with REST/SOAP APIs and middleware platforms for enterprise integration.
View more
View less