Job description
Experience: 510 Years
Employment Type: W2
Work Authorization: GC / US Citizens Only
Location: Englewood CO (Remote/Other locations may be considered)
Job Summary:
We are seeking an experienced ETL Developer with strong Python and SQL expertise to design develop and maintain scalable data pipelines. The ideal candidate will be responsible for building reliable ETL/ELT workflows ensuring data quality and supporting analytics and business intelligence initiatives across the organization.
Roles & Responsibilities
- Design develop test and maintain robust ETL/ELT data pipelines from multiple data sources (databases APIs flat files) to data warehouses or data lakes
- Write and optimize complex SQL queries scripts stored procedures and functions for data extraction transformation and loading
- Use Python (and shell scripting where required) to automate ETL workflows implement custom business logic manage file transfers and handle error processing
- Implement data quality checks and validation rules to ensure data accuracy completeness and consistency
- Monitor ETL jobs troubleshoot failures and optimize performance and scalability of data pipelines
- Collaborate with data analysts data architects and business stakeholders to gather requirements and define data models
- Create and maintain technical documentation including data flow diagrams source-to-target mappings and process documentation
- Support and enhance data warehouse solutions applying data modeling best practices such as star and snowflake schemas
Required Skills & Qualifications:
Technical Skills
- Strong expertise in SQL including complex queries joins aggregations performance tuning and database optimization
- Proficiency in Python for data processing automation and scripting (experience with libraries such as Pandas NumPy SQLAlchemy is a plus)
- Solid understanding of relational databases data warehousing concepts and data modeling
- Experience with ETL tools or frameworks such as Informatica Talend SSIS Apache Airflow AWS Glue or Azure Data Factory (preferred)
- Experience with version control systems (Git)
Education & Experience
- Bachelors degree in Computer Science Information Technology or a related field
- 510 years of hands-on experience in ETL development data engineering or related roles
Requirements
Required Skills & Qualifications:
Technical Skills
- Strong expertise in SQL including complex queries joins aggregations performance tuning and database optimization
- Proficiency in Python for data processing automation and scripting (experience with libraries such as Pandas NumPy SQLAlchemy is a plus)
- Solid understanding of relational databases data warehousing concepts and data modeling
- Experience with ETL tools or frameworks such as Informatica Talend SSIS Apache Airflow AWS Glue or Azure Data Factory (preferred)
- Experience with version control systems (Git)
Required Skills:
Required Skills & Qualifications: Technical Skills Strong expertise in SQL including complex queries joins aggregations performance tuning and database optimization Proficiency in Python for data processing automation and scripting (experience with libraries such as Pandas NumPy SQLAlchemy is a plus) Solid understanding of relational databases data warehousing concepts and data modeling Experience with ETL tools or frameworks such as Informatica Talend SSIS Apache Airflow AWS Glue or Azure Data Factory (preferred) Experience with version control systems (Git)
Job descriptionExperience: 510 YearsEmployment Type: W2Work Authorization: GC / US Citizens OnlyLocation: Englewood CO (Remote/Other locations may be considered)Job Summary:We are seeking an experienced ETL Developer with strong Python and SQL expertise to design develop and maintain scalable data p...
Job description
Experience: 510 Years
Employment Type: W2
Work Authorization: GC / US Citizens Only
Location: Englewood CO (Remote/Other locations may be considered)
Job Summary:
We are seeking an experienced ETL Developer with strong Python and SQL expertise to design develop and maintain scalable data pipelines. The ideal candidate will be responsible for building reliable ETL/ELT workflows ensuring data quality and supporting analytics and business intelligence initiatives across the organization.
Roles & Responsibilities
- Design develop test and maintain robust ETL/ELT data pipelines from multiple data sources (databases APIs flat files) to data warehouses or data lakes
- Write and optimize complex SQL queries scripts stored procedures and functions for data extraction transformation and loading
- Use Python (and shell scripting where required) to automate ETL workflows implement custom business logic manage file transfers and handle error processing
- Implement data quality checks and validation rules to ensure data accuracy completeness and consistency
- Monitor ETL jobs troubleshoot failures and optimize performance and scalability of data pipelines
- Collaborate with data analysts data architects and business stakeholders to gather requirements and define data models
- Create and maintain technical documentation including data flow diagrams source-to-target mappings and process documentation
- Support and enhance data warehouse solutions applying data modeling best practices such as star and snowflake schemas
Required Skills & Qualifications:
Technical Skills
- Strong expertise in SQL including complex queries joins aggregations performance tuning and database optimization
- Proficiency in Python for data processing automation and scripting (experience with libraries such as Pandas NumPy SQLAlchemy is a plus)
- Solid understanding of relational databases data warehousing concepts and data modeling
- Experience with ETL tools or frameworks such as Informatica Talend SSIS Apache Airflow AWS Glue or Azure Data Factory (preferred)
- Experience with version control systems (Git)
Education & Experience
- Bachelors degree in Computer Science Information Technology or a related field
- 510 years of hands-on experience in ETL development data engineering or related roles
Requirements
Required Skills & Qualifications:
Technical Skills
- Strong expertise in SQL including complex queries joins aggregations performance tuning and database optimization
- Proficiency in Python for data processing automation and scripting (experience with libraries such as Pandas NumPy SQLAlchemy is a plus)
- Solid understanding of relational databases data warehousing concepts and data modeling
- Experience with ETL tools or frameworks such as Informatica Talend SSIS Apache Airflow AWS Glue or Azure Data Factory (preferred)
- Experience with version control systems (Git)
Required Skills:
Required Skills & Qualifications: Technical Skills Strong expertise in SQL including complex queries joins aggregations performance tuning and database optimization Proficiency in Python for data processing automation and scripting (experience with libraries such as Pandas NumPy SQLAlchemy is a plus) Solid understanding of relational databases data warehousing concepts and data modeling Experience with ETL tools or frameworks such as Informatica Talend SSIS Apache Airflow AWS Glue or Azure Data Factory (preferred) Experience with version control systems (Git)
View more
View less