Job Title: BI ETL Developer/ Data Engineer
Job Location: Austin TX
Required Skills: Strong ETL SQL expert with knowledge on python.
- ETL concepts
- Data Warehousing concepts
- Advanced SQL Concepts
- Data Validation/Data Quality Check
- CI/CD techniques.
- Programming language: JAVA Python
- Cloud Platform: GCP
Primary Roles & Responsibilities :
- Manage the end-to-end lifecycle of data pipelines including extraction transformation and loading into the Google Warehouse (GCP).
- Conduct in-depth data analysis and apply statistical methods to derive insights.
- Execute code development within the designated DEV environment.
- Perform comprehensive validation and testing prior to deploying code to UAT or other staging environments.
- Document and maintain records of all test outcomes.
- Facilitate the code submission process by preparing ChangeLists for peer review.
- Oversee the final deployment of code across multiple environments including UAT PREPROD and PROD.
Job Description:
- Develop and manage ETL data pipelines to populate the data warehouse using various custom and third-party systems.
- Create deploy and refine comprehensive full-stack Data and BI solutions covering everything from extraction and storage to transformation and visualization.
- Utilize SQL and Python to build and maintain robust data analysis scripts.
- Provide ongoing support and development for dashboards and reports via Google PLx and Looker Studio.
- Enhance existing business intelligence tools and create new dashboards to drive organizational growth.
- Conduct detailed data examinations and apply statistical analysis techniques.
- Monitor performance and implement necessary infrastructure optimizations
- Demonstrate excellent collaboration interpersonal communication and written skills with ability to work in a team environment.
Minium Qualification:
- Candidates must possess at least 6-8 years of professional experience.
- Due to the high-velocity nature of this project individuals with extensive experience will achieve the most effective results.
Responsibilities in this role :
- Design develop and maintain scalable and robust ETL/ELT processes and data pipelines using various tools and technologies.
- Build and optimize data warehouses data lakes and other data storage solutions to support analytical and operational needs.
- Implement data quality checks and monitoring to ensure the accuracy completeness and consistency of data.
- Work with large datasets performing data modeling schema design and performance tuning.
- Create data models that are easy for BI tools to consume and build dashboard.
Job Title: BI ETL Developer/ Data Engineer Job Location: Austin TX Required Skills: Strong ETL SQL expert with knowledge on python. ETL concepts Data Warehousing concepts Advanced SQL Concepts Data Validation/Data Quality Check CI/CD techniques. Programming language: JAVA Python Cloud Platform...
Job Title: BI ETL Developer/ Data Engineer
Job Location: Austin TX
Required Skills: Strong ETL SQL expert with knowledge on python.
- ETL concepts
- Data Warehousing concepts
- Advanced SQL Concepts
- Data Validation/Data Quality Check
- CI/CD techniques.
- Programming language: JAVA Python
- Cloud Platform: GCP
Primary Roles & Responsibilities :
- Manage the end-to-end lifecycle of data pipelines including extraction transformation and loading into the Google Warehouse (GCP).
- Conduct in-depth data analysis and apply statistical methods to derive insights.
- Execute code development within the designated DEV environment.
- Perform comprehensive validation and testing prior to deploying code to UAT or other staging environments.
- Document and maintain records of all test outcomes.
- Facilitate the code submission process by preparing ChangeLists for peer review.
- Oversee the final deployment of code across multiple environments including UAT PREPROD and PROD.
Job Description:
- Develop and manage ETL data pipelines to populate the data warehouse using various custom and third-party systems.
- Create deploy and refine comprehensive full-stack Data and BI solutions covering everything from extraction and storage to transformation and visualization.
- Utilize SQL and Python to build and maintain robust data analysis scripts.
- Provide ongoing support and development for dashboards and reports via Google PLx and Looker Studio.
- Enhance existing business intelligence tools and create new dashboards to drive organizational growth.
- Conduct detailed data examinations and apply statistical analysis techniques.
- Monitor performance and implement necessary infrastructure optimizations
- Demonstrate excellent collaboration interpersonal communication and written skills with ability to work in a team environment.
Minium Qualification:
- Candidates must possess at least 6-8 years of professional experience.
- Due to the high-velocity nature of this project individuals with extensive experience will achieve the most effective results.
Responsibilities in this role :
- Design develop and maintain scalable and robust ETL/ELT processes and data pipelines using various tools and technologies.
- Build and optimize data warehouses data lakes and other data storage solutions to support analytical and operational needs.
- Implement data quality checks and monitoring to ensure the accuracy completeness and consistency of data.
- Work with large datasets performing data modeling schema design and performance tuning.
- Create data models that are easy for BI tools to consume and build dashboard.
View more
View less