About You
You are a Data Engineer with strong data warehousing expertise and solid engineering foundations passionate about designing building and maintaining scalable and reliable data platforms in Google Cloud Platform (GCP). You thrive in environments where you can apply best practices in data modeling automation and performance optimization while collaborating across teams to enable data-driven decision-making.
You Bring to Applaudo the Following Competencies:
- Bachelors degree in Computer Science Information Systems or a related field or equivalent practical experience.
- 5 years of hands-on experience as a Data Engineer.
- Proven expertise in data warehousing concepts including star schema dimensional modeling surrogate keys and slowly changing dimensions (SCDs).
- Advanced SQL skills for complex queries window functions and performance tuning.
- Strong experience with Google Cloud Platform (GCP) including BigQuery Data Fusion Dataflow Cloud Storage and Looker.
- Experience designing and maintaining scalable ETL/ELT pipelines and optimizing data flows from multiple sources (APIs DBs streams files).
- Proficient in Python for data processing integration and automation.
- Solid understanding of data governance lineage and pipeline observability best practices.
- Knowledge of software engineering principles such as version control CI/CD testing automation and Infrastructure as Code.
- Strong organizational skills and ability to adapt quickly to changing priorities.
- Experience implementing cost-efficient BigQuery architectures (partitioning clustering) (nice to have).
- GCP Certification Professional Data Engineer or equivalent (nice to have).
You Will Be Accountable for the Following Responsibilities:
- Design and maintain data warehouses and analytical models using star schemas dimensions facts and hierarchies.
- Build and maintain scalable ETL/ELT pipelines integrating multiple data sources ensuring reliability and performance.
- Implement data solutions leveraging GCP core data services such as BigQuery Data Fusion Dataflow Cloud Storage and Looker.
- Ensure data quality security and cost optimization across all data pipelines and environments.
- Collaborate with data architects analysts and business stakeholders to deliver end-to-end data solutions.
- Write efficient maintainable Python code for data integration processing and automation.
- Apply software engineering best practices including CI/CD version control and Infrastructure as Code.
- Implement and maintain observability across pipelines (logging monitoring and alerting).
- Contribute to continuous improvement by evaluating emerging technologies and optimizing existing data workflows.
Additional Information :
Here at Applaudo Studios values as trust communication respect excellence and team work are our keys to success. We know we are working with the best and thus treat each other with respect and admiration without asking.
Submit your application today and dont miss this opportunity to join the Best Digital team in the Region!
We truly appreciate all the hard and outstanding work our team makes every day at Applaudo Studios and thats why the perks that we offer are deeply thought and designed as a way to thank them for their commitment and excellence.
Some of our perks and benefits:
- Celebrations
- Special discounts*
- Entertainment area*
- Modern Work Spaces*
- Great work environment
- Private medical insurance*
*Benefits may vary according to your location and/or availability. Request further information when applying.
Remote Work :
Yes
Employment Type :
Full-time
About YouYou are a Data Engineer with strong data warehousing expertise and solid engineering foundations passionate about designing building and maintaining scalable and reliable data platforms in Google Cloud Platform (GCP). You thrive in environments where you can apply best practices in data mod...
About You
You are a Data Engineer with strong data warehousing expertise and solid engineering foundations passionate about designing building and maintaining scalable and reliable data platforms in Google Cloud Platform (GCP). You thrive in environments where you can apply best practices in data modeling automation and performance optimization while collaborating across teams to enable data-driven decision-making.
You Bring to Applaudo the Following Competencies:
- Bachelors degree in Computer Science Information Systems or a related field or equivalent practical experience.
- 5 years of hands-on experience as a Data Engineer.
- Proven expertise in data warehousing concepts including star schema dimensional modeling surrogate keys and slowly changing dimensions (SCDs).
- Advanced SQL skills for complex queries window functions and performance tuning.
- Strong experience with Google Cloud Platform (GCP) including BigQuery Data Fusion Dataflow Cloud Storage and Looker.
- Experience designing and maintaining scalable ETL/ELT pipelines and optimizing data flows from multiple sources (APIs DBs streams files).
- Proficient in Python for data processing integration and automation.
- Solid understanding of data governance lineage and pipeline observability best practices.
- Knowledge of software engineering principles such as version control CI/CD testing automation and Infrastructure as Code.
- Strong organizational skills and ability to adapt quickly to changing priorities.
- Experience implementing cost-efficient BigQuery architectures (partitioning clustering) (nice to have).
- GCP Certification Professional Data Engineer or equivalent (nice to have).
You Will Be Accountable for the Following Responsibilities:
- Design and maintain data warehouses and analytical models using star schemas dimensions facts and hierarchies.
- Build and maintain scalable ETL/ELT pipelines integrating multiple data sources ensuring reliability and performance.
- Implement data solutions leveraging GCP core data services such as BigQuery Data Fusion Dataflow Cloud Storage and Looker.
- Ensure data quality security and cost optimization across all data pipelines and environments.
- Collaborate with data architects analysts and business stakeholders to deliver end-to-end data solutions.
- Write efficient maintainable Python code for data integration processing and automation.
- Apply software engineering best practices including CI/CD version control and Infrastructure as Code.
- Implement and maintain observability across pipelines (logging monitoring and alerting).
- Contribute to continuous improvement by evaluating emerging technologies and optimizing existing data workflows.
Additional Information :
Here at Applaudo Studios values as trust communication respect excellence and team work are our keys to success. We know we are working with the best and thus treat each other with respect and admiration without asking.
Submit your application today and dont miss this opportunity to join the Best Digital team in the Region!
We truly appreciate all the hard and outstanding work our team makes every day at Applaudo Studios and thats why the perks that we offer are deeply thought and designed as a way to thank them for their commitment and excellence.
Some of our perks and benefits:
- Celebrations
- Special discounts*
- Entertainment area*
- Modern Work Spaces*
- Great work environment
- Private medical insurance*
*Benefits may vary according to your location and/or availability. Request further information when applying.
Remote Work :
Yes
Employment Type :
Full-time
View more
View less