DescriptionAs a Senior Data Engineer you will leverage your deep knowledge of GCP and your experience with cloud migration to build and maintain scalable secure and reliable Data solutions. You will collaborate closely with architects engineers and operations teams to deliver high-quality cloud-based solutions.
ResponsibilitiesQualifications:
- 8 years of experience in data engineering with a 3 years of hands-on experience of Data Engineering on Google cloud-based data solutions.
- Strong hands-on experience with Google Cloud Platform (GCP) services including Big Query Dataflow Dataproc and Pub/Sub Data Catalog/ Data Plex Catalog
- Proficiency in SQL Python and Pyspark or Apache Beam.
- Proficiency understanding on Data Warehousing Data Analysis Data Profiling Data Quality & Data Mapping
- Experience building and optimizing ETL/ELT processes for large-scale data environments.
- Experience in Migrating on prem Warehouses ETLs and BI tools to GCP
- Familiarity with data modeling schema design and Big Query optimization.
- Hands-on experience with infrastructure-as-code tools like Terraform.
- Experience working with real-time data streaming and batch processing.
- Strong problem-solving skills and ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.
- Experience with CI/CD tools and automation frameworks (e.g. Jenkins GitLab CI Terraform Tekton etc..).
QualificationsResponsibilities:
- Design develop and optimize data pipelines and ETL workflows using GCP services such as Big Query Dataflow Pub/Sub Dataproc and Cloud Storage.
- Perform data cleaning transformation and validation to ensure accuracy and consistency across various data sources
- Understand solution patterns and help application teams implement the patterns and maintain data warehousing solutions to support analytics and reporting needs.
- Ensure data integrity security and governance best practices are followed across all data solutions.
- Work closely with cross functional applications to provide hands on help with the migrations
- Work closely with BI teams data scientists analysts and business stakeholders to understand data needs and translate them into scalable solutions.
- Monitor troubleshoot and improve data pipelines to ensure high availability and performance.
- Advocate for best practices in data modeling schema design and query optimization.
- Orchestrate data processing and workflows using Astronomer and Terraform.
- Collaborate in the design and implementation of machine learning data pipelines where applicable.
- Stay up to date with the latest GCP data technologies and industry trends.
- Leverage LLMs to create accelerators wherever it is feasible
Preferred Qualifications:
- GCP certifications (Professional Cloud Architect & Professional Data Engineer).
- Experience with machine learning pipelines and AI-driven data solutions.
- Experience with IBM DataStage Informatica DBT & Databricks
- Knowledge & understanding of BI tools such as QlikSense Power BI etc.
Experience with containerization (Docker Kubernetes).
DescriptionAs a Senior Data Engineer you will leverage your deep knowledge of GCP and your experience with cloud migration to build and maintain scalable secure and reliable Data solutions. You will collaborate closely with architects engineers and operations teams to deliver high-quality cloud-base...
DescriptionAs a Senior Data Engineer you will leverage your deep knowledge of GCP and your experience with cloud migration to build and maintain scalable secure and reliable Data solutions. You will collaborate closely with architects engineers and operations teams to deliver high-quality cloud-based solutions.
ResponsibilitiesQualifications:
- 8 years of experience in data engineering with a 3 years of hands-on experience of Data Engineering on Google cloud-based data solutions.
- Strong hands-on experience with Google Cloud Platform (GCP) services including Big Query Dataflow Dataproc and Pub/Sub Data Catalog/ Data Plex Catalog
- Proficiency in SQL Python and Pyspark or Apache Beam.
- Proficiency understanding on Data Warehousing Data Analysis Data Profiling Data Quality & Data Mapping
- Experience building and optimizing ETL/ELT processes for large-scale data environments.
- Experience in Migrating on prem Warehouses ETLs and BI tools to GCP
- Familiarity with data modeling schema design and Big Query optimization.
- Hands-on experience with infrastructure-as-code tools like Terraform.
- Experience working with real-time data streaming and batch processing.
- Strong problem-solving skills and ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.
- Experience with CI/CD tools and automation frameworks (e.g. Jenkins GitLab CI Terraform Tekton etc..).
QualificationsResponsibilities:
- Design develop and optimize data pipelines and ETL workflows using GCP services such as Big Query Dataflow Pub/Sub Dataproc and Cloud Storage.
- Perform data cleaning transformation and validation to ensure accuracy and consistency across various data sources
- Understand solution patterns and help application teams implement the patterns and maintain data warehousing solutions to support analytics and reporting needs.
- Ensure data integrity security and governance best practices are followed across all data solutions.
- Work closely with cross functional applications to provide hands on help with the migrations
- Work closely with BI teams data scientists analysts and business stakeholders to understand data needs and translate them into scalable solutions.
- Monitor troubleshoot and improve data pipelines to ensure high availability and performance.
- Advocate for best practices in data modeling schema design and query optimization.
- Orchestrate data processing and workflows using Astronomer and Terraform.
- Collaborate in the design and implementation of machine learning data pipelines where applicable.
- Stay up to date with the latest GCP data technologies and industry trends.
- Leverage LLMs to create accelerators wherever it is feasible
Preferred Qualifications:
- GCP certifications (Professional Cloud Architect & Professional Data Engineer).
- Experience with machine learning pipelines and AI-driven data solutions.
- Experience with IBM DataStage Informatica DBT & Databricks
- Knowledge & understanding of BI tools such as QlikSense Power BI etc.
Experience with containerization (Docker Kubernetes).
View more
View less