Job Description:
We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing building and maintaining scalable data infrastructure that drives business intelligence advanced analytics and machine learning initiatives. You must be comfortable working autonomously navigating complex challenges and driving projects to successful completion in a dynamic cloud environment.
Core Responsibilities
Design and Optimization: Design implement and optimize clean well-structured and performant analytical datasets to support high-volume reporting business analysis and data science model development.
Pipeline Development: Architect build and maintain scalable and robust data pipelines for diverse applications including business intelligence and advanced analytics.
Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.
Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.
DevOps Integration: Collaborate with DevOps teams to ensure smooth deployment monitoring and maintenance of data pipelines and infrastructure in cloud environments.
Required Skills & Experience
Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments with a strong preference for Google Cloud Platform (GCP) services specifically:
o BigQuery: Expert-level skills in data ingestion performance optimization and data modeling within a petabyte-scale environment.
o Experience with other relevant GCP services like Cloud Storage Cloud Dataflow/Beam or Pub/Sub.
Programming & Querying:
o Python: Expert-level programming proficiency in Python including experience with relevant data engineering libraries.
o SQL: A solid command of advanced SQL for complex querying data processing and performance tuning.
Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g. Apache Airflow Cloud Composer Dagster or similar).
DevOps/CI/CD:
o Strong understanding of DevOps principles and practices.
o Experience with CI/CD pipelines automation tools and deployment strategies.
o Familiarity with version control systems (Git) and tools like GitLab CI/CD GitHub Actions or Jenkins.
o Knowledge of containerization (Docker) and orchestration tools (Kubernetes) is a plus.
Monitoring & Automation: Ability to implement monitoring solutions and automate operational tasks to ensure reliability and scalability.
Job Description: We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing building and maintaining scalable data infrastructure that drives business intelligence advanced analytics and machine learning initiatives. Yo...
Job Description:
We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing building and maintaining scalable data infrastructure that drives business intelligence advanced analytics and machine learning initiatives. You must be comfortable working autonomously navigating complex challenges and driving projects to successful completion in a dynamic cloud environment.
Core Responsibilities
Design and Optimization: Design implement and optimize clean well-structured and performant analytical datasets to support high-volume reporting business analysis and data science model development.
Pipeline Development: Architect build and maintain scalable and robust data pipelines for diverse applications including business intelligence and advanced analytics.
Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.
Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.
DevOps Integration: Collaborate with DevOps teams to ensure smooth deployment monitoring and maintenance of data pipelines and infrastructure in cloud environments.
Required Skills & Experience
Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments with a strong preference for Google Cloud Platform (GCP) services specifically:
o BigQuery: Expert-level skills in data ingestion performance optimization and data modeling within a petabyte-scale environment.
o Experience with other relevant GCP services like Cloud Storage Cloud Dataflow/Beam or Pub/Sub.
Programming & Querying:
o Python: Expert-level programming proficiency in Python including experience with relevant data engineering libraries.
o SQL: A solid command of advanced SQL for complex querying data processing and performance tuning.
Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g. Apache Airflow Cloud Composer Dagster or similar).
DevOps/CI/CD:
o Strong understanding of DevOps principles and practices.
o Experience with CI/CD pipelines automation tools and deployment strategies.
o Familiarity with version control systems (Git) and tools like GitLab CI/CD GitHub Actions or Jenkins.
o Knowledge of containerization (Docker) and orchestration tools (Kubernetes) is a plus.
Monitoring & Automation: Ability to implement monitoring solutions and automate operational tasks to ensure reliability and scalability.
View more
View less