Remote Data Engineer needs 10 years experience
Remote Data Engineer requires:
Must be local top Santa Clara
Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments with a strong preference for Google Cloud Platform (GCP) services specifically:
BigQuery: Expert-level skills in data ingestion performance optimization and data modeling within a petabyte-scale environment.
Experience with other relevant GCP services like Cloud Storage Cloud Dataflow/Beam or Pub/Sub
Programming & Querying:
Python: Expert-level programming proficiency in Python including experience with relevant data engineering libraries
SQL: A solid command of advanced SQL for complex querying data processing and performance tuning.
Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g. Apache Airflow Cloud Composer AirflowDagster or similar).
DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g. GitLab GitHub Actions) to automate deployment and testing processes.
DevOps/CI/CD Data Pipeline Orchestration: SQL Cloud Platform Expertise (GCP Focus): Cloud Storage Cloud Dataflow/Beam or Pub/Sub
Remote Data Engineer duties:
Design implement and optimize clean well-structured and performant analytical datasets to support high-volume reporting business analysis and data science model development.
Architect build and maintain scalable and robust data pipelines for diverse applications including business intelligence advanced analytics
Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.
Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.
Remote Data Engineer needs 10 years experience Remote Data Engineer requires: Must be local top Santa Clara Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments with a strong preference for Google Cloud Platform (GCP) services specifically: Big...
Remote Data Engineer needs 10 years experience
Remote Data Engineer requires:
Must be local top Santa Clara
Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments with a strong preference for Google Cloud Platform (GCP) services specifically:
BigQuery: Expert-level skills in data ingestion performance optimization and data modeling within a petabyte-scale environment.
Experience with other relevant GCP services like Cloud Storage Cloud Dataflow/Beam or Pub/Sub
Programming & Querying:
Python: Expert-level programming proficiency in Python including experience with relevant data engineering libraries
SQL: A solid command of advanced SQL for complex querying data processing and performance tuning.
Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g. Apache Airflow Cloud Composer AirflowDagster or similar).
DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g. GitLab GitHub Actions) to automate deployment and testing processes.
DevOps/CI/CD Data Pipeline Orchestration: SQL Cloud Platform Expertise (GCP Focus): Cloud Storage Cloud Dataflow/Beam or Pub/Sub
Remote Data Engineer duties:
Design implement and optimize clean well-structured and performant analytical datasets to support high-volume reporting business analysis and data science model development.
Architect build and maintain scalable and robust data pipelines for diverse applications including business intelligence advanced analytics
Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.
Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.
View more
View less