Role: Senior Data Architect
Location: Issaquah WA (Day 1 Onsite)
Job Type: Full-Time
Must Have Skills:
Data Pipeline C# Python Google Cloud Platform (GCP) Data Quality
Job Description:
Looking for a Data Architect who will play a role in designing developing and implementing data pipelines and data integration solutions using Python and Google Cloud Platform services.
Responsibilities:
- Develop construct test and maintain data acquisition pipelines for large volumes of structured and unstructured data. This includes batch and real-time processing
- Develop and maintain data pipelines and ETL processes using Python.
- Design build and optimize data models and data architecture for efficient data processing and storage
- Implement data integration and data transformation workflows to ensure data quality and consistency
Required:
- Working experience as a Data Engineer
- Experienced in migrating large-scale applications from legacy systems to modern architectures.
- Good programming skills in Python and experience with Spark for data processing and analytics
- Experience in Google Cloud Platform services such as GCS Dataflow Cloud Functions Cloud Composer Cloud Scheduler Datastream (CDC) Pub/Sub BigQuery Dataproc etc. with Apache Beam (Batch & Stream data processing).
- Develop JSON messaging structure for integrating with various application
- Leverage DevOps and CI/CD practices (GitHub Terraform) to ensure the reliability and scalability of data pipelines.
- Experience with scripting languages like Shell Perl etc.
- Design and build an ingestion pipeline using Rest API.
- Experience with data modeling data integration and ETL processes
- Strong knowledge of SQL and database systems
- Familiarity with managing cloud-native databases.
- Understanding of security integration in CI/CD pipelines.
- Understanding of data warehousing concepts and best practices
- Proficiency in working with large-scale data sets and distributed computing frameworks
Note: Visa Independent candidates are highly preferred
Role: Senior Data Architect Location: Issaquah WA (Day 1 Onsite) Job Type: Full-Time Must Have Skills: Data Pipeline C# Python Google Cloud Platform (GCP) Data Quality Job Description: Looking for a Data Architect who will play a role in designing developing and implementing data pipelines a...
Role: Senior Data Architect
Location: Issaquah WA (Day 1 Onsite)
Job Type: Full-Time
Must Have Skills:
Data Pipeline C# Python Google Cloud Platform (GCP) Data Quality
Job Description:
Looking for a Data Architect who will play a role in designing developing and implementing data pipelines and data integration solutions using Python and Google Cloud Platform services.
Responsibilities:
- Develop construct test and maintain data acquisition pipelines for large volumes of structured and unstructured data. This includes batch and real-time processing
- Develop and maintain data pipelines and ETL processes using Python.
- Design build and optimize data models and data architecture for efficient data processing and storage
- Implement data integration and data transformation workflows to ensure data quality and consistency
Required:
- Working experience as a Data Engineer
- Experienced in migrating large-scale applications from legacy systems to modern architectures.
- Good programming skills in Python and experience with Spark for data processing and analytics
- Experience in Google Cloud Platform services such as GCS Dataflow Cloud Functions Cloud Composer Cloud Scheduler Datastream (CDC) Pub/Sub BigQuery Dataproc etc. with Apache Beam (Batch & Stream data processing).
- Develop JSON messaging structure for integrating with various application
- Leverage DevOps and CI/CD practices (GitHub Terraform) to ensure the reliability and scalability of data pipelines.
- Experience with scripting languages like Shell Perl etc.
- Design and build an ingestion pipeline using Rest API.
- Experience with data modeling data integration and ETL processes
- Strong knowledge of SQL and database systems
- Familiarity with managing cloud-native databases.
- Understanding of security integration in CI/CD pipelines.
- Understanding of data warehousing concepts and best practices
- Proficiency in working with large-scale data sets and distributed computing frameworks
Note: Visa Independent candidates are highly preferred
View more
View less