Data Architect with Python and GCP cloud Full Time Role

Saransh Inc

Not Interested
Bookmark
Report This Job

profile Job Location:

Issaquah, WA - USA

profile Monthly Salary: Not Disclosed
Posted on: 30+ days ago
Vacancies: 1 Vacancy

Job Summary

Role: Senior Data Architect
Location: Issaquah WA (Day 1 Onsite)
Job Type: Full-Time
Must Have Skills:
Data Pipeline C# Python Google Cloud Platform (GCP) Data Quality
Job Description:
Looking for a Data Architect who will play a role in designing developing and implementing data pipelines and data integration solutions using Python and Google Cloud Platform services.
Responsibilities:
  • Develop construct test and maintain data acquisition pipelines for large volumes of structured and unstructured data. This includes batch and real-time processing
  • Develop and maintain data pipelines and ETL processes using Python.
  • Design build and optimize data models and data architecture for efficient data processing and storage
  • Implement data integration and data transformation workflows to ensure data quality and consistency
Required:
  • Working experience as a Data Engineer
  • Experienced in migrating large-scale applications from legacy systems to modern architectures.
  • Good programming skills in Python and experience with Spark for data processing and analytics
  • Experience in Google Cloud Platform services such as GCS Dataflow Cloud Functions Cloud Composer Cloud Scheduler Datastream (CDC) Pub/Sub BigQuery Dataproc etc. with Apache Beam (Batch & Stream data processing).
  • Develop JSON messaging structure for integrating with various application
  • Leverage DevOps and CI/CD practices (GitHub Terraform) to ensure the reliability and scalability of data pipelines.
  • Experience with scripting languages like Shell Perl etc.
  • Design and build an ingestion pipeline using Rest API.
  • Experience with data modeling data integration and ETL processes
  • Strong knowledge of SQL and database systems
  • Familiarity with managing cloud-native databases.
  • Understanding of security integration in CI/CD pipelines.
  • Understanding of data warehousing concepts and best practices
  • Proficiency in working with large-scale data sets and distributed computing frameworks
Note: Visa Independent candidates are highly preferred
Role: Senior Data Architect Location: Issaquah WA (Day 1 Onsite) Job Type: Full-Time Must Have Skills: Data Pipeline C# Python Google Cloud Platform (GCP) Data Quality Job Description: Looking for a Data Architect who will play a role in designing developing and implementing data pipelines a...
View more view more

Key Skills

  • Abinitio
  • Administration And Accounting
  • Android
  • Bid Management
  • Inventory Management
  • Embedded C