Google Cloud Data Engineer
Job Description:
We are seeking a highly skilled Data Engineer with extensive experience in Google Cloud Platform (GCP) data services and big data technologies. The ideal candidate will be responsible for designing implementing and optimizing scalable data solutions while ensuring high performance reliability and security.
Key Responsibilities:
Design develop and maintain scalable data pipelines and architectures using GCP data services.
Implement and optimize solutions using BigQuery Dataproc Composer Pub/Sub Dataflow GCS and BigTable.
Work with GCP databases such as Bigtable Spanner CloudSQL AlloyDB ensuring performance security and availability.
Develop and manage data processing workflows using Apache Spark Hadoop Hive Kafka and other Big Data technologies.
Ensure data governance and security using Dataplex Data Catalog and other GCP governance tooling.
Collaborate with DevOps teams to build CI/CD pipelines for data workloads using Cloud Build Artifact Registry and Terraform.
Optimize query performance and data storage across structured and unstructured datasets.
Design and implement streaming data solutions using Pub/Sub Kafka or equivalent technologies.
Required Skills & Qualifications:
8-15 years of experience
Strong expertise in GCP Dataflow Pub/Sub Cloud Composer Cloud Workflow BigQuery Cloud Run Cloud Build.
Proficiency in Python and Java with hands-on experience in data processing and ETL pipelines.
In-depth knowledge of relational databases (SQL MySQL PostgreSQL Oracle) and NoSQL databases (MongoDB Scylla Cassandra DynamoDB).
Experience with Big Data platforms such as Cloudera Hortonworks MapR Azure HDInsight IBM Open Platform.
Strong understanding of AWS Data services such as Redshift RDS Athena SQS/Kinesis.
Familiarity with data formats such as Avro ORC Parquet.
Experience handling large-scale data migrations and implementing data lake architectures.
Expertise in data modeling data warehousing and distributed data processing frameworks.
Deep understanding of data formats such as Avro ORC Parquet.
Certification in GCP Data Engineering Certification or equivalent.
Good to Have:
Experience in BigQuery Presto or equivalent.
Exposure to Hadoop Spark Oozie HBase.
Understanding of cloud database migration strategies.
Knowledge of GCP data governance and security best practices.
Together as owners lets turn meaningful insights into action.
Life at CGI is rooted in ownership teamwork respect and belonging. Here youll reach your full potential because
You are invited to be an owner from day 1 as we work together to bring our Dream to life. Thats why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our companys strategy and direction.
Your work creates value. Youll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas embrace new opportunities and benefit from expansive industry and technology expertise.
Youll shape your career by joining a company built to grow and last. Youll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons.
Come join our teamone of the largest IT and business consulting services firms in the world.
The COMPANY is one of the few end-to-end consulting firms with the scale, reach, capabilities and commitment to meet clients’ enterprise digital transformation needs. Our 77,500 consultants and professionals work side-by-side with clients in 10 industries across more than 400 location ... View more