DescriptionThis role is for an experienced GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP as well as landing data from source applications to GCP. Experience with large scale solutions and operationalising of data warehouses data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform.
The key deliverables include Data Platform migration and modernisation from Teradata to GCP to enable a modern technical solution that can not only handle existing products and services but also provides leading edge digital data product capability along with operational efficiency with the tools to manage the business daytoday to grow and innovate into the future.
Responsibilities- Lead technical Data Engineering team in Chennai working closely with Data business partners to help build the modern Datawarehouse in GCP using Google native tools.
- This role will work closely with teams in US and Europe to ensure robust integrated migration aligned with Global Data Engineering patterns and standards
- Design and deploying data pipelines with automated data lineage.
- Develop reusable Data Engineering patterns
- Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery DataFlow Pub/Sub BigTable Data Fusion DataProc Cloud Composer Cloud SQL Compute Engine Cloud Functions and App Engine
- Build Analytical Domains for Originations Dealer Vehicle and Wholesale by collaborating with Product Owners.
- Build Data Marts/Products to cater to end user requirements from Analytical domains.
Position Opportunities:
The Senior Data Engineer role within FCE Data Engineering supports the following opportunities for successful individuals:
- Key player in a high priority program to unlock the potential of Data Engineering Products and Services & secure operational resilience for Ford Credit Europe
- Explore and implement leading edge technologies tooling and software development best practices
- Experience of leading specific business and technical strategies working with senior Stakeholders including FCE Regulatory Reporting and FCE Data Management Services as the core customers
- Experience of managing data warehousing and product delivery within a financially regulated environment
- Experience of collaborative development practices within an openplan teamdesigned environment
- Experience of working with third party suppliers / supplier management
- Continued personal and professional development with support and encouragement for further certification
QualificationsEssential:
- 10 years of experience in data engineering with a focus on data warehousing and ETL development (including data modelling ETL processes and data warehousing principles).
- 7 years of complex SQL development experience
- 3 years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale
- Strong understanding of key GCP services especially those related to data processing (Batch/Real Time) leveraging Terraform BigQuery Dataflow DataFusion Dataproc Cloud Build AirFlow and Pub/Sub alongside and storage including Cloud Storage Bigtable Cloud Spanner
- Excellent problemsolving skills with the ability to design and optimize complex data pipelines
- Strong communication and collaboration skills capable of working effectively with both technical and nontechnical stakeholders as part of a large global and diverse team
- Experience developing with micro service architecture from container orchestration framework
- Designing pipelines and architectures for data processing
- Strong evidence of selfmotivation to continuously develop own engineering skills and those of the team
- Proven record of working autonomously in areas of high ambiguity without daytoday supervisory support
- Evidence of a proactive mindset to problem solving and willingness to take the initiative
- Strong prioritisation coordination organisational and communication skills and a proven ability to balance workload and competing demands to meet deadlines
Desired:
- Professional Certification in GCP (e.g. Professional Data Engineer).
- Data engineering or development experience gained in a regulated financial environment
- Experience with Teradata to GCP migrations is a plus.
- Strong expertise in SQL and experience with programming languages such as Python Java and/or Apache Beam
- Experience working with and managing Senior Stakeholder expectations and delivering against a strategic road map within a product organisation
- Experience of coaching and mentoring Data Engineers
- Experience with data security governance and compliance best practices in the cloud.
- Knowledge of additional cloud services and infrastructure as code (Terraform CloudFormation).
- Experience in building solution architecture provision infrastructure secure and reliable datacentric services and application in GCP
An understanding of current architecture standards and digital platform services strategy
Required Experience:
Senior IC