Experience: 6 to 10 Years
Responsibilities:
- Develop and maintain cloud solutions using GCP tools such as Google Cloud Storage Big Lake BigQuery Cloud Composer and Dataform
- Design and implement efficient ETL/ELT processes for data warehousing and BI solutions including optimization for bulk data handling
- Write and maintain high-quality code using SQL Python and PySpark
- Implement data historization and versioning strategies for software and data
- Create and maintain CI/CD pipelines using GitLab
- Analyze business requirements and design overall solutions for data integration and consolidation
- Apply agile methodologies such as Kanban or Scrum in project management
- Utilize DWH-typical data modeling techniques including UML and meta-based frameworks
- Collaborate in intercultural near-offshore project environments
- Ensure adherence to DevOps practices and state-of-the-art build and deployment methods
Additional Information :
Please Note: Fraudulent job postings/job scams are increasingly common. Beware of misleading advertisements and fraudulent communication issuing offer letters on behalf of T-Systems in exchange for a fee. Please look for an authentic T-Systems email id - .
Stay vigilant. Protect yourself from recruitment fraud!
To know more please visit : Fraud Alert
Remote Work :
No
Employment Type :
Full-time