As a Data Architect you will be expected to take an architecture lead role on our clients solution delivery engagements with high levels of customer engagement. This will involve ongoing analysis of business requirements throughout the lifetime of the service. Candidates will have a strong understanding of data architecture and analytics design and project delivery lifecycles with an emphasis of working in client facing environments.
Qualifications :
- A minimum of 5 years data architecture experience
- Data Architecture: frameworks standards and principles e.g. DAMADMBOK DSA and relational/nonrelational data modelling across conceptual logical and physical domains.
- Data management: data quality metadata management reference and master data management data integration & interoperability and data storage.
- Data governance: business glossary data standards data catalogues data dictionaries and data lineage.
- Enterprise Data Warehouse/Lakehouse/DataMarts and Analytics Design.
- Experience in formal architectural tools methods and documentation
- Experience with Utility customer environments beneficial including CIM CNAIM RIGs data experience.
- Understanding of business data and its various sources including but not limited to structured data within SQL Server SAP S4/Hana Oracle MongoDB PostgreSQL and unstructured data within multiple EDRMS and Content Management Systems.
- Understanding of streaming data technologies and methodologies.
- Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark Microsoft Fabric Databricks Snowflake) and associated industry standard/portable data formats (e.g. Delta Lake Iceberg Parquet CSV JSON Avro ORC and XML)
- Experience in analysing/understanding business enterprise data sources data volumes data velocity data variety and data value and advising the right tools and techniques that would best fit each layer within the Lakehouse architecture.
- Experience in implementing a data lake schema based on the analysis of data lake use cases performance and flexibility needs.
- An understanding of the Data lifecycle management within ETL and Data Streaming processes
- An understanding of Data Quality Frameworks within Data Lakehouse ETL and Data Streaming processes
- An understanding of Data Catalogue importance within a Data Lakehouse
- Experience in using a range of tools for performing ETL/Data Orchestration.
- Experience in following best practice when performing ETL such as Data cleansing validation enrichment deduplication and lineage.
- Experience in using a range of tools and languages to access and retrieve data from the Data Lake. Python PySpark SQL.
- Experience in implementing security and compliance with the Data Lakehouse to avoid unauthorized access modification and leakage.
Additional Information :
At Version 1 we believe in providing our employees with a comprehensive benefits package that prioritises their wellbeing professional growth and financial stability.
One of our standout advantages is the ability to work with a hybrid schedule along with business travel allowing our employees to strike a balance between work and life. We also offer a range of techrelated benefits including an innovative Tech Scheme to help keep our team members uptodate with the latest technology.
We prioritise the health and safety of our employees providing private medical and life insurance coverage as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations including AWS Microsoft Oracle and Red Hat.
Our employeedesigned Profit Share scheme divides a portion of our companys profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential offering Pathways Career Development Quarterly a programme designed to support professional growth.
Ekta Bahl Talent Acquisition Capability Partner
#LIEB1
Remote Work :
No
Employment Type :
Fulltime