Who Youll Work With
Youll be part of the IT applications team as a Lead Data Engineer.
What Youll Do
- Collaborate with stakeholders and source data system teams to understand data requirements
- Architect and implement scalable workspace data lake dimensional models data pipelines data warehouses and other ETL/ELT processes using Fabric.
- Work with Fabric assets Power BI and other services to build end-to-end data solutions
- Ensure data quality security and compliance with regulations by implementing data validation logging monitoring and role-based access controls.
- Perform root cause analysis on internal/external data and processes to answer specific business questions and identify opportunities for improvement.
- Manage platform cost optimization data quality/governance and performance tuning
- Follow software quality process and methodology standards including those for design data quality code version control defect/change request tracking documentation work product review unit testing and environment management.
- Review requirements / user stories and provide feedback to the team. Includes participation/input to the requirements process
- Integrate AI/ML models and GenAI capabilities into data products and workflows
- Help the QA and functional team to identify and define testing strategies for existing and new features
- Ability to ensure that solutions developed by development teams fit the business needs
- Able to work under pressure and meet deadlines
- Comfortable working in evening hours (2pm to 11pm IST)
Qualifications :
- 8 years of experience in data engineering roles preferably in a global enterprise environment
- Strong hands-on experience with Microsoft Fabric Data Lake Data Warehouse Data pipelines and related broader Microsoft ecosystem.
- Expertise in Power BI semantic models and datasets for building dashboards and reports
- Strong DAX and Power Query skills
- Expert proficiency in SQL Python PySpark for data processing
- Must have implemented ETL solutions to integrate data from various sources into Azure Data Lake and Data Warehouse
- Good knowledge of EDW
- Strong understanding of data management processes such as data normalisation and modelling as well as data security principles data access control and confidentiality.
- Good to have experience in Copilot or any AI/ML solutions with at least basic exposure to GenAI (LLMs prompt engineering AI API integration)
- Familiar with software quality assurance best practices & methodologies and tools like Jira GIT etc.
- Experience with other SaaS/Cloud ERP CRM systems like NetSuite or Salesforce or SAP S4/Hana is a plus
- Excellent problem-solving communication and collaboration skill
Additional Information :
Arista stands out as an engineering-centric company. Our leadership including founders and engineering managers are all engineers who understand sound software engineering principles and the importance of doing things right.
We hire globally into our diverse team. At Arista engineers have complete ownership of their projects. Our management structure is flat and streamlined and software engineering is led by those who understand it best. We prioritize the development and utilization of test automation tools.
Our engineers have access to every part of the company providing opportunities to work across various domains. Arista is headquartered in Santa Clara California with development offices in Australia Canada India Ireland and the US. We consider all our R&D centers equal in stature.
Join us to shape the future of networking and be part of a culture that values invention quality respect and fun.
Remote Work :
Yes
Employment Type :
Full-time
Who Youll Work WithYoull be part of the IT applications team as a Lead Data Engineer. What Youll DoCollaborate with stakeholders and source data system teams to understand data requirementsArchitect and implement scalable workspace data lake dimensional models data pipelines data warehouses and othe...
Who Youll Work With
Youll be part of the IT applications team as a Lead Data Engineer.
What Youll Do
- Collaborate with stakeholders and source data system teams to understand data requirements
- Architect and implement scalable workspace data lake dimensional models data pipelines data warehouses and other ETL/ELT processes using Fabric.
- Work with Fabric assets Power BI and other services to build end-to-end data solutions
- Ensure data quality security and compliance with regulations by implementing data validation logging monitoring and role-based access controls.
- Perform root cause analysis on internal/external data and processes to answer specific business questions and identify opportunities for improvement.
- Manage platform cost optimization data quality/governance and performance tuning
- Follow software quality process and methodology standards including those for design data quality code version control defect/change request tracking documentation work product review unit testing and environment management.
- Review requirements / user stories and provide feedback to the team. Includes participation/input to the requirements process
- Integrate AI/ML models and GenAI capabilities into data products and workflows
- Help the QA and functional team to identify and define testing strategies for existing and new features
- Ability to ensure that solutions developed by development teams fit the business needs
- Able to work under pressure and meet deadlines
- Comfortable working in evening hours (2pm to 11pm IST)
Qualifications :
- 8 years of experience in data engineering roles preferably in a global enterprise environment
- Strong hands-on experience with Microsoft Fabric Data Lake Data Warehouse Data pipelines and related broader Microsoft ecosystem.
- Expertise in Power BI semantic models and datasets for building dashboards and reports
- Strong DAX and Power Query skills
- Expert proficiency in SQL Python PySpark for data processing
- Must have implemented ETL solutions to integrate data from various sources into Azure Data Lake and Data Warehouse
- Good knowledge of EDW
- Strong understanding of data management processes such as data normalisation and modelling as well as data security principles data access control and confidentiality.
- Good to have experience in Copilot or any AI/ML solutions with at least basic exposure to GenAI (LLMs prompt engineering AI API integration)
- Familiar with software quality assurance best practices & methodologies and tools like Jira GIT etc.
- Experience with other SaaS/Cloud ERP CRM systems like NetSuite or Salesforce or SAP S4/Hana is a plus
- Excellent problem-solving communication and collaboration skill
Additional Information :
Arista stands out as an engineering-centric company. Our leadership including founders and engineering managers are all engineers who understand sound software engineering principles and the importance of doing things right.
We hire globally into our diverse team. At Arista engineers have complete ownership of their projects. Our management structure is flat and streamlined and software engineering is led by those who understand it best. We prioritize the development and utilization of test automation tools.
Our engineers have access to every part of the company providing opportunities to work across various domains. Arista is headquartered in Santa Clara California with development offices in Australia Canada India Ireland and the US. We consider all our R&D centers equal in stature.
Join us to shape the future of networking and be part of a culture that values invention quality respect and fun.
Remote Work :
Yes
Employment Type :
Full-time
View more
View less