Role Description
- We are seeking experienced software Data engineers in Charlotte (N.C) with 10 years of relevant experience to support the design and development of a strategic data platform for investment bank.
- Role Objectives
- These roles will be part of the Data Strategy team spanning across the Capital Markets and securities teams broker-dealer and swap-dealer entities.
- These roles will be involved in the active development of the data platform in close coordination with the client team beginning with the establishment of a reference data system for securities and pricing data and later moving to other data domains.
- The consulting team will need to follow internal developments standards to contribute to the overall agenda of the Data Strategy team.
- The implementation of this strategic platform on Azure Cloud Platform will require solutions and know-how as listed in the qualifications below.
Qualifications and Skills
- 10 years of experience in software development using Python PySpark and its frameworks.
- Proven experience as a Data Engineer with experience in Azure cloud.
- The candidate needs to demonstrate solid Pyspark experience at the foundational level (building ground up).
- Databricks experience doesnt necessarily imply Pyspark proficiency.
- Databricks framework can be implemented many ways even using scala language.
- For this requirement Databricks experience is nice to have.
Experience implementing solutions using
- Azure cloud services
- Azure Data Factory
- Azure Lake Gen 2
- Azure Databases
- Azure Data Fabric
- API Gateway management
- Azure Functions
- Well versed with Azure Databricks
- Strong SQL skills with RDMS or noSQL databases
- Experience with developing API s using FastAPI or similar frameworks in Python
- Familiarity with the DevOps lifecycle (git Jenkins etc.) CI/CD processes
- Good understanding of ETL/ELT processes
- Experience in financial services industry financial instruments asset classes and market data are a plus.