Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailAbout this role
At BlackRock technology has always been at the core of what we do and today our technologists continue to shape the future of the industry with their innovative work. We are not only curious but also collaborative and eager to embrace experimentation as a means to solve complex challenges. Here youll find an environment that promotes working across teams businesses regions and specialties and a firm committed to supporting your growth as a technologist through curated learning opportunities techspecific career paths and access to experts and leaders around the world.
We are seeking a highly skilled and motivated Senior level Data Engineer to join the Private Market Data Engineering team within Aladdin Data at BlackRock for driving our Private Market Data Engineering vision of making private markets more accessible and transparent for clients. In this role you will work multifunctionally with Product Data Research Engineering and Program management.
Engineers looking to work in the areas of orchestration data modeling data pipelines APIs storage distribution distributed computation consumption and infrastructure are ideal candidates. The candidate will have extensive experience in developing data pipelines using Python Apache Airflow orchestration platform DBT (Data Build Tool) Great Expectations for data validation Apache Spark MongoDB Elasticsearch Snowflake and PostgreSQL. In this role you will be responsible for designing developing and maintaining robust and scalable data pipelines. You will collaborate with various stakeholders to ensure the data pipelines are efficient reliable and meet the needs of the business.
Design develop and maintain data pipelines using Aladdin Data Enterprise Data Platform framework.
Develop data transformation using DBT (Data Build Tool) with SQL or Python.
Develop ETL/ELT data pipelines using Python SQL and deploy them as containerized apps on a Kubernetes cluster.
Ensure data quality and integrity through automated testing and validation using tools like Great Expectations.
Implement all observability requirements in the data pipeline.
Optimize data workflows for performance and scalability.
Monitor and troubleshoot data pipeline issues ensuring timely resolution.
Document data engineering processes and best practices whenever required.
Develop APIs for data distribution on top of the standard data model of the Enterprise Data Platform.
Must have 5 to 8 years of experience in data engineering with a focus on building data pipelines.
Strong programming skills in Python.
Experience with Apache Airflow or any other orchestration framework for data orchestration.
Proficiency in DBT for data transformation and modeling.
Experience with data quality validation tools like Great Expectations or any other similar tools.
Strong at writing SQL and experience with relational databases like SQL Server PostgreSQL.
Experience with cloudbased data warehouse platform like Snowflake.
Experience working on NoSQL databases like Elasticsearch and MongoDB.
Experience working with container orchestration platform like Kubernetes on AWS and/or Azure cloud environments.
Experience on Cloud platforms like AWS and/or Azure.
Experience working with backend microservices and APIs using Java or C#.
Ability to work collaboratively in a team environment.
Need to possess critical skills of being detail oriented passion to learn new technologies and good analytical and problemsolving skills.
Experience with Financial Services application is a plus.
Effective communication skills both written and verbal.
Bachelors or masters degree in computer science Engineering or a related field.
Our benefits
To help you stay energized engaged and inspired we offer a wide range of benefits including a strong retirement plan tuition reimbursement comprehensive healthcare support for working parents and Flexible Time Off (FTO) so you can relax recharge and be there for the people you care about.
Our hybrid work model
BlackRocks hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person aligned with our commitment to performance and innovation. As a new joiner you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.
About BlackRock
At BlackRock we are all connected by one mission: to help more and more people experience financial wellbeing. Our clients and the people they serve are saving for retirement paying for their childrens educations buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.
This mission would not be possible without our smartest investment the one we make in our employees. Its why were dedicated to creating an environment where our colleagues feel welcomed valued and supported with networks benefits and development opportunities to help them thrive.
For additional information on BlackRock please visit @blackrock Twitter: @blackrock LinkedIn: is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age disability family status gender identity race religion sex sexual orientation and other protected attributes at law.
Required Experience:
IC
Full-Time