*Must be able to sit on site in Woburn MA 5x a week*
The Enterprise Data Management team is a nimble cross-functional group that thrives on solving complex data challenges. We manage enterprise data assets engineer scalable reporting solutions and deliver actionable insights that inform strategic decisions across the bank.
Were seeking an AI Developer with a strong analytical mindset and a passion for coding-someone who can bridge the gap between raw data and business intelligence. This role offers the opportunity to work with modern data stacks and contribute to the evolution of our data infrastructure.
KEY RESPONSIBILITIES
Data Analysis & Insights Generation (30%)
- Perform exploratory data analysis (EDA) and statistical profiling using Python (leveraging libraries such as Pandas NumPy and SciPy) and Java for backend data processing tasks.
- Develop reusable scripts and modular code for data wrangling anomaly detection and KPI tracking.
- Apply object-oriented programming principles to build scalable data pipelines and analytical utilities.
Data Visualization & Reporting (40%)
- Design and implement interactive dashboards using Tableau Power BI or custom-built web interfaces.
- Utilize DAX Power Query and Salesforce APIs to integrate disparate data sources into unified reporting layers.
- Translate complex datasets into intuitive visual narratives that support executive decision-making.
Data Extraction & Preparation (40%)
- Build and maintain ETL workflows using Python Java and SQL-based tools to extract data from PostgreSQL MSSQL and cloud-based data lakes.
- Automate data cleansing and transformation routines using Apache POI (for Excel automation in Java) VBA and Power Query.
- Ensure data integrity through rigorous validation schema enforcement and exception handling.
REQUIREMENTS
- Programming Proficiency: Strong command of Python for data analysis and scripting and working knowledge of Java for backend data processing and integration tasks.
- SQL Expertise: Advanced querying skills across relational databases (PostgreSQL MSSQL).
- Data Engineering Mindset: Familiarity with ETL concepts data modeling and pipeline orchestration.
- Tool Agnostic Flexibility: Comfortable switching between tools and languages to solve problems efficiently.
- Collaborative Communication: Ability to work closely with data scientists business stakeholders and technical teams to translate requirements into analytical solutions.
NICE TO HAVES (Or Things Youll Get To Learn)
- Experience with DuckDB Polars or other high-performance analytical engines.
- Exposure to cloud data platforms like AWS Redshift Azure Synapse or Google BigQuery.
- Familiarity with Git for version control and collaborative development.
- Interest in machine learning predictive modeling or statistical inference.
- Prior experience in financial services or other regulated industries.
QUALIFICATIONS
- 2 years of experience in data analysis software development or business intelligence preferably in financial services or a regulated industry.
- Proficiency in Python and Java with experience in data manipulation automation and backend integration.
- Strong SQL skills and familiarity with relational databases (PostgreSQL MSSQL).
- Experience with data visualization tools (Power BI Tableau) and dashboard development.
- Familiarity with ETL processes data modeling and version control systems (e.g. Git).
- Experience with Excel (including Power Query and VBA) and process documentation tools (Visio Lucidchart).
- Excellent communication and stakeholder management skills.
- Bachelors degree in Computer Science Data Science Information Systems or a related field.
- High proficiency in technical writing and documentation
Required Skills : AI
Basic Qualification :
Additional Skills :
Background Check : No
Drug Screen : No