drjobs Integration Data Engineer

Integration Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Iselin, NJ - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Our client a major bank in Central NJ is looking forIntegration Data Engineer.
Hybrid commute 3 days on-site in Central NJ Locations and 2 day remote per week.

This is a permanent FT career opportunity with base salary range 110 -135K DOE plus around 20% bonus and great benefits package.

Looking for Integration Data Engineer with a background in SQL and data warehousing for enterprise level systems. The ideal candidate is comfortable working with business users along with business analyst expertise.

Major Responsibilities:
  • Design develop and deploy Databricks jobs to process and analyze large volumes of data.
  • Collaborate with data engineers and data scientists to understand data requirements and implement appropriate data processing pipelines.
  • Optimize Databricks jobs for performance and scalability to handle big data workloads.
  • Monitor and troubleshoot Databricks jobs identify and resolve issues or bottlenecks.
  • Implement best practices for data management security and governance within the Databricks environment. Experience designing and developing Enterprise Data Warehouse solutions.
  • Demonstrated proficiency with Data Analytics Data Insights
  • Proficient writing SQL queries and programming including stored procedures and reverse engineering existing process
  • Leverage SQL programming language (Python or similar) and/or ETL Tools (Azure Data Factory Data Bricks Talend and SnowSQL) to develop data pipeline solutions to ingest and exploit new and existing data sources.
  • Perform code reviews to ensure fit to requirements optimal execution patterns and adherence to established standards.
Skills:
5 years - Enterprise Data Management
5 years - SQL Server based development of large datasets
5 years with Data Warehouse Architecture hands-on experience with Databricks platform. Extensive experience in PySpark experience is good to have
3 years Python (Numpy Pandas) coding experience
3 years experience in Finance / Banking industry some understanding of Securities andBanking products and their data footprints.
Experience with Snowflake utilities such as SnowSQL and SnowPipe - good to have
Experience in Data warehousing - OLTP OLAP Dimensions Facts and Data modeling.
Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills
Capable of discussing enterprise level services independent of technology stack
Experience with Cloud based data architectures messaging and analytics
Superior communication skills
Cloud certification(s) preferred
Any experience with Regulatory Reporting is a Plus

Education
Minimally a BA degree within an engineering and/or computer science discipline
Masters degree strongly preferred


Please email your resume or use this link to apply directly:
email:
Check ALL our Jobs:
sql oracle python databricks etl data warehousing ssis ETL snow numpy pandas pyspark oltp olap

Employment Type

Full Time

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.