drjobs
Informatica Data Engineer
drjobs
Informatica Data Eng....
drjobs Informatica Data Engineer العربية

Informatica Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Job Location

drjobs

- USA

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2627337

Informatica Data Engineer
Remote Allowed

Please submit an updated resume (make sure the asking skills are added to the most recent projects Refer to the Skill matrix). You need to fill all the forms/pages of the attachments ie. RTR skill matrix number of years with each asking technology and three references (refer to the forms) two of them must be supervisor level references and provide Full name email and phone number of the references. Please provide the forms in word format only.

  • Client requires the services of Developer/Programmer Analyst 3 hereafter referred to as Candidate(s)
  • who meets the general qualifications of Developer/Programmer Analyst 3
  • Applications/Software Development and the specifications outlined in this document for the Client.

I. DESCRIPTION OF SERVICES

Client requires the services of One (1) Informatica Data Engineer hereafter referred to as Candidate(s) who meets the general qualifications of Applications/Software Development Developer/Programmer Analyst Level 3 and the specifications outlined in this document for the Client.

Job Description

Performs advanced (seniorlevel) Data Pipeline development work with Informatica Cloud (IICS) including data integration sourcetotarget data modeling ExtractTransactionLoad development consuming Oracle data connections RESTful API applicationbased data connections targeting Snowflake data connections designing Snowflake target data lake databases data warehouse modeling with ELT. This candidate will be using Informatica Cloud to build mass ingestion pipelines or other EL from an Oracle transaction database(s) to our Snowflake Data Lake. Additional duties may include working within Snowflake to create other APIbased data ingestion routines and/or setting up Data Sharing of these accumulated data. Dimensional modeling of data may be required in our Data Warehouse to accomplish other objectives like data reporting and improved performance data handling.

  • Development of ETL/ELT data mappings and workflows for data pipeline development with Informatica Cloud Data Integration.
  • Practical experience using and building Informatica Mass Ingestion Pipelines.
  • Demonstrated experience with Oracle Database as a Data Connector source.
  • Expert with Snowflake as a Target database platform.
  • Experience with the Snowflake platform and ecosystem. Knowledge of Snowflake data sharing and Snowpark is a plus.
  • Knowledge of the advantages as well as previous experience working with Informatica pushdown optimization
  • Experience with Snowflake database creation optimization and architectural advantages.
  • Practical experience with Snowflake SQL.
  • Determines database requirements by analyzing business operations applications and programming; reviewing business objectives; and evaluating current systems.
  • Obtains data model requirements develops and implements data models for new projects and maintains existing data models and data architectures.
  • Creates graphics and other flow diagrams (including ERD) to show complex database design and database modeling more simply.
  • Performs related work as assigned.
  • Practical experience with one time data loads as well as Change Data Capture (CDC) for bulk data movement.
  • Creation of technical documentation for process and interface documentation is a key element of this role as the Team is working on release two of a multiple release effort.
  • Ability to review the work of others troubleshoot and provide feedback and guidance to meet tight deliverable deadlines is required.
  • Ability to promote code from development environments to production.
  • Familiarity with GitHub or equivalent version control systems.
  • Experience working with state agencies as well as security protocols and processes.

Employment Type

Full Time

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala
Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.