Role: ETL/Informatica Developer
Mode of Employment: Direct Hire (FTE)
Location: Charlotte NC (Onsite)
Roles & Responsibilities
ETL Process Development:
- Design develop and maintain ETL processes using Informatica PowerCenter or other relevant Informatica tools.
- Deep understanding of HDFS YARN MapReduce Hive Pig HBase Flume Sqoop Zookeeper Oozie.
- Experience with Spark Kafka NoSQL databases.
- Experience in Agile Methodology
- Experience with code versioning tools like Bit-Bucket
- SQL Proficiency: Utilize SQL/PLSQL to extract transform and load data.
- Exposure to advanced transformations like data transformations Parsing JSON/XML messages
- Experience in Job scheduling tools Like Autosys
- Data Integration: Integrate data from various sources ensuring data consistency and quality.
- Data Warehouse Design: Design and maintain data warehouses to support business intelligence activities.
- Performance Optimization: Optimize SQL script/queries for speed and efficiency.
- Troubleshooting: Identify and resolve issues in ETL processes.
- Documentation: Create and maintain technical documentation for ETL processes.
- Testing: Perform unit integration and system testing on ETL processes.
- Collaboration: Collaborate with cross-functional teams to ensure successful implementation of ETL processes.
- Data Quality: Ensure data quality by implementing data cleansing and transformation processes.
- Data Modeling: Develop and maintain relational and dimensional data models.
Role: ETL/Informatica Developer Mode of Employment: Direct Hire (FTE) Location: Charlotte NC (Onsite) Roles & Responsibilities ETL Process Development: Design develop and maintain ETL processes using Informatica PowerCenter or other relevant Informatica tools. Deep understanding of HDFS YARN MapRe...
Role: ETL/Informatica Developer
Mode of Employment: Direct Hire (FTE)
Location: Charlotte NC (Onsite)
Roles & Responsibilities
ETL Process Development:
- Design develop and maintain ETL processes using Informatica PowerCenter or other relevant Informatica tools.
- Deep understanding of HDFS YARN MapReduce Hive Pig HBase Flume Sqoop Zookeeper Oozie.
- Experience with Spark Kafka NoSQL databases.
- Experience in Agile Methodology
- Experience with code versioning tools like Bit-Bucket
- SQL Proficiency: Utilize SQL/PLSQL to extract transform and load data.
- Exposure to advanced transformations like data transformations Parsing JSON/XML messages
- Experience in Job scheduling tools Like Autosys
- Data Integration: Integrate data from various sources ensuring data consistency and quality.
- Data Warehouse Design: Design and maintain data warehouses to support business intelligence activities.
- Performance Optimization: Optimize SQL script/queries for speed and efficiency.
- Troubleshooting: Identify and resolve issues in ETL processes.
- Documentation: Create and maintain technical documentation for ETL processes.
- Testing: Perform unit integration and system testing on ETL processes.
- Collaboration: Collaborate with cross-functional teams to ensure successful implementation of ETL processes.
- Data Quality: Ensure data quality by implementing data cleansing and transformation processes.
- Data Modeling: Develop and maintain relational and dimensional data models.
View more
View less