Role: Data Engineer
Location: Milford or Milwaukee WI (Onsite)
Skills: Data Warehousing applications ETL IBM AWS Infrastructure tools (EBS S3 EC2 Elastic IP Route 53 VPC) and with cloud infrastructure Management and automation technologies.
Job Description:
- IT experience in Design Development and implementation of large scale and medium scale.
- Data Warehousing applications using ETL tool IBM Infosphere Information server DataStage SQL and Unix.
- 57 years with a broad portfolio of AWS Infrastructure tools (EBS S3 EC2 Elastic IP Route 53 VPC) and with cloud infrastructure Management and automation technologies.
- Working with basic structured semistructure and unstructured data.
- Exposure to modern data/analytics architecture (Big Data Cloud etc..
- Exposure to advance analytics tools (Python R etc..
- Scripting (shell python ruby) skills for monitoring and automation.
- Developing and maintaining pipelines for efficient data extraction transformation and loading (ETL) processes.
- Advanced knowledge and expertise in designing and developing ETL jobs using IBM Infosphere DataStage and sounder experience in other IIS Suite like Information Analyzer and basics on Quality Stage.
- Performance Tuning and Optimization of ETL Jobs and SQLA queries in adhoc and complex report environments.
- Worked in various databases like Oracle 10g/9i/8i DB2 SQL Server Teradata.
- Experience in writing PL/SQL stored procedures functions and packages.
- Proficient in writing Unix shell scripts and extensive experience working with Unix operating systems.
- Extensive experience with ETL to load data from Flat files XML Oracle and DB2 to ODBC databases and to perform various transactions using Filter Transformer Join Lookup Aggregator Change Capture Remove Duplicate Shared Container Nested condition Routines Command activity stages to create robus mappings in the DataStage Designer.
Must have:
- Proven expertise in designing developing and deploying ETL workflows using IBM Infosphere DataStage within largescale data warehousing environments.
- Handson experience with AWS cloud infrastructure including services such as EC2 S3 EBS VPC Route 53 and Elastic IP for scalable and secure data solutions.
- Strong knowledge of data pipeline development including extraction transformation and loading (ETL) of structured semistructured and unstructured data.
- Proficient in SQL and PL/SQL programming including writing stored procedures functions and complex queries for data transformation and validation.
- Adept at Unix shell scripting for process automation job scheduling and system monitoring.
- Exposure to modern data architectures including big data and cloudbased analytics platforms.
- Familiar with advanced analytics tools such as Python and R for data exploration and statistical analysis.
- Skilled in ETL performance tuning and optimization for highvolume data processing and realtime reporting needs.
- Experienced with relational databases including Oracle DB2 SQL Server and Teradata.
- Knowledge of data profiling and quality assessment using Information Analyzer and basic exposure to QualityStage.
- Expertise in building complex transformations in DataStage using Transformers Joins Lookups Aggregators Routines and more.
Regards
Manoj
Derex Technologies INC
Contact : Ext 206
Additional Information :
All your information will be kept confidential according to EEO guidelines.
Remote Work :
No
Employment Type :
Fulltime