Position Name: Senior Data Engineer
Work Location: PARK5 City & State: Reston Virginia
Duration: 12 Months
We are looking for a senior AbInitio developer with many years of data integration data warehouse project experience. NoSQL/MongoDB AWS experience is highly preferred must be a self starter and should be able to work with minimum supervision. Please start sourcing for this position. let me know if you have any questions. The Senior Data Engineer is responsible for orchestrating deploying maintaining and scaling cloud OR on-premise infrastructure targeting big data and platform data management (Relational and NoSQL distributed and converged) with emphasis on reliability automation and performance. This role will focus on developing solutions and helping transform the companys platforms deliver data-driven meaningful insights and value to company. ESSENTIAL FUNCTIONS: 1) Work with Business Analysts and Product team to gather data requirements 2) Design and Build AbInitio data graphs and data pipelines to extract the data various databases/flat files/message queues 3) Transform the data to create a consumable data layer for various application uses 4) Support Data pipeline with bug fixes and additional enhancements 5) Document Technical design Operational Runbook etc Qualifications To perform this job successfully an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge skill and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Education Level: Bachelors Degree Education Details: Computer Science Information Technology or Engineering or related field Experience: Total of 10 Years of IT Experience predominantly in Data Integration/ Data Warehouse area Must have at least 5 years of ETL Design and Development experience using Ab Initio 1-2 years of Data Integration project experience on Hadoop Platform preferably Cloudera AbInitio CDC ( Change Data Capture ) experience in a Data Integration/ETL project setting is great to have Working knowledge of HDFS Hive Impala and other related Hadoop technologies Working knowledge in various AWS services is nice to have Sound understanding of SQL and ability to write well performing SQL queries Good knowledge of OLTP and OLAP data models and other data warehouse fundamentals Rigor in high code quality automated testing and other engineering best practices ability to write reusable code components Ability to unit test the code thoroughly and to troubleshoot issues in production environments Must have some working experience with Unix/Linux shell scripting Must be able to work independently and support other junior developers as needed Some Java development experience is nice to have Knowledge of Agile Development practices is required
Position Name: Senior Data Engineer Work Location: PARK5 City & State: Reston Virginia Duration: 12 Months We are looking for a senior AbInitio developer with many years of data integration data warehouse project experience. NoSQL/MongoDB AWS experience is highly preferred must be a self starter an...
Position Name: Senior Data Engineer
Work Location: PARK5 City & State: Reston Virginia
Duration: 12 Months
We are looking for a senior AbInitio developer with many years of data integration data warehouse project experience. NoSQL/MongoDB AWS experience is highly preferred must be a self starter and should be able to work with minimum supervision. Please start sourcing for this position. let me know if you have any questions. The Senior Data Engineer is responsible for orchestrating deploying maintaining and scaling cloud OR on-premise infrastructure targeting big data and platform data management (Relational and NoSQL distributed and converged) with emphasis on reliability automation and performance. This role will focus on developing solutions and helping transform the companys platforms deliver data-driven meaningful insights and value to company. ESSENTIAL FUNCTIONS: 1) Work with Business Analysts and Product team to gather data requirements 2) Design and Build AbInitio data graphs and data pipelines to extract the data various databases/flat files/message queues 3) Transform the data to create a consumable data layer for various application uses 4) Support Data pipeline with bug fixes and additional enhancements 5) Document Technical design Operational Runbook etc Qualifications To perform this job successfully an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge skill and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Education Level: Bachelors Degree Education Details: Computer Science Information Technology or Engineering or related field Experience: Total of 10 Years of IT Experience predominantly in Data Integration/ Data Warehouse area Must have at least 5 years of ETL Design and Development experience using Ab Initio 1-2 years of Data Integration project experience on Hadoop Platform preferably Cloudera AbInitio CDC ( Change Data Capture ) experience in a Data Integration/ETL project setting is great to have Working knowledge of HDFS Hive Impala and other related Hadoop technologies Working knowledge in various AWS services is nice to have Sound understanding of SQL and ability to write well performing SQL queries Good knowledge of OLTP and OLAP data models and other data warehouse fundamentals Rigor in high code quality automated testing and other engineering best practices ability to write reusable code components Ability to unit test the code thoroughly and to troubleshoot issues in production environments Must have some working experience with Unix/Linux shell scripting Must be able to work independently and support other junior developers as needed Some Java development experience is nice to have Knowledge of Agile Development practices is required
View more
View less