Employer Active
Role : Sr Data Engineer
W2 only
Location : Chicago IL (look for Nearby candidates)
Hybrid (3 days office 2 days WFH)
Must to have: Azure Databricks ETL Python or Hadoop INFORMATICA or Abinitio Data lake SQL
AWS S3 and Databricks are top 2 skills needed MUST HAVE If they dont know databricks they have to be able to pick it up quickly
Python & Spark Hadoop/Abinitio/Informatica would be helpful Will be working with pipelines and archiving the data May be moving the data into a new Data Lake
Strong understanding of RDBMS NoSQL big data SQL and ETL tools.
Experience programming with at least one modern language such as Java Python Unix Shell.
Below are skills for the two data engineer positions:
Required qualifications capabilities and skills:
BS degree in Computer Science or certification on software engineering.
Proficient in data analysis data engineering data modeling and database management.
Strong understanding of RDBMS NoSQL big data SQL and ETL tools.
Experience programming with at least one modern language such as Java Python Unix Shell.
Proficiency in REST APIs microservices distributed systems and cloud (hybrid) computing .
Strong understanding of Agile methodologies with ability to work in at least one of the common frameworks.
Strong understanding of techniques such as CI/CD TDD cloud development resiliency and security.
Proven experience with business analysis design development testing deployment maintenance and improvement.
Preferred qualifications capabilities and skills:
Experience working with data intensive software (such as big data data warehouses data lakes).
AWS experience with developing on AWS S3 Lambda MSK EC2 IAM and related data products.
Experience with Databricks Amazon RDS Oracle Hadoop/Cloudera HUE Hive Impala.
Experience with GraphQL.
Full Time