Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailDesign and maintain robust secure and high-performance data pipelines to support DAISYs processing and analytics workflows
Implement authentication and authorization mechanisms to secure data access and movement
Work with both SQL and NoSQL databases to store and retrieve structured and unstructured data efficiently
Apply data caching and optimization strategies to improve application responsiveness and system performance
Build deploy and manage containerized applications using Docker
Collaborate with developers architects and security teams to ensure system compliance and performance
Troubleshoot pipeline and infrastructure issues in development and production environments
Contribute to architecture discussions and make recommendations for technology direction
5 years of experience in data engineering or backend data development
Strong hands-on experience in building data pipelines (ETL/ELT) for high-volume systems
In-depth knowledge of data security authentication and authorization strategies
Experience with SQL databases (e.g. PostgreSQL SQL Server) and NoSQL databases (e.g. MongoDB Redis)
Experience with caching strategies (e.g. Redis Memcached)
Proficient in Docker building managing and deploying containers
Familiarity with data processing frameworks (e.g. Apache Kafka Spark) is a plus
Experience with CI/CD tools and version control (e.g. Git/GitHub)
Experience with Kubernetes for orchestration of containerized workloads
Background working on internal enterprise platforms or workflow systems
Knowledge of data governance encryption and compliance best practices
Strong scripting skills in Python Bash or similar
Experience in the Oil & Gas industry is a plus
Full-time