Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailPremise offers a wide range of dynamic purposedriven career opportunities. We are currently looking for a Data Engineerto join our team in Brentwood TN.
About the role: The Data Engineer will be responsible for building and maintaining scalable data pipelines managing our azure delta lake architecture and supporting analytics initiatives across the organization. The ideal candidate will have a strong background in Azure Databricks data engineering and develops pipeline. The successful candidate will work closely with our data engineering and data science teams to ensure seamless integration of data pipelines and Lakehouse architecture. This role should ensure coding standard and coding migration to higher environment and CI/CD.
Essential Functions:
Data Pipeline Development:
Develop and maintain data pipelines using Azure Databricks Apache Spark and Python.
Collaborate with data engineering teams to integrate data from EPIC Clarity and Caboodle and other onpremise source systems to Azure.
Ensure data quality integrity and security throughout the data pipeline.
Automate unit test cases as part of Data Pipeline.
Lakehouse Architecture:
Contribute to the development and maintenance of our Lakehouse architecture using Azure Databricks and Delta Lake.
Ensure data governance and security in the Lakehouse environment
Should be able to execute POCs to evaluate new tool concepts introduced by Azure/ Databricks.
Troubleshooting and Optimization:
Troubleshoot data pipeline and Lakehouse architecture issues identifying and resolving bottlenecks and errors.
Optimize data pipeline and Lakehouse architecture performance scalability cost and reliability.
Collaboration and Integration:
Collaborate with data science teams to integrate data pipelines and Lakehouse architecture with data science workflows.
Ensure seamless data exchange between data engineering and data science teams.
Participate in peer review for coding standards.
Subject matter expert for PySpark and Scala.
Job Requirements:
Bachelors or Masters degree in Computer Science Information Technology or a related field.
Certifications such as AWS Certified Solutions Architect Cisco CCNP Microsoft Azure Solutions Architect or ITIL preferred.
Databricks and Azure certification preferred.
4 years of experience in Cloud development practice.
Strong expertise in cloud platforms (AWS Azure GCP) virtualization and networking.
Experience with infrastructure automation tools (Terraform Ansible Kubernetes).
Preferred Experience:
Knowledge on Talend PowerBI and Qlik
Proficiency in Linux and Windows server administration.
Solid knowledge of Cybersecurity best practices and disaster recovery planning.
Strong leadership experience in managing technical teams and project management skills.
Knowledge of log management and monitoring tools (Splunk Datadog Prometheus).
Familiarity with containerization and microservices.
Full Time