Role: Data Bricks Engineer
Location: Blue Ash OH
Minimum Experience: 8 - 15 Years
Job Description:
- 5 years of proven professional Databricks and/or Snowflake
- 4 years Python Development
- Object Oriented Programming
- 3 years Distributed Data Processing (PySpark Snowpark)
- Proficient CI/CD practices
- Automated data pipeline orchestration
- Data observability Logging Monitoring and Alerting
- Databricks and/or Snowflake
- API development
- Data quality checks
- Cloud Technologies (Azure preferred)
Roles & Responsibilities:
- Develop distributed data processing data pipeline solutions
- Orchestrate multi-step data transformation pipelines
- Perform unit integration and regression testing on packaged code
- Build transformation logic and code in an Object Oriented Programming style
- Enhance CI/CD pipelines in the path to production
- Create data quality checks for ingested and post processed data
- Ensure data observability via alerting and monitoring of automated pipeline solutions
- Maintain and enhance existing applications
- Build cloud resources via infrastructure as code
- Provide mentoring to junior team members
- Participate in retrospective reviews
- Participate in the estimation process for new work and releases
- Bring new perspectives to problems
- Be driven to improve yourself and the way things are done.
Role: Data Bricks Engineer Location: Blue Ash OH Minimum Experience: 8 - 15 Years Job Description: 5 years of proven professional Databricks and/or Snowflake 4 years Python Development Object Oriented Programming 3 years Distributed Data Processing (PySpark Snowpark) Proficient CI/CD practi...
Role: Data Bricks Engineer
Location: Blue Ash OH
Minimum Experience: 8 - 15 Years
Job Description:
- 5 years of proven professional Databricks and/or Snowflake
- 4 years Python Development
- Object Oriented Programming
- 3 years Distributed Data Processing (PySpark Snowpark)
- Proficient CI/CD practices
- Automated data pipeline orchestration
- Data observability Logging Monitoring and Alerting
- Databricks and/or Snowflake
- API development
- Data quality checks
- Cloud Technologies (Azure preferred)
Roles & Responsibilities:
- Develop distributed data processing data pipeline solutions
- Orchestrate multi-step data transformation pipelines
- Perform unit integration and regression testing on packaged code
- Build transformation logic and code in an Object Oriented Programming style
- Enhance CI/CD pipelines in the path to production
- Create data quality checks for ingested and post processed data
- Ensure data observability via alerting and monitoring of automated pipeline solutions
- Maintain and enhance existing applications
- Build cloud resources via infrastructure as code
- Provide mentoring to junior team members
- Participate in retrospective reviews
- Participate in the estimation process for new work and releases
- Bring new perspectives to problems
- Be driven to improve yourself and the way things are done.
View more
View less