Job Title: DevOps Data Engineer
Company: Reef Capital
Salary: $125000-$175000
Location: Lehi UT; On-site
*Please note that we cannot offer relocation or visa sponsorship for this position
About Reef:
Reef Capital is a Utah-based vertically integrated investment and development firm founded in 2005 with three primary lines of business: Investments Real Estate and Lifestyle. Reefs team is currently involved in some of the most prominent development transactions in Utah and other targeted geographies across the United States.
Built on two decades of success our investment approach combines proven expertise with purposeful innovation. Reefs team has completed more than 500 transactions across all lines of addition to our real estate investment strategy Reef and/or affiliates own and operate various businesses that add significant long-term value to its projects. We have grown rapidly growing from about 25 employees to well over 400 with managed assets on behalf of over 750 institutional and individual partners worldwide.
With the motto Expect the Best Reefs most prominent developments include Black Desert Resort a $2 billion luxury resort in Ivins Utah; Marcella a luxury private golf community in Park City Utah; Tributer Resort Virginias newest premier private lakeside golf destination; Cornerstone Club a 5000-acre residential community private club and resort in Telluride Colorado; Sweetens Cove a renowned and evolving golf destination nestled in the Tennessee Valley; and the restoration of the historic Coco Palms Resort in Wailua on the island of Kauai Hawaii.
At Reef our mission is to recruit develop and retain entrepreneurial individuals who desire to build and create something long-lasting and meaningful. Our business enables bright committed people to work in high-performing teams within an environment that allows each person to achieve their professional objectives. Reef values a strong culture dedicated to the health and well-being of our employees.
Position Summary:
Reef is seeking an experienced and highly motivated DevOps Data Engineer to design build and maintain scalable data infrastructure pipelines and integrations that support enterprise analytics reporting and data-driven decision-making. This role is essential for centralizing data from multiple sources ensuring data quality and accessibility while automating processes and aligning data solutions with business objectives. The ideal candidate will combine expertise in data pipelines cloud platforms scripting and integration with strong project collaboration skills.
Key Responsibilities:
Data Infrastructure and Pipelines
- Design build and maintain data lakes (e.g. using AWS Lake Formation S3 Snowflake or similar platforms like Microsoft Fabric) for centralized data storage ingestion from multiple sources and governance.
- Implement and optimize data pipelines for ETL (Extract Transform Load) processes ensuring data quality security compliance and handling large-scale enterprise data volumes from SaaS applications.
Data Migration and Optimization
- Plan and execute data migrations with a focus on preserving data integrity and minimizing downtime.
- Optimize data flows and storage for advanced analytics reporting and real-time syncing.
Integrations and Automation
- Lead integration projects to enable interoperability across enterprise applications (e.g. Salesforce Microsoft 365 HubSpot).
- Develop scripts in Python and PowerShell for custom data ingestion transformation automation and system efficiency tasks.
- Identify automation opportunities to reduce manual data handling and enhance integrated environments.
Analytics and Reporting Support
- Build custom reports and visualizations using tools like Microsoft SQL scripting Power BI or SharePoint.
- Support data governance security best practices and documentation.
Project Management and Collaboration
- Collaborate with cross-functional teams to translate business needs into technical data solutions.
- Lead data-related projects including migrations pipeline deployments and technology initiatives.
- Research and recommend emerging technologies (e.g. cloud-native data tools) to support data-driven growth.
Qualifications
- Bachelors degree in Computer Science Information Systems or a related field.
- 5 years of experience in data engineering system administration or a similar role.
- Expertise with enterprise applications like Microsoft 365 Salesforce HubSpot Adobe and Docusign.
- Hands-on experience with data platforms (e.g. Snowflake AWS services) and ETL processes.
- Strong knowledge of SSO API integrations and data interoperability.
- Advanced scripting skills in PowerShell (including O365 modules) and Python (mandatory).
- Proficiency in Microsoft SQL scripting reporting and SharePoint.
- Experience with data migrations pipeline optimization and IT/data projects.
- Ability to manage multiple priorities timelines and stakeholders effectively.
- Excellent problem-solving analytical thinking communication and collaboration skills.
- Detail-oriented with a proactive approach to identifying and addressing challenges.
Company Benefits:
- 401(k) Plan with Company Match
- Generous Health Plan with HSA Match
- Flexible Paid Time-off
- Daily Company Lunches
- Cell Phone Service Allowance
- Discounts at Company-owned Resorts and Golf Courses
Required Experience:
IC
Job Title: DevOps Data EngineerCompany: Reef CapitalSalary: $125000-$175000Location: Lehi UT; On-site*Please note that we cannot offer relocation or visa sponsorship for this positionAbout Reef: Reef Capital is a Utah-based vertically integrated investment and development firm founded in 2005 with t...
Job Title: DevOps Data Engineer
Company: Reef Capital
Salary: $125000-$175000
Location: Lehi UT; On-site
*Please note that we cannot offer relocation or visa sponsorship for this position
About Reef:
Reef Capital is a Utah-based vertically integrated investment and development firm founded in 2005 with three primary lines of business: Investments Real Estate and Lifestyle. Reefs team is currently involved in some of the most prominent development transactions in Utah and other targeted geographies across the United States.
Built on two decades of success our investment approach combines proven expertise with purposeful innovation. Reefs team has completed more than 500 transactions across all lines of addition to our real estate investment strategy Reef and/or affiliates own and operate various businesses that add significant long-term value to its projects. We have grown rapidly growing from about 25 employees to well over 400 with managed assets on behalf of over 750 institutional and individual partners worldwide.
With the motto Expect the Best Reefs most prominent developments include Black Desert Resort a $2 billion luxury resort in Ivins Utah; Marcella a luxury private golf community in Park City Utah; Tributer Resort Virginias newest premier private lakeside golf destination; Cornerstone Club a 5000-acre residential community private club and resort in Telluride Colorado; Sweetens Cove a renowned and evolving golf destination nestled in the Tennessee Valley; and the restoration of the historic Coco Palms Resort in Wailua on the island of Kauai Hawaii.
At Reef our mission is to recruit develop and retain entrepreneurial individuals who desire to build and create something long-lasting and meaningful. Our business enables bright committed people to work in high-performing teams within an environment that allows each person to achieve their professional objectives. Reef values a strong culture dedicated to the health and well-being of our employees.
Position Summary:
Reef is seeking an experienced and highly motivated DevOps Data Engineer to design build and maintain scalable data infrastructure pipelines and integrations that support enterprise analytics reporting and data-driven decision-making. This role is essential for centralizing data from multiple sources ensuring data quality and accessibility while automating processes and aligning data solutions with business objectives. The ideal candidate will combine expertise in data pipelines cloud platforms scripting and integration with strong project collaboration skills.
Key Responsibilities:
Data Infrastructure and Pipelines
- Design build and maintain data lakes (e.g. using AWS Lake Formation S3 Snowflake or similar platforms like Microsoft Fabric) for centralized data storage ingestion from multiple sources and governance.
- Implement and optimize data pipelines for ETL (Extract Transform Load) processes ensuring data quality security compliance and handling large-scale enterprise data volumes from SaaS applications.
Data Migration and Optimization
- Plan and execute data migrations with a focus on preserving data integrity and minimizing downtime.
- Optimize data flows and storage for advanced analytics reporting and real-time syncing.
Integrations and Automation
- Lead integration projects to enable interoperability across enterprise applications (e.g. Salesforce Microsoft 365 HubSpot).
- Develop scripts in Python and PowerShell for custom data ingestion transformation automation and system efficiency tasks.
- Identify automation opportunities to reduce manual data handling and enhance integrated environments.
Analytics and Reporting Support
- Build custom reports and visualizations using tools like Microsoft SQL scripting Power BI or SharePoint.
- Support data governance security best practices and documentation.
Project Management and Collaboration
- Collaborate with cross-functional teams to translate business needs into technical data solutions.
- Lead data-related projects including migrations pipeline deployments and technology initiatives.
- Research and recommend emerging technologies (e.g. cloud-native data tools) to support data-driven growth.
Qualifications
- Bachelors degree in Computer Science Information Systems or a related field.
- 5 years of experience in data engineering system administration or a similar role.
- Expertise with enterprise applications like Microsoft 365 Salesforce HubSpot Adobe and Docusign.
- Hands-on experience with data platforms (e.g. Snowflake AWS services) and ETL processes.
- Strong knowledge of SSO API integrations and data interoperability.
- Advanced scripting skills in PowerShell (including O365 modules) and Python (mandatory).
- Proficiency in Microsoft SQL scripting reporting and SharePoint.
- Experience with data migrations pipeline optimization and IT/data projects.
- Ability to manage multiple priorities timelines and stakeholders effectively.
- Excellent problem-solving analytical thinking communication and collaboration skills.
- Detail-oriented with a proactive approach to identifying and addressing challenges.
Company Benefits:
- 401(k) Plan with Company Match
- Generous Health Plan with HSA Match
- Flexible Paid Time-off
- Daily Company Lunches
- Cell Phone Service Allowance
- Discounts at Company-owned Resorts and Golf Courses
Required Experience:
IC
View more
View less