Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailWorkato transforms technology complexity into business opportunity. As the leader in enterprise orchestration Workato helps businesses globally streamline operations by connecting data processes applications and experiences. Its AIpowered platform enables teams to navigate complex workflows in realtime driving efficiency and agility.
Trusted by a community of 400000 global customers Workato empowers organizations of every size to unlock new value and lead in todays fastchanging world. Learn how Workato helps businesses of all sizes achieve more at workato.
Ultimately Workato believes in fostering a flexible trustoriented culture that empowers everyone to take full ownership of their roles. We are driven by innovation and looking for team players who want to actively build our company.
But we also believe in balancing productivity with selfcare. Thats why we offer all of our employees a vibrant and dynamic work environment along with a multitude of benefits they can enjoy inside and outside of their work lives.
If this sounds right up your alley please submit an application. We look forward to getting to know you!
Also feel free to check out why:
Business Insider named us an enterprise startup to bet your career on
Forbes Cloud 100 recognized us as one of the top 100 private cloud companies in the world
Quartz ranked us the #1 best company for remote workers
At Workato were redefining business automation by integrating innovative technologies that drive digital transformation. Were seeking a highly skilled Senior Data Engineer to lead the design development and optimization of our modern data infrastructure. In this role you will work extensively with advanced tools such as dbt Automate DV Trino Snowflake Apache Iceberg and Apache Airflow to build robust scalable and efficient data pipelines that empower our decisionmaking and analytics capabilities.
You will work closely with data scientists providing data vault for them integrating models to the data vault integrating different sources of the data to single data warehouse:
Product usage data
ETL data from AI services
Business data
External data
In this role you will also be responsible to:
Data Pipeline Development:
Design develop and maintain data pipelines and ETL processes using dbt and Apache Airflow to ensure seamless data integration transformation and validation across diverse data sources.
Data Infrastructure Management:
Architect and implement scalable data solutions utilizing Snowflake as a data warehouse and leverage Trino for efficient query across distributed data sets.
Modern Data Technologies:
Integrate and optimize data workflows using Automate DV and Apache Iceberg to manage data versioning quality and lifecycle ensuring reliability and compliance.
Collaboration & Leadership:
Work closely with data scientists analysts and business stakeholders to translate requirements into technical solutions. Mentor junior engineers and lead code reviews to promote best practices in data engineering.
Performance & Optimization:
Continuously monitor troubleshoot and optimize data processes to ensure high performance minimal downtime and optimal resource utilization.
Innovation & Best Practices:
Stay abreast of emerging trends in data engineering and automation driving innovation and adopting new tools and techniques that enhance data processing and integration capabilities.
Education & Experience:
Bachelors or Masters degree in Computer Science Engineering or a related field.
5 years of experience in data engineering with a proven track record of designing and managing largescale data infrastructures.
Technical Expertise:
Proficiency in dbt for data transformation and modeling.
Experience with Automate DV for data validation and workflow automation.
Handson expertise with Trino for distributed SQL query engines.
Deep understanding of Snowflake architecture and its ecosystem.
Knowledge of Apache Iceberg for managing large analytic datasets.
Strong background in orchestrating workflows using Apache Airflow.
Proficiency in SQL and at least one programming language (Python preferred).
Analytical & ProblemSolving Skills:
Ability to analyze complex data challenges and design innovative datadriven solutions.
Strong debugging skills and attention to detail.
Soft Skills:
Excellent communication and collaboration skills.
Demonstrated leadership and mentoring capabilities.
Ability to thrive in a fastpaced dynamic environment.
Familiarity with cloud data platforms (AWS GCP or Azure) and containerization technologies.
Experience in agile development methodologies.
Proven track record of working in automationcentric environments.
Required Experience:
Senior IC
Full Time