Company Overview:
Pibythree is a leading data solutions provider dedicated to helping organizations harness the power of their data. Our mission is to deliver innovative reliable and scalable data management solutions to our clients enabling them to make informed decisions. We prioritize a culture of collaboration continuous learning and excellence where every team member is empowered to contribute to our collective success. As we continue to grow we are committed to maintaining a workplace that fosters creativity diversity and professional development.
Role Responsibilities:
- Design and implement data models to support business needs.
- Develop and maintain ETL processes for extracting transforming and loading data into Snowflake.
- Create and optimize SQL queries to enhance performance and data retrieval.
- Collaborate with data analysts and business stakeholders to understand data requirements.
- Utilize Dbt (Data Build Tool) to manage and develop data transformations.
- Monitor data pipelines and ensure data integrity and accuracy.
- Prepare documentation for data processes and architecture.
- Implement best practices for data management and security measures.
- Conduct performance tuning and ensure efficient data processing.
- Assist in troubleshooting and resolving data-related issues.
- Work with cross-functional teams to ensure smooth data delivery operations.
- Stay updated on industry trends and emerging technologies relevant to data engineering.
- Participate in code reviews and provide constructive feedback.
- Contribute to the design and architecture of data solutions on the cloud.
- Mentor junior engineers and share knowledge within the team.
Qualifications:
- Bachelor s degree in Computer Science Information Technology or related field.
- Minimum 3 years of experience in data engineering or related role.
- Strong experience with Snowflake and related cloud services.
- Proficiency in SQL and experience with Dbt.
- Experience with ETL tools and data warehousing concepts.
- Solid understanding of data modeling techniques.
- Hands-on experience with Python or similar programming languages.
- Familiarity with data visualization tools is a plus.
- Strong analytical and problem-solving skills.
- Excellent communication and collaboration abilities.
- Ability to work in a fast-paced environment and manage multiple priorities.
- Strong attention to detail and commitment to quality.
- Experience working with large datasets and data lakes.
- Knowledge of data security and compliance regulations.
- Willingness to learn and adapt to new technologies.
data modeling,sql,problem solving,data warehousing,snowflake,sql azure,dbt,python,data security,data visualization,shell scripting,data integrity,python scripting,etl processes