This is a remote position.
At Softgic we work with the coolest with those who build with those who love what they do with those who have 100 in attitude because thats our #Cooltura. Join our purpose of making life easier with technology and be part of our team as a Data Engineer.
Compensation :
USD 20 28/hour.
Location:
Remote (anywhere).
Mission of Softgic:
In Softgic S.A.S. we work for the digital and cognitive transformation of our clients aware that quality is an essential factor for us we incorporate the following principles into our policy:
- Deliver quality products and services.
- Achieve the satisfaction of our internal and external clients.
- Encourage in our team the importance of training to grow professionally and personally through development plans.
- Comply with the applicable legal and regulatory requirements.
- Promote continuous improvement of the quality management system.
What makes you a strong candidate:
- You are proficient Azure Data Lake Azure Sql ELT (Extract load transform) and Python.
- English Native or fully fluent.
- Spanish Native or fully fluent.
Responsibilities and more:
- Design develop and maintain scalable data architectures using SQL Server Azure SQL Database and Snowflake on Azure.
- Implement and manage data pipelines using Azure Data Factory supporting ETL and ELT processes.
- Work with SQL Change Data Capture (CDC) along with Debezium to enable realtime and incremental data processing.
- Work with Streaming technologies such as Kafka and Azure Event Hub to deliver near real time analytics and reporting.
- Manage Azure Data Lake to store and process structured and unstructured data efficiently.
- Design and optimize Data Vault and Star Schema models for data warehousing solutions.
- Develop and maintain ETL/ELT workflows using Python and SQLbased tools.
- Leverage Databricks for big data processing machine learning and advanced analytics.
- Ensure data quality governance and security across multiple data environments.
- Build and maintain analytical reports using Sigma
- Collaborate with business stakeholders and data analysts to ensure data solutions align with business needs.
- Monitor and troubleshoot data pipelines to ensure reliability accuracy and efficiency.
- Support disaster recovery planning and highavailability data strategies.
- Stay up to date with emerging data engineering technologies and best practices.
Requirements
Abilities:
- 57 years of experience as a data architect or seniorlevel data engineer.
- Expertise in SQL Server (SSMS TSQL SSIS SSRS SSAS) and Azure SQL Database.
- Strong experience in data modeling including Data Vault and Star Schema methodologies.
- Proficiency in ETL/ELT development and data pipeline management.
- Handson experience with Snowflake on Azure and Databricks for big data processing.
- Experience working with streaming technologies (e.g. Kafka Flint Event Hub)
- Strong analytical and problemsolving skills with a focus on data integrity and scalability.
- Knowledge of Python for data transformation automation and analytics is a bonus.
Requirements:
- Ability to sit or stand for extended periods of time as required.
- Ability to work in a fastpaced deadlinedriven environment with minimal supervision.
Benefits
- Were certified as a Great Place to Work.
- Opportunities for advancement and growth.
- Paid time off.
- Formal education and certifications support.
- Benefits with partner companies.
- Referral program.
- Flexible working hours.
Azure Data Lake Azure Sql ELT (Extract, load, transform) ETL (Extract, transform, load) Microsoft SQL Server Python