Skills and Experience
Bachelors degree in Computer Science Information Systems Engineering or a
related field.
Minimum of 2 years of experience in data engineering data infrastructure or a
similar technical role.
Proficiency with ETL tools and frameworks (e.g. Informatica Talend Apache
NiFi Airflow).
Experience with data warehousing platforms (e.g. Snowflake Redshift
BigQuery Azure Synapse or similar).
Strong knowledge of database design management and optimization (SQL and
NoSQL databases).
Experience in data modeling and building scalable data architectures.
Familiarity with big data technologies (e.g. Hadoop Spark) and cloud data
platforms (AWS Azure GCP).
Experience with API integration and working with RESTful/GraphQL APIs.
Understanding of data security best practices and regulatory compliance (e.g.
HIPAA GDPR).
Proficiency in scripting or programming languages (Python SQL Shell scripting).
Strong analytical and problem-solving skills.
Excellent written and verbal communication skills; ability to clearly document and
explain technical concepts.
Experience with version control systems (e.g. Git) and CI/CD pipelines is a plus.
Experience with Salesforce data integration and dashboarding is a plus (nice to
have but not required).
Attention to detail and ability to produce high-quality error-free deliverables.
Skills and Experience Bachelors degree in Computer Science Information Systems Engineering or a related field. Minimum of 2 years of experience in data engineering data infrastructure or a similar technical role. Proficiency with ETL tools and frameworks (e.g. Informatica Talend Apache N...
Skills and Experience
Bachelors degree in Computer Science Information Systems Engineering or a
related field.
Minimum of 2 years of experience in data engineering data infrastructure or a
similar technical role.
Proficiency with ETL tools and frameworks (e.g. Informatica Talend Apache
NiFi Airflow).
Experience with data warehousing platforms (e.g. Snowflake Redshift
BigQuery Azure Synapse or similar).
Strong knowledge of database design management and optimization (SQL and
NoSQL databases).
Experience in data modeling and building scalable data architectures.
Familiarity with big data technologies (e.g. Hadoop Spark) and cloud data
platforms (AWS Azure GCP).
Experience with API integration and working with RESTful/GraphQL APIs.
Understanding of data security best practices and regulatory compliance (e.g.
HIPAA GDPR).
Proficiency in scripting or programming languages (Python SQL Shell scripting).
Strong analytical and problem-solving skills.
Excellent written and verbal communication skills; ability to clearly document and
explain technical concepts.
Experience with version control systems (e.g. Git) and CI/CD pipelines is a plus.
Experience with Salesforce data integration and dashboarding is a plus (nice to
have but not required).
Attention to detail and ability to produce high-quality error-free deliverables.
View more
View less