Job Title: Software Developer Data Integration Platform
Requirements:
- Technical Skills (Mandatory):
- Python (Data Ingestion Pipelines): Proficiency in building and maintaining data ingestion pipelines using Python.
- Blazegraph: Experience with Blazegraph technology.
- Neptune: Familiarity with Amazon Neptune a fully managed graph database service.
- Knowledge Graph (RDF Triple): Understanding of RDF (Resource Description Framework) and Triple stores for knowledge graph management.
- AWS Environment (S3): Experience working with AWS services particularly S3 for storage solutions.
- GIT: Proficiency in using Git for version control.
Optional and good to have skills:
-
- Azure DevOps (Optional): Experience with Azure DevOps for CI/CD pipelines and project management (optional but preferred).
- Metaphactory by Metaphacts (Very Optional): Familiarity with Metaphactory a platform for knowledge graph management (very optional).
- LLM / Machine Learning Experience: Experience with Large Language Models (LLM) and machine learning techniques.
- Big Data Solutions (Optional): Experience with big data solutions is a plus.
- SnapLogic / Alteryx / ETL Know-How (Optional): Familiarity with ETL tools like SnapLogic or Alteryx is optional but beneficial.
- Professional Experience:
- Professional Software Development: Demonstrated experience in professional software development practices.
- Years of Experience: 3-5 years of relevant experience in software development and related technologies.
- Soft Skills:
- Strong problem-solving skills.
- Excellent communication and teamwork abilities.
- Ability to work in a fast-paced and dynamic environment.
- Strong attention to detail and commitment to quality.
- Fluent in English (spoken and written)
- Educational Background:
- A degree in Computer Science Engineering or a related field is preferred.